This is from some ACM programming contest, and has a bunch of good solutions, so don't search the web too quickly.
The problem is as follows:
You are given a histogram, and want to find the maximum area rectangle that can be drawn within that histogram.
Basically, given an array $A[1], A[2], \dots, A[n]$ of say integers, find a contiguous sub-array $A[i], \dots, A[j]$ such that $(j-i+1)\times \min \{A[i], \dots, A[j]\}$ is maximum.
An $O(n \log n)$ solution is acceptable, but $O(n)$ solutions exist.
[Solution]
The problem is as follows:
You are given a histogram, and want to find the maximum area rectangle that can be drawn within that histogram.
Basically, given an array $A[1], A[2], \dots, A[n]$ of say integers, find a contiguous sub-array $A[i], \dots, A[j]$ such that $(j-i+1)\times \min \{A[i], \dots, A[j]\}$ is maximum.
An $O(n \log n)$ solution is acceptable, but $O(n)$ solutions exist.
[Solution]
No comments:
Post a Comment