Loading [MathJax]/jax/output/HTML-CSS/jax.js

Saturday, May 23, 2015

Max area in histogram [Solution]

The problem:

Given an array A of integers, A[1],A[2],,A[n], find a contiguous sub-array A[i],,A[j] such that (ji+1)×min{A[i],,A[j]} is maximum.

Solution

This is a classic ACM programming contest problem.

A bunch of solutions can be found here: http://www.informatik.uni-ulm.de/acm/Locals/2003/html/judge.html

Look at Problem H.

But, the reason for this blog post was to present a different solution to this, using Cartesian Trees.

A cartesian tree for an array A, is a binary tree such that it is a heap formed from the elements of A, and the in-order traversal of the tree gives back the array in order: A[1],A[2],,A[n]. See the diagram below (swiped from the wiki page).

 

These trees can be constructed in O(n) time (see the wiki page for more details).

To solve the problem, we use divide and conquer.

First we construct a cartesian tree which gives a min-heap.

Now notice that, if A[m] is the minimum element of A, then the optimum solution is either n×A[m], or the optimum solutions from A[1],A[m1] and A[m+1],,A[n].

The cartesian tree of A gives us m, and the cartesian trees for A[1],,A[m1] (the left subtree) and A[m+1],,A[n] (the right sub-tree)!

Thus if we annotate each node N of the cartesian tree with the number of nodes in that sub-tree Nc, we just need to compute max of N.values×Nc over all the nodes! N.value is the element value from A.

This is an O(n) time solution.

For another application of Cartesian trees, try solving this puzzle on this blog: Construct BST.

No comments:

Post a Comment