Here is the link to the original problem. Let us try to simplify the problem statement a little bit.
There is a dinner house of infinite number of guests but finite number of pancakes. Each of them hold a plate which is either empty or has some pancakes in them. Every minute, one of the following actions can happen.
- Each guest eats one pancake per minute.
- The server will declare it as a special minute during which all guests stop eating, and the server will choose a guest with non-zero pancakes and transfer some cakes from that guest to the another guest.
The problem is to optimize the number of special minutes so that the total time to finish all the cakes is minimum.
We are given the number of pancakes on each plate initially.
Let us walk through a small example.
[1, 2, 4] – If we don’t declare any special minutes, all the cakes will be finished in 4 minutes which is the maximum cakes.
What is we declare a special minute and take out 2 cakes from the 3rd guest and keep it in an empty plate?
[1,2,2,2] – Now we can finish all the cakes in 2 minutes + 1 special minute. So we can finish all the cakes in 3 minutes itself.
Note that we can not reduce the time any further because if we do so, we have to declare 3 special minutes (for moving one cake from each of the plate with 2 cakes) which outweigh the actual eating time which is 2 minutes.
Without the loss of generality we assume that we can minimize the number of pancakes to be eaten by moving them from the maximum plate to a zero plate.
A simple brute force algorithm would suffice because the restriction on the input is small (1000).
- Let us calculate the frequency of pancakes in each plate.
- For each of the sizes between 1 and 1000
- Move the excess cakes (p[i] – size) in each plate to an empty plate and sum up the number of moves required.
- Add the size to the number of moves, you will get the time required to finish.
- Update the minimum time.
Let us trace the algorithm for the above example.
[1,2,4]
- Divide them into size 1 chunks – [1,1,1,1,1,1,1] – 4 moves+ 1 = 5 minutes.
- Divide them into size 2 chunks – [1,2,2] – 1 move + 2 = 3 minutes
- Divide them into size 3 chunks – [1,2,3,1] – 1 move + 3 = 4 minutes
The minimum among them is 3 minutes. For simplicity I have not shown the division for size greater than 3. Even if we calculate they return the same output of 4.
The maximum time it takes for any combination is the maximum number of cakes in any plate.
Here is the C++ implementation of the same.
Note: I did not solve this problem in the competition. But I got inspired by the solutions (user: kyc) who successfully solved the problem. I only composed the solution analysis. If somebody has a better solution, please feel free to post it in a comment.