Given a matrix consisting of only 0,1s, with each of it’s row sorted (i.e all 0s appear before 1s). The problem is to efficiently find the maximum number of 1s present in any row.

For example consider the following matrix.

0 0 1 1

0 0 0 1

0 1 1 1

0 0 0 0

The maximum number of 1s in any row is **three **which appear in the third row.

The general approach is to find the number of 1s in each row and update the maximum.

There are two ways to find the number of 1s.

- One simple approach is to linearly search for first 1 and find the count of 1s. This will take O(n) time in worst case. So the overall time complexity would be O(n
^{2}). - Another faster method is to search for first 1 using binary search since each row is sorted. This will take only O(log n) time and the overall complexity would be O(n log n).

Even more efficient approach can be as follows.

- Start with the right most element in the first row, keep counting the number of 1s until we reach zero.
- Move to the next row and start checking the elements from previous column
- If the element is zero, just move on to the next row
- Otherwise count the ones until we reach zero or the first column

- Repeat the above step for all the rows. At the end, we get the maximum number 1s.

This approach just runs in O(n+m) where n, m are the number of rows and columns respectively.

Here is the C++ implementation of the above. I have used the lower_bound() method from the standard library to find the left most occurrence of a 1. Look at my previous post to understand how this method works.