## Pages

### HackerRank Deque-STL

Given an array of integers, find the max value for each contiguous subarray in it sized k. This HackerRank problem is meant to be solved in C++ and, as its name suggests, using a deque.

For instance, if we are given the array {3, 4, 6, 3, 4} and k is 2, we have to consider four subarrays sized:
`{3,4} {4,6} {6,3} {3,4} `
And the expected solution is
`{4, 6, 6, 4}`

The original HackerRank problem asks to write a function than outputs its result to standard output. I didn't like much this requisite. As a TDD developer, I'm used to let tests drive the code development. And having to check standard output to verify a function behavior is not fun. So I slightly changed the function signature, asking to return a vector containing the results, and I used the original function as a simple adapter to the original problem. Something like that:
```std::vector<int> maxInSubs(int data[], int n, int k)
{
// ...
}

// ...
void printKMax(int arr[], int n, int k)
{
auto data = maxInSubs(arr, n, k);
std::copy(data.begin(), data.end(), std::ostream_iterator<int>(std::cout, " "));
std::cout << '\n';
}
```
First (naive) attempt

Just do what we are asked to to. For each subarray find its maximum value ad push it to the result vector.
```std::vector<int> maxInSubs(int data[], int n, int k)
{
std::vector<int> results;
for (int i = 0; i < n - k + 1; ++i)
{
results.push_back(*std::max_element(data + i, data + i + k));
}
return results;
}
```
Clean and simple and, when k and n are small, not even too slow. However, for k comparable to a large n we can say bye bye to performance.

Patched naive attempt

We could be tempted to save the algorithm explained above, observing that the slowness is due to the k calls to max_element(). We could avoid to call it a substantial number of times checking the value of the elements exiting and entering the current window, for instance in this way:
```std::vector<int> results{ *std::max_element(data, data + k) };  // 1

for (size_t beg = 1, end = k + 1; end <= n; ++beg, ++end)  // 2
{
if (data[end - 1] > results[results.size() - 1])  // 3
{
results.push_back(data[end - 1]);
}
else if (data[beg - 1] < results[results.size() - 1])  // 4
{
results.push_back(results[results.size() - 1]);
}
else  // 5
{
results.push_back(*std::max_element(data + beg, data + end));
}
}
```
1. Initialize the result vector with the max element for the first interval.
2. Keep beg and end as loop variable, describing the current window to check.
3. The new right element of the window is bigger than the max for the previous window. Surely it is the max for this one.
4. The element that has just left the window is smaller than the previous max. Surely the max is still in the window.
5. Otherwise, we'd better check which is the current max.

A smartly designed array in input could beat this simple algorithm. However on HackerRank they didn't spend too much time on this matter, and this solution is accepted with full marks.

Solution with a deque

In a more elegant solution, we should to minimize the multiple check we perform on the data elements. Right, but how? Until this moment, I haven't paid attention to the huge hint HackerRank gave us, "Use a deque!", they shout from the name of the problem itself.

The point is that I want to perform a cheap cleanup on each window, so that I could just pick a given element in it, without scanning the entire interval.

Let's use the deque as a buffer to store only the reasonable candidates as max. Since we want to remove from this buffer the candidates that are not anymore valid when the window is moved, instead of their value we keep in it their indices from the original data array.

Here is how I initialize it:
```std::deque<int> candidates{ 0 };  // 1
for (int i = 1; i < k; ++i)
{
pushBack(candidates, data, i);  // 2
}
```
1. We could safely say that the first element in data is a good candidate as max for its first subarray.
2. Push back to candidates the "i" index from data, but first ensure the previous candidates are useful.

Since the code in pushBack() is going to be used also afterward, I made function for it:
```void pushBack(std::deque<int>& candidates, int data[], int i)
{
while (!candidates.empty() && data[i] >= data[candidates.back()])  // 1
candidates.pop_back();
candidates.push_back(i);
}
```
1. There is no use in a candidate, if the newcomer is bigger, so remove it.

Now candidates contains the indices of all the elements in the first window on data having the max value. Possibly just one element, but for sure the deque is not empty.

We are ready for the main loop:
```for (int i = k; i < n; ++i)
{
results.push_back(data[candidates.front()]);  // 1

if (candidates.front() <= i - k)  // 2
candidates.pop_front();

pushBack(candidates, data, i);  // 3
}
results.push_back(data[candidates.front()]);  // 4
```
1. As said above, we know that candidates is not empty and its front is the index of a max value in the current window. Good. Push it to results.
2. Now we prepare for the next window. If the front candidate is out, we remove it.
3. Push back the new element index among the candidates, following the algorithm described above. It would kill the candidates that are not bigger than it, ending up with a deque where the biggest element is surely on front.
4. Remember to push the last candidate in the results, and then the job is done.

Does this solution look more convincing to you? Full C++ code and test case on GitHub.