We have an integer, representing a total amount that we want to get adding coins of given denominations. Our job is finding the minimum number of required coins.
If the denominations are such that we can always guarantee a safe greedy choice, the greedy solution is better. However, if, as in this case, the denominations are [1, 3, 4] we have to look somewhere else.
A recursive approach is safe, but it would lead to an exponential time complexity.
Let's solve it instead using a Dynamic Programming approach:
DENOMINATIONS = [1, 3, 4] # ... cache = [0] * (target + 1) # 1 for i in range(1, target + 1): # 2 cache[i] = cache[i-1] + 1 # 3 for coin in DENOMINATIONS[1:]: # 4 if coin <= i: other = cache[i-coin] + 1 cache[i] = min(cache[i], other) return cache[-1] # 51. Each step would need to have a look at previous results, so we keep them in a cache. Notice that we keep also a dummy extra value, not a strict necessity, but keeps our code simpler.
2. Let's loop on all the "real" elements - the zeroth one, dummy, is left untouched with is original value of 0.
3. Assuming that among the available denominations there is "1", we can always select it. This is usually a safe assumption. If we can't do it, our code should be more complex, and we should also consider the case where a result can't be generated correctly.
4. Let's check all the other denominations. If the current one could be used, go and fetch in the cache the previous step, add one to it, and put it in the cache at the current position, if this is less than the currently calculated value.
5. Finally return the last calculated value.
I pushed test case and python script to GitHub.
No comments:
Post a Comment