Consider the coin change problem with denominations (c^k, c^(k-1), ... , c^2, c, 1) for integers c >= 2 and k >= 0. Prove that the greedy algorithm G always picks the largest possible coin produces shortest change sequences. ----------------------------- Solution: Claim 1: the optimal solution for changing n has the form d = (d_k, d_(k-1), ... , d_2, d_1, d_0) where d_i are the "digits" in the c-ary representation of n, i.e. 0 <= d_k (no upper bound) 0 <= d_i < c for i=0..k-1 Example: c=3 k=2 n=70 d = (7, 2, 1) ( 7*9 + 2*3 + 1) d is unique (like regular c-ary number representations) and can be computed by iteratively taking the remainder when dividing by c (d_i = n_i % c) and continuing with the division result n_{i+1} = n_i/c until i=k-1. d_k is then n_k. Suppose d' is optimal and d' != d. Then there exists i < k with d'_i >= c (if all d'_i are < c, then d'=d). Now increase d'_(i+1) by 1 and decrease d'_i by c to obtain d''. Solution d'' uses (c-1) fewer coins than d', a contradiction to d' being optimal. Claim 2: G computes d For i=k .. 0, G computes d_i by counting how often c^i can be subtracted from the rest while still staying >= 0. Induction: Base: i = k d_k is correctly computed because all values for i < k can only add up to at most c^k - 1 and the rest must be accounted for by d_k. G computes d_k = |_ n / c^k _| by counting the number of times c^k can be subtracted. Step: Assuming G computes (d_k .. d_i) correctly. Then the remaining value r is < c^i and d_i = |_ r / c^{i-1} _|, which G computes. So, G has computed d when it stops.