This is a homework problem. Let A[] is an array of integers and integer K -- window size. Generate array M of minimums seen in a window as it slides over A. I found an article with a solution for this problem but did not understand why it has O(n) complexity. Can anybody explain it to me?
-
@Andre: "the homework tag, like other so-called 'meta' tags, is now discouraged."Roger Pate– Roger Pate2010-11-08 09:46:09 +00:00Commented Nov 8, 2010 at 9:46
-
Oh, didn't know that. Saw it requested on stackoverflow.com/questions/4114917/…. Maybe there should be a note in the tag's description?AndreKR– AndreKR2010-11-08 09:50:00 +00:00Commented Nov 8, 2010 at 9:50
Add a comment
|
1 Answer
This tends to catch people out. You would think it would take O(N^2) time since you reason adding takes O(N) time and you have O(N) elements. However, realise each element can only be added once and removed once. So in total it takes O(N) to slide over the whole array A.
This yields an amortised efficiency of O(1) every time you move the sliding window on by one element. In other words, the average time it takes to move the sliding window by one element is O(1).