Welcome to the Treehouse Community
The Treehouse Community is a meeting place for developers, designers, and programmers of all backgrounds and skill levels to get support. Collaborate here on code errors or bugs that you need feedback on, or asking for an extra set of eyes on your latest project. Join thousands of Treehouse students and alumni in the community today. (Note: Only Treehouse students can comment or ask questions, but non-students are welcome to browse our conversations.)
Looking to learn something new?
Treehouse offers a seven day free trial for new students. Get access to thousands of hours of content and a supportive community. Start your free trial today.
Calvin Secrest24,812 Points
Worst case scenarios?
So I understand how O(n) – linear time is having algorithm time grows proportionally the same the data set size grows. For example inserting into a list can be worst case scenario because every data item must be compared to inserted value. Yet, I am conflicting how worst case scenario can occur in Binary search or logarithm time and even further with Quadratic time, Cubic time, etc.. Sorry if I didn't explain myself clearly!
Thanks in advance
Binary search example
input - 9 sorted numbers between 1 to 1000 eg. [22(0), 443(1), 457(2), 482(3), 500(middle), 698(5), 811(6), 977(7), 979(8)]
- searching for position of 500
- 1) look in the middle => found 500 on first try
=> position of element found on first try best case scenario for binary search
searching for position of 977 1) look in the middle => 500 2) look at position 6 => 811 3) look at position 7 => 977
=> position of element found on third try worst case scenario for binary search (base 2 log) log(10) = 3.16 ~ 3, big(O) calculation of worst case