Welcome to the Treehouse Community

Want to collaborate on code errors? Have bugs you need feedback on? Looking for an extra set of eyes on your latest project? Get support with fellow developers, designers, and programmers of all backgrounds and skill levels here with the Treehouse Community! While you're at it, check out some resources Treehouse students have shared here.

Looking to learn something new?

Treehouse offers a seven day free trial for new students. Get access to thousands of hours of content and join thousands of Treehouse students and alumni in the community today.

Start your free trial

Computer Science Introduction to Algorithms Time Complexity Quasilinear Time

Shouldn't Merge Sort be O(n * log(n) + log(n))

When you think about the Merge Sort you split the set ( log(n) ) and then you join them together while sorting each group (n * log(n)). In sum that such be log(n) + n * log(n)

2 Answers

Thanks, but the Stack post is not very helpful as the answers just explain what I already said. The set first get's split (log(n) operations) then concatenated while sorting again per split (n * log(n)). In total: n * log(n) + log(n).

Here is my post on SO

The reason that O(n*log(n)+log(n)) simplifies to just O(n*log(n)) is that the limit as n approaches infinity of the ratio of n*log(n)+log(n) to n*log(n) is 1. In other words, as we make the size of the input set arbitrarily large, the rate of growth of n*log(n)+log(n) and n*log(n) is 'almost exactly' the same so we just write O(n*log(n)). I hope this helps, I don't think this course is intended to get into the weeds with calculus but limits aren't too hard to understand intuitively. Overlaying the graphs of nlog(n) and nlog(n) + log(n) for large n as in the video examples might help to illustrate the point.