Welcome to the Treehouse Community

The Treehouse Community is a meeting place for developers, designers, and programmers of all backgrounds and skill levels to get support. Collaborate here on code errors or bugs that you need feedback on, or asking for an extra set of eyes on your latest project. Join thousands of Treehouse students and alumni in the community today. (Note: Only Treehouse students can comment or ask questions, but non-students are welcome to browse our conversations.)

Looking to learn something new?

Treehouse offers a seven day free trial for new students. Get access to thousands of hours of content and a supportive community. Start your free trial today.

Computer Science Introduction to Algorithms Time Complexity Quasilinear Time

Shouldn't Merge Sort be O(n * log(n) + log(n))

When you think about the Merge Sort you split the set ( log(n) ) and then you join them together while sorting each group (n * log(n)). In sum that such be log(n) + n * log(n)

2 Answers

Thanks, but the Stack post is not very helpful as the answers just explain what I already said. The set first get's split (log(n) operations) then concatenated while sorting again per split (n * log(n)). In total: n * log(n) + log(n).

Here is my post on SO

The reason that O(n*log(n)+log(n)) simplifies to just O(n*log(n)) is that the limit as n approaches infinity of the ratio of n*log(n)+log(n) to n*log(n) is 1. In other words, as we make the size of the input set arbitrarily large, the rate of growth of n*log(n)+log(n) and n*log(n) is 'almost exactly' the same so we just write O(n*log(n)). I hope this helps, I don't think this course is intended to get into the weeds with calculus but limits aren't too hard to understand intuitively. Overlaying the graphs of nlog(n) and nlog(n) + log(n) for large n as in the video examples might help to illustrate the point.