Welcome to the Treehouse Community
The Treehouse Community is a meeting place for developers, designers, and programmers of all backgrounds and skill levels to get support. Collaborate here on code errors or bugs that you need feedback on, or asking for an extra set of eyes on your latest project. Join thousands of Treehouse students and alumni in the community today. (Note: Only Treehouse students can comment or ask questions, but non-students are welcome to browse our conversations.)
Looking to learn something new?
Treehouse offers a seven day free trial for new students. Get access to thousands of hours of content and a supportive community. Start your free trial today.
Why is this not calculating the correct min() and max() value?
Since the program doesn't show the table output when you make a mistake, I have no clue what I'm doing wrong in the code below:
SELECT MIN(rating) AS "star_min", MAX(rating) AS "star_max" FROM reviews GROUP BY movie_id HAVING movie_id = 6;