Welcome to the Treehouse Community
Want to collaborate on code errors? Have bugs you need feedback on? Looking for an extra set of eyes on your latest project? Get support with fellow developers, designers, and programmers of all backgrounds and skill levels here with the Treehouse Community! While you're at it, check out some resources Treehouse students have shared here.
Looking to learn something new?
Treehouse offers a seven day free trial for new students. Get access to thousands of hours of content and join thousands of Treehouse students and alumni in the community today.
Start your free trialGrigory Rudko
4,015 PointsIn line 21 Amit used Int (10) as a parameter, but it should be Double. Code still works, why?
In line 21 Amit used Int (10) as a parameter, but it should be Double. Code still works, why?
2 Answers
Martin Wildfeuer
Courses Plus Student 11,071 Points10 is basically shorthand for 10.0 when used with floating point types. So the following is valid code
var floatingPointType: Double = 0.1 // 0.1
floatingPointType += 10 // 10.1
It does not work the other way round, however
var integerType: Int = 1
integerType += 0.1 // Error
Grigory Rudko
4,015 PointsI had this feeling. Thanks!