Welcome to the Treehouse Community

Want to collaborate on code errors? Have bugs you need feedback on? Looking for an extra set of eyes on your latest project? Get support with fellow developers, designers, and programmers of all backgrounds and skill levels here with the Treehouse Community! While you're at it, check out some resources Treehouse students have shared here.

Looking to learn something new?

Treehouse offers a seven day free trial for new students. Get access to thousands of hours of content and join thousands of Treehouse students and alumni in the community today.

Start your free trial

JavaScript

parseInt does not work the way it is expected to.

I'd like to parse this hexadecimal number into decimal number. First line works fine, but the second line does not produce correct answer.

parseInt("0xA some random string", 16) ---> 10

parseInt("0xA", 16) ---> 16

Hexadecimal number 0xA is 10 in decimal number, so first line is correct but the second line is not. Could you explain?

parseInt("0xA", 16) works, but it's parseInt(0xA, 16) that is not working. Since the function is supposed to convert the first argument into string and then parse it, it should work too.

1 Answer

Howdy!

Boy, is this a tough one. I feel like I've kind of been going in circles here, but think I can at least get you going in a good direction. I'm going to give a little extra clarity for other readers. Let's first take a look at exactly what parseInt(string, radix) is doing.

The parseInt() function parses a string argument and returns an integer of the specified radix.

string is the value to parse. If string is not a string, then it is converted to one. Leading whitespace in the string is ignored.

radix is an integer between 2 and 36 that represents the radix (the base in mathematical numeral systems).

Since the issue for you seems to be when not passing a string, I'm thinking something funky is happening with the coercion. Here's what I found out:

var num = 0xA
var str = '0xA'
num // 10
str // '0xA'
0xA // 10

//It looks like JS converted num to a decimal. Now we can compare.

parseInt(num, 16) // 16
parseInt(str, 16) // 10
parseInt(0xA) // 16
parseInt(10, 16) // 16
parseInt('10', 16) // 16

/* 
 * Either 0xA is being converted before parseInt is run,
 * or the parseInt toString call is converting it to a decimal, 
 * which is the default behavior seen here:
*/

// default
0xA.toString() // '10'
// passing a radix
0xA.toString(16) // 'a'

So, maybe the real question actually is: Is parseInt() the best option for what I am doing? To more directly answer you, the issue is because when you do parseInt(0xA, 16) it is really doing parseInt('10', 16).