## Welcome to the Treehouse Community

Want to collaborate on code errors? Have bugs you need feedback on? Looking for an extra set of eyes on your latest project? Get support with fellow developers, designers, and programmers of all backgrounds and skill levels here with the Treehouse Community! While you're at it, check out some resources Treehouse students have shared here.

### Looking to learn something new?

Treehouse offers a seven day free trial for new students. Get access to thousands of hours of content and join thousands of Treehouse students and alumni in the community today.

# Not really sure if I am doing this correctly...

```var userNumber = parseInt(prompt("Enter a number"));
var userNumber2 = parseInt(prompt("Enter another number"));
var randomNumber = Math.floor(Math.random() * userNumber * userNumber2 ) + 1;
document.write(randomNumber);
```

Hi Gary,

I think the problem is that you should subtract "userNumber - userNumber2"

```var entry1 = prompt("Please type a number");
var integer1 = parseInt(entry1);
var entry2 = prompt("Please type another number");
var integer2 = parseInt(entry2);

var randomNum = Math.floor(Math.random() * (integer2 - integer1 + 1)) + integer1;

document.write(randomNum)
```

If still stuck, post back

-Dan

Thanks for the reply Dan. Ok so a user enters two different numbers twice, then the console generate a random number and then we subtract userNumber from userNumber2 add 1 to it and round off the decimal with the Math.random function. I am not sure why I am subtracting it though.

The reason you are subtracting is for the algorithm to work. I actually forgot that you have to add it back after you multiply to Math.random() number. I will do the steps for you:

```var entry1 = prompt("Please type a number");
var integer1 = parseInt(entry1);
var entry2 = prompt("Please type another number");
var integer2 = parseInt(entry2);

var randomNum = Math.floor(Math.random() * (integer2 - integer1 + 1)) + integer1;

document.write(randomNum)
// Lets say integer1 = 8 and integer2 = 10
// First you do integer2 - integer1 = 2
// Then you add 1, so right it will be 3
// After that, Math.random() will generate a random number, lets say it is 0.82.
// 0.82 * 3 = 2.46
// Then Math.floor() will round it down to 2
// Lastly you add integer2 + 2 = 10
// So the output is 10. This correct because 10 is between 8 and 10.
```

I hope this help, I just broke it down for you step by step. If still don't understand let me know.

Oh ok I understand now. Thanks for the breakdown I was so caught up about adding or multiplying variables that I didn't actually take the time to do the math.

This actually doesn't make sense to me, your breakdown at least. Everything is smooth until the second to last line where you state, "Lastly you add integer2 - 2 = 10 (Which isn't even addition, but actually subtraction, and integer2 - 2 would = 8 not 10). I don't think your reasoning is correct, but then again I could be wrong.

Thanks for pointing this out. I am sorry for the mistake, I already edited. On the other hand, I am just trying to help. And with some experience that I have, teaching, I discovered that the same method doesn't work for everyone. If you want I can try to explain it to you differently. Let me know if you still don't understand, and what you don't understand after the change.

I figured it was just a small mistake, but I was assuming I had missed something. I got it though, thanks.