Welcome to the Treehouse Community

Want to collaborate on code errors? Have bugs you need feedback on? Looking for an extra set of eyes on your latest project? Get support with fellow developers, designers, and programmers of all backgrounds and skill levels here with the Treehouse Community! While you're at it, check out some resources Treehouse students have shared here.

Looking to learn something new?

Treehouse offers a seven day free trial for new students. Get access to thousands of hours of content and join thousands of Treehouse students and alumni in the community today.

Start your free trial

iOS

OOP

Hello,

I have watched the lesson, though I do not understand this part:

struct Point { let x: Int let y: Int

init(x: Int, y: Int) {
    self.x = x
    self.y = y
}

/// Returns surrounding points in range of
/// the current one
func points(inRange range: Int = 1) ->[Point] {
    var results = [Point]()

    let lowerBoundOfXRange = x - range
    let upperBoundOfXRange = x + range

    let lowerBoundOfYRange = y - range
    let upperBoundOfYRange = y + range

    for xCoordinate in lowerBoundOfXRange...upperBoundOfXRange {
        for yCoordinate in lowerBoundOfYRange...upperBoundOfYRange {
            let coordinatePoint = Point(x: xCoordinate, y: yCoordinate)
            results.append(coordinatePoint)
        }
    }
    return results
}

}

even if I change the value for range to 2 or 3 xCode shows values for (x - range ; x + range) and (y - range ; y + range) to be -1 ; 1 for x and -1 ; 1 for y. I also do not understand why it shows -1 and 1 if we have not provided any values for x and y coordinates yet. Here is the screenshot for better understanding: https://screencast.com/t/MnK20aJ0K

As you can see on the screenshot I entered 2 as a range value however x +- and y+- are the same.

Any help is much appreciated!!!

2 Answers

Well I pasted your code into xcode and removed the = 1 from your range parameter. -Since I define it at instance creation.

and when i do

let point = Point(x: 2, y: 2)
point.points(inRange: 2)

I get an array of 0,0 up to 4,4 which is correct?

Remember this is an instance method! You have to create an instance let point = Point(x: 2, y: 2) and then call the method on the instance point.methodName()

second example

// create instance and assign to your constant
let yourConstant = Point(x: 2, y: 2)

// call the points() method on your instance which is assigned to yourConstant
yourConstant.points(inRange: 2)

Hope this helps :-)

Hey Marcus, thank you for the reply. Yep, I understand that it is an instance method. What was unclear for me is: why results showed me coordinates from -1...+1 even when I changed default value for range to 2 or 3 or even delete the default value. This is either a Swift trick or my Xcode is functioning incorrectly and do not show the correct result.

Hey Marcus, thank you for the reply. Yep, I understand that it is an instance method. What was unclear for me is: why results showed me coordinates from -1...+1 even when I changed default value for range to 2 or 3 or even delete the default value. This is either a Swift trick or my Xcode is functioning incorrectly and do not show the correct result.

Because you're initializing the value at the method signature instead of when calling the method.

points.points(inRange: X) is where you set the range

not at func points(inRange range: Int) ->[Point]