If anyone here has played Agar.io before, you'll probably be familiar with the way the camera moves. It slides toward your mouse depending on the distance your mouse is from the center of the screen. I'm trying to recreate this movement using an SKCameraNode and touchesMoved:
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
let pos = touches.first!.location(in: view)
let center = view!.center
let xDist = pos.x - center.x
let yDist = pos.y - center.y
let distance = sqrt((xDist * xDist) + (yDist * yDist)) * 0.02
camera?.position.x += xDist * distance * 0.02
camera?.position.y -= yDist * distance * 0.02
}
Surprisingly, this doesn't look too bad. However, there are two problems with this.
The first is that I want to use a UIPanGestureRecognizer for this, since it'd work far better with multiple touches. I just can't imagine how this could be done, since I can't think of a way to use it to get the distance from the touch to the center of the screen.
The second problem is that this movement isn't smooth. If the user stops moving their finger for a single frame, there is a huge jump since the camera snaps to an absolute halt. I'm awful at maths, so I can't think of a way to implement a nice smooth decay (or attack).
Any help would be fantastic!
EDIT: I'm now doing this:
private var difference = CGVector()
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
let pos = touches.first!.location(in: view)
let center = view!.center
difference = CGVector(dx: pos.x - center.x, dy: pos.y - center.y)
}
override func update(_ currentTime: TimeInterval) {
let distance = sqrt((difference.dx * difference.dx) + (difference.dy * difference.dy)) * 0.02
camera?.position.x += difference.dx * distance * 0.02
camera?.position.y -= difference.dy * distance * 0.02
}
All I need to know now is a good way to get the difference variable to smoothly increase from the time the user touches the screen.
EDIT 2: Now I'm doing this:
private var difference = CGVector()
private var touching: Bool = false
private var fade: CGFloat = 0
private func touched(_ touch: UITouch) {
let pos = touch.location(in: view)
let center = view!.center
difference = CGVector(dx: pos.x - center.x, dy: pos.y - center.y)
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
touching = true
touched(touches.first!)
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
touched(touches.first!)
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
touching = false
touched(touches.first!)
}
override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) {
touching = false
}
override func update(_ currentTime: TimeInterval) {
if touching {
if fade < 0.02 { fade += 0.0005 }
} else {
if fade > 0 { fade -= 0.0005 }
}
let distance = sqrt((difference.dx * difference.dx) + (difference.dy * difference.dy)) * 0.01
camera?.position.x += difference.dx * distance * fade
camera?.position.y -= difference.dy * distance * fade
}
Can someone please help me before it gets any worse? The fade variable is incremented in an awful way and it's not smooth, and I just need someone to give me a slight hint of a better way to do this.