iPhone Gestures aren't simple or obvious.
The iPhone is essentially nothing more than a small touch screen device. Like a good dish at a fine restaurant it's how all those ingredients are mixed that make the magic. I believe the core of this magic is in the device's inherent mobility and intuitive interfaces.
Just like the PalmPilot before it, the gestures are what make the interface unique and usable. When I first started working with the iPhone SDK back in early 2008 something struck me like a metric ton of bricks. I couldn't find an API that detects swipes! That's right! That particular ingredient which makes the iPhone magical isn't even in the OS. It's up to application programmers to implement it!
The gestures typically used in iPhone applications are:
Other interesting gestures not typically used in iPhone applications probably most interesting for full screen OpenGL applications:
While digging around on Google for cookbook touch APIs for iPhone you'll find all kinds of obsolete information. Part of the reason the SDK was under wrap with the NDA was to squelch dissemination of some of the half-baked APIs. Imagine that it -- it wasn't just a scary corporate conspiracy! The nerd media pushed hard on the NDA and we've got what you see now. Plenty of obsolete iPhone code on the web.
If you're found iPhone code that refers to the touches as "clicks" then you're looking at very old code. Move on. Light bulb time right? Touches are nothing more than mouse inputs circa Windows 3. The iPhone SDK simulator is making a little more sense to you right now isn't it.
If you're found iPhone code that mentions touch "phases" then you're also looking at old code. Originally all touch events were pushed into a single call. In that call you would read the phase and it would tell you "begin", "moved", "ended". Now be have separate calls for each.
UIResponder
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
It's strange that something so core to the magic of this device isn't in the OS. You're left with having to measure gestures in the application. This means each application will likely have its own idea of determining the make up of a particular gesture.
Take for example a simple horizontal swipe. We have to measure the amount of finger travel in the application. We do this in both the X and Y directions. If there is enough movement in the X direction we call it a horizontal swipe. If there is enough movement in the Y direction we call it a vertical swipe. If we're looking for a horizontal swipe, we have to reject it if there is too much Y movement. And vise-versa for the vertical swipe.
All these limits are application specific! This leads to an inconsistent interface among all applications on the platform!
"1000 new APIs for developers" says Apple's press for the OS 3.0 beta. You'd think one of them should have filled this gap? Nope.
Just like the PalmPilot before it, the gestures are what make the interface unique and usable. When I first started working with the iPhone SDK back in early 2008 something struck me like a metric ton of bricks. I couldn't find an API that detects swipes! That's right! That particular ingredient which makes the iPhone magical isn't even in the OS. It's up to application programmers to implement it!
The gestures typically used in iPhone applications are:
- pinch open and close
- swipe up, down, left and right
- flick up, down, left, and right
Other interesting gestures not typically used in iPhone applications probably most interesting for full screen OpenGL applications:
- one finger circle clockwise and counter clockwise (swirl)
- two finger circle clockwise and counter clockwise (rotating fixed radius pinch)
While digging around on Google for cookbook touch APIs for iPhone you'll find all kinds of obsolete information. Part of the reason the SDK was under wrap with the NDA was to squelch dissemination of some of the half-baked APIs. Imagine that it -- it wasn't just a scary corporate conspiracy! The nerd media pushed hard on the NDA and we've got what you see now. Plenty of obsolete iPhone code on the web.
If you're found iPhone code that refers to the touches as "clicks" then you're looking at very old code. Move on. Light bulb time right? Touches are nothing more than mouse inputs circa Windows 3. The iPhone SDK simulator is making a little more sense to you right now isn't it.
If you're found iPhone code that mentions touch "phases" then you're also looking at old code. Originally all touch events were pushed into a single call. In that call you would read the phase and it would tell you "begin", "moved", "ended". Now be have separate calls for each.
UIResponder
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
It's strange that something so core to the magic of this device isn't in the OS. You're left with having to measure gestures in the application. This means each application will likely have its own idea of determining the make up of a particular gesture.
Take for example a simple horizontal swipe. We have to measure the amount of finger travel in the application. We do this in both the X and Y directions. If there is enough movement in the X direction we call it a horizontal swipe. If there is enough movement in the Y direction we call it a vertical swipe. If we're looking for a horizontal swipe, we have to reject it if there is too much Y movement. And vise-versa for the vertical swipe.
All these limits are application specific! This leads to an inconsistent interface among all applications on the platform!
CGPoint gestureStart;
const float kMinimumGestureLength = 25;
const float kMaximumVariance = 5;
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
gestureStart = [touch locationInView:self.view];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPosition = [touch locationInView:self.view];
CGFloat deltaX = fabsf(gestureStartPoint.x - currentPosition.x);
CGFloat deltaY = fabsf(gestureStartPoint.y - currentPosition.y);
if (deltaX >= kMinimumGestureLength && deltaY <= kMaximumVariance)
{
// handle horizontal swipe
}
else if (deltaY >= kMinimumGestureLength && deltaX <= kMaximumVariance)
{
// handle vertical swipe
}
}
"1000 new APIs for developers" says Apple's press for the OS 3.0 beta. You'd think one of them should have filled this gap? Nope.
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home