Wean the touch screen to shift the coordinates of the touch a couple of mm up
I think few people noticed that the physical coordinates of the touch of a finger and their software display in iOS are slightly different: iOS produces a point that is displaced about 1.5 mm upward from a real touch. This is in the interests of usability - a point close to the nail seems more realistic than lying below the fingertip. In addition, the screen area where you click is better seen.
To make it clearer what I’m talking about, you can download any drawing (for example, Bamboo Paper , the application is not mine, free), lock the screen auto-rotate, draw a small horizontal line, then turn the device upside down (required when locking the auto-rotate) and try to continue the drawn line. Most likely, the continued line will be below the original.
In most applications, this behavior is invisible, but in some it can be harmful. For example, in sketches where auto-rotation is disabled and it is assumed that the user can rotate the device as he pleases, it is necessary to ensure identical touch processing regardless of the current rotation of the device. Or in a board game, such as chess, where a shift in coordinates can lead to the player playing black (and for whom the interface in portrait mode is upside down), when you try to move the piece, it will miss and take the piece from the cell below.
According to my tests on the first generation iPad, the shift was about 7 pixels. Naturally, on other devices the result will be different, it is necessary to test.
I did not find an open software interface to manage this shift in the Apple documentation, and so I had to look for workarounds.
In the case of sketches, you can simply introduce a correction into the logic of the touchesMoved methods and the like:
However, this is not always acceptable, especially if there is a certain amount of ready-made code that you want to reuse without changes.
As a radical solution in the first version of this article, I showed an example with the amendment of absolutely all calls to locationInView and previousLocationInView for UITouch instances :
Here, using the method_exchangeImplementations runtime function , the implementation of the default UITouch methods is exchanged for ours with the correction. Please note that a file with a created category does not have to be imported into any other .h or .m file. It is enough to add it to the project, and the + load method will be called automatically, because + load message is automatically sent to each class and category when adding it to runtime.
Upd. However, as it turned out in the comments , starting with iOS 5, Apple asks not to use method_exchangeImplementations in applications. Thanks wicharekfor information. Therefore, to make an amendment, it is better to use any other method to your taste, for example, one of the proposed in the comments.
So, after introducing the correction, drawing on view will work identically for any orientation of the device. However, this will not solve the problem with chess pieces: when deciding which view to send a touch event, a point with a shift will still be used, and we may end up in the wrong piece.
To determine which view should handle the touch, use the hitTest: withEvent: method on UIView , which works as follows:
Thus, for the touch to correctly get into the chess pieces, it is enough to redefine hitTest: withEvent for the root view containing all the subviews for which we need to introduce a correction:
After that, it becomes as convenient for a player who plays black to get into chess pieces as a player who plays white.
To make it clearer what I’m talking about, you can download any drawing (for example, Bamboo Paper , the application is not mine, free), lock the screen auto-rotate, draw a small horizontal line, then turn the device upside down (required when locking the auto-rotate) and try to continue the drawn line. Most likely, the continued line will be below the original.
In most applications, this behavior is invisible, but in some it can be harmful. For example, in sketches where auto-rotation is disabled and it is assumed that the user can rotate the device as he pleases, it is necessary to ensure identical touch processing regardless of the current rotation of the device. Or in a board game, such as chess, where a shift in coordinates can lead to the player playing black (and for whom the interface in portrait mode is upside down), when you try to move the piece, it will miss and take the piece from the cell below.
According to my tests on the first generation iPad, the shift was about 7 pixels. Naturally, on other devices the result will be different, it is necessary to test.
I did not find an open software interface to manage this shift in the Apple documentation, and so I had to look for workarounds.
In the case of sketches, you can simply introduce a correction into the logic of the touchesMoved methods and the like:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
//...
CGPoint point = [[touches anyObject] locationInView:self];
// Поправка имеет положительное значение, т.к. ось y направлена сверху вниз
point.y = point.y + 7;
//...
}
However, this is not always acceptable, especially if there is a certain amount of ready-made code that you want to reuse without changes.
As a radical solution in the first version of this article, I showed an example with the amendment of absolutely all calls to locationInView and previousLocationInView for UITouch instances :
#import "objc/runtime.h"
@interface UITouch (Adjusted)
-(CGPoint)adjustedLocationInView:(UIView *)view;
-(CGPoint)adjustedPreviousLocationInView:(UIView *)view;
@end
@implementation UITouch (Adjusted)
-(CGPoint)adjustedLocationInView:(UIView *)view{
CGPoint point = [self adjustedLocationInView:view];
point.y = point.y + 7;
return point;
}
-(CGPoint)adjustedPreviousLocationInView:(UIView *)view{
CGPoint point = [self adjustedPreviousLocationInView:view];
point.y = point.y + 7;
return point;
}
+(void)load{
Class class = [UITouch class];
Method locInViewMethod = class_getInstanceMethod(class, @selector(locationInView:));
Method adjLocInViewMethod = class_getInstanceMethod(class, @selector(adjustedLocationInView:));
method_exchangeImplementations(locInViewMethod, adjLocInViewMethod);
Method prevLocInViewMethod = class_getInstanceMethod(class, @selector(previousLocationInView:));
Method adjPrevLocInViewMethod = class_getInstanceMethod(class, @selector(adjustedPreviousLocationInView:));
method_exchangeImplementations(prevLocInViewMethod, adjPrevLocInViewMethod);
NSLog(@"UITouch class is adjusted now.");
}
@end
Here, using the method_exchangeImplementations runtime function , the implementation of the default UITouch methods is exchanged for ours with the correction. Please note that a file with a created category does not have to be imported into any other .h or .m file. It is enough to add it to the project, and the + load method will be called automatically, because + load message is automatically sent to each class and category when adding it to runtime.
Upd. However, as it turned out in the comments , starting with iOS 5, Apple asks not to use method_exchangeImplementations in applications. Thanks wicharekfor information. Therefore, to make an amendment, it is better to use any other method to your taste, for example, one of the proposed in the comments.
So, after introducing the correction, drawing on view will work identically for any orientation of the device. However, this will not solve the problem with chess pieces: when deciding which view to send a touch event, a point with a shift will still be used, and we may end up in the wrong piece.
To determine which view should handle the touch, use the hitTest: withEvent: method on UIView , which works as follows:
- called pointInside: withEvent: from the self ;
- if the result is NO , then hitTest: withEvent: returns nil , i.e. view does not respond to touch;
- if the result is YES , then the method recursively sends hitTest: withEvent: to all its subviews;
- if one of the subview returned a non-nil object, then the root hitTest: withEvent: returns this object;
- if all subview returned nil , or view does not have a subview, then self is returned ;
Thus, for the touch to correctly get into the chess pieces, it is enough to redefine hitTest: withEvent for the root view containing all the subviews for which we need to introduce a correction:
@implementation ChessBoardView
- (UIView*)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
point.y = point.y + 7;
UIView *hitView = [super hitTest:point withEvent:event];
return hitView;
}
@end
After that, it becomes as convenient for a player who plays black to get into chess pieces as a player who plays white.