[This post is by Adam Powell, the single of the some-more touchy-feely Android engineers. Tim Bray]
The word multitouch gets thrown around utterly the bit as well as the not regularly clear what people have been referring to. For the little the about hardware capability, for others it refers to specific gesticulate await in software. Whatever we decide to call it, today were starting to demeanour during how to have your apps as well as views handle nicely with mixed fingers upon the screen.
This post is starting to be complicated upon formula examples. It will cover formulating the custom View which responds to hold events as well as allows the user to manipulate an intent drawn inside of it. To get the most out of the examples we should be familiar with environment up an Activity as well as the basics of the Android UI system. Full plan source will be related during the end.
Well proceed with the brand new View category which draws an intent (our application icon) during the given position:
public category TouchExampleView extends View { in isolation Drawable mIcon; in isolation boyant mPosX; in isolation boyant mPosY; in isolation boyant mLastTouchX; in isolation boyant mLastTouchY; open TouchExampleView(Context context) { this(context, null, 0); } open TouchExampleView(Context context, AttributeSet attrs) { this(context, attrs, 0); } open TouchExampleView(Context context, AttributeSet attrs, int defStyle) { super(context, attrs, defStyle); mIcon = context.getResources().getDrawable(R.drawable.icon); mIcon.setBounds(0, 0, mIcon.getIntrinsicWidth(), mIcon.getIntrinsicHeight()); } @Override open blank onDraw(Canvas canvas) { super.onDraw(canvas); canvas.save(); canvas.translate(mPosX, mPosY); mIcon.draw(canvas); canvas.restore(); } @Override open boolean onTouchEvent(MotionEvent ev) { // More to come here later... lapse true; }}MotionEvent
The Android frameworks primary indicate of entrance for hold interpretation is the android.view.MotionEvent class. Passed to your views by the onTouchEvent as well as onInterceptTouchEvent methods, MotionEvent contains interpretation about pointers, or active hold points upon the inclination screen. Through the MotionEvent we can obtain X/Y coordinates as well as stretch as well as vigour for each pointer. MotionEvent.getAction() earnings the value describing what kind of suit eventuality occurred.
One of the some-more usual uses of hold submit is vouchsafing the user draw towards an intent around the screen. We can get ahead this in the View category from upon top of by implementing onTouchEvent as follows:
@Overridepublic boolean onTouchEvent(MotionEvent ev) { last int movement = ev.getAction(); switch (action) { box MotionEvent.ACTION_DOWN: { last boyant x = ev.getX(); last boyant y = ev.getY(); // Remember where we started mLastTouchX = x; mLastTouchY = y; break; } box MotionEvent.ACTION_MOVE: { last boyant x = ev.getX(); last boyant y = ev.getY(); // Calculate the stretch moved last boyant dx = x - mLastTouchX; last boyant dy = y - mLastTouchY; // Move the intent mPosX += dx; mPosY += dy; // Remember this hold in front of for the next pierce eventuality mLastTouchX = x; mLastTouchY = y; // Invalidate to ask the redraw invalidate(); break; } } lapse true;}The formula upon top of has the bug upon inclination which await mixed pointers. While dragging the image around the screen, place the second finger upon the touchscreen afterwards lift the first finger. The image jumps! Whats happening? Were calculating the stretch to pierce the intent formed upon the last known in front of of the default pointer. When the first finger is lifted, the second finger becomes the default pointer as well as we have the vast delta between pointer positions which the formula dutifully applies to the objects location.
If all we wish is info about the single pointers location, the methods MotionEvent.getX() as well as MotionEvent.getY() have been all we need. MotionEvent was lengthened in Android 2.0 (Eclair) to report interpretation about mixed pointers as well as brand new actions were added to report multitouch events. MotionEvent.getPointerCount() earnings the series of active pointers. getX as well as getY right away accept an index to specify which pointers interpretation to retrieve.
Index vs. ID
At the aloft level, touchscreen interpretation from the image in time might not be rught away utilitarian given hold gestures engage suit over time spanning most suit events. A pointer index does not necessarily compare up opposite formidable events, it only indicates the datas in front of inside of the MotionEvent. However this is not work which your app has to do itself. Each pointer additionally has an ID mapping which stays persistent opposite hold events. You can collect this ID for each pointer regulating MotionEvent.getPointerId(index) as well as find an index for the pointer ID regulating MotionEvent.findPointerIndex(id).
Feeling Better?
Lets fix the e.g. upon top of by taking pointer IDs in to account.
private immobile last int INVALID_POINTER_ID = -1;// The active pointer is the the single now moving the object.private int mActivePointerId = INVALID_POINTER_ID;// Existing formula ...@Overridepublic boolean onTouchEvent(MotionEvent ev) { last int movement = ev.getAction(); switch (action & MotionEvent.ACTION_MASK) { box MotionEvent.ACTION_DOWN: { last boyant x = ev.getX(); last boyant y = ev.getY(); mLastTouchX = x; mLastTouchY = y; // Save the ID of this pointer mActivePointerId = ev.getPointerId(0); break; } box MotionEvent.ACTION_MOVE: { // Find the index of the active pointer as well as fetch the in front of last int pointerIndex = ev.findPointerIndex(mActivePointerId); last boyant x = ev.getX(pointerIndex); last boyant y = ev.getY(pointerIndex); last boyant dx = x - mLastTouchX; last boyant dy = y - mLastTouchY; mPosX += dx; mPosY += dy; mLastTouchX = x; mLastTouchY = y; invalidate(); break; } box MotionEvent.ACTION_UP: { mActivePointerId = INVALID_POINTER_ID; break; } box MotionEvent.ACTION_CANCEL: { mActivePointerId = INVALID_POINTER_ID; break; } box MotionEvent.ACTION_POINTER_UP: { // Extract the index of the pointer which left the hold sensor last int pointerIndex = (action & MotionEvent.ACTION_POINTER_INDEX_MASK) >> MotionEvent.ACTION_POINTER_INDEX_SHIFT; last int pointerId = ev.getPointerId(pointerIndex); if (pointerId == mActivePointerId) { // This was the active pointer starting up. Choose the brand new // active pointer as well as adjust accordingly. last int newPointerIndex = pointerIndex == 0 ? 1 : 0; mLastTouchX = ev.getX(newPointerIndex); mLastTouchY = ev.getY(newPointerIndex); mActivePointerId = ev.getPointerId(newPointerIndex); } break; } } lapse true;}There have been the couple of brand new elements during work here. Were switching upon movement & MotionEvent.ACTION_MASK right away rather than only movement itself, as well as were regulating the brand new MotionEvent movement constant, MotionEvent.ACTION_POINTER_UP. ACTION_POINTER_DOWN as well as ACTION_POINTER_UP have been dismissed whenever the secondary pointer goes down or up. If there is already the pointer upon the shade as well as the brand new the single goes down, we will embrace ACTION_POINTER_DOWN instead of ACTION_DOWN. If the pointer goes up though there is still during slightest the single touching the screen, we will embrace ACTION_POINTER_UP instead of ACTION_UP.
The ACTION_POINTER_DOWN as well as ACTION_POINTER_UP events encode additional report in the movement value. ANDing it with MotionEvent.ACTION_MASK gives us the movement consistent whilst ANDing it with ACTION_POINTER_INDEX_MASK gives us the index of the pointer which went up or down. In the ACTION_POINTER_UP box the e.g. extracts this index as well as ensures which the active pointer ID is not referring to the pointer which is no longer touching the screen. If it was, we name the opposite pointer to be active as well as save the current X as well as Y position. Since this saved in front of is used in the ACTION_MOVE box to work out the stretch to pierce the onscreen object, we will regularly work out the stretch to pierce regulating interpretation from the scold pointer.
This is all the interpretation which we need to routine any arrange of gesticulate your app might require. However dealing with this low-level interpretation can be cumbersome when operative with some-more formidable gestures. Enter GestureDetectors.
GestureDetectors
Since apps can have vastly opposite needs, Android does not spend time in swell hold interpretation in to aloft turn events unless we privately ask it. GestureDetectors have been tiny filter objects which consume MotionEvents as well as dispatch aloft turn gesticulate events to listeners specified during their construction. The Android horizon provides dual GestureDetectors out of the box, though we should additionally feel giveaway to have make use of of them as examples for implementing your own if needed. GestureDetectors have been the pattern, not the prepacked solution. Theyre not only for formidable gestures such as drawing the star whilst station upon your head, they can even have elementary gestures like fling or double tap easier to work with.
android.view.GestureDetector generates gesticulate events for multiform usual single-pointer gestures used by Android including scrolling, flinging, as well as long press. For Android 2.2 (Froyo) weve additionally added android.view.ScaleGestureDetector for estimate the most ordinarily requested two-finger gesture: pinch zooming.
Gesture detectors follow the settlement of providing the process open boolean onTouchEvent(MotionEvent). This method, like the namesake in android.view.View, earnings loyal if it handles the eventuality as well as false if it does not. In the context of the gesticulate detector, the lapse value of loyal implies which there is an appropriate gesticulate now in progress. GestureDetector as well as ScaleGestureDetector can be used together when we wish the perspective to recognize mixed gestures.
To report rescued gesticulate events, gesticulate detectors have make use of of listener objects passed to their constructors. ScaleGestureDetector uses ScaleGestureDetector.OnScaleGestureListener. ScaleGestureDetector.SimpleOnScaleGestureListener is offering as the supporter category which we can extend if we dont care about all of the reported events.
Since we have been already supporting dragging in the example, lets supplement await for scaling. The updated e.g. formula is shown below:
private ScaleGestureDetector mScaleDetector;private boyant mScaleFactor = 1.f;// Existing formula ...public TouchExampleView(Context context, AttributeSet attrs, int defStyle) { super(context, attrs, defStyle); mIcon = context.getResources().getDrawable(R.drawable.icon); mIcon.setBounds(0, 0, mIcon.getIntrinsicWidth(), mIcon.getIntrinsicHeight()); // Create the ScaleGestureDetector mScaleDetector = brand new ScaleGestureDetector(context, brand new ScaleListener());}@Overridepublic boolean onTouchEvent(MotionEvent ev) { // Let the ScaleGestureDetector inspect all events. mScaleDetector.onTouchEvent(ev); last int movement = ev.getAction(); switch (action & MotionEvent.ACTION_MASK) { box MotionEvent.ACTION_DOWN: { last boyant x = ev.getX(); last boyant y = ev.getY(); mLastTouchX = x; mLastTouchY = y; mActivePointerId = ev.getPointerId(0); break; } box MotionEvent.ACTION_MOVE: { last int pointerIndex = ev.findPointerIndex(mActivePointerId); last boyant x = ev.getX(pointerIndex); last boyant y = ev.getY(pointerIndex); // Only pierce if the ScaleGestureDetector isn't estimate the gesture. if (!mScaleDetector.isInProgress()) { last boyant dx = x - mLastTouchX; last boyant dy = y - mLastTouchY; mPosX += dx; mPosY += dy; invalidate(); } mLastTouchX = x; mLastTouchY = y; break; } box MotionEvent.ACTION_UP: { mActivePointerId = INVALID_POINTER_ID; break; } box MotionEvent.ACTION_CANCEL: { mActivePointerId = INVALID_POINTER_ID; break; } box MotionEvent.ACTION_POINTER_UP: { last int pointerIndex = (ev.getAction() & MotionEvent.ACTION_POINTER_INDEX_MASK) >> MotionEvent.ACTION_POINTER_INDEX_SHIFT; last int pointerId = ev.getPointerId(pointerIndex); if (pointerId == mActivePointerId) { // This was the active pointer starting up. Choose the brand new // active pointer as well as adjust accordingly. last int newPointerIndex = pointerIndex == 0 ? 1 : 0; mLastTouchX = ev.getX(newPointerIndex); mLastTouchY = ev.getY(newPointerIndex); mActivePointerId = ev.getPointerId(newPointerIndex); } break; } } lapse true;}@Overridepublic blank onDraw(Canvas canvas) { super.onDraw(canvas); canvas.save(); canvas.translate(mPosX, mPosY); canvas.scale(mScaleFactor, mScaleFactor); mIcon.draw(canvas); canvas.restore();}private category ScaleListener extends ScaleGestureDetector.SimpleOnScaleGestureListener { @Override open boolean onScale(ScaleGestureDetector detector) { mScaleFactor *= detector.getScaleFactor(); // Don't let the intent get as well tiny or as well large. mScaleFactor = Math.max(0.1f, Math.min(mScaleFactor, 5.0f)); invalidate(); lapse true; }}This e.g. merely scratches the aspect of what ScaleGestureDetector offers. The listener methods embrace the anxiety to the detector itself as the parameter which can be queried for lengthened report about the gesticulate in progress. See the ScaleGestureDetector API await for some-more details.
Now the e.g. app allows the user to draw towards with the single finger, scale with two, as well as it rightly handles flitting active pointer focus between fingers as they contact as well as leave the screen. You can download the last sample plan during http://code.google.com/p/android-touchexample/. It requires the Android 2.2 SDK (API turn 8) to set up as well as the 2.2 (Froyo) powered device to run.
From Example to Application
In the real app we would wish to tweak the sum about how zooming behaves. When zooming, users will expect calm to zoom about the focal indicate of the gesticulate as reported by ScaleGestureDetector.getFocusX() as well as getFocusY(). The specifics of this will change depending upon how your app represents as well as draws the content.
Different touchscreen hardware might have opposite capabilities; the little panels might only await the single pointer, others might await dual pointers though with in front of interpretation unsuitable for formidable gestures, as well as others might await precise positioning interpretation for dual pointers as well as beyond. You can query what type of touchscreen the device has during runtime regulating PackageManager.hasSystemFeature().
As we pattern your user interface keep in thoughts which people have make use of of their mobile inclination in most opposite ways as well as not all Android inclination have been combined equal. Some apps might be used one-handed, creation multiple-finger gestures awkward. Some users prefer regulating directional pads or trackballs to navigate. Well-designed gesticulate await can put formidable functionality during your users fingertips, though additionally consider conceptualizing swap equates to of accessing application functionality which can coexist with gestures.
Photography
Tidak ada komentar:
Posting Komentar