Out-of-the-box, Android’s standard UI components support a range of Android gestures, but occasionally your app may need to support more than just onClick!
In this Android gesture tutorial, we’ll be covering everything you need to implement a range of Android gestures. We’ll be creating a range of simple applications that provide an insight into the core concepts of touch gestures, including how Android records the “lifecycle” of a gesture, and how it tracks the movement of individual fingers in a multi-touch interaction.
Occasionally your app may need to support more than just onClick.
To help demonstrate how this information might translate into real-world projects, we’ll also be creating an application that allows the user to zoom in and out of an image, using the pinch gesture. Finally, since Android 10 is poised to completely overhaul Android’s gesture support, we’ll be looking at how you can update your applications to support Android’s new gesture-based navigation, including how to ensure your app’s own gestures don’t conflict with Android 10’s system-wide gestures.
Read also: Building your Android UI: Everything you need to know about views
Touch gestures allow users to interact with your app using touch.
Android supports a range of touch gestures, including tap, double tap, pinch, swipe, scroll, long press, drag and fling. Although drag and fling are similar, drag is the type of scrolling that occurs when a user drags their finger across the touchscreen, while a fling gesture occurs when the user drags and then lifts their finger quickly.
Android gestures can be divided into the following categories:
In Android, the individual fingers or other objects that perform a touch gesture are referred to as pointers.
A touch event starts when the user places one or more pointers on the device’s touchscreen, and ends when they remove these pointer(s) from the screen. This begins Android gestures.
While one or more pointers are in contact with the screen, MotionEvent objects gather information about the touch event. This information includes the touch event’s movement, in terms of X and Y coordinates, and the pressure and size of the contact area.
A MotionEvent also describes the touch event’s state, via an action code. Android supports a long list of action codes, but some of the core action codes include:
The MotionEvent objects transmit the action code and axis values to the onTouchBack() event callback method for the View that received this touch event. You can use this information to interpret the pattern of the touch gesture, and react accordingly. Note that each MotionEvent object will contain information about all the pointers that are currently active, even if those pointers haven’t moved since the previous MotionEvent was delivered.
While Android does try to deliver a consistent stream of MotionEvents, it’s possible for an event to be dropped or modified before it’s delivered successfully. To provide a good user experience, your app should be able to handle inconsistent MotionEvents, for example if it receives an ACTION_DOWN event without receiving an ACTION_UP for the “previous” gesture. This is an important consideration for our Android gesture tutorial.
To help illustrate the “lifecycle” of a touch gesture, let’s create an application that retrieves the action code for each MotionEvent object and then prints this information to Android Studio’s Logcat.
In the following code, we’re intercepting each touch event by overriding the onTouchEvent() method, and then checking for the following values:
The onTouchEvent() method will be triggered every time a pointer’s position, pressure or contact area changes.
In the following code, I’m also using getActionMasked() to retrieve the action being performed:
import androidx.appcompat.app.AppCompatActivity; import androidx.core.view.MotionEventCompat; import android.os.Bundle; import android.util.Log; import android.view.MotionEvent; public class MainActivity extends AppCompatActivity { private static final String TAG = "MyActivity"; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); } @Override public boolean onTouchEvent (MotionEvent event){ int myAction = MotionEventCompat.getActionMasked(event); switch (myAction) { case (MotionEvent.ACTION_UP): Log.i(TAG, "Up action"); return true; case (MotionEvent.ACTION_DOWN): Log.d(TAG, "Down action"); return true; case (MotionEvent.ACTION_MOVE): Log.d(TAG, "Move action"); return true; case (MotionEvent.ACTION_CANCEL): Log.d(TAG, "Cancel action"); return true; default: return super.onTouchEvent(event); } } }
Install this application on your physical Android smartphone or tablet, and experiment by performing various touch gestures. Android Studio should print different messages to Logcat, based on where you are in the touch gesture’s lifecycle.
You can also listen for touch events by using the setOnTouchListener() method to attach a View.OnTouchListener to your View object. The setOnTouchListener() method registers a callback that will be invoked every time a touch event is sent to its attached View, for example here we’re invoking a callback every time the user touches an ImageView:
View imageView = findViewById(R.id.my_imageView); myView.setOnTouchListener(new OnTouchListener() { public boolean onTouch(View v, MotionEvent event) { //To do: Respond to touch event// return true; } });
Note that if you use View.OnTouchListener, then you should not create a listener that returns false for the ACTION_DOWN event. Since ACTION_DOWN is the starting point for all touch events, a value of false will cause your application to become stuck at ACTION_DOWN, and the listener won’t be called for any subsequent events.
Touch gestures aren’t always precise! For example, it’s easy for your finger to shift slightly when you were just trying to tap a button, especially if you’re using your smartphone or tablet on the go, or you have manual dexterity issues.
To help prevent accidental scrolling, Android gestures use the concept of “touch slop” which is the distance, in pixels, that a pointer can travel before a non-movement based gesture, such as a tap, becomes a movement-based gesture, such as a drag.
Touch slop is the distance, in pixels, that a pointer can travel before a non-movement based gesture
When using movement-based gestures, you need to ensure that the user is in control of any onscreen movement that occurs. For example, if the user is dragging an object across the screen, then the speed that this object is travelling must match the speed of the user’s gesture.
You can measure the velocity of a movement-based gesture, using Android’s VelocityTracker class. In the following Activity, I’m using VelocityTracker to retrieve the speed of a gesture, and then printing the velocity to Android Studio’s Logcat:
import android.app.Activity; import android.util.Log; import android.view.MotionEvent; import android.view.VelocityTracker; public class MainActivity extends Activity { public static final String TAG = "Velocity"; private VelocityTracker myVelocityTracker; @Override public boolean onTouchEvent(MotionEvent event) { obtainVelocityTracker(event); switch (event.getAction()) { case MotionEvent.ACTION_UP: final VelocityTracker velocityTracker = myVelocityTracker; //Determine the pointer’s velocity// velocityTracker.computeCurrentVelocity(1000); //Retrieve the velocity for each pointer// float xVelocity = myVelocityTracker.getXVelocity(); float yVelocity = myVelocityTracker.getYVelocity(); //Log the velocity in pixels per second// Log.i(TAG, "xVelocity: " + xVelocity + ", yVelocity: " + yVelocity); //Reset the velocity tracker to its initial state, ready to record the next gesture// myVelocityTracker.clear(); break; default: break; } return true; } private void obtainVelocityTracker(MotionEvent event) { if (myVelocityTracker == null) { //Retrieve a new VelocityTracker object// myVelocityTracker = VelocityTracker.obtain(); } myVelocityTracker.addMovement(event); } }
Install this application on your Android device and experiment by performing different movement-based gestures; the velocity of each gesture should be printed to the Logcat window.
Assuming that you’re using common Android gestures, such as tap and long press, you can use Android’s GestureDetector class to detect gestures without having to process the individual touch events.
To detect a gesture, you’ll need to create an instance of GestureDetector, and then call onTouchEvent(android.view.MotionEvent) in the View#onTouchEvent(MotionEvent) method. You can then define how this touch event should be handled, in the callback.
Read also: Exploring Android Q: Adding bubble notifications to your app
Let’s create an application where the user can zoom in and out of an ImageView, using gestures. To start, create a simple layout that contains an image:
<?xml version="1.0" encoding="utf-8"?> <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="match_parent" android:layout_height="match_parent" android:orientation="horizontal"> <ImageView android:id="@+id/imageView" android:layout_width="wrap_content" android:layout_height="match_parent" android:scaleType="matrix" android:src="@drawable/myImage"/> </LinearLayout>
To create the zoom in/zoom out effect, I’m using ScaleGestureDetector, which is a convenience class that can listen for a subset of scaling events, plus the SimpleOnScaleGestureListener helper class.
In the following Activity, I’m creating an instance of ScaleGestureDetector for my ImageView, and then calling onTouchEvent(android.view.MotionEvent) in the View#onTouchEvent(Motionvent) method. Finally, I’m defining how the application should handle this gesture.
import android.os.Bundle; import android.view.MotionEvent; import android.widget.ImageView; import android.view.ScaleGestureDetector; import android.graphics.Matrix; import androidx.appcompat.app.AppCompatActivity; public class MainActivity extends AppCompatActivity { private Matrix imageMatrix = new Matrix(); private ImageView imageView; private float scale = 2f; private ScaleGestureDetector gestureDetector; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); imageView = (ImageView)findViewById(R.id.imageView); //Instantiate the gesture detector// gestureDetector = new ScaleGestureDetector(MainActivity.this,new imageListener()); } @Override public boolean onTouchEvent(MotionEvent event){ //Let gestureDetector inspect all events// gestureDetector.onTouchEvent(event); return true; } //Implement the scale listener// private class imageListener extends ScaleGestureDetector.SimpleOnScaleGestureListener{ @Override //Respond to scaling events// public boolean onScale(ScaleGestureDetector detector) { //Return the scaling factor from the previous scale event// scale *= detector.getScaleFactor(); //Set a maximum and minimum size for our image// scale = Math.max(0.2f, Math.min(scale, 6.0f)); imageMatrix.setScale(scale, scale); imageView.setImageMatrix(imageMatrix); return true; } } }
Try installing this app on a physical Android smartphone or tablet, and you’ll be able to shrink and expand your chosen image, using pinching in and pinching-out gestures.
Some gestures require you to use multiple pointers, such as the pinch gesture. Every time multiple pointers make contact with the screen, Android generates:
For example, in the following Activity I’m detecting whether a gesture is single-touch or multi-touch and then printing an appropriate message to Android Studio’s Logcat. I’m also printing the action code for each event, and the X and Y coordinates for each pointer, to provide more insight into how Android tracks individual pointers:
import android.app.Activity; import android.util.Log; import android.view.MotionEvent; import androidx.core.view.MotionEventCompat; public class MainActivity extends Activity { public static final String TAG = "SingleorMulti"; @Override public boolean onTouchEvent(MotionEvent event) { int action = MotionEventCompat.getActionMasked(event); String actionCode = ""; switch (action) { case MotionEvent.ACTION_DOWN: actionCode = "Down"; break; case MotionEvent.ACTION_POINTER_DOWN: actionCode = "Pointer Down"; break; case MotionEvent.ACTION_MOVE: actionCode = "Move"; break; case MotionEvent.ACTION_UP: actionCode = "Up"; break; case MotionEvent.ACTION_POINTER_UP: actionCode = "Pointer Up"; break; case MotionEvent.ACTION_OUTSIDE: actionCode = "Outside"; break; case MotionEvent.ACTION_CANCEL: actionCode = "Cancel"; break; } Log.i(TAG, "The action is : " + actionCode); int index = MotionEventCompat.getActionIndex(event); int xPos = -1; int yPos = -1; if (event.getPointerCount() > 1) { Log.i(TAG, "Multi-Touch event"); } else { Log.i(TAG, "Single Touch event"); return true; } xPos = (int) MotionEventCompat.getX(event, index); yPos = (int) MotionEventCompat.getY(event, index); Log.i(TAG, "xPosition: " + xPos + ", yPosition: " + yPos); return true; } }
When handling touch events within a ViewGroup, it’s possible that the ViewGroup may have children that are targets for different touch events than the parent ViewGroup.
To ensure each child View receives the correct touch events, you’ll need to override the onInterceptTouchEvent() method. This method is called every time a touch event is detected on the surface of a ViewGroup, allowing you to intercept a touch event before it’s dispatched to the child Views.
Also read:
If the onInterceptTouchEvent() method returns true, then the child View that was previously handling the touch event will receive an ACTION_CANCEL, and this event will be sent to the parent’s onTouchEvent() method instead.
For example, in the following snippet we’re deciding whether to intercept a touch event, based on whether it’s a scrolling event:
@Override public boolean onInterceptTouchEvent(MotionEvent ev) { final int action = MotionEventCompat.getActionMasked(ev); if (action == MotionEvent.ACTION_CANCEL || action == MotionEvent.ACTION_UP) { mIsScrolling = false; //Do not intercept the touch event// return false; } switch (action) { case MotionEvent.ACTION_MOVE: { if (mIsScrolling) { //Intercept the touch event// return true; } } ... ... ... return false; } @Override public boolean onTouchEvent(MotionEvent ev) { //To do: Handle the touch event// } }
Run this app on your Android device, and the Logcat output should look something like this:
You can make
13/10/2019 06:00 AM
2014 © Canadian apps and news