Using gestures instead of hard or soft buttons to navigate a mobile OS isn’t exactly perfect and it’s definitely the case on Android right now. But what if the phone attempted to learn when that swipe off the edge of the screen meant you wanted to roll around in the app or go back or to the home screen? Google may be trying to find out.

Quinny899 on the XDA-Developers forums was able to pick up on a new TensorFlow Lite model within Android 12 working out of the SystemUI app’s EdgeBackGestureHandler script with an associated vocabulary file called “backgesture” featuring the package names of 43,000 apps (two of them from Quinny899).

Presumably, this model may be using recorded swipe data — specifically, start and end pixels — from those apps to determine if a swipe calls for navigating through the app or to the system.

Current gesture sensitivity settings

Model-based gesture navigation can reportedly be activated by triggering a flag within Android 12 Developer Preview 1. From here on out, it’ll take some time to suss out whether variable swipe tolerances work out better for users than the current hard presets.

For more about the Android 12 launch, check out our announcement post detailing what’s new here. If you want to install the developer preview on your own device, find out how in our Android 12 download guide.