Working with the Touchscreen in Mobile Apps
Multi-touch, pressure, hover and Apple Pencil — what the touch stack actually exposes, and how to build gestures that feel native.
The touchscreen isn't just an input device — it's the *only* input on most phones. Both platforms expose a rich event stream (touch, pressure, tilt, pencil) that you usually consume through gesture recognisers. Get them right and your app feels native; get them wrong and users will silently churn. This guide covers the building blocks and the rough edges.
Key Takeaways
- Use UIGestureRecognizer (iOS), MotionEvent (Android) or React Native Gesture Handler / Reanimated for production gestures.
- 3D Touch is gone, but Force Touch is still on Apple Watch; both platforms support hover (iPad pencil, Samsung S Pen).
- Always design for multi-touch: pinch + pan together is the table-stakes example.
- Palm rejection, edge filtering and accessibility (Switch Control, AssistiveTouch) are part of "good touch UX".
Touchscreen at a Glance
What It Is & How It Works
What it is. A capacitive (or, on iPad Pro / Surface, EMR) sensor stack that reports finger / stylus contacts as a stream of events.
How it works. Each contact has an id, position, pressure (where supported), tilt, radius and timestamp. The OS feeds these into gesture recognisers; you compose recognisers into higher-level interactions.
Units & signal. Position in points (iOS) or pixels (Android), pressure (0-1), radius (px), tilt (deg), velocity (px/s).
What You Can Build With It
Drag and drop
Move list items, image tiles or canvas objects.
Example: A kanban board reordering cards.
Pinch + pan + zoom
Maps, photo viewers and design tools.
Example: A photo editor zooming into a 50 MP image with two-finger pinch.
Pressure / pencil drawing
Variable-width strokes from Apple Pencil or Samsung S Pen.
Example: A note-taking app drawing pressure-sensitive ink.
Custom gestures
Swipe-to-action, two-finger long-press, edge swipes.
Example: A mail app with swipe-left to archive.
Permissions & Setup
Touch input requires no permissions on either platform — but you should still respect accessibility settings.
iOS · Info.plist
No permission required
Android · AndroidManifest.xml
No permission required
Code Examples
Setup
- Expo: `npx expo install react-native-gesture-handler react-native-reanimated`
- iOS: prefer UIGestureRecognizer subclasses; SwiftUI: use .gesture()
- Android: prefer GestureDetector; Compose: use pointerInput { detectTransformGestures(...) }
import { Gesture, GestureDetector } from 'react-native-gesture-handler';
import Animated, { useAnimatedStyle, useSharedValue } from 'react-native-reanimated';
export function ZoomImage({ source }: { source: any }) {
const scale = useSharedValue(1);
const tx = useSharedValue(0);
const ty = useSharedValue(0);
const pinch = Gesture.Pinch().onUpdate(e => { scale.value = e.scale; });
const pan = Gesture.Pan().onUpdate(e => {
tx.value = e.translationX;
ty.value = e.translationY;
});
const composed = Gesture.Simultaneous(pinch, pan);
const style = useAnimatedStyle(() => ({
transform: [
{ translateX: tx.value },
{ translateY: ty.value },
{ scale: scale.value },
],
}));
return (
<GestureDetector gesture={composed}>
<Animated.Image source={source} style={[{ width: 300, height: 300 }, style]} />
</GestureDetector>
);
}Tip: With Newly, you describe the feature you want and the AI agent wires up the sensor, permissions, and UI for you. Try it free.
Best Practices
Compose, don't reinvent
Use platform recognisers / Reanimated; rolling your own gesture FSM is the most common source of touch bugs.
Tune touch latency
Use Hermes + Reanimated worklets, CADisplayLink callbacks, or Choreographer to keep gestures on the UI/main thread frame budget.
Respect accessibility tools
AssistiveTouch, Switch Control and TalkBack synthesise touches; do not block standard gestures.
Differentiate fingers from pencils
Both platforms tag pencil events; route them to drawing tools and ignore palm contacts.
Common Pitfalls
Catching touches you should bubble
A swipe in your custom view that should reach a parent ScrollView causes "frozen" scroll.
Mitigation: Use simultaneous / require-to-fail relationships; test scroll containers explicitly.
Janky gestures on the JS thread
Doing animation math on the React Native JS thread drops frames.
Mitigation: Move work to Reanimated worklets or native drivers.
Ignoring edge swipes
Both OSes reserve edge gestures (back, control center). Custom edge swipes will fight the OS.
Mitigation: Avoid the first 20-30 pt of each edge for custom interactions.
When To Use It (And When Not To)
Good fit
- Anything the user touches, ever
- Drawing, pinching, panning, swiping
- Pencil-driven note-taking and design
- Custom gestures with thoughtful fallbacks
Look elsewhere if…
- Replacing system gestures (back, home, control center)
- Hidden invisible gestures users can't discover
- Gestures requiring more than 3 simultaneous fingers
- Touch logic blocking the main thread
Frequently Asked Questions
Is 3D Touch still a thing?
No — Apple removed it from iPhone XR onward; long-press now drives Haptic Touch. Force Touch survives on Apple Watch.
How do I know if a touch came from Apple Pencil?
On iOS, check `UITouch.type == .stylus`; on iPad SwiftUI, use `EventModifiers.pencil`. React Native exposes the touch type via `e.nativeEvent`.
Why does my pan feel laggy?
Almost always a JS-thread bottleneck. Move math to Reanimated worklets or use native drivers.
Can I read pressure on Android?
Yes — `MotionEvent.getPressure()` returns 0-1; many devices report 1.0 only.
Build with the Touchscreen on Newly
Ship a touchscreen-powered feature this week
Newly turns a description like “use the touchscreen to drag and drop” into a real React Native app — permissions, native modules and UI included. Full source code is yours, and you can publish to the App Store and Google Play directly from the dashboard.
Want a deeper dive on the underlying APIs? See Expo Sensors, Apple Core Motion and Android sensor framework.
