Gyroscope in Mobile Apps: A Practical Guide
Use the gyroscope to track rotational velocity for AR, gaming and orientation in iOS and Android — with copy-pasteable Expo, Swift and Kotlin examples.
The gyroscope measures how fast your device is rotating around each of its three axes. Where the accelerometer answers "where is gravity?", the gyroscope answers "how fast am I turning?". This makes it the foundation of AR, motion-controlled games and any app that needs precise orientation in real time.
Key Takeaways
- A gyroscope reports angular velocity in radians per second around X, Y and Z.
- Integrating gyroscope data over time gives you orientation, but it drifts within seconds without correction.
- Real apps fuse the gyroscope with the accelerometer (and magnetometer) — that fused stream is what ARKit and ARCore use under the hood.
- Expo, Core Motion and SensorManager all expose the same {x, y, z} stream with slightly different unit conventions.
Gyroscope at a Glance
What It Is & How It Works
What it is. A MEMS gyroscope is a tiny vibrating-mass sensor that measures the Coriolis force when the device rotates. It outputs three values — the rate of rotation around each axis — many times per second.
How it works. You subscribe to a sample stream and read three numbers per sample. To get an actual orientation you would integrate angular velocity over time, but in practice you ask the platform for fused orientation (deviceMotion on iOS, ROTATION_VECTOR on Android) instead of doing the maths yourself.
Units & signal. iOS Core Motion reports rotationRate.x/y/z in radians/second. Android reports the same units via TYPE_GYROSCOPE. Expo wraps the platform values directly.
What You Can Build With It
Augmented reality
AR frameworks use the gyroscope to keep virtual content fixed to the world while the camera moves.
Example: A "view in your room" AR feature in a furniture app.
First-person games
Tilt and rotate the device to look around in a game without using touch.
Example: A roller-coaster VR clip you steer by turning your head.
Stabilising the camera UI
Counter-rotate UI elements based on gyro readings so a horizon line stays level on screen.
Example: A real-estate camera that draws a level guide while you frame a shot.
Detecting rotation gestures
Spin-to-refresh or wrist-flick gestures map directly to a yaw rotation peak.
Example: A photo app that flips between front/back camera on a wrist twist.
Permissions & Setup
Same as the accelerometer: iOS prompts for Motion & Fitness on first read; Android does not require a runtime permission.
iOS · Info.plist
NSMotionUsageDescription
Android · AndroidManifest.xml
No special permission keys required.
Code Examples
Subscribe to {x, y, z} angular velocity at the rate you actually need.
Setup
- Expo: `npx expo install expo-sensors`
- iOS: import CoreMotion, add `NSMotionUsageDescription` to Info.plist
- Android: request the default `TYPE_GYROSCOPE` from `SensorManager`
import { useEffect, useState } from 'react';
import { Gyroscope } from 'expo-sensors';
import { Text, View } from 'react-native';
export function GyroView() {
const [data, setData] = useState({ x: 0, y: 0, z: 0 });
useEffect(() => {
Gyroscope.setUpdateInterval(50); // ms — 20 Hz
const sub = Gyroscope.addListener(setData);
return () => sub.remove();
}, []);
return (
<View>
<Text>x: {data.x.toFixed(2)} rad/s</Text>
<Text>y: {data.y.toFixed(2)} rad/s</Text>
<Text>z: {data.z.toFixed(2)} rad/s</Text>
</View>
);
}Tip: With Newly, you describe the feature you want and the AI agent wires up the sensor, permissions, and UI for you. Try it free.
Best Practices
Prefer fused orientation for AR/UI
Use Core Motion `deviceMotion` (iOS) or `TYPE_ROTATION_VECTOR` (Android) instead of integrating raw gyro values yourself.
Calibrate at startup
Stationary devices should read close to zero. If they consistently report a non-zero offset, capture the bias for a second and subtract it.
Pause when off-screen
Stop updates when the user backgrounds the app or the relevant screen unmounts. Background gyro is rarely useful and is expensive.
Throttle UI updates
Read at 50–100 Hz internally, but update React state at 15–20 Hz to keep the JS thread responsive.
Common Pitfalls
Drift over time
Integrating raw rotation rate to estimate orientation accumulates error. After ~30 seconds you can be 10°+ off.
Mitigation: Use the platform-provided fused orientation, or fuse with the accelerometer.
Inconsistent sign conventions
iOS uses a right-handed coordinate system; Android uses a left-handed one for some sensors. Mixing them up flips your camera.
Mitigation: Stick to one of the wrapper libraries (Expo, react-native-sensors, ARKit/ARCore) that normalise the axes.
Reading too often on Android
`SENSOR_DELAY_FASTEST` can fire at 200+ Hz and crush battery life and CPU.
Mitigation: Pick `SENSOR_DELAY_GAME` (50 Hz) or `SENSOR_DELAY_UI` (15 Hz) unless you have a specific reason.
Forgetting some devices have no gyro
Cheap tablets and old feature phones can lack a real gyroscope.
Mitigation: Check availability and gracefully fall back to accelerometer-only behaviour or a touch UI.
When To Use It (And When Not To)
Good fit
- AR placement and 3D camera control
- Look-around / first-person input in games
- Detecting rotation-based gestures (twist, flick, spin)
- Stabilising on-screen indicators (level, horizon, compass needle)
Look elsewhere if…
- Long-running orientation tracking — drifts without sensor fusion
- Estimating distance moved — gyro tells you how, not how far
- Replacing the magnetometer for absolute heading
- Reading from the lock screen on iOS (no Core Motion access)
Frequently Asked Questions
What is the difference between gyroscope and accelerometer?
The accelerometer measures linear acceleration (including gravity); the gyroscope measures angular velocity. Together they tell you both how the device is moving and how it is rotating.
Why does my orientation drift over time?
Numerical integration of any noisy signal accumulates error. Use the OS-provided fused orientation (rotation vector / device motion) which corrects gyro drift with the accelerometer and magnetometer.
Is the gyroscope available on every phone?
It is on virtually every smartphone made since 2012, but not on every cheap tablet. Always feature-detect before relying on it.
Can I run the gyroscope in the background?
Yes on Android with a foreground service. On iOS, general background sensor streaming is not allowed; you would need a specific background mode like CarPlay or Location.
Build with the Gyroscope on Newly
Ship a gyroscope-powered feature this week
Newly turns a description like “use the gyroscope to augmented reality” into a real React Native app — permissions, native modules and UI included. Full source code is yours, and you can publish to the App Store and Google Play directly from the dashboard.
Want a deeper dive on the underlying APIs? See Expo Sensors, Apple Core Motion and Android sensor framework.
