Face Recognition in Mobile Apps
Face ID, Android face unlock and Vision face APIs — what each does, when to use them, and how to ship a respectful face-aware experience.
"Face recognition" on mobile actually splits into two unrelated things: secure biometric authentication that you should always delegate to the OS, and face detection / landmark APIs that your app can use directly for fun camera features. This guide walks through both, plus the privacy questions you should be ready to answer.
Key Takeaways
- Use Face ID (LocalAuthentication) and BiometricPrompt for any *authentication* use case — never roll your own face-unlock.
- For face *detection* (bounding boxes, landmarks, expressions), use Vision (iOS) or ML Kit Face Detection (Android).
- Identity-grade face recognition (1:N matching) requires server-side ML and explicit user consent.
- Always ship a non-face fallback for accessibility and reliability.
Face APIs at a Glance
What It Is & How It Works
What it is. A spectrum from secure 3D face authentication (Face ID, structured light) to simple 2D face detection on the camera preview.
How it works. For auth, the OS uses an IR depth sensor or the front camera, with anti-spoofing built in. For detection, an on-device ML model returns bounding boxes and landmarks per frame.
Units & signal. Authentication APIs return a result enum. Detection APIs return rectangles, landmark points, head Euler angles and (optionally) expression probabilities (smiling, eyes-open).
What You Can Build With It
Face unlock auth
Replace passwords with the OS-managed face prompt.
Example: A messaging app re-auths users with Face ID after 5 minutes idle.
Live filters and AR effects
Detect landmarks to overlay glasses, masks or beauty filters.
Example: A camera app that adds dog ears in real time.
Liveness checks for KYC
Combine face detection with motion prompts ("blink", "turn head") to verify a real person.
Example: An onboarding flow that scans the user's ID and matches a selfie.
Accessibility and presence
Pause video when the user looks away or detect attention for kids' apps.
Example: A reading app that pauses when the user's gaze leaves the screen.
Permissions & Setup
For authentication, the OS shows a system prompt and returns success/failure. For detection, you must request camera permission and respect the user's decision.
iOS · Info.plist
NSFaceIDUsageDescription (auth)NSCameraUsageDescription (detection)
Android · AndroidManifest.xml
android.permission.USE_BIOMETRIC (auth)android.permission.CAMERA (detection)
Code Examples
Setup
- Expo: install `expo-camera` and `expo-face-detector` (or use the Vision plugin)
- iOS: import Vision and AVFoundation; add NSCameraUsageDescription
- Android: add Google Play services' ML Kit (`com.google.mlkit:face-detection`)
import * as React from 'react';
import { CameraView, useCameraPermissions } from 'expo-camera';
import * as FaceDetector from 'expo-face-detector';
import { View, Text } from 'react-native';
export function FaceCamera() {
const [perm, request] = useCameraPermissions();
const [count, setCount] = React.useState(0);
if (!perm?.granted) {
return <Text onPress={() => request()}>Tap to allow camera</Text>;
}
return (
<View style={{ flex: 1 }}>
<CameraView
style={{ flex: 1 }}
facing="front"
onFacesDetected={({ faces }) => setCount(faces.length)}
faceDetectorSettings={{
mode: FaceDetector.FaceDetectorMode.fast,
detectLandmarks: FaceDetector.FaceDetectorLandmarks.all,
runClassifications: FaceDetector.FaceDetectorClassifications.all,
}}
/>
<Text style={{ position: 'absolute', top: 60, left: 20, color: 'white' }}>
Faces: {count}
</Text>
</View>
);
}Tip: With Newly, you describe the feature you want and the AI agent wires up the sensor, permissions, and UI for you. Try it free.
Best Practices
Separate authentication from detection
Use OS biometrics for "is the right user here", and detection APIs for "is there a face on screen".
Run detection on-device when you can
Vision and ML Kit run locally; that's a strong signal you can give users about privacy.
Throttle detection
Run face detection every 3-5 frames in low-power mode; full-rate is rarely needed.
Be transparent about identity matching
Anything beyond "we counted a face" requires explicit consent and a clear retention policy.
Common Pitfalls
Building custom face-unlock
It's less secure than the OS prompt and may violate platform guidelines.
Mitigation: Always use Face ID / BiometricPrompt for authentication.
Sending faces to a server without consent
Privacy laws (GDPR, BIPA) treat biometric data strictly.
Mitigation: Process on-device whenever possible, or get explicit opt-in.
Ignoring lighting
Face detection accuracy collapses in low or back-lit scenes.
Mitigation: Combine with the ambient light sensor; suggest the user move to better light.
When To Use It (And When Not To)
Good fit
- Frictionless re-authentication
- Live AR camera effects
- Liveness checks during onboarding
- Accessibility-driven gaze and presence detection
Look elsewhere if…
- Identifying people without consent
- Identity verification in low-light first-time flows
- Replacing 2FA for highly sensitive systems
- Uploading raw selfies to a server when on-device works
Frequently Asked Questions
Is Face ID a face recognition API?
No — Face ID is a local biometric authenticator. Your app receives only a yes/no, never an image or template.
Can I do 1:N face matching on-device?
Yes for small databases (a few hundred faces) using on-device embeddings; beyond that you generally need a server.
Do I need a privacy policy disclosure?
Yes. Even on-device processing should be explicitly disclosed if you store or transmit any feature data.
Which is more accurate, Face ID or Android face unlock?
Face ID uses 3D structured light and is typically rated higher than 2D-only Android face unlock; Android STRONG class implementations close the gap.
Build with the Face Recognition on Newly
Ship a face recognition-powered feature this week
Newly turns a description like “use the face recognition to face unlock auth” into a real React Native app — permissions, native modules and UI included. Full source code is yours, and you can publish to the App Store and Google Play directly from the dashboard.
Want a deeper dive on the underlying APIs? See Expo Sensors, Apple Core Motion and Android sensor framework.
