Infrared Sensors in Mobile Apps
Where infrared shows up on phones — Face ID flood / dot projectors, IR illuminators on Android, IR blasters and depth sensors — and what your app can actually use.
Infrared sensors do a lot of quiet work on a smartphone — Face ID, low-light face unlock, proximity detection, depth sensing, even controlling your TV on a few Android phones. Most of that hardware is locked behind higher-level APIs (Face ID, biometric auth, depth). This guide explains what is actually directly accessible and how to handle the rest.
Key Takeaways
- You almost never read infrared pixels directly. Instead you use the higher-level API that wraps them — Face ID, biometric prompt, depth, proximity.
- A handful of older Android phones (Samsung S4–S6, HTC One, Huawei Mate / P30 Pro) ship a Consumer IR (IR blaster) usable via `ConsumerIrManager`.
- iPhones and most modern Androids do not include a public IR-blaster API.
- The TrueDepth flood/dot projectors are infrared and are exposed to your app only through ARKit/Vision face APIs, never as raw IR.
Infrared on Smartphones
What It Is & How It Works
What it is. Several distinct technologies share the "infrared" label: structured-light IR (Face ID), ToF IR (LiDAR / Android ToF), narrow-band IR illuminators for face unlock and night vision, IR blasters for remote control, and IR receivers for proximity detection.
How it works. You don't poke an IR sensor directly. Use the relevant high-level API: LocalAuthentication / BiometricPrompt for face unlock; ARKit/Vision for face geometry; ConsumerIrManager for IR-blaster output on supporting Androids.
Units & signal. For consumer IR (IR blaster), you transmit at carrier frequencies (e.g. 38 kHz) using on/off pulse patterns. For everything else, the API hides physical units from you.
What You Can Build With It
Face unlock / Face ID
Authenticate the user with the secure enclave; you receive a yes/no, not images.
Example: A banking app gating access behind Face ID.
Remote control (IR blaster)
Send IR codes to TVs, ACs and projectors on supported Android devices.
Example: A universal remote app for older Samsung Galaxy phones.
Depth and AR
TrueDepth and ARCore Depth indirectly use IR — you read the resulting depth map.
Example: A try-on glasses app using TrueDepth depth via ARKit.
Proximity / pocket detection
The infrared proximity sensor surfaces as a near/far boolean.
Example: A walkie-talkie app blanking the screen near the user's ear.
Permissions & Setup
Face ID requires `NSFaceIDUsageDescription` plus the LocalAuthentication prompt. Consumer IR on Android requires the manifest permission `TRANSMIT_IR`; it does not show a runtime prompt.
iOS · Info.plist
NSCameraUsageDescription (TrueDepth indirectly)NSFaceIDUsageDescription
Android · AndroidManifest.xml
android.permission.TRANSMIT_IR (Consumer IR, where present)
Code Examples
Setup
- iOS: import LocalAuthentication for Face ID; ARKit / Vision for IR depth
- Android: check `getSystemService(Context.CONSUMER_IR_SERVICE)`; declare `TRANSMIT_IR` in the manifest
- Cross-platform: there is no portable IR API; use platform-specific code
// Face ID auth via expo-local-authentication
import * as LocalAuthentication from 'expo-local-authentication';
export async function authenticate() {
const supported = await LocalAuthentication.hasHardwareAsync();
const enrolled = await LocalAuthentication.isEnrolledAsync();
if (!supported || !enrolled) return false;
const result = await LocalAuthentication.authenticateAsync({
promptMessage: 'Unlock with Face ID',
});
return result.success;
}Tip: With Newly, you describe the feature you want and the AI agent wires up the sensor, permissions, and UI for you. Try it free.
Best Practices
Use the abstraction, not the chip
BiometricPrompt and LocalAuthentication know about hardware enrolment, fallback PINs, error states. Don't reinvent them.
Detect IR-blaster availability before showing a remote UI
`hasIrEmitter()` is a one-line check; missing UI beats a broken remote.
Never store raw biometric data
You never even receive it. Both platforms keep IR captures inside the secure enclave.
Document privacy expectations
If you use Face ID or face mesh, explain in plain English what you do and don't store.
Common Pitfalls
Trying to capture raw IR camera frames
Public APIs do not expose them on either platform.
Mitigation: Reframe the feature in terms of depth, face geometry or biometric authentication.
Assuming every Android has an IR blaster
Less than 10% of phones in market today have one.
Mitigation: Hide the feature unless `hasIrEmitter()` returns true.
Bypassing biometric prompts
Trying to skip the system-provided prompts is a fast track to App Store rejection.
Mitigation: Always use the official LocalAuthentication / BiometricPrompt flows.
When To Use It (And When Not To)
Good fit
- Biometric authentication via the OS prompts
- Universal remote apps targeting IR-blaster Android phones
- Indirect IR usage through depth or face APIs
- Privacy-respecting, system-mediated face / fingerprint flows
Look elsewhere if…
- Capturing raw IR images on iOS or modern Android
- Cross-platform IR remote-control features
- Building bespoke face authentication
- Long-range IR communication on mass-market phones
Frequently Asked Questions
Can I read images from the IR camera?
Not on iOS, and not from a public API on Android. The IR sensor is reserved for the OS's biometric and depth pipelines.
Does the iPhone have an IR blaster?
No iPhone has ever shipped a consumer IR transmitter.
How do I use Face ID?
Use the LocalAuthentication framework on iOS or expo-local-authentication on React Native — that's the entire API.
Where can I get IR codes for TVs?
The Lirc database is the canonical open source. Many remote SDKs (Logitech Harmony, Anymote) provide tested codes.
Build with the Infrared Sensor on Newly
Ship a infrared sensor-powered feature this week
Newly turns a description like “use the infrared sensor to face unlock / face id” into a real React Native app — permissions, native modules and UI included. Full source code is yours, and you can publish to the App Store and Google Play directly from the dashboard.
Want a deeper dive on the underlying APIs? See Expo Sensors, Apple Core Motion and Android sensor framework.
