LiDAR Scanner in Mobile Apps

Use the LiDAR scanner on iPhone Pro and iPad Pro for high-precision depth, AR mesh, instant placement and 3D scanning.

Timothy Lindblom

Founder, Newly

LiDAR — Light Detection And Ranging — is a time-of-flight sensor that fires invisible infrared dots and measures how long they take to bounce back. Apple ships LiDAR on Pro iPhones and Pro iPads, where it powers instant AR placement, scene reconstruction and the RoomPlan API. This guide is iOS-first because Android phones do not yet ship a comparable consumer LiDAR.

How to Use LiDAR on iPhone & iPad — What Can It Do?Watch on YouTube ↗

Key Takeaways

  • LiDAR returns a true 3D depth map at up to ~5 m, indoors and outdoors, in any lighting.
  • It is exposed through ARKit (`ARWorldTrackingConfiguration.sceneReconstruction = .mesh`) and RoomPlan, not as a raw sensor.
  • There is no public Android equivalent today — most Android depth comes from ToF or ARCore Depth API.
  • Combine LiDAR with the rear camera to get coloured meshes you can save as USDZ for AR Quick Look.

LiDAR at a Glance

~5 m
Practical range
cm
Typical accuracy
Pro only
iPhone / iPad availability
No
Android shipping today

What It Is & How It Works

What it is. A solid-state time-of-flight scanner that emits an array of IR dots and measures their return time. It produces a sparse depth point cloud the OS densifies into a continuous depth map and an AR mesh.

How it works. In ARKit, you opt in to scene reconstruction. The framework returns `ARMeshAnchor` objects covering the room as the user pans the device. You can also read the per-frame depth map for finer-grained access.

Units & signal. Depth in metres. Mesh in 3D world coordinates. Confidence per face (low/medium/high).

What You Can Build With It

Instant AR placement

Drop virtual objects in the scene with no "scan the floor" pre-step thanks to LiDAR.

Example: A furniture preview app that places a chair the moment the camera opens.

Room-scale 3D scanning

Use RoomPlan to capture floor plans and 3D layouts of rooms.

Example: A real-estate listing tool that captures a walkable 3D tour.

Precise measuring

Measure walls, doors, distance to objects with cm-level accuracy.

Example: A "measure my window" app for blinds.

Object occlusion & physics

Hide virtual objects behind real geometry; bounce balls off the actual sofa.

Example: A children's game where physics objects collide with real walls.

Permissions & Setup

LiDAR access goes through ARKit and the camera permission. There is no separate "LiDAR permission".

iOS · Info.plist

  • NSCameraUsageDescription

Android · AndroidManifest.xml

  • n/a — no consumer LiDAR

Code Examples

Setup

  • Hardware: iPhone 12 Pro / 13 Pro / 14 Pro / 15 Pro / 16 Pro and any iPad Pro since 2020
  • iOS: import `ARKit`, set `sceneReconstruction = .mesh`
  • Cross-platform: there is no portable LiDAR API; for Android use ARCore Depth API as a substitute
// LiDAR is iOS-only. Easiest path from RN: a thin native module
// that wraps ARKit. Below is a sketch using the Vision Camera community
// extension if you only need a depth map.
import { Camera, useCameraDevice } from 'react-native-vision-camera';

export function LidarPreview() {
  const device = useCameraDevice('back', { physicalDevices: ['ultra-wide-angle-camera'] });
  return device ? <Camera device={device} isActive pixelFormat="depth" /> : null;
}

Tip: With Newly, you describe the feature you want and the AI agent wires up the sensor, permissions, and UI for you. Try it free.

Best Practices

  • Use RoomPlan for floor plans

    The RoomPlan API does walls/doors/windows segmentation for you. Don't roll your own from raw mesh.

  • Throttle mesh updates

    Mesh anchors arrive every few frames. Re-process only when the new mesh adds significant area.

  • Mind the 5 m range

    Anything past ~5 m is unreliable; show a "move closer" hint when scanning a large area.

  • Provide a non-LiDAR fallback

    Most users will be on a non-Pro device. Detect availability and fall back to plane detection.

Common Pitfalls

Treating LiDAR as available everywhere

Only a fraction of iOS users have it. Most don't.

Mitigation: Always check `ARWorldTrackingConfiguration.supportsSceneReconstruction(.mesh)` before relying on it.

Mesh holes in shiny / black surfaces

Mirrors, glass and matte-black objects absorb or reflect IR oddly.

Mitigation: Show the live mesh during scanning so the user can re-scan problem areas.

Battery and thermal cost

LiDAR + AR mesh is one of the most expensive camera modes you can run.

Mitigation: Run for as short as possible; respect `ProcessInfo.thermalState` and stop at `serious`.

Privacy expectations

Scanning a room captures 3D layout that is potentially identifying.

Mitigation: Document on-device processing; don't upload meshes without explicit consent.

When To Use It (And When Not To)

Good fit

  • High-end AR experiences where Pro devices are the target
  • Room-scale 3D scanning, real estate, interior design
  • Indoor measurement apps requiring cm accuracy
  • AR experiences that benefit from instant placement

Look elsewhere if…

  • Apps that must ship to every device — LiDAR is rare
  • Long-range outdoor scanning beyond 5 m
  • Cross-platform projects without an Android substitute plan
  • High-precision face capture (use TrueDepth for faces)

Frequently Asked Questions

Which devices have LiDAR?

iPhone 12 Pro and later Pro models, plus all iPad Pro models since 2020. No Pro = no LiDAR.

Do Android phones have LiDAR?

Not in the same form. Some have time-of-flight sensors (similar idea, lower fidelity), but there is no equivalent consumer API surface.

How accurate is the depth?

On the order of 1–2 cm for surfaces 0.3–3 m away. Falls off rapidly past 5 m and on shiny / dark materials.

Can I export the scan?

Yes. ARMeshAnchor data can be exported to USDZ via ModelIO or RealityKit, or saved to Photos as a USDZ.

Build with the LiDAR Scanner on Newly

Ship a lidar scanner-powered feature this week

Newly turns a description like “use the lidar scanner to instant ar placement into a real React Native app — permissions, native modules and UI included. Full source code is yours, and you can publish to the App Store and Google Play directly from the dashboard.

Start Building Your App

Want a deeper dive on the underlying APIs? See Expo Sensors, Apple Core Motion and Android sensor framework.

Continue Learning