Vision SDK for iOS
v0.11.0 View changelog
- AI and AR features for drivers that run on today’s mobile and embedded applications
- Build augmented reality navigation with turn-by-turn directions and custom objects
- Create custom alerts for speeding, lane departures, tailgating, and more
- Neural networks run on-device: real time performance without taxing your data plan
The Mapbox Vision SDK for iOS is a library for interpreting road scenes in real-time directly on iOS devices using the device’s built-in camera. Features include:
- Classification and display of regulatory and warning signs
- Object detection for vehicles, pedestrians, road signs, and traffic lights
- Semantic segmentation of the roadway into 10 different classes
- Augmented reality navigation with global coverage
- Support for external cameras: WiFi or wired connection
The Vision SDK for iOS is composed of three libraries you can interact with directly: Vision, Vision AR, and Vision Safety. All three depend on the
- Vision (
Vision) is the primary library, needed for any application of Mapbox Vision. Its components enable camera configuration, display of classification, detection, and segmentation layers, lane feature extraction, and other interfaces. Vision accesses real-time inference running in Vision Core.
- Vision AR (
VisionAR) is an add-on module for Vision used to create customizable augmented reality experiences. It allows configuration of the user’s route visualization: lane material (shaders, textures), lane geometry, occlusion, custom objects, and more. Read more in the AR navigation guide.
- Vision Safety (
VisionSafety) is an add-on module for Vision used to create customizable alerts for speeding, nearby vehicles, cyclists, pedestrians, lane departures, and more. Read more in the Safety alerts guide.
Vision Core is the core logic of the system, including all machine learning models. Importing any of the Vision-related modules listed above into your project automatically brings
The Vision SDK for iOS is written in Swift 4.2 and can be used with iOS 11.2 and higher on iPhone 6s or newer.
Use of the Vision SDK requires that the device is pointed with a view of the road. We strongly recommend using a dashboard or windshield mount to keep your phone oriented correctly while you drive. We have tested a few options and have seen positive results with two mounts (option 1 and option 2).
To set up the Vision SDK you will need to download the SDK, install the frameworks relevant to your project, and complete a few configuration steps.
You must download the relevant frameworks from vision.mapbox.com/install before continuing. You can download the framework directly or import it into your project with CocoaPods or Carthage. This will require that you are logged into your Mapbox account.
After downloading or importing the SDK into your project, configure the following in your Xcode project.
Mapbox APIs require a Mapbox account and access token.
- Get an access token from the Mapbox account page.
- In the project editor, select the application target, then go to the
- Under the “Custom iOS Target Properties” section, set
MGLMapboxAccessTokento your access token.
- In order for the SDK to track the user’s location, set
NSLocationWhenInUseUsageDescriptionto a description of location usage.
NSCameraUsageDescriptionto a description of camera usage.
Required: Import relevant modules,
MapboxVision being required.
import MapboxVision // OPTIONAL: include Vision AR functionality import MapboxVisionAR // OPTIONAL: include Vision Safety functionality import MapboxVisionSafety
Required: Initialize video source and create instance of
VisionManager with it.
let videoSource = CameraVideoSource() let visionManager = VisionManager.create(videoSource: videoSource)
Optional: If you want to subscribe to Vision events, set delegate to
// `self` should implement `VisionManagerDelegate` protocol visionManager.delegate = self
Optional: If you want to subscribe to AR events, create
VisionARManager and set delegate.
// Create AR module let visionARMAnager = VisionARManager.create(visionManager: visionManager) // `self` should implement `VisionARManagerDelegate` protocol visionARManager.delegate = self
Optional: If you want to subscribe to Safety events, create
VisionSafetyManager and set delegate.
// Create Safety module let visionSafetyManager = VisionSafetyManager.create(visionManager: visionManager) // `self` should implement `VisionSafetyManagerDelegate` protocol visionSafetyManager.delegate = self
Required: Control events sending with an instance of
videoSource.start() visionManager.start() visionManager.stop()
Required: Clean up the resources when you don’t need them anymore.
videoSource.stop() visionManager.stop() // AR and/or Safety should be destroyed first visionARManager.destroy() visionSafetyManager.destroy() // Finally destroy instance of `VisionManager` visionManager.destroy()
After installing the framework, you will need to set up the device in the vehicle. Some things to consider when choosing and setting up a mount:
- Generally, shorter length mounts will vibrate less. Mounting to your windshield or to the dashboard itself are both options.
- Place the phone near or behind your rearview mirror. Note that your local jurisdiction may have limits on where mounts may be placed.
- Make sure the phone’s camera view is unobstructed (you will be able to tell with any of the video screens open).
Read more about setting up your development environment for testing the capabilities of the Vision SDK in the Testing and development guide.