Skip to main content

Vision SDK for iOS

Current version: v0.13.1View changelog

  • check
    AR navigation
  • check
    Scene segmentation
  • check
    Sign detection
  • check
    Safety alerts
  • check
    Object detection
  • check
    Lane detection
  • check
    External camera support

The Mapbox Vision SDK is a software library that allows you to use powerful neural networks to run on today’s devices to understand the roadway in real time.

Main SDK features are:

  • AR navigation: Build navigation with global coverage with turn-by-turn directions and custom objects in augmented reality
  • Scene segmentation: Expose clearly distinguished cars, lanes, curbs, and other road surfaces like sidewalks
  • Sign detection: Recognize road signs for speed limits, construction, turn restrictions, and more (see all sign types)
  • Safety alerts: Create custom alerts for speeding, tailgating, and potential collisions
  • Object detection: Detect and track nearby vehicles, bicycles, pedestrians, road signs, construction cones, and traffic lights
  • Lane detection: Detect the vehicle’s current lane to enable lane-level navigation
  • External camera support: Use custom integration camera implementations

SDK structure

The Vision SDK is composed of three frameworks you can interact with: Vision, Vision AR, and Vision Safety. Both Vision AR and Vision Safety depend on the Vision framework.

  • The Vision framework is the primary library, needed for any application that integrates Mapbox Vision. Its components enable camera configuration, display of classification, detection, and segmentation layers, lane feature extraction, and other interfaces. The Vision SDK's segmentation provides developers with the following pieces of lane information: number of lanes, lane widths, lane edge types, and directions of travel for each lane. A set of points describing each lane edge is also available.

  • The Vision AR framework is an add-on module for Vision used to create customizable augmented reality experiences. It allows configuration of the user’s route visualization: AR lane, AR fence, and their materials (shaders, textures), geometry and occlusion. Read more in the AR guide.

  • The Vision Safety framework is an add-on module for Vision used to create customizable alerts for speeding, nearby vehicles, cyclists, pedestrians, lane departures, and more. It notifies drivers about road conditions and potential hazards. For example, developers can track speed limits and other critical signage using sign classification and track the most recently observed speed limit. Program alerts to be triggered when the detected speed of the vehicle is more than the last observed speed limit. Read more in the Safety alerts guide.


The Mapbox Vision SDK for iOS uses Swift 4.2 and can be used with iOS 11.2 and higher on iPhone 6s or newer.

To work with only the foundational components of Vision (segmentation, detection, and classification layers), developers need only import the Vision SDK.

Use of the Vision SDK in production requires that the device is pointed with a view of the road. We strongly recommend using a dashboard or windshield mount to keep your phone oriented correctly while you drive.

We have tested a few options and have seen positive results with two mounts (option 1 and option 2).

During the development you can set up the testing environment. Find more details at Testing and Development section.

Documentation for the Mapbox Vision SDK for iOS comes in the form of examples, tutorials and API reference.

If you can't find what you're looking for, reach out to our support team.


While the Vision SDK is using the camera you must display the Mapbox watermark on screen. Read more about attribution requirements in our terms of service.

Was this page helpful?