メインコンテンツまでスキップ

Draw safety alerts using the Mapbox Vision SDK for iOS

前提条件

Familiarity with Xcode and Swift, and completion of the Mapbox Vision SDK for iOS Install and configure guide.

The Mapbox Vision SDK for iOS is a library for interpreting road scenes in real time directly on iOS devices using the device’s built-in camera. The Vision SDK detects many types of objects including cars, people, road signs, and more.

In this tutorial, you'll learn how to use the Vision Safety module from the Mapbox Vision SDK and apply it to detect overspeeding and potential collisions with cars.

Note

In the example used in this tutorial, all custom alerts have predefined images.In your own application, you can use your own images.

Getting started

Here are the resources that you need before getting started:

  • An application including Mapbox Vision SDK for iOS. Before starting this tutorial, go through the Install and configure steps in the Vision SDK for iOS documentation. This will walk you through how to install the Vision SDK and configure your application.
  • Recorded session. This tutorial is based on a recorded driving session through a city and the replay capabilities of MapboxVision. You will use VisionReplayManager for a recorded session.
    • Check our Testing and development guide to familiarize yourself with record and replay functionality.
    • You can download the recorded session used in this tutorial below.
arrow-downDownload sample session

To configure your application with a prerecorded session:

  1. Unzip the contents to a folder on your local machine.
  2. Go your Xcode project Info.plist and set YES for UIFileSharingEnabled thus enabling file sharing through Finder.
  3. Install the app to the device ( + R).
  4. Connect your device, choose it in Finder under Locations section, and select Files tab.
  5. Drag and drop the folder with the recorded session onto your app. Now the session is available in the Documents folder inside the app container.
  6. In the code use VisionReplayManager.create(recordPath:) method to create an instance of VisionReplayManager by providing a path to a recorded session.

Configure Vision SDK lifecycle

In this tutorial, you'll use VisionReplayManager to run a prerecorded session (which includes video and telemetry data) and find speed signs and cars in the video. The VisionReplayManager class is the main object for registering for events from the Vision SDK and controlling its delivery. For production applications or testing in a live environment, use VisionManager instead of VisionReplayManager. See the Next steps section for details.

To set up the Vision SDK:

  1. Create a VisionReplayManager instance with a recorded session path.
 
private var visionManager: VisionReplayManager!
 
// Documents directory path with files uploaded via Finder
 
let documentsPath =
 
NSSearchPathForDirectoriesInDomains(.documentDirectory,
 
.userDomainMask,
 
true).first!
 
let path = documentsPath.appending("/safety-alerts-drawing")
 

 
// create VisionReplayManager with a path to recorded session
 
visionManager = try? VisionReplayManager.create(recordPath: path)
  1. Register its delegate.
 
// register its delegate
 
visionManager.delegate = self
  1. Create an instance of VisionSafetyManager configured with VisionReplayManager instance.
 
// create VisionSafetyManager and register as its delegate to receive safety related events
 
visionSafetyManager = VisionSafetyManager.create(visionManager: visionManager)
  1. Register its delegate to receive safety related events.
 
// register its delegate
 
visionSafetyManager.delegate = self
  1. Create VisionPresentationViewController and configure it with VisionReplayManagerto display camera frames.
 
private let visionViewController = VisionPresentationViewController()
 
// configure Vision view to display sample buffers from video source
 
visionViewController.set(visionManager: visionManager)
 
// add Vision view as a child view
 
addVisionView()
 
private func addVisionView() {
 
addChild(visionViewController)
 
view.addSubview(visionViewController.view)
 
visionViewController.didMove(toParent: self)
 
}
  1. Start delivering events by calling start on VisionReplayManager.
 
override func viewWillAppear(_ animated: Bool) {
 
super.viewWillAppear(animated)
 

 
visionManager.start()
 
}
  1. Stop delivering events by calling stop on VisionReplayManager.
 
override func viewDidDisappear(_ animated: Bool) {
 
super.viewDidDisappear(animated)
 

 
visionManager.stop()
 
}

Set up views to draw alerts

Overspeeding alert

To show an overspeeding alert you'll use a UIView element named alertOverspeedingView:

 
private var alertOverspeedingView: UIView!
 
private func addOverspeedingAlertView() {
 
alertOverspeedingView = UIImageView(image: UIImage(named: "alert"))
 
alertOverspeedingView.isHidden = true
 
alertOverspeedingView.translatesAutoresizingMaskIntoConstraints = false
 
view.addSubview(alertOverspeedingView)
 
NSLayoutConstraint.activate([
 
alertOverspeedingView.topAnchor
 
.constraint(equalToSystemSpacingBelow: view.safeAreaLayoutGuide.topAnchor, multiplier: 1),
 
view.safeAreaLayoutGuide.trailingAnchor
 
.constraint(equalToSystemSpacingAfter: alertOverspeedingView.trailingAnchor, multiplier: 1)
 
])
 
}

Collision detections

You will draw a collision state using a custom CollisionDetectionView with a custom border color to show a bounding box around potential collision objects:

 
// Custom UIView to draw a red bounding box
 
class CollisionDetectionView: UIView {
 
override init(frame: CGRect) {
 
super.init(frame: frame)
 

 
// Transparent view with a red border
 
backgroundColor = .clear
 
layer.borderWidth = 3
 
layer.borderColor = UIColor.red.cgColor
 
}
 

 
required init?(coder: NSCoder) {
 
fatalError("init(coder:) has not been implemented")
 
}
 
}

Having a custom class also helps to track detection views in UIView's subviews. If you need to remove all CollisionDetectionView objects from superview, you can use the following code:

 
// remove `CollisionDetectionView` objects from the view
 
for subview in view.subviews {
 
if subview.isKind(of: CollisionDetectionView.self) {
 
subview.removeFromSuperview()
 
}
 
}

Respond to SDK events

Implement VisionManagerDelegate

Note

All delegate methods are called on a background thread, so consider dispatching execution to the DispatchQueue.main when working with UI elements or the queue of your choice for data synchronization.

You need to implement several methods from VisionManagerDelegate to get the necessary data.

Use visionManager(_:didUpdateVehicleState:) to get the latest vehicle state. You need to save the latest state of the vehicle:

 
private var vehicleState: VehicleState?
 
func visionManager(_ visionManager: VisionManagerProtocol,
 
didUpdateVehicleState vehicleState: VehicleState) {
 
// dispatch to the main queue in order to sync access to `VehicleState` instance
 
DispatchQueue.main.async { [weak self] in
 
// save the latest state of the vehicle
 
self?.vehicleState = vehicleState
 
}
 
}

Use visionManagerDidCompleteUpdate(_:) to know the whole update iteration is completed. That means all the data that came from delegate methods is in sync. When the update is completed, you can use all the data to update UI:

 
func visionManagerDidCompleteUpdate(_ visionManager: VisionManagerProtocol) {
 
// dispatch to the main queue in order to work with UIKit elements
 
DispatchQueue.main.async { [weak self] in
 
// update UI elements
 
self?.updateOverspeedingDrawing()
 
self?.updateCollisionDrawing()
 
}
 
}

Implement VisionSafetyManagerDelegate

Note

All delegate methods are called on a background thread, so consider dispatching execution to the DispatchQueue.main when working with UI elements or the queue of your choice for data synchronization.

You need to implement several methods from VisionManagerSafetyDelegate.

Use visionSafetyManager(_:didUpdateRoadRestrictions:) to know when the road restrictions were updated. When VisionSafetyManagerDelegate provides you with new information about road restrictions (speed alerts, more specifically), you store it in a local variable.

Then you'll use this information in the visionManagerDidCompleteUpdate(_ visionManager: VisionManagerProtocol) method to decide if a speed limit has been exceeded:

 
private var speedLimits: SpeedLimits?
 
func visionSafetyManager(_ visionSafetyManager: VisionSafetyManager,
 
didUpdateRoadRestrictions roadRestrictions: RoadRestrictions) {
 
// dispatch to the main queue in order to sync access to `SpeedLimits` instance
 
DispatchQueue.main.async { [weak self] in
 
// save currenly applied speed limits
 
self?.speedLimits = roadRestrictions.speedLimits
 
}
 
}

Use visionSafetyManager(_:didUpdateCollisions:) to get collision objects. In this method you need to filter collision objects that have type Car and store them in a local array:

 
private var carCollisions = [CollisionObject]()
 
func visionSafetyManager(_ visionSafetyManager: VisionSafetyManager,
 
didUpdateCollisions collisions: [CollisionObject]) {
 
// we will draw collisions with cars only, so we need to filter `CollisionObject`s
 
let carCollisions = collisions.filter { $0.object.detectionClass == .car }
 

 
// dispatch to the main queue in order to sync access to `[CollisionObject]` array
 
DispatchQueue.main.async { [weak self] in
 
// update current collisions state
 
self?.carCollisions = carCollisions
 
}
 
}

Handle overspeeding

As soon as all necessary data is in sync within the visionManagerDidCompleteUpdate(_ visionManager: VisionManagerProtocol) method, you can draw an overspeeding alert:

 
private func updateOverspeedingDrawing() {
 
// when update is completed all the data has the most current state
 
guard let vehicle = vehicleState, let limits = speedLimits else { return }
 

 
// decide whether speed limit is exceeded by comparing it with the current speed
 
let isOverSpeeding = vehicle.speed > limits.speedLimitRange.max
 
alertOverspeedingView.isHidden = !isOverSpeeding
 
}

Handle collisions

As soon as all necessary data is in sync within the visionManagerDidCompleteUpdate(_ visionManager: VisionManagerProtocol) method, you can draw custom collision detection views.

Remove CollisionDetectionView objects from the view:

 
// remove `CollisionDetectionView` objects from the view
 
for subview in view.subviews {
 
if subview.isKind(of: CollisionDetectionView.self) {
 
subview.removeFromSuperview()
 
}
 
}

Iterating through the CollisionObject array, do the following steps to show the collision detection view for each item.

First, calculate absolute coordinates of the bounding box, given the relative coordinates of the rectangle around the detected object on the camera frame:

 
let relativeBBox = carCollision.lastDetection.boundingBox
 
let cameraFrameSize = carCollision.lastFrame.image.size.cgSize
 

 
// calculate absolute coordinates
 
let bboxInCameraFrameSpace = CGRect(x: relativeBBox.origin.x * cameraFrameSize.width,
 
y: relativeBBox.origin.y * cameraFrameSize.height,
 
width: relativeBBox.size.width * cameraFrameSize.width,
 
height: relativeBBox.size.height * cameraFrameSize.height)
 

 
// at this stage, bbox has the coordinates in the camera frame space
 
// you should convert it to the view space saving the aspect ratio

Convert the coordinates of the bounding box from the camera frame space to the view space saving the aspect ratio:

 
// first, construct left-top and right-bottom coordinates of a bounding box
 
var leftTop = CGPoint(x: bboxInCameraFrameSpace.origin.x,
 
y: bboxInCameraFrameSpace.origin.y)
 
var rightBottom = CGPoint(x: bboxInCameraFrameSpace.maxX,
 
y: bboxInCameraFrameSpace.maxY)
 

 
// then convert the points from the camera frame space into the view frame space
 
leftTop = leftTop.convertForAspectRatioFill(from: cameraFrameSize,
 
to: view.bounds.size)
 
rightBottom = rightBottom.convertForAspectRatioFill(from: cameraFrameSize,
 
to: view.bounds.size)
 

 
// finally, construct a bounding box in the view frame space
 
let bboxInViewSpace = CGRect(x: leftTop.x,
 
y: leftTop.y,
 
width: rightBottom.x - leftTop.x,
 
height: rightBottom.y - leftTop.y)

Having the bounding box with coordinates in the view space, draw a collision detection alert:

 
// draw a collision detection alert
 
let view = CollisionDetectionView(frame: bboxInViewSpace)
 
self.view.addSubview(view)

Final result

Here is the complete code for the demo:

SafetyAlertsViewController
import MapboxVision
import MapboxVisionSafety
import UIKit

/**
* "Safety alerts" example demonstrates how to utilize events from MapboxVisionSafetyManager
* to alert a user about exceeding allowed speed limit and potential collisions with other cars.
*/

// Custom UIView to draw a red bounding box
class CollisionDetectionView: UIView {
override init(frame: CGRect) {
super.init(frame: frame)

// Transparent view with a red border
backgroundColor = .clear
layer.borderWidth = 3
layer.borderColor = UIColor.red.cgColor
}

required init?(coder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
}

class SafetyAlertsViewController: UIViewController {
private var visionManager: VisionReplayManager!
private var visionSafetyManager: VisionSafetyManager!

private let visionViewController = VisionPresentationViewController()

private var alertOverspeedingView: UIView!

private var vehicleState: VehicleState?
private var speedLimits: SpeedLimits?
private var carCollisions = [CollisionObject]()

override func viewDidLoad() {
super.viewDidLoad()

// Documents directory path with files uploaded via Finder
let documentsPath =
NSSearchPathForDirectoriesInDomains(.documentDirectory,
.userDomainMask,
true).first!
let path = documentsPath.appending("/safety-alerts-drawing")

// create VisionReplayManager with a path to recorded session
visionManager = try? VisionReplayManager.create(recordPath: path)
// register its delegate
visionManager.delegate = self

// create VisionSafetyManager and register as its delegate to receive safety related events
visionSafetyManager = VisionSafetyManager.create(visionManager: visionManager)
// register its delegate
visionSafetyManager.delegate = self

// configure Vision view to display sample buffers from video source
visionViewController.set(visionManager: visionManager)
// add Vision view as a child view
addVisionView()

// add view to draw overspeeding alert
addOverspeedingAlertView()
}

override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)

visionManager.start()
}

override func viewDidDisappear(_ animated: Bool) {
super.viewDidDisappear(animated)

visionManager.stop()
}

deinit {
// free up VisionSafetyManager's resources
visionSafetyManager.destroy()

// free up VisionManager's resources
visionManager.destroy()
}

private func addVisionView() {
addChild(visionViewController)
view.addSubview(visionViewController.view)
visionViewController.didMove(toParent: self)
}

private func addOverspeedingAlertView() {
alertOverspeedingView = UIImageView(image: UIImage(named: "alert"))
alertOverspeedingView.isHidden = true
alertOverspeedingView.translatesAutoresizingMaskIntoConstraints = false
view.addSubview(alertOverspeedingView)
NSLayoutConstraint.activate([
alertOverspeedingView.topAnchor
.constraint(equalToSystemSpacingBelow: view.safeAreaLayoutGuide.topAnchor, multiplier: 1),
view.safeAreaLayoutGuide.trailingAnchor
.constraint(equalToSystemSpacingAfter: alertOverspeedingView.trailingAnchor, multiplier: 1)
])
}

// MARK: - Handle VisionSafety events

private func updateCollisionDrawing() {
// remove `CollisionDetectionView` objects from the view
for subview in view.subviews {
if subview.isKind(of: CollisionDetectionView.self) {
subview.removeFromSuperview()
}
}

// iterate the collection of `CollisionObject`s and draw each of them
for carCollision in carCollisions {
let relativeBBox = carCollision.lastDetection.boundingBox
let cameraFrameSize = carCollision.lastFrame.image.size.cgSize

// calculate absolute coordinates
let bboxInCameraFrameSpace = CGRect(x: relativeBBox.origin.x * cameraFrameSize.width,
y: relativeBBox.origin.y * cameraFrameSize.height,
width: relativeBBox.size.width * cameraFrameSize.width,
height: relativeBBox.size.height * cameraFrameSize.height)

// at this stage, bbox has the coordinates in the camera frame space
// you should convert it to the view space saving the aspect ratio

// first, construct left-top and right-bottom coordinates of a bounding box
var leftTop = CGPoint(x: bboxInCameraFrameSpace.origin.x,
y: bboxInCameraFrameSpace.origin.y)
var rightBottom = CGPoint(x: bboxInCameraFrameSpace.maxX,
y: bboxInCameraFrameSpace.maxY)

// then convert the points from the camera frame space into the view frame space
leftTop = leftTop.convertForAspectRatioFill(from: cameraFrameSize,
to: view.bounds.size)
rightBottom = rightBottom.convertForAspectRatioFill(from: cameraFrameSize,
to: view.bounds.size)

// finally, construct a bounding box in the view frame space
let bboxInViewSpace = CGRect(x: leftTop.x,
y: leftTop.y,
width: rightBottom.x - leftTop.x,
height: rightBottom.y - leftTop.y)

// draw a collision detection alert
let view = CollisionDetectionView(frame: bboxInViewSpace)
self.view.addSubview(view)
}
}

private func updateOverspeedingDrawing() {
// when update is completed all the data has the most current state
guard let vehicle = vehicleState, let limits = speedLimits else { return }

// decide whether speed limit is exceeded by comparing it with the current speed
let isOverSpeeding = vehicle.speed > limits.speedLimitRange.max
alertOverspeedingView.isHidden = !isOverSpeeding
}
}

extension SafetyAlertsViewController: VisionManagerDelegate {
func visionManager(_ visionManager: VisionManagerProtocol,
didUpdateVehicleState vehicleState: VehicleState) {
// dispatch to the main queue in order to sync access to `VehicleState` instance
DispatchQueue.main.async { [weak self] in
// save the latest state of the vehicle
self?.vehicleState = vehicleState
}
}

func visionManagerDidCompleteUpdate(_ visionManager: VisionManagerProtocol) {
// dispatch to the main queue in order to work with UIKit elements
DispatchQueue.main.async { [weak self] in
// update UI elements
self?.updateOverspeedingDrawing()
self?.updateCollisionDrawing()
}
}
}

extension SafetyAlertsViewController: VisionSafetyManagerDelegate {
func visionSafetyManager(_ visionSafetyManager: VisionSafetyManager,
didUpdateRoadRestrictions roadRestrictions: RoadRestrictions) {
// dispatch to the main queue in order to sync access to `SpeedLimits` instance
DispatchQueue.main.async { [weak self] in
// save currenly applied speed limits
self?.speedLimits = roadRestrictions.speedLimits
}
}

func visionSafetyManager(_ visionSafetyManager: VisionSafetyManager,
didUpdateCollisions collisions: [CollisionObject]) {
// we will draw collisions with cars only, so we need to filter `CollisionObject`s
let carCollisions = collisions.filter { $0.object.detectionClass == .car }

// dispatch to the main queue in order to sync access to `[CollisionObject]` array
DispatchQueue.main.async { [weak self] in
// update current collisions state
self?.carCollisions = carCollisions
}
}
}

// This comment is here to assure the correct rendering of code snippets in a public documentation

Next steps

Use real-time data

When you're done testing, follow these steps to start working with real-time data.

  1. Change the type of visionManager var to VisionManager
  2. Create and save a CameraVideoSource instance
// create a video source obtaining buffers from camera module
cameraVideoSource = CameraVideoSource()
  1. Create VisionManager with the cameraVideoSource you created above
// create VisionManager with video source
visionManager = VisionManager.create(videoSource: cameraVideoSource!)
  1. Start CameraVideoSource along with VisionManager in viewWillAppear(_:)
  2. Stop CameraVideoSource along with VisionManager in viewDidDisappear(_:)
  3. Update drawing asserts and logic in updateCollisionDrawing and updateOverspeedingDrawing methods if it's needed
このpageは役に立ちましたか?