Testing and development

Testing visual detection and classification

You can point your device at a prerecorded video playing on a separate monitor to test some features that rely only on visual detection and classification.

Device requirements

You must use a physical iOS device for all development and testing with Vision-related products. You can use the camera built into the device or an external camera connected to a physical device. You cannot use iOS Simulator with Xcode.

Video requirements

You can use prerecorded videos from a dashboard-mounted camera to test some features like detections, segmentations, and sign classifications. You can use any dashboard camera video (for example, from YouTube) or record your own video to be sure to capture local traffic signs and road markings important to your application.

For the best results when recording your own videos, you should use the same in-car setup described in the Requirements and Device setup sections of this documentation. Some tips for recording a video:

  • The Vision SDK works best under good lighting conditions. Lighting conditions may impact the reliability of certain behaviors so you should collect video in any lighting conditions you’d like to test.
  • Plan your route ahead of time to include a diversity of driving situations that may be important to your application (roads with different lane configurations, various speed limits, encounters with pedestrians and cyclists, etc).

Development environment setup

After you have selected or recorded a video, you will need to set up your physical development environment. You will need:

  • A physical device with a built-in camera (or set up an external camera source following the code example)
  • A monitor to play the video on
  • A method for positioning the device to point at the video

Position the device so the camera is pointed at the video. The video should fill the entire screen of your device.

What to expect

This approach to development can help you test some features of the SDK, but will not work for testing all features since some features rely on GPS and other sensors.

Examples of features that can be tested using this setup:

  • Road sign classification to determine the signType for road signs that appear in the video.
  • Lane Detection.
  • Vehicle, pedestrian, and bicycle detection for the 2D screen location of objects only.

Examples of features that cannot be tested using this setup:

  • AR Navigation cannot be tested using this setup because it requires GPS data. Using a simple simulated location is not a workable alternative because the location needs to match the current location featured in the video.
  • Vision Safety cannot be tested using this setup because Vision Safety requires the current vehicle speed and the ability to estimate distance for objects in view.

Testing with sensor data

Several features of the Vision SDK require GPS and other sensors including Vision AR and Vision Safety. We do not currently offer tools for recording driving sessions with sensor data or using this data for testing in an iterative development setup. We are working on simulation tools and a set of test scenarios with video, GPS, and IMU data to allow developers to iterate more rapidly. We will be releasing these tools in a future SDK release.

You will need to do all testing of these features in a vehicle with a device set up according to the instructions in the Requirements and Device setup sections of this documentation.