Classes

The following classes are available globally.

  • Presents segmentation and detection events emitted from VisionManager as well as raw video frames.

    Displayed content depends on the current value of visualizationMode.

    In Segmentation and Detections modes VisionPresentationViewController displays according events when VisionManager or VisionReplayManager is started.

    In Clear mode raw video frames are displayed:

    See more

    Declaration

    Swift

    class VisionPresentationViewController : UIViewController
  • Cubic bezier spline in the world space.

    See more

    Declaration

    Swift

    class BezierCubic3D : NSObject
  • Object representing the state of the camera. The precision of the camera properties depends on its calibration, so it’s recommended to use them when the Camera.isCalibrated value becomes true.

    See more

    Declaration

    Swift

    class Camera : NSObject
  • Intrinsic camera parameters representing the source of video frames.

    See more

    Declaration

    Swift

    class CameraParameters : NSObject
  • Single result of object detection.

    See more

    Declaration

    Swift

    class Detection : NSObject
  • Single frame with meta information.

    See more

    Declaration

    Swift

    class Frame : NSObject
  • All detections for a single frame.

    See more

    Declaration

    Swift

    class FrameDetections : NSObject
  • Segmentation result for a single frame.

    See more

    Declaration

    Swift

    class FrameSegmentation : NSObject
  • All sign classifications for a single frame.

    See more

    Declaration

    Swift

    class FrameSignClassifications : NSObject
  • Coordinate in Mercator.

    This coordinate system is used to locate an object’s geographic location as it would appear on a map. Examples include the GPS position of the ego-vehicle or the position of landmarks, which can be used for localization or position of the detected objects.

    Currently we use Spherical Earth Model for geodesic calculations. Each point is specified using longitude, latitude.

    Longitude ranges from -180 to 180 degrees, where 0 is Greenwich meridian, the positive direction (+) is to the East, and the negative direction (-) is to the West.

    Latitude ranges from -90 to +90 degrees, where 0 is Equator, the positive direction (+) is to the North, and the negative direction (-) is to the South.

    See more

    Declaration

    Swift

    class GeoCoordinate : NSObject
  • Object representing geographical coordinates.

    See more

    Declaration

    Swift

    class GeoPosition : NSObject
  • Image object with attributes.

    See more

    Declaration

    Swift

    class Image : NSObject
  • Size of an image.

    See more

    Declaration

    Swift

    class ImageSize : NSObject
  • Single lane object.

    See more

    Declaration

    Swift

    class Lane : NSObject
  • Lane edge description.

    See more

    Declaration

    Swift

    class LaneEdge : NSObject
  • Represents pixel coordinates on image. Top left corner of an image has (x: 0, y: 0) coordinate.

    This coordinate system is used to represent the position of an object relative to the frame. The origin is the left top corner of the frame. The position of an object is expressed as x, y.

    See more

    Declaration

    Swift

    class Point2D : NSObject
  • Object aggregating information about road markup and road geometry.

    See more

    Declaration

    Swift

    class RoadDescription : NSObject
  • Object representing available sign information.

    See more

    Declaration

    Swift

    class Sign : NSObject
  • Single result of sign instance classification.

    See more

    Declaration

    Swift

    class SignClassification : NSObject
  • Location of our vehicle with meta information.

    See more

    Declaration

    Swift

    class VehicleState : NSObject
  • Class that encapsulates image buffer and its format.

    See more

    Declaration

    Swift

    class VideoSample : NSObject
  • Point in the ISO coordinate system (unit is a meter).

    This coordinate system is used to represent the position of an object relative to the device camera’s position in physical space. The origin of this coordinate system is a point projected from a camera position to a road plane.

    See more

    Declaration

    Swift

    class WorldCoordinate : NSObject
  • Object aggregating information about objects and their position in the world around a vehicle. Description includes static and dynamic objects.

    See more

    Declaration

    Swift

    class WorldDescription : NSObject
  • Description of the object in the world.

    See more

    Declaration

    Swift

    class WorldObject : NSObject
  • Object encapsulating work with camera device.

    See more

    Declaration

    Swift

    open class CameraVideoSource : ObservableVideoSource
  • Helper class handling observers: storing, releasing, notifying. Observers are held weakly by the instance of the class. You may inherit your video source from this class to avoid handling observers yourself.

    Warning

    The implementation uses a non-recursive lock, thus you must not call add(observer:) or remove(observer:) methods from notify closure.
    See more

    Declaration

    Swift

    open class ObservableVideoSource : NSObject, VideoSource
  • Declaration

    Swift

    public class BaseVisionManager : VisionManagerProtocol
    extension BaseVisionManager: VisionDelegate
  • The main object for registering for events from the SDK, starting and stopping their delivery. It also provides some useful functions for performance configuration and data conversion.

    Lifecycle of VisionManager :

    1. create
    2. start
    3. startRecording (optional)
    4. stopRecording (optional)
    5. stop, then lifecycle may proceed with destroy or start
    6. destroy
    See more

    Declaration

    Swift

    public final class VisionManager : BaseVisionManager
  • VisionReplayManager is a counterpart of VisionManager that uses recorded video and telemetry instead of realtime data. Use it to debug and test functions that use Vision in a development environment before testing in a vehicle. Use it in the same workflow as you use VisionManager after creating it with specific recorded session.

    Lifecycle of VisionReplayManager :

    1. create
    2. start
    3. stop, then lifecycle may proceed with destroy or start
    4. destroy

    Important

    This class is intended for debugging purposes only. Do NOT use session replay in production application.
    See more

    Declaration

    Swift

    public final class VisionReplayManager : BaseVisionManager