Classes
The following classes are available globally.
-
Presents segmentation and detection events emitted from
VisionManager
as well as raw video frames.Displayed content depends on the current value of
visualizationMode
.In
Segmentation
andDetections
modesVisionPresentationViewController
displays according events whenVisionManager
orVisionReplayManager
is started.In
Clear
mode raw video frames are displayed:- in case of
VisionManager
: fromVideoSource
provided duringVisionManager
creation on new events, - in case of
VisionReplayManager
: from recorded video whenVisionReplayManager
is started.
Declaration
Swift
class VisionPresentationViewController : UIViewController
- in case of
-
Cubic bezier spline in the world space.
See moreDeclaration
Swift
class BezierCubic3D : NSObject
-
Object representing the state of the camera. The precision of the camera properties depends on its calibration, so it’s recommended to use them when the
See moreCamera.isCalibrated
value becomestrue
.Declaration
Swift
class Camera : NSObject
-
Intrinsic camera parameters representing the source of video frames.
See moreDeclaration
Swift
class CameraParameters : NSObject
-
Single result of object detection.
See moreDeclaration
Swift
class Detection : NSObject
-
Single frame with meta information.
See moreDeclaration
Swift
class Frame : NSObject
-
All detections for a single frame.
See moreDeclaration
Swift
class FrameDetections : NSObject
-
Segmentation result for a single frame.
See moreDeclaration
Swift
class FrameSegmentation : NSObject
-
All sign classifications for a single frame.
See moreDeclaration
Swift
class FrameSignClassifications : NSObject
-
Coordinate in Mercator.
This coordinate system is used to locate an object’s geographic location as it would appear on a map. Examples include the GPS position of the ego-vehicle or the position of landmarks, which can be used for localization or position of the detected objects.
Currently we use Spherical Earth Model for geodesic calculations. Each point is specified using
longitude, latitude
.Longitude ranges from
-180
to180
degrees, where0
is Greenwich meridian, the positive direction (+
) is to theEast
, and the negative direction (-
) is to theWest
.Latitude ranges from
See more-90
to+90
degrees, where0
is Equator, the positive direction (+
) is to theNorth
, and the negative direction (-
) is to theSouth
.Declaration
Swift
class GeoCoordinate : NSObject
-
Object representing geographical coordinates.
See moreDeclaration
Swift
class GeoPosition : NSObject
-
Image object with attributes.
See moreDeclaration
Swift
class Image : NSObject
-
Size of an image.
See moreDeclaration
Swift
class ImageSize : NSObject
-
Single lane object.
See moreDeclaration
Swift
class Lane : NSObject
-
Lane edge description.
See moreDeclaration
Swift
class LaneEdge : NSObject
-
Represents pixel coordinates on image. Top left corner of an image has (x: 0, y: 0) coordinate.
This coordinate system is used to represent the position of an object relative to the frame. The origin is the left top corner of the frame. The position of an object is expressed as
See morex, y
.Declaration
Swift
class Point2D : NSObject
-
Object aggregating information about road markup and road geometry.
See moreDeclaration
Swift
class RoadDescription : NSObject
-
Object representing available sign information.
See moreDeclaration
Swift
class Sign : NSObject
-
Single result of sign instance classification.
See moreDeclaration
Swift
class SignClassification : NSObject
-
Location of our vehicle with meta information.
See moreDeclaration
Swift
class VehicleState : NSObject
-
Class that encapsulates image buffer and its format.
See moreDeclaration
Swift
class VideoSample : NSObject
-
Point in the ISO coordinate system (unit is a meter).
This coordinate system is used to represent the position of an object relative to the device camera’s position in physical space. The origin of this coordinate system is a point projected from a camera position to a road plane.
See moreDeclaration
Swift
class WorldCoordinate : NSObject
-
Object aggregating information about objects and their position in the world around a vehicle. Description includes static and dynamic objects.
See moreDeclaration
Swift
class WorldDescription : NSObject
-
Description of the object in the world.
See moreDeclaration
Swift
class WorldObject : NSObject
-
Object encapsulating work with camera device.
See moreDeclaration
Swift
open class CameraVideoSource : ObservableVideoSource
-
Helper class handling observers: storing, releasing, notifying. Observers are held weakly by the instance of the class. You may inherit your video source from this class to avoid handling observers yourself.
Warning
The implementation uses a non-recursive lock, thus you must not calladd(observer:)
orremove(observer:)
methods fromnotify
closure.Declaration
Swift
open class ObservableVideoSource : NSObject, VideoSource
-
Declaration
Swift
public class BaseVisionManager : VisionManagerProtocol
extension BaseVisionManager: VisionDelegate
-
The main object for registering for events from the SDK, starting and stopping their delivery. It also provides some useful functions for performance configuration and data conversion.
Lifecycle of VisionManager :
create
start
startRecording
(optional)stopRecording
(optional)stop
, then lifecycle may proceed withdestroy
orstart
destroy
Declaration
Swift
public final class VisionManager : BaseVisionManager
-
VisionReplayManager
is a counterpart ofVisionManager
that uses recorded video and telemetry instead of realtime data. Use it to debug and test functions that use Vision in a development environment before testing in a vehicle. Use it in the same workflow as you useVisionManager
after creating it with specific recorded session.Lifecycle of VisionReplayManager :
create
start
stop
, then lifecycle may proceed withdestroy
orstart
destroy
Important
This class is intended for debugging purposes only. Do NOT use session replay in production application.Declaration
Swift
public final class VisionReplayManager : BaseVisionManager