Skip to main content

Use the Mapbox Vision SDK for Android with an external USB camera

Prerequisite

Familiarity with the Vision SDK, Android Studio, and Kotlin or Java.

The Mapbox Vision SDK for Android is a library for interpreting road scenes in real time directly on Android devices. By default the Vision SDK uses the device's internal camera via Camera 2 API, but it is also possible to use an external camera or any other video source (for example a file or internet stream) with the Vision SDK. Using an external camera can improve your mobile app usability since it won't require the user to mount the phone to the car windshield.

To use an external source, you will need to implement a custom VideoSource. This tutorial will show how to implement custom VideoSource for a connected USB camera.

Here's an example setup running the Vision SDK on a Samsung S10+ with a Logitech C920 USB camera:

Getting started

Before starting this tutorial, go through the Install and configure steps in the Vision SDK for Android documentation. This will walk you through how to install the Vision SDK and configure your application.

USB camera on Android

Some Android devices may support some USB cameras via default Camera/Camera2 APIs, but support and device coverage is limited. This tutorial uses the UVCCamera library to get video stream from the USB camera. The library itself uses modified versions of libusb and libuvc to handle the USB cameras.

Camera and device positioning

The USB camera should be mounted under the windshield as stated at requirements. Device positioning requirements depend on which Vision features you're using:

  • If you're using Vision features that do not require camera calibration or sensor data (like segmentation and detections), there are no strict requirements with regard to phone position.
  • If you're using the USB camera with other features (like Vision AR or Vision Safety) that do require camera calibration and sensor data (including gyro, accelerometer, and gravity), you will need to mount the phone so that sensor data coming from the device is reliable. This means that phone orientation should be fixed and aligned with the orientation of the camera as much as possible.

Add the UVCCamera library

Add the following maven repository to the repositories section of the top-level build.gradle:

build.gradle
repositories {
maven { url 'http://raw.github.com/saki4510t/libcommon/master/repository/' }
}

Next, add the dependencies. You may build the UVCCamera library yourself or use prebuilt libuvccamera-release.aar from this tutorial. Supposing you've put library into the libs directory in your project, add the dependencies to the project-level build.gradle:

build.gradle
dependencies {
implementation fileTree(dir: 'libs', include: ['*.aar'])
implementation("com.serenegiant:common:$uvccamera_common") {
exclude module: 'support-v4'
}
}

You will also need android.permission.CAMERA permission granted to access USB camera, add it to the AndroidManifest.xml:

AndroidManifest.xml
<uses-permission android:name="android.permission.CAMERA" />

Create a USB VideoSource

To connect Vision with a USB camera, implement a custom VideoSource and pass it to VisionManager.create. VisionManager will call the attach method of this Video when it's ready to process frames so you can save this listener and start feeding it frames.

In the attach callback, create a USBMonitor object and call its register method to trigger the USB camera launch process.

Note

You should switch to a background thread to call all USB camera related methods to avoid main thread delays.

// VideoSource implementation that connects to USB camera and feeds frames to VisionManager.
private VideoSource usbVideoSource = new VideoSource() {
// VisionManager will attach videoSourceListener after VisionManager.create is called.
// Here we open USB camera connection and continue connection via onDeviceConnectListener callbacks.
// NOTE : method is called from the same thread, that VisionManager.create is called.
@Override
public void attach(@NonNull VideoSourceListener videoSourceListener) {
if (!backgroundHandlerThread.isAlive()) {
backgroundHandlerThread.start();
}
backgroundHandler = new Handler(backgroundHandlerThread.getLooper());
backgroundHandler.post(() ->
{
// Init and register USBMonitor.
synchronized (UsbVideoSourceActivity.this) {
usbVideoSourceListener = videoSourceListener;
usbMonitor = new USBMonitor(UsbVideoSourceActivity.this, onDeviceConnectListener);
usbMonitor.register();
}
});
}

// VisionManager will detach listener after VisionManager.destroy is called.
// Here we close USB camera connection.
// NOTE : method is called from the same thread, that VisionManager.destroy is called.
@Override
public void detach() {
backgroundHandler.post(() -> {
synchronized (UsbVideoSourceActivity.this) {
if (usbMonitor != null) {
usbMonitor.unregister();
usbMonitor.destroy();
}
if (uvcCamera != null) {
uvcCamera.stopPreview();
releaseCamera();
}
usbVideoSourceListener = null;
}
});

backgroundHandlerThread.quitSafely();
}

};

USB camera life cycle

To create USBMonitor in the previous step, you need an instance of OnDeviceConnectListener. This object will handle callbacks that send USB connection state updates.

  1. First request permission to connect to the camera in an onAttach callback (this call will trigger a system dialog describing the request).
  2. After confirmation of the dialog, an onConnect callback will follow, where connection itself is started.
  3. When the camera is disconnected, an onDisconnect callback allows you to release camera and deallocate resources used.
private USBMonitor.OnDeviceConnectListener onDeviceConnectListener = new USBMonitor.OnDeviceConnectListener() {
@Override
public void onAttach(UsbDevice device) {
synchronized (UsbVideoSourceActivity.this) {
usbMonitor.requestPermission(device);
}
}

@Override
public void onConnect(
UsbDevice device,
USBMonitor.UsbControlBlock ctrlBlock,
boolean createNew
) {
backgroundHandler.post(() -> {
synchronized (UsbVideoSourceActivity.this) {
releaseCamera();
initializeCamera(ctrlBlock);
}
});
}

@Override
public void onDetach(UsbDevice device) {
}

@Override
public void onCancel(UsbDevice device) {
}

@Override
public void onDisconnect(UsbDevice device, USBMonitor.UsbControlBlock ctrlBlock) {
backgroundHandler.post(() -> {
synchronized (UsbVideoSourceActivity.this) {
releaseCamera();
}
});
}

};

Create and release UVCCamera

The function that creates the camera instance and configures it contains several steps:

  1. Create a UVCCamera object and set the preview size parameters via UVCCamera.setPreviewSize.
  2. Create an external OpenGL ES texture and set the preview to use this texture wrapped in a SurfaceTexture object. The external surface fits since the raw video stream from the camera does not need to be displayed on the screen. Instead, frames go directly to the Vision SDK and then, after processing VisionView, displays segmentation results.
  3. Set a frame callback with camera.setFrameCallback that will retrieve individual frames and feed them to the Vision SDK via usbVideoSourceListener.

The code for camera initialization is:

private void initializeCamera(USBMonitor.UsbControlBlock ctrlBlock) {
uvcCamera = new UVCCamera();
uvcCamera.open(ctrlBlock);
uvcCamera.setPreviewSize(
CAMERA_FRAME_SIZE.getImageWidth(),
CAMERA_FRAME_SIZE.getImageHeight(),
UVCCamera.FRAME_FORMAT_YUYV
);

SurfaceTexture surfaceTexture = new SurfaceTexture(createExternalGlTexture());
surfaceTexture.setDefaultBufferSize(
CAMERA_FRAME_SIZE.getImageWidth(),
CAMERA_FRAME_SIZE.getImageHeight()
);
// Start preview to external GL texture
// NOTE : this is necessary for callback passed to [UVCCamera.setFrameCallback]
// to be triggered afterwards
uvcCamera.setPreviewTexture(surfaceTexture);
uvcCamera.startPreview();

// Set callback that will feed frames from the USB camera to Vision SDK
uvcCamera.setFrameCallback(
(frame) -> usbVideoSourceListener.onNewFrame(
new VideoSourceListener.FrameHolder.ByteBufferHolder(frame),
ImageFormat.RGBA,
CAMERA_FRAME_SIZE
),
UVCCamera.PIXEL_FORMAT_RGBX
);

}

Lastly, create external an OpenGL ES texture.

Note
You should call this function from thread with GL context.
private int createExternalGlTexture() {
int[] textures = new int[1];
GLES20.glGenTextures(1, textures, 0);
int texId = textures[0];
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texId);
GLES20.glTexParameterf(
GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER,
GLES20.GL_LINEAR
);
GLES20.glTexParameterf(
GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER,
GLES20.GL_LINEAR
);
GLES20.glTexParameteri(
GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S,
GLES20.GL_CLAMP_TO_EDGE
);
GLES20.glTexParameteri(
GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T,
GLES20.GL_CLAMP_TO_EDGE
);
return texId;
}

To release UVCCamera and associated resources call its close and destroy methods:

private void releaseCamera() {
if (uvcCamera != null) {
uvcCamera.close();
uvcCamera.destroy();
uvcCamera = null;
}
}

Create Vision with custom implemented USB VideoSource

Now that you have implemented a new VideoSource, you can use it to initialize VisionManager:

@Override
protected void onStart() {
super.onStart();
startVisionManager();
}

private void startVisionManager() {
if (allPermissionsGranted() && !visionManagerWasInit) {
VisionManager.create(usbVideoSource);
VisionManager.setModelPerformance(
ModelPerformance.On(
ModelPerformanceMode.FIXED, ModelPerformanceRate.HIGH
)
)
visionView.setVisionManager(VisionManager.INSTANCE);
VisionManager.start();

visionManagerWasInit = true;
}

}

Finished product

You've launched Vision SDK with external USB camera.

UsbActivity.java
package com.mapbox.vision.examples;

import android.graphics.SurfaceTexture;
import android.hardware.usb.UsbDevice;
import android.opengl.GLES11Ext;
import android.opengl.GLES20;
import android.os.Handler;
import android.os.HandlerThread;

import androidx.annotation.NonNull;

import com.mapbox.vision.VisionManager;
import com.mapbox.vision.mobile.core.models.frame.ImageFormat;
import com.mapbox.vision.mobile.core.models.frame.ImageSize;
import com.mapbox.vision.performance.ModelPerformance;
import com.mapbox.vision.performance.ModelPerformanceMode;
import com.mapbox.vision.performance.ModelPerformanceRate;
import com.mapbox.vision.video.videosource.VideoSource;
import com.mapbox.vision.video.videosource.VideoSourceListener;
import com.mapbox.vision.view.VisionView;
import com.serenegiant.usb.USBMonitor;
import com.serenegiant.usb.UVCCamera;

/**
* Example shows how Vision SDK can work with external USB camera.
* [UVCCamera](https://github.com/saki4510t/UVCCamera) library is used to connect to the USB camera itself,
* frames from camera are then fed to Vision SDK.
*/
public class UsbVideoSourceActivity extends BaseActivity {

private static final ImageSize CAMERA_FRAME_SIZE = new ImageSize(1280, 720);

private VisionView visionView;

private HandlerThread backgroundHandlerThread = new HandlerThread("VideoDecode");
private Handler backgroundHandler;

private boolean visionManagerWasInit = false;

/**
* Vision SDK will attach listener to get frames and camera parameters from the USB camera.
*/
private VideoSourceListener usbVideoSourceListener;

/**
* VideoSource implementation that connects to USB camera and feeds frames to VisionManager.
*/
private VideoSource usbVideoSource = new VideoSource() {
/**
* VisionManager will attach [videoSourceListener] after [VisionManager.create] is called.
* Here we open USB camera connection, and proceed connection via [onDeviceConnectListener] callbacks.
*
* NOTE : method is called from the same thread, that [VisionManager.create] is called.
*/
@Override
public void attach(@NonNull VideoSourceListener videoSourceListener) {
if (!backgroundHandlerThread.isAlive()) {
backgroundHandlerThread.start();
}
backgroundHandler = new Handler(backgroundHandlerThread.getLooper());
backgroundHandler.post(() ->
{
// Init and register USBMonitor.
synchronized (UsbVideoSourceActivity.this) {
usbVideoSourceListener = videoSourceListener;
usbMonitor = new USBMonitor(UsbVideoSourceActivity.this, onDeviceConnectListener);
usbMonitor.register();
}
});
}

/**
* VisionManager will detach listener after [VisionManager.destroy] is called.
* Here we close USB camera connection.
*
* NOTE : method is called from the same thread, that [VisionManager.destroy] is called.
*/
@Override
public void detach() {
backgroundHandler.post(() -> {
synchronized (UsbVideoSourceActivity.this) {
if (usbMonitor != null) {
usbMonitor.unregister();
usbMonitor.destroy();
}
if (uvcCamera != null) {
uvcCamera.stopPreview();
releaseCamera();
}
usbVideoSourceListener = null;
}
});

backgroundHandlerThread.quitSafely();
}
};

private USBMonitor usbMonitor;
private UVCCamera uvcCamera;

@Override
protected void initViews() {
setContentView(R.layout.activity_main);
visionView = findViewById(R.id.vision_view);
}

@Override
protected void onPermissionsGranted() {
startVisionManager();
}

@Override
protected void onStart() {
super.onStart();
startVisionManager();
}

@Override
protected void onStop() {
super.onStop();
stopVisionManager();
}

@Override
protected void onResume() {
super.onResume();
visionView.onResume();
}

@Override
protected void onPause() {
super.onPause();
visionView.onPause();
}

private void startVisionManager() {
if (allPermissionsGranted() && !visionManagerWasInit) {
VisionManager.create(usbVideoSource);
VisionManager.setModelPerformance(
new ModelPerformance.On(ModelPerformanceMode.FIXED, ModelPerformanceRate.HIGH.INSTANCE)
);
visionView.setVisionManager(VisionManager.INSTANCE);
VisionManager.start();

visionManagerWasInit = true;
}
}

private void stopVisionManager() {
if (visionManagerWasInit) {
VisionManager.stop();
VisionManager.destroy();

visionManagerWasInit = false;
}
}

private USBMonitor.OnDeviceConnectListener onDeviceConnectListener = new USBMonitor.OnDeviceConnectListener() {
@Override
public void onAttach(UsbDevice device) {
synchronized (UsbVideoSourceActivity.this) {
usbMonitor.requestPermission(device);
}
}

@Override
public void onConnect(
UsbDevice device,
USBMonitor.UsbControlBlock ctrlBlock,
boolean createNew
) {
backgroundHandler.post(() -> {
synchronized (UsbVideoSourceActivity.this) {
releaseCamera();
initializeCamera(ctrlBlock);
}
});
}

@Override
public void onDetach(UsbDevice device) {
}

@Override
public void onCancel(UsbDevice device) {
}

@Override
public void onDisconnect(UsbDevice device, USBMonitor.UsbControlBlock ctrlBlock) {
backgroundHandler.post(() -> {
synchronized (UsbVideoSourceActivity.this) {
releaseCamera();
}
});
}
};

private void releaseCamera() {
if (uvcCamera != null) {
uvcCamera.close();
uvcCamera.destroy();
uvcCamera = null;
}
}

private void initializeCamera(USBMonitor.UsbControlBlock ctrlBlock) {
uvcCamera = new UVCCamera();
uvcCamera.open(ctrlBlock);
uvcCamera.setPreviewSize(
CAMERA_FRAME_SIZE.getImageWidth(),
CAMERA_FRAME_SIZE.getImageHeight(),
UVCCamera.FRAME_FORMAT_YUYV
);

SurfaceTexture surfaceTexture = new SurfaceTexture(createExternalGlTexture());
surfaceTexture.setDefaultBufferSize(
CAMERA_FRAME_SIZE.getImageWidth(),
CAMERA_FRAME_SIZE.getImageHeight()
);
// Start preview to external GL texture
// NOTE : this is necessary for callback passed to [UVCCamera.setFrameCallback]
// to be triggered afterwards
uvcCamera.setPreviewTexture(surfaceTexture);
uvcCamera.startPreview();

// Set callback that will feed frames from the USB camera to Vision SDK
uvcCamera.setFrameCallback(
(frame) -> usbVideoSourceListener.onNewFrame(
new VideoSourceListener.FrameHolder.ByteBufferHolder(frame),
ImageFormat.RGBA,
CAMERA_FRAME_SIZE
),
UVCCamera.PIXEL_FORMAT_RGBX
);
}

/**
* Create external OpenGL texture for [uvcCamera].
*/
private int createExternalGlTexture() {
int[] textures = new int[1];
GLES20.glGenTextures(1, textures, 0);
int texId = textures[0];
GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, texId);
GLES20.glTexParameterf(
GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER,
GLES20.GL_LINEAR
);
GLES20.glTexParameterf(
GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER,
GLES20.GL_LINEAR
);
GLES20.glTexParameteri(
GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S,
GLES20.GL_CLAMP_TO_EDGE
);
GLES20.glTexParameteri(
GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T,
GLES20.GL_CLAMP_TO_EDGE
);
return texId;
}
}

Next steps

Use Vision AR for Android with the implemented UsbVideoSource. For a complete example using Vision AR, see the Basic AR navigation example.

Was this page helpful?