Skip to main content

Events

Events are a way to be informed of the actions taken by the assistant as requested by user prompts. When the user has prompted the assistant to do something, the AI will trigger an action and respond with an event. For example, you can ask the assistant to play music, or to stop the navigation, and more advanced cases like changing the temperature in the car. Mapbox continues to build out more events and refine the existing ones.

There are several MapGPT events that can be observed with the SDK using the API observeMapGptEvents:

Body

Defines the event specifics, which varies based on the event type. All events mentioned below are extension of [Body] and inherit it's properties.

PropertiesTypeDescription
idlongA unique increasing ID field for the event. This can be used by the client to reference the last received event ID when reconnecting to the service.
timestamplongCreation type of the event as unix timestamp.
chunkIdstringDescribes which conversational chunk the event is tied to, in format of {date}@{id} where the chunkPrefix should be considered as an ID of the AI response and chunkOffset is the logical order in the response stream.
chunkPrefixstringRepresents an ID of the AI response.
chunkOffsetstringRepresents logical order in the response stream.
isSupplementbooleanDefines events that were not generated in direct response to user query but retroactively or spontaneously by the assistant.

Conversation

Defines a conversation chunk. A single chunk is typically a sentence in a longer AI response.

PropertiesTypeDescription
dataobjectContents related to conversation events.
data.conversationIdstringA unique string for this conversation.
data.contentstringThe sentence in a response.
data.initialbooleanWhether this is the first chunk in the response stream.
data.confirmationbooleanWhether this chunk is a confirmation of a user action.
data.finalbooleanWhether this is the last chunk in the response stream.
data.maxTokensbooleanWhether conversation has hit the profile's response token limit.

Entity

Defines entities, such as POIs or song names, extracted from a conversation or otherwise generated using this format.

PropertiesTypeDescription
dataarrayList of contents related to entities.
data.placeobjectDefines a POI entity.
data.place.namestringAn identified name of the entity such as restaurant name.
data.place.geocodedobjectReverse geocoded place information.
data.cardobjectDefines a card entity.
data.card.componentsarrayList of CardComponent.
data.card.anchoredComponentobjectCardComponent of the type CardComponent.Image describing the light and dark uri for the data provider if applicable.
data.card.payloadobjectAdditional payload associated with the card that is associated as a callback upon interaction.

PlayMusic

Defines an event to play a music.

PropertiesTypeDescription
dataobjectContents related to play music event.
data.providerstringProvider specific uri of the track to be played.
data.uristringContents related to play music event.
data.songstringName of the song.
data.artiststringName of the artist.

PauseMusic

Defines an event to pause music that is now playing.

ResumeMusic

Defines an event to resume music that was paused before.

StartNavigation

Defines an event to start navigation to a destination. The destination can be a POI name or a favorite such as "home" or "work".

PropertiesTypeDescription
dataobjectContents related to start navigation event.
data.locationstringName of the place.
data.geocodedstringReverse geocoded place information.
data.favoritestringKey to identify the location if it is a favorite.

To issue commands such as "Navigate me to home" or "Navigate me to work", you need to mention them as capabilities along with the query.


// Create a navigate to home capability middleware

class NavigateToHomeMiddleware:
MapGptCapabilitiesMiddleware<NavigateToHomeMiddleware.NavigateToHomeContext>,
CoroutineMiddleware<NavigateToHomeMiddleware.NavigateToHomeContext>() {

inner class NavigateToHomeContext(mapGptCoreContext: MapGptCoreContext): MapGptCoreContext by mapGptCoreContext

override fun <Parent : MiddlewareContext> provideContext(parent: Parent): NavigateToHomeContext {
return NavigateToHomeContext(mapGptCoreContext = parent as MapGptCoreContext)
}

private val _capabilities = MutableStateFlow(setOf(NavigateToFavoriteCapability("home")))
override val capabilities: Flow<Set<MapGptCapability>> = _capabilities.asStateFlow()

override fun onAttached(middlewareContext: NavigateToHomeContext) {
super.onAttached(middlewareContext)
SharedLog.d(TAG) { "onAttached" }
}

override fun onDetached(middlewareContext: NavigateToHomeContext) {
super.onDetached(middlewareContext)
SharedLog.d(TAG) { "onDetached" }
}

private companion object {
private const val TAG = "NavigateToHomeMiddleware"
}
}

// Create a navigate to work capability middleware

class NavigateToWorkMiddleware:
MapGptCapabilitiesMiddleware<NavigateToWorkMiddleware.NavigateToWorkContext>,
CoroutineMiddleware<NavigateToWorkMiddleware.NavigateToWorkContext>() {

inner class NavigateToWorkContext(mapGptCoreContext: MapGptCoreContext): MapGptCoreContext by mapGptCoreContext

override fun <Parent : MiddlewareContext> provideContext(parent: Parent): NavigateToWorkContext {
return NavigateToWorkContext(mapGptCoreContext = parent as MapGptCoreContext)
}

private val _capabilities = MutableStateFlow(setOf(NavigateToFavoriteCapability("work")))
override val capabilities: Flow<Set<MapGptCapability>> = _capabilities.asStateFlow()

override fun onAttached(middlewareContext: NavigateToWorkContext) {
super.onAttached(middlewareContext)
SharedLog.d(TAG) { "onAttached" }
}

override fun onDetached(middlewareContext: NavigateToWorkContext) {
super.onDetached(middlewareContext)
SharedLog.d(TAG) { "onDetached" }
}

private companion object {
private const val TAG = "NavigateToWorkMiddleware"
}
}

// Pass this to `capabilitiesRepository`

private val capabilitiesRepository = MapGptServiceCapabilitiesRepository(
mapGptCore = mapGptCore,
capabilityProvider = MapGptCapabilitiesProvider(
scope = viewModelScope,
capabilitiesServices = setOf(NavigateToHomeMiddleware(), NavigateToWorkMiddleware())
),
)

// Send the capability with the user request

val request = MapGptStreamingRequest(
prompt = prompt,
context = userContext,
capabilities = capabilitiesRepository.capabilityIds(),
)
mapGptService.postPromptsForStreaming(request)

AddWaypoint

Defines an event to add a waypoint to an existing route.

PropertiesTypeDescription
dataobjectContents related to waypoint event.
data.indexintegerIndex at which the waypoint should be added to the route.
data.locationstringName of the place.
data.geocodedstringReverse geocoded place information.

StopNavigation

Defines an event to stop navigation.

NoResponse

Defines action containing no verbal response for the user query.

StopListening

Defines action ordering MapGPT to close the microphone and stop capturing input.

Observing events

See the Events API guide for more information on how to observe other events offered by the SDK. There are many data points to observe while navigating to a destination.

observeMapGptEvents

class MainActivity : AppCompatActivity() {

override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)

lifecycleScope.launch {
repeatOnLifecycle(Lifecycle.State.STARTED) {
Dash.controller.observeMapGptEvents()
.collect { /* DashMapGptEvent */ event ->
// Do something with the MapGptEvent
processMapGptEvents(event)
}
}
}
}

fun processMapGptEvents(event: DashMapGptEvent) {
when (event) {
is DashMapGptEvent.Activated ->
Toast.makeText(
this,
"New conversation with the assistant has started",
Toast.LENGTH_SHORT
).show()
is DashMapGptEvent.PlayingMusic ->
Toast.makeText(
this,
"App is playing the song with track identified by ${event.trackId}",
Toast.LENGTH_SHORT
).show()
is DashMapGptEvent.PausedMusic ->
Toast.makeText(
this,
"App has paused the music that was playing",
Toast.LENGTH_SHORT
).show()
is DashMapGptEvent.ResumedMusic ->
Toast.makeText(
this,
"App has resumed the music that was paused",
Toast.LENGTH_SHORT
).show()
is DashMapGptEvent.StartedNavigation ->
Toast.makeText(
this,
"App has started navigation to your destination",
Toast.LENGTH_SHORT
).show()
is DashMapGptEvent.StoppedNavigation ->
Toast.makeText(
this,
"App has stopped navigation",
Toast.LENGTH_SHORT
).show()
is DashMapGptEvent.SetHvac ->
Toast.makeText(
this,
"App has set the vehicle temperature to ${event.temperature} ${event.unit}",
Toast.LENGTH_SHORT
).show()
}
}
}

Upon visual rendering of data by the assistant in form of cards, the end user can click on the card to do certain actions. In cases, where the data displayed on the UI represents a custom payload, a tap by the user on the card can be observed using observeMapGptCardEvents.

observeMapGptCardEvents

class MainActivity : AppCompatActivity() {

override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)

lifecycleScope.launch {
repeatOnLifecycle(Lifecycle.State.STARTED) {
Dash.controller.observeMapGptCardEvents()
.collect { /* DashMapGptCardEvent */ event ->
// Do something with the DashMapGptCardEvent
processMapGptCardEvents(event)
}
}
}
}

fun processMapGptCardEvents(event: DashMapGptCardEvent) {
when (event) {
is DashMapGptCardEvent.OnMapGptCardClicked ->
Toast.makeText(
this,
"User tapped on the card rendering data identified by ${event.payload}",
Toast.LENGTH_SHORT
).show()
}
}
}
Was this page helpful?