Events
Events are a way to be informed of the actions taken by the assistant as requested by user prompts. When the user has prompted the assistant to do something, the AI will trigger an action and respond with an event. For example, you can ask the assistant to play music, or to stop the navigation, and more advanced cases like changing the temperature in the car. Mapbox continues to build out more events and refine the existing ones.
There are several MapGPT events that can be observed with the SDK using the API observeMapGptEvents
:
Body
Defines the event specifics, which varies based on the event type. All events mentioned below are extension of [Body] and inherit it's properties.
Properties | Type | Description |
---|---|---|
id | long | A unique increasing ID field for the event. This can be used by the client to reference the last received event ID when reconnecting to the service. |
timestamp | long | Creation type of the event as unix timestamp. |
chunkId | string | Describes which conversational chunk the event is tied to, in format of {date}@{id} where the chunkPrefix should be considered as an ID of the AI response and chunkOffset is the logical order in the response stream. |
chunkPrefix | string | Represents an ID of the AI response. |
chunkOffset | string | Represents logical order in the response stream. |
isSupplement | boolean | Defines events that were not generated in direct response to user query but retroactively or spontaneously by the assistant. |
Conversation
Defines a conversation chunk. A single chunk is typically a sentence in a longer AI response.
Properties | Type | Description |
---|---|---|
data | object | Contents related to conversation events. |
data.conversationId | string | A unique string for this conversation. |
data.content | string | The sentence in a response. |
data.initial | boolean | Whether this is the first chunk in the response stream. |
data.confirmation | boolean | Whether this chunk is a confirmation of a user action. |
data.final | boolean | Whether this is the last chunk in the response stream. |
data.maxTokens | boolean | Whether conversation has hit the profile's response token limit. |
Entity
Defines entities, such as POIs or song names, extracted from a conversation or otherwise generated using this format.
Properties | Type | Description |
---|---|---|
data | array | List of contents related to entities. |
data.place | object | Defines a POI entity. |
data.place.name | string | An identified name of the entity such as restaurant name. |
data.place.geocoded | object | Reverse geocoded place information. |
data.card | object | Defines a card entity. |
data.card.components | array | List of CardComponent . |
data.card.anchoredComponent | object | CardComponent of the type CardComponent.Image describing the light and dark uri for the data provider if applicable. |
data.card.payload | object | Additional payload associated with the card that is associated as a callback upon interaction. |
PlayMusic
Defines an event to play a music.
Properties | Type | Description |
---|---|---|
data | object | Contents related to play music event. |
data.provider | string | Provider specific uri of the track to be played. |
data.uri | string | Contents related to play music event. |
data.song | string | Name of the song. |
data.artist | string | Name of the artist. |
PauseMusic
Defines an event to pause music that is now playing.
ResumeMusic
Defines an event to resume music that was paused before.
StartNavigation
Defines an event to start navigation to a destination. The destination can be a POI name or a favorite such as "home" or "work".
Properties | Type | Description |
---|---|---|
data | object | Contents related to start navigation event. |
data.location | string | Name of the place. |
data.geocoded | string | Reverse geocoded place information. |
data.favorite | string | Key to identify the location if it is a favorite. |
To issue commands such as "Navigate me to home" or "Navigate me to work", you need to mention them as capabilities along with the query.
// Create a navigate to home capability middleware
class NavigateToHomeMiddleware:
MapGptCapabilitiesMiddleware<NavigateToHomeMiddleware.NavigateToHomeContext>,
CoroutineMiddleware<NavigateToHomeMiddleware.NavigateToHomeContext>() {
inner class NavigateToHomeContext(mapGptCoreContext: MapGptCoreContext): MapGptCoreContext by mapGptCoreContext
override fun <Parent : MiddlewareContext> provideContext(parent: Parent): NavigateToHomeContext {
return NavigateToHomeContext(mapGptCoreContext = parent as MapGptCoreContext)
}
private val _capabilities = MutableStateFlow(setOf(NavigateToFavoriteCapability("home")))
override val capabilities: Flow<Set<MapGptCapability>> = _capabilities.asStateFlow()
override fun onAttached(middlewareContext: NavigateToHomeContext) {
super.onAttached(middlewareContext)
SharedLog.d(TAG) { "onAttached" }
}
override fun onDetached(middlewareContext: NavigateToHomeContext) {
super.onDetached(middlewareContext)
SharedLog.d(TAG) { "onDetached" }
}
private companion object {
private const val TAG = "NavigateToHomeMiddleware"
}
}
// Create a navigate to work capability middleware
class NavigateToWorkMiddleware:
MapGptCapabilitiesMiddleware<NavigateToWorkMiddleware.NavigateToWorkContext>,
CoroutineMiddleware<NavigateToWorkMiddleware.NavigateToWorkContext>() {
inner class NavigateToWorkContext(mapGptCoreContext: MapGptCoreContext): MapGptCoreContext by mapGptCoreContext
override fun <Parent : MiddlewareContext> provideContext(parent: Parent): NavigateToWorkContext {
return NavigateToWorkContext(mapGptCoreContext = parent as MapGptCoreContext)
}
private val _capabilities = MutableStateFlow(setOf(NavigateToFavoriteCapability("work")))
override val capabilities: Flow<Set<MapGptCapability>> = _capabilities.asStateFlow()
override fun onAttached(middlewareContext: NavigateToWorkContext) {
super.onAttached(middlewareContext)
SharedLog.d(TAG) { "onAttached" }
}
override fun onDetached(middlewareContext: NavigateToWorkContext) {
super.onDetached(middlewareContext)
SharedLog.d(TAG) { "onDetached" }
}
private companion object {
private const val TAG = "NavigateToWorkMiddleware"
}
}
// Pass this to `capabilitiesRepository`
private val capabilitiesRepository = MapGptServiceCapabilitiesRepository(
mapGptCore = mapGptCore,
capabilityProvider = MapGptCapabilitiesProvider(
scope = viewModelScope,
capabilitiesServices = setOf(NavigateToHomeMiddleware(), NavigateToWorkMiddleware())
),
)
// Send the capability with the user request
val request = MapGptStreamingRequest(
prompt = prompt,
context = userContext,
capabilities = capabilitiesRepository.capabilityIds(),
)
mapGptService.postPromptsForStreaming(request)
AddWaypoint
Defines an event to add a waypoint to an existing route.
Properties | Type | Description |
---|---|---|
data | object | Contents related to waypoint event. |
data.index | integer | Index at which the waypoint should be added to the route. |
data.location | string | Name of the place. |
data.geocoded | string | Reverse geocoded place information. |
StopNavigation
Defines an event to stop navigation.
NoResponse
Defines action containing no verbal response for the user query.
StopListening
Defines action ordering MapGPT to close the microphone and stop capturing input.
Observing events
See the Events API guide for more information on how to observe other events offered by the SDK. There are many data points to observe while navigating to a destination.
observeMapGptEvents
class MainActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
lifecycleScope.launch {
repeatOnLifecycle(Lifecycle.State.STARTED) {
Dash.controller.observeMapGptEvents()
.collect { /* DashMapGptEvent */ event ->
// Do something with the MapGptEvent
processMapGptEvents(event)
}
}
}
}
fun processMapGptEvents(event: DashMapGptEvent) {
when (event) {
is DashMapGptEvent.Activated ->
Toast.makeText(
this,
"New conversation with the assistant has started",
Toast.LENGTH_SHORT
).show()
is DashMapGptEvent.PlayingMusic ->
Toast.makeText(
this,
"App is playing the song with track identified by ${event.trackId}",
Toast.LENGTH_SHORT
).show()
is DashMapGptEvent.PausedMusic ->
Toast.makeText(
this,
"App has paused the music that was playing",
Toast.LENGTH_SHORT
).show()
is DashMapGptEvent.ResumedMusic ->
Toast.makeText(
this,
"App has resumed the music that was paused",
Toast.LENGTH_SHORT
).show()
is DashMapGptEvent.StartedNavigation ->
Toast.makeText(
this,
"App has started navigation to your destination",
Toast.LENGTH_SHORT
).show()
is DashMapGptEvent.StoppedNavigation ->
Toast.makeText(
this,
"App has stopped navigation",
Toast.LENGTH_SHORT
).show()
is DashMapGptEvent.SetHvac ->
Toast.makeText(
this,
"App has set the vehicle temperature to ${event.temperature} ${event.unit}",
Toast.LENGTH_SHORT
).show()
}
}
}
Upon visual rendering of data by the assistant in form of cards, the end user can click on the card to do certain actions. In cases, where the data displayed on the UI represents a custom payload, a tap by the user on the card can be observed using observeMapGptCardEvents
.
observeMapGptCardEvents
class MainActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
lifecycleScope.launch {
repeatOnLifecycle(Lifecycle.State.STARTED) {
Dash.controller.observeMapGptCardEvents()
.collect { /* DashMapGptCardEvent */ event ->
// Do something with the DashMapGptCardEvent
processMapGptCardEvents(event)
}
}
}
}
fun processMapGptCardEvents(event: DashMapGptCardEvent) {
when (event) {
is DashMapGptCardEvent.OnMapGptCardClicked ->
Toast.makeText(
this,
"User tapped on the card rendering data identified by ${event.payload}",
Toast.LENGTH_SHORT
).show()
}
}
}