In the VmonsterRoom
, several events of type Event
may occur, which are essential for handling interactions and monitoring the state of the session.
The following are the key events:
Triggering when an attempt is made to connect the AI Avatar Stream. It occurs immediately upon calling the join()
function.
Occuring when the AI Avatar Stream connection is successfully completed.
Triggering when the AI Avatar Stream connection is terminated.
Triggering when the AI Avatar begins speaking. It allows you to determine whether the AI Avatar is currently speaking.
Occuring when the AI Avatar sends a message. It enables you to review messages sent by the AI Avatar.
Triggering when the AI Avatar stops speaking. It allows you to determine whether the AI Avatar is currently speaking.
Triggering when the AI Avatar video track connected. Video MediaStream is passed to the callback.
Triggering when the AI Avatar audio track connected. Audio MediaStream is passed to the callback.
Triggered repeatedly while the user’s audio is unmuted and they are speaking. It sends speech-to-text (STT) results one by one to the callback as STTData.
STTDataEventType
“transcript”
– The finalized transcription of the user’s speech audio.“start_of_speech”
– Indicates the start of the user speaking. This means the Voice Activity Detection (VAD) system has detected the beginning of human speech.“end_of_speech”
- Indicates the end of the user speaking. This means the Voice Activity Detection (VAD) system has detected the cessation of human speech.Occurs when real-time STT cannot be performed successfully.
To attach callbacks to these events, use on()
method.
Learn about how to add callback function to Events
In the VmonsterRoom
, several events of type Event
may occur, which are essential for handling interactions and monitoring the state of the session.
The following are the key events:
Triggering when an attempt is made to connect the AI Avatar Stream. It occurs immediately upon calling the join()
function.
Occuring when the AI Avatar Stream connection is successfully completed.
Triggering when the AI Avatar Stream connection is terminated.
Triggering when the AI Avatar begins speaking. It allows you to determine whether the AI Avatar is currently speaking.
Occuring when the AI Avatar sends a message. It enables you to review messages sent by the AI Avatar.
Triggering when the AI Avatar stops speaking. It allows you to determine whether the AI Avatar is currently speaking.
Triggering when the AI Avatar video track connected. Video MediaStream is passed to the callback.
Triggering when the AI Avatar audio track connected. Audio MediaStream is passed to the callback.
Triggered repeatedly while the user’s audio is unmuted and they are speaking. It sends speech-to-text (STT) results one by one to the callback as STTData.
STTDataEventType
“transcript”
– The finalized transcription of the user’s speech audio.“start_of_speech”
– Indicates the start of the user speaking. This means the Voice Activity Detection (VAD) system has detected the beginning of human speech.“end_of_speech”
- Indicates the end of the user speaking. This means the Voice Activity Detection (VAD) system has detected the cessation of human speech.Occurs when real-time STT cannot be performed successfully.
To attach callbacks to these events, use on()
method.
Learn about how to add callback function to Events