AvatarChatbot Application¶
The AvatarChatbot service can be effortlessly deployed on either Intel Gaudi2 or Intel XEON Scalable Processors.
AI Avatar Workflow¶
The AI Avatar example is implemented using both megaservices and the component-level microservices defined in GenAIComps. The flow chart below shows the information flow between different megaservices and microservices for this example.
Deploy AvatarChatbot Service¶
The AvatarChatbot service can be deployed on either Intel Gaudi2 AI Accelerator or Intel Xeon Scalable Processor.
Deploy AvatarChatbot on Gaudi¶
Refer to the Gaudi Guide for instructions on deploying AvatarChatbot on Gaudi, and on setting up an UI for the application.
Deploy AvatarChatbot on Xeon¶
Refer to the Xeon Guide for instructions on deploying AvatarChatbot on Xeon.
Supported Models¶
ASR¶
The default model is openai/whisper-small. It also supports all models in the Whisper family, such as openai/whisper-large-v3
, openai/whisper-medium
, openai/whisper-base
, openai/whisper-tiny
, etc.
To replace the model, please edit the compose.yaml
and add the command
line to pass the name of the model you want to use:
services:
whisper-service:
...
command: --model_name_or_path openai/whisper-tiny
TTS¶
The default model is microsoft/SpeechT5. We currently do not support replacing the model. More models under the commercial license will be added in the future.
Animation¶
The default model is Rudrabha/Wav2Lip and TencentARC/GFPGAN. We currently do not support replacing the model. More models under the commercial license such as OpenTalker/SadTalker will be added in the future.