AvatarChatbot Application¶
The AvatarChatbot example demonstrates the implementation of AI avatar chatbot using OPEA component-level microservices.
Table of contents¶
Architecture¶
The AI Avatar example is implemented using both megaservices and the component-level microservices defined in GenAIComps. The flow chart below shows the information flow between different megaservices and microservices for this example.
This AvatarChatbot use case performs AI avatar chatbot across multiple platforms. Currently, we provide the example for Intel Gaudi2 and Intel Xeon Scalable Processors, and we invite contributions from other hardware vendors to expand OPEA ecosystem.
Deployment Options¶
The table below lists the available deployment options and their implementation details for different hardware platforms.
Platform |
Deployment Method |
Link |
---|---|---|
Intel Xeon |
Docker compose |
|
Intel Gaudi2 |
Docker compose |
|
AMD ROCm |
Docker compose |
Validated Configurations¶
Deploy Method |
LLM Engine |
LLM Model |
Hardware |
---|---|---|---|
Docker Compose |
TGI |
Intel/neural-chat-7b-v3-3 |
Intel Gaudi |
Docker Compose |
TGI |
Intel/neural-chat-7b-v3-3 |
Intel Xeon |
Docker Compose |
TGI |
Intel/neural-chat-7b-v3-3 |
AMD ROCm |