GenAI Microservices¶
GenAI microservices leverag a service composer to assemble a mega-service tailored for real-world Enterprise AI applications. All the microservices are containerized, allowing cloud native deployment. Checkout how the microservices are used in GenAI Examples.
We’re building this microservices documentation from content in the GenAIComps GitHub repository.
Microservices Table of Contents
Agent Microservice¶
Asr Microservice¶
Chathistory Microservice¶
Cores Microservice¶
Dataprep Microservice¶
- Dataprep Microservice
- Dataprep Microservice with Milvus
- Dataprep Microservice for Multimodal Data with Redis
- Dataprep Microservice with Neo4J
- Dataprep Microservice with PGVector
- Dataprep Microservice with Pinecone
- Dataprep Microservice with Qdrant
- Dataprep Microservice with Redis
- Dataprep Microservice with VDMS
- Multimodal Dataprep Microservice with VDMS
Embeddings Microservice¶
- Embeddings Microservice
- build Mosec endpoint docker image
- Embedding Server
- Multimodal Embeddings Microservice
- Multimodal CLIP Embeddings Microservice
- Embedding Generation Prediction Guard Microservice
- 🚀 Start Microservice with Docker
- 🚀 Consume Embeddings Service
- Embeddings Microservice with Langchain TEI
- Embeddings Microservice with Llama Index TEI
Feedback_management Microservice¶
Finetuning Microservice¶
Guardrails Microservice¶
Intent_detection Microservice¶
Knowledgegraphs Microservice¶
Llms Microservice¶
- TGI FAQGen LLM Microservice
- Document Summary TGI Microservice
- LLM Microservice
- LLM Native Microservice
- LLM Native Microservice
- Introduction
- Introduction
- Get Started
- Consume the Prediction Guard Microservice
- TGI LLM Microservice
- vLLM Endpoint Service
- vLLM Endpoint Service
- VLLM-Ray Endpoint Service
- LM-Eval Microservice