Docker Images

A list of released OPEA docker images in https://hub.docker.com/, contains all relevant images from the GenAIExamples, GenAIComps and GenAIInfra projects. Please expect more public available images in the future release.

Take ChatQnA for example. ChatQnA is a chatbot application service based on the Retrieval Augmented Generation (RAG) architecture. It consists of opea/embedding-tei, opea/retriever-redis, opea/reranking-tei, opea/llm-tgi, opea/dataprep-redis, opea/chatqna, opea/chatqna-ui and opea/chatqna-conversation-ui (Optional) multiple microservices. Other services are similar, see the corresponding README for details.

Example images

Example Images

Dockerfile

Description

opea/audioqna

Link

The docker image served as a chatqna gateway and using language modeling to generate answers to user queries by converting audio input to text, and then using text-to-speech (TTS) to convert those answers back to speech for interaction.

opea/chatqna

Link

The docker image served as a chatqna gateway and interacted with users by understanding their questions and providing relevant answers.

opea/chatqna-ui

Link

The docker image acted as the chatqna UI entry for facilitating interaction with users for question answering

opea/chatqna-conversation-ui

Link

The purpose of the docker image is to provide a user interface for chat-based Q&A using React. It allows for interaction with users and supports continuing conversations with a history that is stored in the browser’s local storage.

opea/chatqna-guardrails

Link

This docker image is used to encapsulate chatqna’s LLM service to secure model inputs and outputs. Guardrails proactively prevents models from interacting with insecure content and signals in time to stop insecure behavior.

opea/chatqna-without-rerank

Link

This docker image is used as a chatqna-without-rerank gateway to provide the chatqna service without rerank to Xeon customers who cannot run the Rerank service on HPUs, but need high performance and accuracy.

opea/chatqna-no-wrapper

Link

The docker image serves as chatqna-no-wrapper  gateway with no_wrapper benchmark support.

opea/chatqna-no-wrapper-without-rerank

Link

The docker image serves as chatqna-no-wrapper-without-rerank gateway with no_wrapper benchmark support.

opea/codegen

Link

The docker image served as the codegen gateway to provide service of the automatic creation of source code from a higher-level representation

opea/codegen-ui

Link

The docker image acted as the codegen UI entry for facilitating interaction with users for automatically generating code from user’s description

opea/codegen-react-ui

Link

The purpose of the docker image is to provide a user interface for Codegen using React. It allows generating the appropriate code based on the current user input.

opea/codetrans

Link

The docker image served as a codetrans gateway to provide service of converting source code written in one programming language into an equivalent version in another programming language

opea/codetrans-ui

Link

The docker image acted as the codetrans UI entry for facilitating interaction with users for translating one programming language to another one

opea/doc-index-retriever

Link

The docker image acts as a DocRetriever gateway, It uses different methods to match user queries with a set of free text records.

opea/docsum

Link

The docker image served as a docsum gateway to provide service of capturing the main points and essential details of the original text

opea/docsum-ui

Link

The docker image acted as the docsum UI entry for facilitating interaction with users for document summarization

opea/docsum-react-ui

Link

The purpose of the docker image is to provide a user interface for document summary using React. It allows upload a file or paste text and then click on “Generate Summary” to get a condensed summary of the generated content and automatically scroll to the bottom of the summary.

opea/faqgen

Link

The docker image served as a faqgen gateway and automatically generating comprehensive, natural sounding Frequently Asked Questions (FAQs) from documents, legal texts, customer inquiries and other sources.

opea/faqgen-ui

Link

The docker image serves as the docsum UI entry point for easy interaction with users, generating FAQs by pasting in question text.

opea/faqgen-react-ui

Link

The purpose of the docker image is to provide a user interface for Generate FAQs using React. It allows generating FAQs by uploading files or pasting text.

opea/multimodalqna

Link

The docker image served as a multimodalqna gateway and dynamically fetches the most relevant multimodal information (frames, transcripts, and/or subtitles) from the user’s video collection to solve the problem.

opea/multimodalqna-ui

Link

The docker image serves as the docsum UI entry point for easy interaction with users. Answers to questions are generated from videos uploaded by users..

opea/productivity-suite-react-ui-server

Link

The purpose of the docker image is to provide a user interface for Productivity Suite Application using React. It allows interaction by uploading documents and inputs.

opea/searchqna

Link

The docker image served as the searchqna gateway to provide service of retrieving accurate and relevant answers to user queries from a knowledge base or dataset

opea/searchqna-ui

Link

The docker image acted as the searchqna UI entry for facilitating interaction with users for question answering

opea/translation

Link

The docker image served as the translation gateway to provide service of language translation

opea/translation-ui

Link

The docker image acted as the translation UI entry for facilitating interaction with users for language translation

opea/videoqna

Link

The docker image acts as videoqna gateway, interacting with the user by retrieving videos based on user prompts

opea/videoqna-ui

Link

The docker image serves as the user interface entry point for the videoqna, facilitating interaction with the user and retrieving the video based on user prompts.

opea/visualqna

Link

The docker image acts as a videoqna gateway, outputting answers in natural language based on a combination of images and questions

opea/visualqna-ui

Link

The docker image serves as the user interface portal for VisualQnA, facilitating interaction with the user and outputting answers in natural language based on a combination of images and questions from the user.

Microservice images

Microservice Images

Dockerfile

Description

opea/agent-langchain

Link

The docker image exposed the OPEA agent microservice for GenAI application use

opea/asr

Link

The docker image exposed the OPEA Audio-Speech-Recognition microservice for GenAI application use

opea/chathistory-mongo-server

Link

The docker image exposes OPEA Chat History microservice which based on MongoDB database, designed to allow user to store, retrieve and manage chat conversations

opea/dataprep-milvus

Link

The docker image exposed the OPEA dataprep microservice based on milvus vectordb for GenAI application use

opea/dataprep-multimodal-vdms

Link

This docker image exposes an OPEA dataprep microservice based on a multi-modal VDMS for use by GenAI applications.

opea/dataprep-multimodal-redis

Link

This docker image exposes an OPEA dataprep microservice based on a multi-modal redis for use by GenAI applications.

opea/dataprep-on-ray-redis

Link

The docker image exposed the OPEA dataprep microservice based on redis vectordb and optimized ray for GenAI application use

opea/dataprep-pgvector

Link

The docker image exposed the OPEA dataprep microservice based on pgvector vectordb for GenAI application use

opea/dataprep-pinecone

Link

The docker image exposed the OPEA dataprep microservice based on pincone vectordb for GenAI application use

opea/dataprep-qdrant

Link

The docker image exposed the OPEA dataprep microservice based on qdrant vectordb for GenAI application use

opea/dataprep-redis

Link

The docker image exposed the OPEA dataprep microservice based on redis vectordb Langchain framework for GenAI application use

opea/dataprep-redis-llama-index

Link

The docker image exposed the OPEA dataprep microservice based on redis vectordb LlamaIndex framework for GenAI application use

opea/dataprep-vdms

Link

This docker image exposes an OPEA dataprep microservice based on VDMS vectordb for use by GenAI applications.

opea/embedding-langchain-mosec

Link

The docker image exposed the OPEA mosec embedding microservice base on Langchain framework for GenAI application use

opea/embedding-langchain-mosec-endpoint

Link

The docker image exposed the OPEA mosec embedding endpoint microservice base on Langchain framework for GenAI application use

opea/embedding-multimodal-clip

Link

The docker image exposes OPEA multimodal CLIP-based embedded microservices for use by GenAI applications

opea/embedding-multimodal

Link

The docker image exposes OPEA multimodal embedded microservices for use by GenAI applications

opea/embedding-multimodal-bridgetower

Link

The docker image exposes OPEA multimodal embedded microservices based on bridgetower for use by GenAI applications

opea/embedding-multimodal-bridgetower-gaudi

Link

The docker image exposes OPEA multimodal embedded microservices based on bridgetower for use by GenAI applications on the Gaudi

opea/embedding-tei

Link

The docker image exposed the OPEA embedding microservice upon tei docker image for GenAI application use

opea/embedding-tei-llama-index

Link

The docker image exposed the OPEA embedding microservice upon tei docker image base on LlamaIndex framework for GenAI application use

opea/feedbackmanagement

Link

The docker image exposes that the OPEA feedback management microservice uses a MongoDB database for GenAI applications.

opea/finetuning

Link

The docker image exposed the OPEA Fine-tuning microservice for GenAI application use

opea/finetuning-gaudi

Link

The docker image exposed the OPEA Fine-tuning microservice for GenAI application use on the Gaudi

opea/gmcrouter

Link

The docker image served as one of key parts of the OPEA GenAI Microservice Connector(GMC) to route the traffic among the microservices defined in GMC

opea/gmcmanager

Link

The docker image served as one of key parts of the OPEA GenAI Microservice Connector(GMC) to be controller manager to handle GMC CRD

opea/guardrails-tgi

Link

The docker image exposed the OPEA guardrail microservice to provide content review for GenAI application use

opea/guardrails-toxicity-detection

Link

The docker image exposed the OPEA guardrail microservice to provide toxicity detection for GenAI application use

opea/guardrails-pii-detection

Link

The docker image exposed the OPEA guardrail microservice to provide PII detection for GenAI application use

opea/llm-docsum-tgi

Link

This docker image is designed to build a document summarization microservice using the HuggingFace Text Generation Inference(TGI) framework. The microservice accepts document input and generates a document summary.

opea/llm-faqgen-tgi

Link

This docker image is designed to build a frequently asked questions microservice using the HuggingFace Text Generation Inference(TGI) framework. The microservice accepts document input and generates a FAQ.

opea/llm-native

Link

The docker image exposed the OPEA LLM microservice based on native for GenAI application use

opea/llm-ollama

Link

The docker image exposed the OPEA LLM microservice based on ollama for GenAI application use

opea/llm-tgi

Link

The docker image exposed the OPEA LLM microservice upon TGI docker image for GenAI application use

opea/llm-vllm

Link

The docker image exposed the OPEA LLM microservice upon vLLM docker image for GenAI application use

opea/llm-vllm-hpu

Link

The docker image exposed the OPEA LLM microservice upon vLLM docker image for use by GenAI apps on the Gaudi

opea/llm-vllm-llamaindex

Link

This docker image exposes OPEA LLM microservices to the llamaindex framework’s vLLM Docker image for use by GenAI applications

opea/llm-vllm-llamaindex-hpu

Link

This docker image exposes OPEA LLM microservices to the llamaindex framework’s vLLM Docker image for use by GenAI applications on the gaudi

opea/llm-vllm-ray

Link

The docker image exposes the OPEA LLM microservices Ray-based upon the vLLM Docker image for GenAI application use

opea/llm-vllm-ray-hpu

Link

The docker image exposes Ray-based OPEA LLM microservices upon the vLLM Docker image for use by GenAI applications on the Gaudi

opea/llava-hpu

Link

The docker image exposed the OPEA microservice running LLaVA as a large visual model (LVM) service for GenAI application use on the Gaudi

opea/lvm-tgi

Link

This docker image is designed to build a large visual model (LVM) microservice using the HuggingFace Text Generation Inference(TGI) framework. The microservice accepts document input and generates a answer to question.

opea/lvm-llava

Link

The docker image exposed the OPEA microservice running LLaVA as a large visual model (LVM) server for GenAI application use

opea/lvm-llava-svc

Link

The docker image exposed the OPEA microservice running LLaVA as a large visual model (LVM) service for GenAI application use

opea/lvm-video-llama

Link

The docker image exposed the OPEA microservice running Video-Llama as a large visual model (LVM) for GenAI application use

opea/nginx

Link

The docker image exposed the OPEA nginx microservice for GenAI application use

opea/promptregistry-mongo-server

Link

The docker image exposes the OPEA Prompt Registry microservices which based on MongoDB database, designed to store and retrieve user’s preferred prompts

opea/reranking-videoqna

Link

The docker image exposed the OPEA reranking microservice for reranking the results of VideoQnA use casesfor GenAI application use

opea/reranking-fastrag

Link

The docker image exposed the OPEA reranking microservice base on fastrag for GenAI application use

opea/reranking-langchain-mosec

Link

The docker image exposed the OPEA mosec reranking microservice base on Langchain framework for GenAI application use

opea/reranking-langchain-mosec-endpoint

Link

The docker image exposed the OPEA mosec reranking endpoint microservice base on Langchain framework for GenAI application use

opea/reranking-tei

Link

The docker image exposed the OPEA reranking microservice based on tei docker image for GenAI application use

opea/retriever-milvus

Link

The docker image exposed the OPEA retrieval microservice based on milvus vectordb for GenAI application use

opea/retriever-multimodal-redis

Link

The docker image exposed the OPEA retrieval microservice based on multimodal redis vectordb for GenAI application use

opea/retriever-pathway

Link

The docker image exposed the OPEA retrieval microservice with pathway for GenAI application use

opea/retriever-pgvector

Link

The docker image exposed the OPEA retrieval microservice based on pgvector vectordb for GenAI application use

opea/retriever-pinecone

Link

The docker image exposed the OPEA retrieval microservice based on pinecone vectordb for GenAI application use

opea/retriever-qdrant

Link

The docker image exposed the OPEA retrieval microservice based on qdrant vectordb for GenAI application use

opea/retriever-redis

Link

The docker image exposed the OPEA retrieval microservice based on redis vectordb for GenAI application use

opea/retriever-redis-llamaindex

Link

The docker image exposed the OPEA retriever service based on LlamaIndex for GenAI application use

opea/retriever-vdms

Link

The docker image exposed the OPEA retriever service based on Visual Data Management System for GenAI application use

opea/speecht5

Link

The docker image exposed the OPEA SpeechT5 service for GenAI application use

opea/speecht5-gaudi

Link

The docker image exposed the OPEA SpeechT5 service on Gaudi2 for GenAI application use

opea/tei-gaudi

Link

The docker image powered by HuggingFace Text Embedding Inference (TEI) on Gaudi2 for deploying and serving Embedding Models

opea/vectorstore-pathway

Link

The docker image exposed the OPEA Vectorstores microservice with Pathway for GenAI application use

opea/video-llama-lvm-server

Link

The docker image exposed the OPEA microservice running Video-Llama as a large visual model (LVM) server for GenAI application use

opea/tts

Link

The docker image exposed the OPEA Text-To-Speech microservice for GenAI application use

opea/vllm

Link

The docker image powered by vllm-project for deploying and serving vllm Models

opea/vllm-openvino

Link

The docker image powered by vllm-project for deploying and serving vllm Models of the Openvino Framework

opea/web-retriever-chroma

Link

The docker image exposed the OPEA retrieval microservice based on chroma vectordb for GenAI application use

opea/whisper

Link

The docker image exposed the OPEA Whisper service for GenAI application use

opea/whisper-gaudi

Link

The docker image exposed the OPEA Whisper service on Gaudi2 for GenAI application use