# Prompt Template microservice The Prompt Template microservice dynamically generates system and user prompts based on structured inputs and document context. It supports usage in LLM pipelines to customize prompt formatting with reranked documents, conversation history, and user queries. ## Getting started ### 🚀1. Start Prompt Template Microservice with Python (Option 1) To start the Prompt Template microservice, you need to install Python packages first. #### 1.1. Install Requirements ```bash pip install -r requirements.txt ``` #### 1.2. Start Microservice ```bash python opea_prompt_template_microservice.py ``` ### 🚀2. Start Prompt Template Microservice with Docker (Option 2) #### 2.1. Build the Docker Image: Use the below docker build command to create the image: ```bash cd ../../../ docker build -t opea/prompt-template:latest -f comps/prompt_template/src/Dockerfile . ``` Please note that the building process may take a while to complete. #### 2.2. Run the Docker Container: ```bash docker run -d --name="prompt-template-microservice" \ -p 7900:7900 \ --net=host \ --ipc=host \ opea/prompt-template:latest ``` ### 3. Verify the Prompt Template Microservice #### 3.1. Check Status ```bash curl http://localhost:7900/v1/health_check \ -X GET \ -H 'Content-Type: application/json' ``` #### 3.2. Sending a Request ##### 3.2.1 Default Template Generation Generates the prompt using the default template: **Example Input** ```bash curl -X POST -H "Content-Type: application/json" -d @- http://localhost:7900/v1/prompt_template <