# OPEA Release Notes v1.5
We are excited to announce the release of OPEA version 1.5, which includes significant contributions from the open-source community.
More information about how to get started with OPEA v1.5 can be found on the [Getting Started](https://opea-project.github.io/latest/index.html) page. All project source code is maintained in the [opea-project organization](https://github.com/opea-project). To pull Docker images, please access the [Docker Hub](https://hub.docker.com/u/opea). For instructions on deploying Helm Charts, please refer to the [guide](https://github.com/opea-project/GenAIInfra/tree/v1.5/helm-charts#readme).
## Table of Contents
- [OPEA Release Notes v1.5](#opea-release-notes-v15)
- [Table of Contents](#table-of-contents)
- [What's New in OPEA v1.5](#whats-new-in-opea-v15)
- [GenAI Examples](#genai-examples)
- [GenAI Microservices](#genai-microservices)
- [Productization](#productization)
- [Validated Hardware](#validated-hardware)
- [Validated Software](#validated-software)
- [Full Changelogs](#full-changelogs)
- [Contributors](#contributors)
- [Contributing Organizations](#contributing-organizations)
- [Individual Contributors](#individual-contributors)
## What's New in OPEA v1.5
This release includes new features, optimizations, and user-focused updates.
### GenAI Examples
- Browser-use Agent: a new use case to empower anyone to automate repetitive web tasks. It controls your web browser to perform tasks like visiting websites and extracting data. ([GenAIExamples#2312](https://github.com/opea-project/GenAIExamples/pull/2312))
- Arbitration Post Hearing Assistant Application: a new use case designed to process and summarize post-hearing transcripts or arbitration-related documents. ([GenAIExamples#2309](https://github.com/opea-project/GenAIExamples/pull/2309))
- Polylingua Translation Service: a new use case for translation. ([GenAIComps#2298](https://github.com/opea-project/GenAIExamples/pull/2298))
- OpenAI-Compatible Endpoint Support: ChatQnA now supports OpenAI API-Compatible endpoints. ([GenAIComps#2091](https://github.com/opea-project/GenAIExamples/pull/2091))
### GenAI Microservices
- Text2Query: a specialized, independent service designed to translate natural language queries into structured query languages. ([GenAIComps#1931](https://github.com/opea-project/GenAIComps/pull/1931))
- Arbitration Post-Hearing: a new microservice for Arbitration Post-Hearing with LLM-Based Entity Extraction. ([GenAIComps#1938](https://github.com/opea-project/GenAIComps/pull/1938))
- FunASR/paraformer: Add a FunASR toolkit-based backend to ASR microservice to support Paraformer, a non-autoregressive end-to-end speech recognition model. ([GenAIComps#1914](https://github.com/opea-project/GenAIComps/pull/1914))
- LLM Scaler: Boosted LLM/LVM performance on ARC GPU by llm-scaler-vllm v0.10.0-b4. ([GenAIComps#1914](https://github.com/opea-project/GenAIComps/pull/1941))
- openEuler OS Support: Enabling openEuler OS Support for OPEA Components. ([GenAIComps#1813](https://github.com/opea-project/GenAIComps/pull/1813), [GenAIComps#](https://github.com/opea-project/GenAIComps/pull/1875), [GenAIComps#](https://github.com/opea-project/GenAIComps/pull/1879), [GenAIComps#](https://github.com/opea-project/GenAIComps/pull/1913), [GenAIComps#](https://github.com/opea-project/GenAIComps/pull/1913))
- MCP Compliance: Enabled MCP server for some of the OPEA components. ([GenAIComps#1849](https://github.com/opea-project/GenAIComps/pull/1849), [GenAIComps#](https://github.com/opea-project/GenAIComps/pull/1855))
- OPEA Store: Enhanced data access using OPEA Store for ChatHistory, FeedbackManagement, and PromptRegistry. ([GenAIComps#1916](https://github.com/opea-project/GenAIComps/pull/1916))
### Productization
- Enhanced Monitoring: Added monitoring for 8 key GenAI Examples. ([GenAIExamples#2316](https://github.com/opea-project/GenAIExamples/pull/2316),[GenAIExamples#](https://github.com/opea-project/GenAIExamples/pull/2318),[GenAIExamples#](https://github.com/opea-project/GenAIExamples/pull/2319),[GenAIExamples#](https://github.com/opea-project/GenAIExamples/pull/2322))
- GenAIStudio: Added support for drag-and-drop creation of fine-tuning applications. ([GenAIStudio#74](https://github.com/opea-project/GenAIStudio/pull/74), [GenAIStudio#](https://github.com/opea-project/GenAIStudio/pull/75))
- One-click Deployment: Enabled openEuler OS support for one-click deployment. ([GenAIExamples#2267](https://github.com/opea-project/GenAIExamples/pull/2267))
- Documentation Refinement: Refined READMEs for all the components to help readers easily locate documentation tailored to deployment, customization, and hardware.
## Validated Hardware
- Intel® Gaudi® AI Accelerators (2nd)
- Intel® Xeon® Scalable processor (3rd)
## Validated Software
- Docker version 28.5.1
- Docker Compose version v2.40.3
- Intel® Gaudi® software and drivers [v1.22.1](https://docs.habana.ai/en/v1.22.1/Installation_Guide/)
- TEI v1.7
- TGI v2.4.0 (Xeon), v2.3.1 (Gaudi)
- Ubuntu 22.04
- vLLM v0.10.1 (Xeon), opea/vllm-gaudi:1.22.0 (Gaudi)
## Full Changelogs
- GenAIExamples: [v1.4...v1.5](https://github.com/opea-project/GenAIExamples/compare/v1.4...v1.5)
- GenAIComps: [v1.4...v1.5](https://github.com/opea-project/GenAIComps/compare/v1.4...v1.5)
- GenAIInfra: [v1.4...v1.5](https://github.com/opea-project/GenAIInfra/compare/v1.4...v1.5)
- GenAIEval: [v1.4...v1.5](https://github.com/opea-project/GenAIEval/compare/v1.4...v1.5)
- GenAIStudio: [v1.4...v1.5](https://github.com/opea-project/GenAIStudio/compare/v1.4...v1.5)
- docs: [v1.4...v1.5](https://github.com/opea-project/docs/compare/v1.4...v1.5)
## Contributors
This release would not have been possible without the contributions of the following organizations and individuals.
### Contributing Organizations
- `Bud`: Polylingua Translation Service, Components as MCP Servers.
- `Intel`: Development and improvements to GenAI examples, components, infrastructure, evaluation, and studio.
- `openEuler`: openEuler OS support.
- `Zensar`: Arbitration Post Hearing Assistant.
### Individual Contributors
For a comprehensive list of individual contributors, please refer to the [Full Changelogs](#full-changelogs) section.