Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adapt example code for guardrails refactor #1360

Merged
merged 9 commits into from
Jan 8, 2025
1 change: 1 addition & 0 deletions .github/workflows/_run-docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -134,6 +134,7 @@ jobs:
SERVING_TOKEN: ${{ secrets.SERVING_TOKEN }}
IMAGE_REPO: ${{ inputs.registry }}
IMAGE_TAG: ${{ inputs.tag }}
opea_branch: "lvl/refator_guardrails"
example: ${{ inputs.example }}
hardware: ${{ inputs.hardware }}
test_case: ${{ matrix.test_case }}
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/pr-docker-compose-e2e.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
name: E2E test with docker compose

on:
pull_request_target:
pull_request:
branches: ["main", "*rc"]
types: [opened, reopened, ready_for_review, synchronize] # added `ready_for_review` since draft is skipped
paths:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@ jobs:
run: |
cd ..
git clone https://github.com/opea-project/GenAIComps.git
cd GenAIComps && git checkout lvl/refator_guardrails

- name: Check for Missing Dockerfile Paths in GenAIComps
run: |
Expand Down
4 changes: 2 additions & 2 deletions ChatQnA/docker_compose/intel/hpu/gaudi/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ docker build --no-cache -t opea/dataprep-redis:latest --build-arg https_proxy=$h
To fortify AI initiatives in production, Guardrails microservice can secure model inputs and outputs, building Trustworthy, Safe, and Secure LLM-based Applications.

```bash
docker build -t opea/guardrails-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/guardrails/llama_guard/langchain/Dockerfile .
docker build -t opea/guardrails:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/guardrails/src/guardrails/Dockerfile .
```

### 4. Build MegaService Docker Image
Expand Down Expand Up @@ -168,7 +168,7 @@ If Conversation React UI is built, you will find one more image:

If Guardrails docker image is built, you will find one more image:

- `opea/guardrails-tgi:latest`
- `opea/guardrails:latest`

## 🚀 Start MicroServices and MegaService

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -51,8 +51,8 @@ services:
ipc: host
command: --model-id ${GURADRAILS_MODEL_ID} --max-input-length 1024 --max-total-tokens 2048
guardrails:
image: ${REGISTRY:-opea}/guardrails-tgi:${TAG:-latest}
container_name: guardrails-tgi-gaudi-server
image: ${REGISTRY:-opea}/guardrails:${TAG:-latest}
container_name: guardrails-gaudi-server
ports:
- "9090:9090"
ipc: host
Expand Down
6 changes: 3 additions & 3 deletions ChatQnA/docker_image_build/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -95,12 +95,12 @@ services:
dockerfile: comps/dataprep/pinecone/langchain/Dockerfile
extends: chatqna
image: ${REGISTRY:-opea}/dataprep-pinecone:${TAG:-latest}
guardrails-tgi:
guardrails:
build:
context: GenAIComps
dockerfile: comps/guardrails/llama_guard/langchain/Dockerfile
dockerfile: comps/guardrails/src/guardrails/Dockerfile
extends: chatqna
image: ${REGISTRY:-opea}/guardrails-tgi:${TAG:-latest}
image: ${REGISTRY:-opea}/guardrails:${TAG:-latest}
vllm:
build:
context: vllm
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -758,7 +758,7 @@ spec:
runAsUser: 1000
seccompProfile:
type: RuntimeDefault
image: "opea/guardrails-tgi:latest"
image: "opea/guardrails:latest"
imagePullPolicy: Always
ports:
- name: guardrails-usvc
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -688,7 +688,7 @@ spec:
runAsUser: 1000
seccompProfile:
type: RuntimeDefault
image: "opea/guardrails-tgi:latest"
image: "opea/guardrails:latest"
imagePullPolicy: Always
ports:
- name: guardrails-usvc
Expand Down
4 changes: 2 additions & 2 deletions ChatQnA/tests/test_compose_guardrails_on_gaudi.sh
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ function build_docker_images() {
git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../

echo "Build all the images with --no-cache, check docker_image_build.log for details..."
service_list="chatqna-guardrails chatqna-ui dataprep-redis retriever-redis guardrails-tgi nginx"
service_list="chatqna-guardrails chatqna-ui dataprep-redis retriever-redis guardrails nginx"
docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log

docker pull ghcr.io/huggingface/tgi-gaudi:2.0.6
Expand Down Expand Up @@ -136,7 +136,7 @@ function validate_microservices() {
"${ip_address}:9090/v1/guardrails" \
"Violated policies" \
"guardrails" \
"guardrails-tgi-gaudi-server" \
"guardrails-gaudi-server" \
'{"text":"How do you buy a tiger in the US?"}'
}

Expand Down
6 changes: 3 additions & 3 deletions docker_images_list.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,9 +65,9 @@ Take ChatQnA for example. ChatQnA is a chatbot application service based on the
| [opea/finetuning-gaudi](https://hub.docker.com/r/opea/finetuning-gaudi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/finetuning/Dockerfile.intel_hpu) | The docker image exposed the OPEA Fine-tuning microservice for GenAI application use on the Gaudi |
| [opea/gmcrouter](https://hub.docker.com/r/opea/gmcrouter) | [Link](https://github.com/opea-project/GenAIInfra/blob/main/microservices-connector/Dockerfile.manager) | The docker image served as one of key parts of the OPEA GenAI Microservice Connector(GMC) to route the traffic among the microservices defined in GMC |
| [opea/gmcmanager](https://hub.docker.com/r/opea/gmcmanager) | [Link](https://github.com/opea-project/GenAIInfra/blob/main/microservices-connector/Dockerfile.router) | The docker image served as one of key parts of the OPEA GenAI Microservice Connector(GMC) to be controller manager to handle GMC CRD |
| [opea/guardrails-tgi](https://hub.docker.com/r/opea/guardrails-tgi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/llama_guard/langchain/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide content review for GenAI application use |
| [opea/guardrails-toxicity-detection](https://hub.docker.com/r/opea/guardrails-toxicity-detection) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/toxicity_detection/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide toxicity detection for GenAI application use |
| [opea/guardrails-pii-detection](https://hub.docker.com/r/opea/guardrails-pii-detection) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/pii_detection/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide PII detection for GenAI application use |
| [opea/guardrails](https://hub.docker.com/r/opea/guardrails) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/src/guardrails/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide content review for GenAI application use |
| [opea/guardrails-toxicity-detection](https://hub.docker.com/r/opea/guardrails-toxicity-detection) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/src/toxicity_detection/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide toxicity detection for GenAI application use |
| [opea/guardrails-pii-detection](https://hub.docker.com/r/opea/guardrails-pii-detection) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/src/pii_detection/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide PII detection for GenAI application use |
| [opea/llm-docsum-tgi](https://hub.docker.com/r/opea/llm-docsum-tgi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/llms/summarization/tgi/langchain/Dockerfile) | This docker image is designed to build a document summarization microservice using the HuggingFace Text Generation Inference(TGI) framework. The microservice accepts document input and generates a document summary. |
| [opea/llm-faqgen-tgi](https://hub.docker.com/r/opea/llm-faqgen-tgi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/llms/faq-generation/tgi/langchain/Dockerfile) | This docker image is designed to build a frequently asked questions microservice using the HuggingFace Text Generation Inference(TGI) framework. The microservice accepts document input and generates a FAQ. |
| [opea/llm-textgen](https://hub.docker.com/r/opea/llm-textgen) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/llms/src/text-generation/Dockerfile) | The docker image exposed the OPEA LLM microservice upon TGI docker image for GenAI application use |
Expand Down
Loading