
Evaluating Oracle OCI Generative AI Models
You can set up Openlayer tests to evaluate your Oracle OCI Generative AI models in development and monitoring.Development
In development mode, Openlayer becomes a step in your CI/CD pipeline, and your tests get automatically evaluated after being triggered by some events. Openlayer tests often rely on your AI system’s outputs on a validation dataset. As discussed in the Configuring output generation guide, you have two options:- either provide a way for Openlayer to run your AI system on your datasets, or
- before pushing, generate the model outputs yourself and push them alongside your artifacts.
OCI_USER_ID, OCI_FINGERPRINT, OCI_TENANCY_ID, OCI_REGION, and OCI_KEY_FILE or configure your OCI config file appropriately.
Monitoring
To use the monitoring mode, you must set up a way to publish the requests your AI system receives to the Openlayer platform. This process is streamlined for Oracle OCI Generative AI models. To set it up, you must follow the steps in the code snippet below:Python
See full Python example
Token Estimation: Some Oracle OCI Generative AI models do not return usage details including total tokens processed in their responses. When this happens, Openlayer can estimate token counts using a rule of thumb (string length divided by 3/4).The
trace_oci_genai() function accepts an optional estimate_tokens parameter:estimate_tokens=True(default): Estimates token counts when not provided by OCI responseestimate_tokens=False: ReturnsNonefor token fields when not available in the response
If the Oracle OCI Generative AI call is just one of the steps of your AI
system, you can use the code snippets above together with
tracing. In this case, your Oracle OCI calls get added
as a step of a larger trace. Refer to the Tracing guide
for details.
Benefits of Oracle OCI Integration
By integrating Openlayer with Oracle OCI Generative AI, you get:- Comprehensive Observability: Monitor your Oracle-hosted models with detailed metrics and traces
- Cost Tracking: Track usage and costs across your Oracle OCI Generative AI deployments
- Performance Monitoring: Monitor latency, token usage, and model performance
- Quality Assurance: Run automated tests to ensure your models maintain quality standards
- Easy Setup: Simple integration with just a few lines of code
Supported Oracle OCI Models
The integration supports all Oracle OCI Generative AI models, including:- Cohere Command models (command-r-plus, command-r, etc.)
- Meta Llama models
- Other models available through Oracle OCI Generative AI service
Make sure your OCI configuration is properly set up with the necessary
permissions to access the Generative AI service in your compartment.

