AI Providers

Deploy your own LLMs on JFrog ML or integrate with your preferred AI providers

Overview

JFrog ML lets you deploy your own open-source LLMs available in the JFrog ML Model Library, or integrate with external AI providers such as OpenAI.

When managing, testing, and deploying your prompts in production, seamlessly use models deployed on JFrog ML or integrate with your preferred AI providers.

📘

Available Integrations

Currently supported integrations include JFrog ML Model Library and OpenAI. Additional integrations will be provided in future releases.

OpenAI Integration

To integrate JFrog ML prompts with OpenAI, navigate to Integrations under Settings -> Integrations -> OpenAI and paste your OpenAI API key from the OpenAI platform.

This enables you to use your OpenAI account directly within the JFrog ML platform. Once connected, you can experiment from the prompt playground or invoke prompt responses using our Prompt SDK to use prompts from any type of deployment.


What’s Next