OpenAI-compatible APIs
Today, we're excited to share our new AI Chat and Analytics APIs that make it easier than ever to create your own custom support or conversational experiences with Inkeep.
The APIs are designed to follow the OpenAI chat completions-API format. This allows you to use existing tools for developing LLM applications while getting all of the benefits of Inkeep's service: managed content ingestion, grounded answers, and analytics for your team.
AI Chat API
Here's a quick example using the OpenAI JavaScript SDK with Inkeep's AI Chat API. We just need to change the baseUrl and use inkeep-qa-expert
as the model.
javascript
Optionally, you can register a provideLinks
tool to get back a list of sources that were used to generate the answer. Check out this repo for a full example.
Analytics API
Log a Conversation
Our new Analytics API enables you to log any OpenAI-compatible conversation to Inkeep's Analytics service.
Let's use it on the results of the previous conversation:
javascript
Any conversation logged to this endpoint benefits from Inkeep's rich dashboard, analytics, and reporting features.
Submit Feedback
Now let's use the /feedback
endpoint to log end-user feedback:
javascript
With this, you can create and monitor your own feedback mechanism like 👍/👎 functionality.
Other primitives
The Inkeep Analytics API also provides all the primitives you need to create and monitor your LLM chat application:
- An
/events
endpoint to log analytics for custom user interactions likeanswer_copied
orchat_shared
. - A
GET /conversations
endpoint to fetch existing conversations. You can leverage this to add "Chat history" or "Share chat" functionality to your chat experience.
Get Started
Build your own chat experience today by leveraging the Chat and Analytics APIs.