Introducing our Copilot for Support Teams.
Learn more
Back
Launch week
August 9, 2024

OpenAI-compatible APIs

Nick GomezNick Gomez
OpenAI-compatible APIs

Today, we're excited to share our new AI Chat and Analytics APIs that make it easier than ever to create your own custom support or conversational experiences with Inkeep.

The APIs are designed to follow the OpenAI chat completions-API format. This allows you to use existing tools for developing LLM applications while getting all of the benefits of Inkeep's service: managed content ingestion, grounded answers, and analytics for your team.

AI Chat API

Here's a quick example using the OpenAI JavaScript SDK with Inkeep's AI Chat API. We just need to change the baseUrl and use inkeep-qa-expert as the model.

javascript
import OpenAI from 'openai';
const client = new OpenAI({
baseURL: 'https://api.inkeep.com/v1/',
apiKey: process.env.INKEEP_API_KEY,
});
// initial chat
let chat = {
messages: [
{ role: "user", content: "How do I get started with Inkeep?" }
],
};
const res = await client.chat.completions.create({
model: 'inkeep-qa-expert',
messages: chat.messages,
});
// add to convo history
chat.messages.push({
role: 'assistant',
content: res.choices[0].message.content,
});

Optionally, you can register a provideLinks tool to get back a list of sources that were used to generate the answer. Check out this repo for a full example.

Analytics API

Log a Conversation

Our new Analytics API enables you to log any OpenAI-compatible conversation to Inkeep's Analytics service.

Let's use it on the results of the previous conversation:

javascript
const loggedConv = await fetch('https://api.analytics.inkeep.com/conversations', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
type: "openai",
...messages: chat.messages
})
});

Any conversation logged to this endpoint benefits from Inkeep's rich dashboard, analytics, and reporting features.

Submit Feedback

Now let's use the /feedback endpoint to log end-user feedback:

javascript
const feedbackData = {
type: "positive",
messageId: loggedConv.messages.at(-1).id,
reasons: [{ label: "accurate_code_snippet", details: "The code worked perfectly." }]
};
fetch('https://api.analytics.inkeep.com/feedback', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify(feedbackData)
});

With this, you can create and monitor your own feedback mechanism like 👍/👎 functionality.

Other primitives

The Inkeep Analytics API also provides all the primitives you need to create and monitor your LLM chat application:

  • An /events endpoint to log analytics for custom user interactions like answer_copied or chat_shared.
  • A GET /conversations endpoint to fetch existing conversations. You can leverage this to add "Chat history" or "Share chat" functionality to your chat experience.

Get Started

Build your own chat experience today by leveraging the Chat and Analytics APIs.

Follow Inkeep product updates