Bring your own AI
Pick your model.
Pay your provider. Once.
Connect your own Anthropic, OpenAI, Azure OpenAI, or any OpenAI-compatible provider, including self-hosted models. One AI bill, your compliance terms, no markup. Or skip AI entirely — the knowledge base works either way.
AI locked to one provider is AI on someone else's terms.
Most knowledge platforms with AI built in lock you to whatever provider they picked. The model is whatever they negotiated. The price is whatever they charge you on top of what they pay. The data flows through their infrastructure, under their terms.
That's fine if their choices line up with yours. It's not fine if your compliance team requires a specific provider, your finance team is already paying for an enterprise AI commitment, or your security team can't allow data to leave your tenant.
You should be able to bring your own AI. Or no AI at all.
Five ways to wire up AI
From "we'll handle it" to "we promise we never see the request." Pick what fits.
Default
Anthropic (Default)
Uses our built-in Anthropic key. Nothing to configure. AI usage draws from your plan's AI credit pool. Best for teams that just want it to work.
All plans.
BYO
Anthropic (Custom)
Connect your own Anthropic API key and pick a specific Claude model. Your contract with Anthropic, your billing, no markup from us.
Business and Enterprise.
BYO
OpenAI
Connect your own OpenAI API key. Use any GPT model OpenAI offers. Your OpenAI contract, your billing, your data processing terms with OpenAI.
Business and Enterprise.
BYO
Azure OpenAI
Connect your Azure-hosted GPT deployment. Same models as OpenAI, but governed by your Azure agreement and Microsoft's regional data residency commitments. Common in regulated industries.
Business and Enterprise.
BYO
Custom / Self-hosted
Any OpenAI-compatible API endpoint. Self-hosted vLLM or LM Studio, on-premises deployments of Llama or Mistral, internal AI gateways. For when data isn't allowed to leave your network at all.
Business and Enterprise.
No AI
None — AI off entirely
If your team can't or won't use AI, switch it off per workspace. The knowledge base, search, training, widgets, and audit trail all work fully without AI features.
All plans.
What you get with BYO
Pick your model
Claude Sonnet, Claude Opus, GPT-4, GPT-5 nano, Llama running on your own GPUs. Whatever your provider supports, you can pick.
One AI bill
If you already pay Anthropic, OpenAI, or Azure for AI elsewhere in your business, BYO means you stop double-paying. AI usage flows through your existing contract.
No markup from us
When you bring your own key, you pay your provider directly at their published rates. We don't sit in the middle adding a margin.
Your compliance terms
Data processing is governed by your contract with the provider — your DPA, your data residency, your retention policy. Useful when compliance has signed off on a specific provider only.
Self-hosted supported
For organisations that can't allow data to leave their network at all, point KnowledgeScout at an internal OpenAI-compatible endpoint. Inference happens on your infrastructure, never ours.
No vendor lock-in
Models change quickly. Switch providers in your settings without losing any of your knowledge base, training, or article history. We don't trap you in one model's ecosystem.
Or use no AI at all.
AI is one feature of the knowledge base, not a prerequisite. Search, articles, training, widgets, version history, read acknowledgements, audit trails, the whole platform works completely without AI in the loop.
Toggle AI off per workspace and the chatbot, agentic loop, and AI Writer disappear. Toggle it back on later if compliance signs off, or never. No greyed-out features, no nag prompts, no penalty for opting out.
Who this is for
Companies with existing AI agreements
If your finance team already pays for OpenAI Enterprise or has an Azure OpenAI commitment, BYO means you funnel knowledge base usage into the same contract instead of opening a second one.
Regulated industries with provider requirements
If your compliance team has approved Azure OpenAI specifically, or banned a particular provider, BYO lets you stay on the side of approved infrastructure.
Privacy-conscious or air-gapped teams
If data isn't allowed to leave your network, point KnowledgeScout at an internal OpenAI-compatible endpoint. Inference stays on your infrastructure.
Teams that want a specific model
Some teams have strong preferences (Claude for reasoning, GPT for breadth, Llama for cost). BYO means you pick exactly what you want, not what we picked.
Why KnowledgeScout's BYO is different
Five real options, including self-hosted
Most "BYO key" features mean "BYO key for the one provider we support." Ours covers Anthropic, OpenAI, Azure OpenAI, and any OpenAI-compatible endpoint. You're not boxed in.
No middleman markup
Your AI usage flows direct to your provider. We don't sit in the inference path adding a margin or re-billing you. The cost is what your provider charges, nothing more.
Drafts go to human review, regardless of provider
No matter which AI you wire up, drafts created by the AI Writer or AI agents always land in a human review queue. Nothing publishes silently. The "AI optional + drafts to review" model is the same across every provider.
Switch providers without rebuilding anything
Models evolve quickly. Switch from Anthropic to OpenAI to a self-hosted Llama in your settings, without losing any of your knowledge base, training history, or audit trail. The substrate is yours; the AI is interchangeable.
Common questions
Which AI providers are supported?
Five options. Anthropic (Default) uses our built-in Anthropic key with no configuration needed. Anthropic (Custom) lets you connect your own Anthropic API key and pick a specific model. OpenAI lets you connect your own OpenAI key and use any GPT model. Azure OpenAI lets you connect an Azure-hosted deployment. Custom / Self-hosted accepts any OpenAI-compatible API, including locally-hosted models. You pick the one that fits your compliance, cost, and model preferences.
Do I have to use AI at all?
No. AI features are opt-in per workspace. The knowledge base — search, articles, version history, audit trails, training, widgets — all works completely without AI. If your organisation has AI restrictions or your team prefers no AI in the loop, leave it off. You can turn it on later if it suits, off again if it doesn't.
How is my data processed when I bring my own provider?
When you connect your own provider key, AI requests for your tenant flow directly to that provider under your existing contract with them. KnowledgeScout doesn't sit in the middle for inference — we route the request to your provider, your provider returns the response, we pass it back. Your data processing relationship is between you and the provider, governed by their terms and any DPA you have with them.
Do I save money by using BYO?
Usually yes. With the default Anthropic provider, AI usage draws from your plan's AI credit pool, which we pay Anthropic for and meter to you. With BYO, you pay your provider directly at their published rates, with no markup from us. If you already have an AI agreement (corporate Anthropic, OpenAI Enterprise, or Azure OpenAI commitment), you avoid double-paying.
What's the difference between OpenAI and Azure OpenAI?
OpenAI runs the models on OpenAI's own infrastructure and is governed by OpenAI's terms. Azure OpenAI runs the same models on Microsoft Azure infrastructure under your existing Azure agreement and Microsoft's data residency commitments. Many regulated industries and enterprises pick Azure for the governance story, even though the model is the same.
Can I use a locally-hosted or self-hosted model?
Yes. The Custom / Self-hosted option accepts any OpenAI-compatible API endpoint. That covers self-hosted vLLM, LM Studio, Ollama with the right wrapper, on-premises deployments of open-weight models like Llama or Mistral, and other OpenAI-compatible gateways. Useful when your data isn't allowed to leave your network.
Which plans support BYO AI?
Business and Enterprise. Bring-your-own keys is included as part of the plan, no separate fee. Startup uses the default Anthropic provider with our managed AI credits, which is enough for most small teams getting started.
Your AI choice. Your bill. Your terms.
Five provider options on Business and Enterprise. AI completely off if that's what you need. The substrate stays the same either way.