← Back to Blog
Architecture

Bring Your Own Keys: The Complete API Key Guide for HammerLockAI

HammerLock Research Desk 5 min read

Most AI products make a simple proposition: pay us, we handle everything. Your subscription covers the model access, the compute, the infrastructure. You get a clean interface, a monthly bill, and no visibility into what's happening underneath.

HammerLockAI's Bring Your Own Keys (BYOK) model is a different proposition: connect your own accounts directly to the providers you choose, pay them at their published rates, and route everything through HammerLockAI's interface and privacy layer without anything passing through our infrastructure.

This guide covers how BYOK works, when to use it, how to configure each provider, and how it interacts with HammerLockAI's privacy architecture.

Why BYOK Matters

When you use a bundled credits model — paying a platform for AI access — your data routes through that platform's infrastructure before reaching the underlying model provider. The platform makes an API call on your behalf, which means your query passes through their servers.

With BYOK, the routing is direct: HammerLockAI's runtime makes API calls from your device to the provider using your credentials. There's no intermediary server handling your data. The provider sees a call from your API key, not from a shared platform account.

Combined with HammerLockAI's PII anonymization layer — which scrubs personal identifiers before any query leaves your device — BYOK gives you a configuration where your data travels from your device, anonymized, directly to the provider of your choice, and the response comes back to your device, where it's encrypted and stored locally.

That's a meaningfully different data flow than a typical AI platform.

Supported Providers and Key Setup

HammerLockAI supports BYOK for every major cloud AI provider. Here's where to get keys for each:

OpenAI (GPT-4o, GPT-4o-mini) Platform.openai.com → API → API Keys → Create new secret key. Set usage limits in the Billing section to avoid surprises. GPT-4o is the full-capability model; GPT-4o-mini is faster and cheaper for simpler queries.

Anthropic (Claude Sonnet) Console.anthropic.com → API Keys → Create Key. Anthropic's Claude Sonnet models are especially strong for reasoning-intensive tasks — legal analysis, structured research, complex writing. If you're using HammerLockAI's Counsel or Analyst agents for high-stakes work, an Anthropic key is worth having.

Google (Gemini Flash, Gemini Pro) Aistudio.google.com → Get API key. Gemini Flash is Google's fastest model; Gemini Pro handles longer context and more complex tasks. Google's models are notably strong for document-heavy workflows.

Groq Console.groq.com → API Keys → Create API Key. Groq runs open-weight models (Llama 3.3 70B and others) on custom hardware optimized for inference speed. It's often the fastest provider in a racing scenario — useful when latency matters more than raw capability.

Mistral Console.mistral.ai → API Keys. Mistral Small is a strong, efficient model; Mistral Large handles complex tasks. European infrastructure if data residency matters to your compliance requirements.

DeepSeek Platform.deepseek.com → API Keys. DeepSeek's models are competitive on reasoning benchmarks at lower cost than OpenAI or Anthropic. Worth including for general-purpose queries where cost efficiency matters.

Entering Keys in HammerLockAI

Keys are stored locally in your encrypted vault — they never leave your device and are never transmitted to HammerLock's servers. In the operator console, navigate to Settings → API Keys, enter each key for the providers you want to enable, and save. Keys are encrypted using the same AES-256 mechanism that protects your conversations and vault.

You can enable as many or as few providers as you want. BYOK and bundled credits can coexist: HammerLockAI will use your own keys when configured for a provider and fall back to bundled credits for providers where you haven't supplied a key.

Cost Comparison: Credits vs. BYOK

HammerLockAI's Pro plan includes 1,000 monthly cloud AI credits, with Booster and Power packs available if you need more. Credits use HammerLock's pooled API access, routed through our infrastructure.

BYOK bypasses credits entirely. You pay providers directly at their published token rates. For high-volume use — multiple research sessions per day, frequent long-form document analysis — BYOK is almost always cheaper than credits. For occasional use, credits are simpler.

A rough calibration: if you're spending more than a few hours per week in active HammerLockAI sessions, run the math on direct provider costs against your credit consumption. At current API pricing, heavy professional users typically find BYOK significantly cheaper at the Pro usage level.

The other advantage of BYOK: no monthly credit cycle. You're not trying to manage a credit budget or worrying about overage. Your API accounts are always-on, capacity is whatever your account limits allow, and the bill arrives from the provider at the end of the month.

BYOK in the Racing Architecture

When you have multiple API keys configured, HammerLockAI's provider racing works across all of them. A query can race simultaneously across your OpenAI key, your Anthropic key, your Groq key, and any other configured providers. The fastest response wins; the others are cancelled.

This means your BYOK configuration directly affects racing performance. More keys = more providers in the race = higher probability that the fastest one wins on any given query. For maximum performance, configure all six supported providers with your own keys.

Rate limits apply per account. If your OpenAI account is rate-limited, the race is won by another provider — automatic failover at the account level, not just the provider level.

The Ollama Option: Zero-Cost Local Models

Every BYOK configuration should include Ollama as the local fallback layer. Ollama itself is free and open-source. Models — Llama 3.1, Mistral, Phi, Gemma — are downloaded once and run on your hardware at zero marginal cost per query.

For work that doesn't require the capabilities of frontier cloud models (drafting, brainstorming, summarization of moderate complexity), local Ollama models handle the load at no cost. Cloud providers — whether BYOK or credits — get invoked when you need maximum capability.

A well-configured BYOK setup might route 40–60% of queries to local Ollama, 20–30% to a fast provider like Groq, and the remainder to capability-heavy providers like GPT-4o or Claude Sonnet. The result is a cost structure significantly lower than relying on cloud APIs for everything.

API Key Security

Your API keys represent real money and real access. A few practices worth following:

Set spending limits on every provider account. OpenAI, Anthropic, and most others allow you to cap monthly spend at the account level. Set limits that reflect realistic usage — don't leave accounts with no cap.

Rotate keys periodically. If you suspect a key has been compromised, regenerate it immediately from the provider's console and update it in HammerLockAI. Revocation is instant.

Understand that HammerLockAI stores your keys in your local encrypted vault. The encryption password you set at install is the only thing standing between your device and your keys. Use a strong password.

The Bottom Line

BYOK is for professionals who want maximum control over their AI stack. Direct provider relationships, direct billing, no data routing through intermediaries, full access to the racing and failover architecture across all configured accounts, and a cost structure that scales efficiently with heavy use.

If you're using HammerLockAI as a serious professional tool — not a casual assistant — BYOK is the configuration that makes it perform like one.


Need help configuring a specific provider? Reach us at info@hammerlockai.com

HammerLockAI is built on a fork of OpenClaw, the open-source agentic AI runtime. View the source on GitHub →