Skip to main content
Enterprise NodePad plans let you connect your own model provider credentials instead of relying on NodePad’s shared infrastructure. You can supply API keys for providers like Anthropic, OpenAI, or Google, or point NodePad directly at your own inference endpoint. Either way, your data flows through your own accounts and infrastructure — not through NodePad’s.
Bring-your-own-keys and custom inference endpoints are enterprise-only features. They are not available on the cloud beta tier. Contact sales to discuss an enterprise plan.

Why bring your own keys

Keep data in your infrastructure

Requests route through your own API accounts. NodePad never stores prompts or responses on its servers when you use your own keys.

Use your negotiated pricing

If your organization has volume pricing with Anthropic, OpenAI, or Google, your NodePad usage bills against those rates — not NodePad’s retail margins.

Air-gapped environments

Connect a local Ollama instance or an internal vLLM deployment. NodePad can operate without any outbound internet traffic to third-party model providers.

Vendor flexibility

Switch providers or endpoints without changing how you use NodePad. Your canvas, threads, and workflows stay intact.

Supported inference endpoints

In addition to standard provider API keys, NodePad supports the following self-hosted and cloud inference backends:
Ollama lets you run open-weight models locally on your own machine or internal server. Point NodePad at your Ollama instance’s address and it appears as a selectable model provider across your canvas. Suitable for fully air-gapped or on-premises deployments.

How to configure your keys or endpoint

1

Open workspace settings

From any canvas, open the workspace settings panel using the menu in the top-right corner of the NodePad interface.
2

Go to the Models tab

Select the Models tab in the settings panel. You’ll see the current model configuration for your workspace.
3

Add your credentials

Enter your API key for the provider you want to connect, or supply the base URL and any required authentication details for your self-hosted endpoint.
4

Test the connection

Use the Test connection button to confirm NodePad can reach your endpoint and authenticate successfully before saving.
5

Set as default (optional)

Once connected, you can set your custom endpoint as the workspace default. All new threads will use it unless overridden at the message level.
Your API keys are encrypted at rest and never exposed to other workspace members. Workspace admins can see which providers are connected but cannot view the key values themselves.

Air-gapped deployment

If your security requirements prevent NodePad from reaching the internet entirely, combine bring-your-own-keys with NodePad’s self-hosted deployment option. You run NodePad inside your own infrastructure and point it at a local Ollama or vLLM instance — no external network calls at any layer. For more on self-hosting NodePad, see the self-hosting guide.