When you use a browser AI tool, you are giving it access to some of the most sensitive data in your digital life. Your emails, your bank accounts, your health records, your private messages, your search history — all of it passes through your browser. Most browser AI tools ask you to trust that they will not misuse this access. We think that is the wrong approach. You should not have to trust us. Tensor is designed so that trust is unnecessary because we never see your data in the first place.
Privacy is not a feature we bolted on after the fact. It is the foundational architectural decision that shapes every aspect of how Tensor works. This post explains our privacy model in detail so you can evaluate it for yourself.
No Cloud, No Servers, No Middleman
Tensor does not have a backend. There is no Tensor server receiving your data, no Tensor database storing your conversations, no Tensor API processing your requests. The extension runs entirely in your browser, and all data stays in your browser's local storage.
When you send a message to an AI model through Tensor, the request goes directly from your browser to the AI provider's API. Tensor is not a proxy. The HTTP request originates from the extension's service worker and is sent directly to api.openai.com, api.anthropic.com, generativelanguage.googleapis.com, or whichever provider you have configured. We never see the request content, the response, or even the fact that a request was made.
This is fundamentally different from most AI tools. Most competitors route your requests through their own servers so they can add value (caching, fine-tuning, analytics), manage billing (per-request charges), and collect data (for model training or business intelligence). We chose to forgo all of those capabilities in exchange for a guarantee that is simple and absolute: your data never touches our infrastructure.
Your API Keys, Your Relationship
Tensor requires you to bring your own API keys from the AI providers you want to use. This means you have a direct billing relationship with OpenAI, Anthropic, Google, DeepSeek, or whichever provider you choose. You pay them directly. You control your usage limits. You can revoke access at any time.
Your API keys are stored in your browser's local storage, encrypted with a key derived from a password you set. They are never transmitted anywhere except to the AI provider they belong to. Not to us, not to analytics services, not to any third party. If you inspect Tensor's network activity with Chrome DevTools, you will see zero requests to any Tensor-owned domain.
We considered building a "managed key" option where users could pay us and we would handle API access. We rejected it because it would require us to store your keys on our servers, route your requests through our infrastructure, and have access to your conversation content. The convenience was not worth the privacy cost.
Local Storage Architecture
Every piece of data Tensor generates is stored locally in your browser. Here is what we store and where:
- Conversations: Stored in IndexedDB. Each conversation is a collection of messages with timestamps and metadata. No conversation data is ever sent to Tensor.
- Personal Context: Stored in
chrome.storage.local. Your name, preferences, and context information. Never leaves your browser. - Workflows and Orchestrations: Stored in IndexedDB. Step definitions, variable mappings, and execution history. All local.
- Agent Configurations: Stored in IndexedDB. Schedules, conditions, action plans, and execution logs. All local.
- Settings and Preferences: Stored in
chrome.storage.local. Theme, font size, default model, hotkeys. All local. - API Keys: Stored in
chrome.storage.local, encrypted at rest. Only decrypted when making API requests.
You can export all your data at any time as a JSON file for backup or migration. You can delete all data with one click in settings. When you uninstall Tensor, Chrome automatically deletes all extension storage, leaving no traces.
The Shield Module: Tracker Detection
Privacy is not just about what Tensor does with your data. It is also about protecting you from what others do. Tensor includes a built-in Shield module that detects and blocks over 80 categories of trackers, fingerprinters, and data collectors as you browse.
Shield operates at the network request level, analyzing outgoing requests against a continuously updated database of known tracking domains and patterns. It detects:
- Third-party tracking pixels from ad networks and analytics services
- Browser fingerprinting scripts that collect hardware, font, and canvas data to uniquely identify you
- Cross-site tracking cookies used to follow your browsing activity across websites
- Session replay services that record your mouse movements, clicks, and keystrokes on websites
- Data broker beacons that collect and sell your browsing profile
- Social media widgets that track you even when you do not interact with them
- Hidden iframes and invisible image tags used for covert data collection
Shield is not a simple blocklist. It uses pattern matching and heuristic analysis to catch novel tracking techniques that pure domain-based blockers miss. When Shield blocks a tracker, it logs the event so you can see exactly what was blocked on each page. In our testing, the average news website triggers between 15 and 40 tracker blocks per page load.
How We Differ From Cloud-Based Alternatives
Let us be concrete about what the privacy differences mean in practice.
When you use a cloud-based browser AI tool: Your browsing activity, page contents, form entries, and conversation history are sent to the company's servers. They can see what websites you visit, what you search for, what emails you write, what forms you fill out, and what questions you ask. Most tools' privacy policies allow them to use this data for model training, product improvement, and in some cases, marketing and advertising. Even if a company promises not to misuse your data, they still have access to it, which means a data breach, a rogue employee, or a change in business strategy could expose it.
When you use Tensor: None of that data ever exists on any server we control. We cannot misuse your data because we do not have it. We cannot suffer a data breach of your information because we do not store it. A rogue employee cannot access your conversations because they do not exist on our systems. Our business model does not depend on data monetization because we have no data to monetize.
The only third party that sees your data is the AI provider you choose to use, and that is a relationship you control directly through your own API key and the provider's own terms of service.
The Offline Option
For users who want maximum privacy, Tensor supports fully offline AI through Ollama integration. Ollama runs open-source language models locally on your machine, which means your prompts and responses never leave your computer. Not even the AI provider sees your data.
Running models locally requires more hardware resources than using a cloud API, but modern machines handle it well. A laptop with 16 GB of RAM can run models like Llama 3 8B or Mistral 7B at reasonable speeds. For users with discrete GPUs, inference is fast enough to feel comparable to cloud APIs.
The offline option is particularly valuable for users working with sensitive professional data — lawyers reviewing client documents, doctors discussing patient cases, financial advisors analyzing portfolios — where even sending data to a trusted API provider may violate regulatory requirements.
What We Do Collect
In the interest of full transparency, here is the complete list of data Tensor collects: nothing. We do not run analytics. We do not track installs. We do not collect crash reports. We do not phone home. We do not have telemetry.
We know this makes it harder to build the product. We cannot see which features are popular, where users get stuck, or what errors they encounter. We rely entirely on user feedback through our Discord community and GitHub issues. This is a trade-off we accept because we believe privacy should be a default, not a setting you have to hunt for and toggle on.
Our Commitment
Privacy-first architecture is not something we can undo without rebuilding the entire product. There is no server infrastructure to add telemetry to. There is no data pipeline to start logging to. Our privacy guarantees are enforced by architecture, not policy. Policies can change; architecture cannot change overnight.
We built Tensor this way because we believe that browser AI is too sensitive to be built any other way. Your browser is your most intimate digital space, and the AI that operates within it should be accountable to you and you alone.