What Cloud AI Providers Do With Your Data

Before you paste that subcontract into ChatGPT or upload bid documents to Claude, you should know where that information goes and who can see it.

Construction companies are already prime ransomware targets. Adding AI tools to your workflow means understanding another layer of data exposure.

The Short Answer

Free and consumer plans at most AI providers may use your conversations to train future models. Your data gets mixed into a massive dataset that improves the AI for everyone—including your competitors.

Business and API plans typically don't train on your data. You pay more, you get more privacy.

ChatGPT (OpenAI)

Free and Plus ($20/month) plans: OpenAI may use your conversations to train models unless you opt out. Go to Settings → Data Controls → "Improve the model for everyone" and turn it off.

Even with training disabled, OpenAI retains conversations for 30 days for safety monitoring. They can review flagged content.

Team ($25/user/month) and Enterprise plans: Your data is not used for training. Retention policies are configurable. These plans include admin controls and audit logs.

Claude (Anthropic)

As of September 2025, Anthropic changed its policy. Free and Pro plans now ask whether you want to allow training. If you accept, conversations can be used to improve models and retained for up to five years. If you decline, retention drops to about 30 days.

Check your settings at Settings → Privacy → "Model Improvement" to see what you've agreed to.

Business and API plans: Data is not used for training. Retention is minimal by default, and enterprise customers can negotiate specific terms.

Gemini (Google)

Consumer Gemini conversations may be used for training and reviewed by humans. Google's privacy policy covers Gemini alongside other Google services—it's broad.

If you're using Gemini through Google Workspace with an enterprise agreement, different terms apply. Check with your Google admin or reseller for specifics.

What This Means for Contractors

Don't paste sensitive information into consumer AI plans without understanding the implications. Bid numbers, client financials, proprietary methods, employee information—all of it could theoretically end up in training data.

For general writing help—drafting emails, explaining concepts, formatting documents—consumer plans are probably fine. The content isn't sensitive.

For anything involving competitive information, contracts, or client data, either use business-tier plans with clear privacy terms, use the API with training disabled, or keep that work off cloud AI entirely.

API vs. Consumer Plans

Using the API (paying per use rather than a monthly subscription) generally comes with better privacy terms. Both OpenAI and Anthropic state that API inputs are not used for training by default.

The tradeoff: API access requires some technical setup and you pay per use rather than unlimited. For typical contractor usage, API costs run $5-25/month depending on volume.

Local AI Options

If you need to search sensitive documents without any data leaving your network, local AI tools exist. AnythingLLM and similar software run on your own computer. The AI model lives on your machine, documents stay on your machine, nothing goes to the cloud.

The tradeoff: local models are less capable than cloud models, and setup requires more technical comfort. But for searching bid documents or contracts, they work well enough.

Practical Guidelines

For most daily AI use—email drafting, general questions, formatting help—consumer plans are fine. Just don't paste anything you wouldn't email to a stranger.

For document review, contract analysis, or anything involving competitive or client information, use business plans or API access with training disabled.

For highly sensitive work—active bids, litigation documents, personnel matters—consider whether cloud AI is appropriate at all, or use local alternatives.

Check your settings. Both ChatGPT and Claude have privacy toggles buried in settings menus. Make sure you know what you've agreed to.


Meta Description: What happens to your data when you use ChatGPT, Claude, or Gemini. Privacy policies explained for contractors who handle sensitive project information.