Microsoft Advisory — Azure OpenAI Contract Terms

Microsoft AI Services Terms: What Legal Teams Need to Watch

Azure OpenAI Service offers cutting-edge generative AI capabilities under Microsoft’s cloud, but its contracts come with nuanced terms that enterprises must scrutinise. This advisory highlights key clauses and considerations — data retention and privacy commitments, pricing models, and customer obligations — that IT, procurement, finance, and legal decision-makers should review when evaluating or negotiating Azure OpenAI agreements.

📅 August 2025⏱ Advisory Guide✍️ Fredrik Filipsson
📖 This guide is part of our Microsoft advisory series. See also: How to Negotiate Azure OpenAI with Microsoft · Azure OpenAI Pricing Explained · Microsoft Contract Negotiation Service
30 DaysDefault Data Retention Window
Per TokenConsumption-Based Pricing
EU ZonesData Residency Options Available
No TrainingYour Data Not Used for AI Training

Data Privacy and “No Training” Commitments

Microsoft’s “no training” clause is a centrepiece of its Azure OpenAI terms. In plain language, Microsoft promises it will not use your prompts, files, or outputs to train or improve its own AI models. Your data remains isolated in your Azure tenant — it is not shared with OpenAI (the third party) or other customers. This assurance addresses a major concern for enterprises: it prevents proprietary information or IP from inadvertently becoming part of Microsoft’s AI knowledge base.

For example, if a law firm feeds in confidential contracts or a manufacturer inputs trade secrets, those will not be fed back into the model’s learning. Legal teams should nonetheless get this promise in writing via the product terms or Data Protection Addendum to ensure it is contractually enforceable.

🔴 Critical Nuance: “No Training” Does Not Mean Zero Data Usage

Microsoft retains certain data for up to 30 days for abuse detection and troubleshooting. If the system flags content as potentially violating the Azure OpenAI Code of Conduct (hate speech, self-harm, violence, etc.), Microsoft personnel can review prompt/response snippets. Sensitive information could be seen by a human reviewer under specific conditions.

Many customers hear “we don’t train on your data” and assume no one at Microsoft ever accesses their data. In reality, your data is not used to improve the AI, but it might be briefly stored and inspected for policy violations.

Negotiation tip: If your organisation handles highly confidential or regulated data, address this upfront. Microsoft has an internal process for opting out of content logging (sometimes called modified abuse monitoring). Large enterprise customers — especially those with dedicated account reps — can apply for an exception so that prompts and outputs are not retained at all. Pushing for this in your Azure OpenAI agreement (or via an addendum) can mitigate the confidentiality risk, essentially closing the loophole that allows human review.

However, note that opting out may disable certain safety features; Microsoft will require justification and may grant it only to managed customers with sufficient oversight. At a minimum, ensure the contract specifies what data Microsoft can retain and for how long, and includes strict confidentiality obligations for any data stored or viewed for support purposes. Legal teams should verify that Microsoft’s standard privacy and security commitments (in the Data Protection Addendum) apply to Azure OpenAI, treating any customer-provided content as “Customer Data” with all associated protections.

Data Residency and Sovereignty Concerns

Global enterprises often need to know where Azure OpenAI will process and store their data. Microsoft’s terms indicate that your data at rest resides in the Azure region/geo you select, but there are important caveats.

⚠️ Data Residency Watch Points

If data sovereignty is a deal-breaker, negotiate a custom clause confirming your data will remain within specified locations (at least for data at rest, and ideally for processing as well). Inquire about Azure OpenAI availability in sovereign clouds (Azure Government, etc.) — these may lag behind in features and require separate contracts.

A practical approach to the audit gap: request additional assurances or documentation from Microsoft. Ask them to map Azure OpenAI’s controls to your required compliance frameworks, or include it in any existing on-site audit your company already negotiates for broader Azure services. Obtain all relevant compliance reports and factor those into your risk assessment.

Pricing and Cost Management: Avoiding Surprises

Azure OpenAI follows a consumption-based pricing model. There are no per-user licences for the service itself; instead, you pay for what you use (typically per 1,000 tokens processed or per hour for certain model deployments). This usage-based model offers flexibility, but it can lead to unpredictable costs if usage spikes or is not well managed.

💡

Get Cost Clarity

Each model (GPT-4, GPT-3.5, Embeddings) has its own rate. GPT-4 is significantly more expensive per output token than earlier models — a fact that translates into hefty bills if usage scales across multiple business units.

📊

Enterprise Rates Available

Treat Azure OpenAI like any other strategic Azure service: seek volume-based discounts or credits. Custom rate cards, tiered pricing, and percentage-off consumption rates are negotiable.

🔗

Bundle with Your EA

Incorporate Azure OpenAI into your overall Azure commitment (EA). Usage draws down against prepaid credits at a discounted rate. Insist it counts toward any existing Azure monetary commitment.

Consumption Approach Comparison

ApproachCost BasisNegotiation LeverageLock-In Risk
Pay-as-you-goStandard published rates per unit (tokens, transactions). Pay only for actual usage each month.Low leverage by default. List price applies, but you can stop or reduce usage anytime.Low contractual lock-in. However, integration creates practical lock-in once apps rely on Azure OpenAI.
Azure commit (pre-paid)Commit certain spend on Azure (including OpenAI) over 1–3 years. Usage draws from this at discounted rates.High leverage if you commit big. Negotiate volume discounts or bonus credits. Tiered rates possible.Medium lock-in. Financially committed to spend (or lose credit value). Ensure terms allow adjusting if consumption changes.
Multi-year custom ratesFixed pricing or caps for Azure OpenAI over a multi-year term, often as an EA amendment.High leverage if Azure OpenAI is a centrepiece. Secure price lock or special pricing for new AI features.Medium-to-high lock-in. Stable pricing, but tied for the term. Include exit clauses or renewal review.

Avoiding lock-in: Keep contract terms as flexible as possible. Limit terms (1-year pilot or coterminous with EA renewal). Include a “benchmark and adjust” clause — at the one-year mark, both parties review usage and pricing in light of new offerings or competitive alternatives. Also, verify that no contract language restricts you from using alternative AI solutions alongside Azure’s — maintain the ability to pivot to another service or bring in an additional provider.

"The single most important contract lever for Azure OpenAI is integrating it into your existing Enterprise Agreement. When Azure OpenAI spend draws down against pre-committed Azure credits at a negotiated discount, you achieve both financial efficiency and contractual protection — extending your EA’s price caps, liability terms, and data protection commitments to the AI service. Enterprises that treat Azure OpenAI as a standalone click-through agreement consistently pay more and receive fewer protections."

Fredrik Filipsson, Co-Founder, Redress Compliance

Contractual Risks and Customer Obligations

Azure OpenAI might be cutting-edge technology, but at its core it is a Microsoft cloud service — meaning your enterprise will have a set of responsibilities and risks to manage under the contract.

Acceptable Use and Content Standards

Customers must abide by Microsoft’s Responsible AI guidelines and Code of Conduct. Your users cannot deliberately generate prohibited content (hate speech, extreme violence, unlawful material) and should not use the AI for disallowed tasks. Microsoft’s terms give them the right to suspend or terminate service for misuse. Incorporate these usage rules into your internal AI governance policies and ensure all employees accessing Azure OpenAI are aware of the boundaries.

Output Use and Intellectual Property

Under Microsoft’s terms, you typically own your input and output — Microsoft does not claim rights to the content you or the AI create. However, ownership does not equal safety. The AI’s output might include content resembling existing copyrighted or patented material. Microsoft’s contracts disclaim liability for such scenarios and the agreement states the service is provided “as-is” with no warranties that output will not infringe IP.

For legal teams, this is a red flag area: if your business relies on AI-generated content, you must have an internal review or quality control step. Do not assume the AI is correct or legally clean. Additionally, Microsoft is unlikely to indemnify you for AI outputs, so your company may need to secure its own insurance or indemnities.

⚠️ Four Key Customer Obligations to Watch

Recommendations

1

Integrate Azure OpenAI into Your Enterprise Agreement

Treat Azure OpenAI as a core service, not a niche add-on. Folding it under your main Microsoft agreement ensures pre-negotiated protections (liability caps, data protection terms) and volume pricing. Do not accept a lightweight click-through agreement.

2

Demand Clarity on Data Handling

Insist on clear, written commitments about data usage, retention, and location. If your industry requires it, negotiate an addendum for jurisdiction-specific processing and storage. Pursue the logging opt-out for highly sensitive data. Obtain a confidentiality clause covering any human reviews.

3

Leverage Volume for Discounts

If you anticipate substantial AI usage, approach Microsoft with a usage forecast and request a custom pricing proposal. Push for token volume discounts, free usage credits for pilots, or a fixed rate for committed spend. Microsoft has flexibility here.

4

Keep Terms Short and Flexible

Avoid multi-year inflexible commitments for rapidly evolving technology. Align with EA renewal cycles and include “escape hatches” such as opt-out rights after an initial phase. Ensure you can renegotiate if Microsoft releases new models or pricing drops.

5

Prepare an Internal Usage Policy

Define what data types employees can or cannot input (no PII without approval, no client confidential text). Set guidelines on vetting AI outputs. Establish incident response processes. This keeps you in line with Microsoft’s terms and guards against misuse.

6

Monitor Regulatory and Contract Changes

Assign someone to continuously monitor AI regulations and Microsoft’s terms of use. Cloud AI policy is in flux — Microsoft may update terms to address new laws (EU AI Act). Quickly assess changes and amend your agreement or usage approach accordingly.

7

Engage Legal & Security Early in AI Projects

Involve legal, compliance, and cybersecurity teams from day one. They can flag contract issues (HIPAA BAA, export controls) and help configure the service safely. Early cross-functional input strengthens your negotiating position.

Checklist: 5 Actions to Take

Azure OpenAI Contract Action Plan

Frequently Asked Questions

How is Azure OpenAI Service priced, and can we negotiate the rates?
Azure OpenAI is priced on a pay-as-you-go model, charging based on usage (per 1,000 tokens processed or per image generated). There are no upfront licence fees. Enterprise customers can negotiate on pricing — if you anticipate high usage, request volume discounts or a custom rate card from Microsoft. Leverage your Enterprise Agreement so that any Azure spend commitments or discounts apply to Azure OpenAI. Always run the maths on projected usage and discuss it with Microsoft. Also, ensure your contract has price protection — a clause that locks your Azure OpenAI rates for the term of your agreement or guarantees you receive any public price reductions.
Will Microsoft see or retain our data when we use Azure OpenAI?
Microsoft will not use your data to train its models or share it with other customers. Your prompts and the AI’s outputs are considered Customer Data protected under Microsoft’s standard privacy terms and the data processing addendum. By default, Azure OpenAI does log interactions for up to 30 days for abuse detection. If something triggers a red flag, that prompt/response might be reviewed by Microsoft personnel. If you cannot accept even that level of retention, you can request an exemption (available for certain large customers) so that logging is turned off entirely. Treat AI input like any external communication: share on a need-to-know basis and scrub any personal or secret info that is not necessary for the task.
Who owns the content that the AI generates? Are there IP risks?
Generally, you own the outputs that Azure OpenAI produces for you, just as you own the data you input. Microsoft does not claim ownership over your prompts or results. However, ownership does not guarantee the content is free from IP issues. The AI may generate text or code similar to existing works, and Microsoft disclaims responsibility and typically does not indemnify you if a third party claims infringement. Important AI-generated content should undergo an IP review or plagiarism check. Use Azure OpenAI outputs as a starting point or draft, and have human experts vet them. Contractually, assume your company is responsible for content it deploys, even if AI helped create it.
What obligations do we have for compliant and ethical use?
When you sign up for Azure OpenAI, you agree to acceptable use policies and responsible AI principles. Key obligations include: not using the AI to generate prohibited content, not circumventing content filters, not using outputs to mislead people without disclosure, and securing access to the service (protecting API keys, controlling who can use it). Violating these obligations can lead to suspension or termination. Create a compliance checklist for Azure OpenAI usage, ensure use cases are reviewed by legal/compliance, restrict deployment to trained staff, and periodically audit usage. If Microsoft provides Code of Conduct training resources, have your team complete them.
Are we locked into using Azure OpenAI long-term?
Azure OpenAI is consumption-based — if you stop, you simply stop calling the service and incur no new costs. However, if you negotiated special pricing or committed to certain spend, you may have a financial lock-in (forfeit discounts or face penalties). Beyond contracts, there is practical lock-in once applications and users rely on Azure OpenAI. To maintain flexibility, design systems with abstraction layers allowing the AI component to be swapped. Microsoft does not force exclusivity, so you are free to use other AI platforms. Negotiate terms that allow easy exit at renewal points, and keep architecture flexible. A termination or transition assistance clause in your master agreement can be helpful.

Need Help with Azure OpenAI Contract Terms?

Whether you’re negotiating a new Azure OpenAI agreement, reviewing an existing contract, or implementing governance for your AI deployment — our Microsoft advisory specialists help enterprises navigate Microsoft’s AI services terms, negotiate commercially favourable protections, and avoid hidden cost traps.

📅 Book a Free Consultation Explore Microsoft Advisory →

Related Resources

ServiceMicrosoft Advisory Services ServiceMicrosoft Contract Negotiation ServiceMicrosoft EA Optimisation ServiceGenAI Negotiation Services ArticleHow to Negotiate Azure OpenAI with Microsoft ArticleAzure OpenAI Pricing Explained Case StudiesGenAI Negotiation Case Studies HubMicrosoft Licensing Knowledge Hub
FF

Fredrik Filipsson

Co-Founder, Redress Compliance

Fredrik Filipsson brings over 20 years of experience in enterprise software licensing and contract negotiation. As AI adoption accelerates across the enterprise, Redress Compliance’s vendor-independent advisory helps organisations navigate Microsoft’s Azure OpenAI contract terms — reviewing data handling clauses, negotiating commercially favourable pricing, and implementing governance frameworks that protect enterprise interests.

View all articles by Fredrik →
← Back to Microsoft Knowledge Hub