Microsoft Licensing

Microsoft AI Licensing for Copilot and Azure OpenAI

Microsoft AI Licensing For Copilot 1024x683

Microsoft AI Licensing for Copilot

Microsoft’s new AI-powered “Copilot” offerings – from GitHub Copilot to Microsoft 365 Copilot to Azure OpenAI services – promise to boost productivity and innovation.

CIOs and CTOs, however, must carefully navigate the licensing and pricing of these tools for enterprise use.

This guide provides an overview of each offering, including its costs and contract models, as well as key considerations such as compliance and value, to help technology leaders effectively integrate OpenAI-powered solutions into their organizations.

Read Microsoft 365 Copilot Licensing Strategy for Enterprises.

Overview: Microsoft’s New AI Offerings

Microsoft has introduced several AI “Copilot” services that integrate OpenAI’s advanced models into everyday enterprise workflows:

  • GitHub Copilot – An AI coding assistant in developers’ IDEs that suggests code and answers questions.
  • Microsoft 365 Copilot – An AI assistant embedded in Office apps (Word, Excel, Teams, etc.) that helps generate content, analyze data, and summarize information.
  • Azure OpenAI Service – A cloud service providing API access to OpenAI’s GPT-3.5, GPT-4, and other models, allowing businesses to build custom AI applications and integrations.

All these leverage OpenAI’s GPT models, but they differ in licensing model and enterprise integration. Microsoft’s strategy is to offer AI assistance “everywhere”, in coding.

In productivity apps, and as a platform, it charges in different ways (per user or usage). The following sections break down each offering’s licensing and pricing for enterprise customers.

GitHub Copilot for Developers: Licensing and Pricing

GitHub Copilot is an AI pair programmer that assists software developers.

For enterprises, it’s available in two subscription tiers:

  • Copilot Business$19 per user per month. This plan provides core code completions and AI-assisted development features in IDEs, along with additional enterprise features such as command-line integrationsecurity filtering, and intellectual property (IP) indemnity for generated code. It’s billed per seat every month (no long-term commitment required).
  • Copilot Enterprise$39 per user per month. This higher tier (requires GitHub Enterprise Cloud) includes everything in Business plus advanced capabilities: Copilot Chat (an AI chatbot that can answer questions about your private codebase and documentation), pull request summaries, and code analysis tools. The Enterprise plan provides greater data privacy (no training on your private code) and enhanced support. It also carries the same IP indemnification assurances.

GitHub Copilot Enterprise pricing card, highlighting advanced features at $39 per user per month in the Enterprise tier. Copilot Business (at $19) includes core coding assistance and security safeguards, while the Enterprise tier adds chat, codebase indexing, and deeper integration for large organizations.

Licensing model:

GitHub Copilot is licensed as a SaaS subscription outside the usual Microsoft volume licensing programs. Enterprises can purchase Copilot seats through GitHub’s platform (which can integrate with your Single Sign-On and GitHub Enterprise management).

Billing is based on the number of users assigned a Copilot seat each month. There is no usage-based fee – users get unlimited suggestions up to rate limits.

However, GitHub does introduce the concept of “premium requests” for certain features (like iterative chat or code generation tasks), with a generous allowance per user to cover normal use.

No separate contract is required beyond accepting GitHub’s terms (although enterprise agreements may reference it if GitHub is part of your Microsoft agreement).

Read GitHub Copilot for Business: Licensing.

Considerations:

  • Free trials and special licenses: GitHub offers Copilot Free, which includes limited features for evaluation, as well as free access for verified students and open-source project maintainers. However, for enterprise use, plan on using the paid versions – there’s no unlimited free trial available for organizations.
  • IP and legal:* Microsoft provides a Copilot Copyright Commitment for paid Copilot users – essentially indemnifying customers against legal claims if the AI suggestions inadvertently include copyrighted code or content. This commitment (effective since late 2023) is a major benefit for enterprises worried about code ownership and compliance. It applies only to paid versions (Copilot Business/Enterprise), giving CIOs peace of mind that Microsoft will defend and cover any copyright infringement claims arising from Copilot’s output (provided users follow the intended use and don’t deliberately misuse the tool).
  • Data privacy: By default, Copilot Enterprise will not use your private repository code or prompts to train the public AI models. Your code stays within your organization’s tenant on GitHub. This addresses sensitive data concerns – a key factor for highly regulated industries.
  • Integration with contracts: GitHub Copilot may not be automatically included in your Microsoft Enterprise Agreement (EA) – it often requires a separate subscription. However, Microsoft sales teams can coordinate deals if you’re adding Copilot alongside other Microsoft products. Large customers have successfully negotiated discounts on Copilot seats when bundling them in big agreements, especially as Microsoft is keen to drive Copilot adoption. Be sure to engage your Microsoft/GitHub account reps; volume discounts or extended pilot programs may be available for substantial seat counts.

Microsoft 365 Copilot for Enterprise Productivity

Microsoft 365 Copilot brings generative AI into Office 365/M365 apps to assist knowledge workers with drafting emails, creating documents, summarizing meetings, analyzing spreadsheets, and more.

Unlike GitHub Copilot, this is licensed as an add-on for Microsoft 365 enterprise subscriptions:

  • Price: $30 per user per month (billed annually) for the Microsoft 365 Copilot add-on. This is a flat fee per licensed user, in addition to your existing Microsoft 365 subscription cost. Notably, Microsoft requires an annual commitment for Copilot licenses; you cannot pay on a month-to-month basis or toggle it on and off freely. Enterprises must plan to invest $360 per user per year for Copilot, making this a significant budget consideration.
  • Prerequisites: Only certain licensing tiers qualify. Currently, users must have a base Microsoft 365 E3 or E5 license (or an equivalent Office 365 E3 or E5) to be eligible for Copilot. Initially, when Copilot launched, Microsoft targeted only enterprise plans and enforced a minimum purchase requirement of 300 seats. As of late 2024, that seat minimum was removed, and Copilot became available to smaller organizations and business plans – but a “Microsoft 365” subscription of a certain level is still required. In practice, this means if you’re an SMB on Business Standard/Premium, you can access Copilot (Microsoft has indicated Business Premium and Standard users are eligible). However, you still pay the same $30 per user add-on. Education (A3/A5) and Government (G5) licenses may have their own Copilot availability timelines and pricing, which enterprises in those sectors should confirm separately.
  • No free trial: Unlike some Microsoft services, there is no free trial for M365 Copilot. Organizations must purchase the add-on to test it in their environment. Many enterprises began with a small subset of users as a pilot and then expanded due to the lack of a formal trial program.

What you get:

With the Copilot add-on active, users can see the Copilot AI assistant in various apps, including Word, Excel, PowerPoint, Outlook, and Teams.

It can generate first drafts of documents or emails, create PowerPoint slides from Word docs, summarize lengthy threads or meetings in Teams, and answer natural-language questions about your business data (leveraging Microsoft Graph to retrieve information the user has permission to access).

Essentially, it combines OpenAI’s GPT-4 with your enterprise data (files, emails, calendars, etc.) in a controlled and secure manner.

Read Azure OpenAI GPT-4 Cost Strategy for CIOs.

Cost context:

$30 per user per month is a premium price – for perspective, an M365 E5 license (which includes Office apps, security, etc.) is approximately $57 per user per month. Adding Copilot brings the total cost close to $87 for an E5 user (over 50% increase), or roughly double the cost of an E3 user (E3 ~$36 plus $30).

CIOs must evaluate whether the productivity gains offset this steep cost per user. Microsoft’s positioning is that Copilot dramatically amplifies individual productivity; however, organizations should identify use cases (e.g., drafting communications, data analysis) and perhaps measure the outcomes of pilot users to justify the ROI.

In some cases, enterprises may choose to license Copilot only for specific roles or departments where it generates the most value (at least initially).

Licensing and integration:

Microsoft 365 Copilot is simply an add-on license in your volume licensing agreement or cloud subscription, similar to how you’d add a security or voice calling add-on.

For enterprise agreement customers, you can include Copilot licenses in your EA renewal or true-up.

Microsoft has been eager to promote Copilot adoption, so you may find them somewhat flexible on terms if it means a sizable Copilot deployment (for example, aligning Copilot’s term with your EA term, or providing promotional pricing for early adopters).

Still, Microsoft initially gave minimal discounts on Copilot due to high demand and buzz. Now (heading into 2025), there are reports that negotiation is possible, especially if Copilot sales have slowed, some enterprises have secured incentives like discounted Copilot pricing or funding for deployment services.

Ensure you ask about any Copilot adoption programs or pilot offers.

Data privacy and compliance:

When users employ M365 Copilot, all data stays within your Microsoft 365 tenant boundary. Copilot has access to data that the user already has access to (it respects existing permissions).

The prompts and AI processing occur in Microsoft’s cloud. Still, Microsoft assures that your content is not used to train the underlying AI model – it’s transiently used to generate responses for that user, then discarded.

Copilot also has built-in guardrails: it uses a “grounding” technique to fetch relevant enterprise data via Microsoft Graph and applies filters to avoid disallowed content.

From a compliance standpoint, Microsoft 365 Copilot inherits the robust compliance certifications of M365 (such as GDPR, ISO 27001, SOC, etc.), and admins can control Copilot’s availability and monitor its use. There are also auditing capabilities to log Copilot’s actions for governance.

Related to AI: 

It’s worth noting that Microsoft introduced Bing Chat Enterprise in 2023, an AI chat service (based on GPT-4) included at no extra cost with Microsoft 365 E3/E5 and Business plans. Bing Chat Enterprise enables users to ask web and work-related questions through a chat interface, with commercial data protections in place.

However, it is separate from M365 Copilot – it doesn’t integrate into Office apps or directly access your private documents. Think of Bing Chat Enterprise as a more secure, standalone ChatGPT-style tool, whereas Microsoft 365 Copilot is deeply integrated into your Office workflow.

The free inclusion of Bing Chat Enterprise enhances the value of existing licenses.

Still, the full Copilot experience, which works alongside you in Word, Excel, and Teams with your business data, requires the paid add-on.

CIOs can use Bing Chat Enterprise as a stepping stone (since it’s already available if you have E3/E5), but to truly transform productivity in document-heavy workflows, the M365 Copilot license is needed.

Enterprise considerations:

  • Deployment: Enabling Copilot is technically straightforward (assign licenses, and it lights up in apps), but success depends on user readiness. Plan enablement programs to ensure employees understand how to use Copilot effectively and responsibly (e.g., crafting effective prompts, reviewing AI outputs). This helps maximize value from the investment.
  • Security: Ensure your organization’s data governance policies are up to date. Users might inadvertently prompt Copilot with sensitive info – establish guidelines on what not to ask AI and remind that existing security controls (DLP, retention, etc.) still apply to Copilot interactions. Microsoft has provided admin controls to disable Copilot features for certain users or tighten content filtering if needed.
  • Licensing scope: You do not have to buy Copilot for all M365 users – it can be purchased for a subset. Many enterprises start with specific departments (e.g., a pilot group in HR or finance) before scaling. Microsoft’s 300-user minimum is no longer in place, so even smaller firms or specific pilot groups can proceed without this barrier. Just remember the one-year term commitment per user.

Read Negotiating Azure OpenAI Credits in New Enterprise Agreements.

Azure OpenAI Service: Pay-as-You-Go AI Platform

While GitHub and M365 Copilot are packaged solutions, Azure OpenAI Service offers building blocks for custom AI.

It gives enterprises direct access to OpenAI’s models (GPT-3.5, GPT-4, Embeddings, DALL-E image generation, etc.) through Azure’s cloud platform.

The licensing and pricing here are quite different:

  • Consumption-based pricing: Azure OpenAI is billed by usage (per model invocation, measured in tokens) rather than per user. Organizations create an Azure OpenAI resource in their Azure subscription, then pay for the API calls their applications make to the model. Pricing varies by model:
    • GPT-4 (8k context) – roughly $0.03 per 1,000 tokens for prompt input and $0.06 per 1,000 tokens for generated output.
    • GPT-4 (32k context) – about double the 8k price (e.g., $0.06 input, $0.12 output per 1k tokens).
    • GPT-3.5 Turbo – around $0.002 per 1,000 tokens (both input and output), significantly cheaper but less capable.
    • Embeddings – ~$0.0001 per 1,000 tokens using the Ada model for vector embeddings.
      (These figures are indicative; Microsoft updates pricing periodically. Azure’s pricing page often lists costs per 1,000 or 1M tokens, depending on the model.)
  • Azure subscription required: Usage is metered and charged against your Azure account. If your company has an Azure Enterprise Agreement or a Cloud Solution Provider arrangement, Azure OpenAI charges are included in your Azure consumption. This means you might fund it through existing Azure credits or commitments. There’s no separate “license cost” – you pay only for what you use (plus any Azure infrastructure you choose to pair with it).
  • Access and quotas: Initially, Azure OpenAI required an application and Microsoft’s approval to use (to ensure responsible AI use cases). By 2024, many models will be generally available; however, some advanced features may still be gated by Microsoft’s compliance checks. Once enabled, you have default rate limits (e.g., number of tokens per minute) which can be increased by request if you have a legitimate high-volume use case.

Use cases:

Azure OpenAI is ideal if you want to build your own AI copilots or integrate AI into your software, such as creating a customer service chatbot, an internal knowledge base Q&A bot, or using GPT to analyze proprietary data. It provides you with full flexibility to utilize AI beyond the pre-built Office or GitHub contexts.

Many enterprises use Azure OpenAI to develop domain-specific copilots (for instance, a “Sales Proposal Copilot” that generates proposals using your templates and data, or a custom agent integrated into your IT helpdesk).

Microsoft even offers a Copilot Studio toolkit for building custom copilots and agents on Azure OpenAI, which some licensing plans treat as part of the Azure OpenAI service (with possible prepaid capacity packs available to purchase for expected usage).

Cost management: The pay-as-you-go model requires vigilance:

  • Costs can scale with usage. A single complex GPT-4 query might cost only a few cents, but if thousands of employees or a high-traffic app use it daily, expenses add up quickly. Unlike the fixed $30 per user for M365 Copilot, Azure OpenAI costs are variable. This is a double-edged sword: you only pay for actual usage (no cost for idle users or during off-hours), but budgeting is trickier, and unexpected spikes can occur.
  • Enterprises should implement monitoring and quotas. Azure provides cost alerts and the ability to cap spending or throttle usage. It’s wise to start with limited access (maybe a pilot project) and measure average costs. Over time, you might negotiate an Azure consumption commitment that includes expected OpenAI usage to get better overall cloud discounts.
  • In some cases, Microsoft might offer private pricing or reservations for huge AI workloads. If you plan to use millions of tokens per day, engage Microsoft – high-volume customers could potentially get custom pricing agreements or grants (especially if your use showcases Azure’s capabilities).

Read Azure OpenAI Data Privacy and Compliance for Enterprise AI Deployments.

Integration and compliance:

Azure OpenAI runs in Azure’s cloud environment, meaning you benefit from Azure’s compliance standards (which are generally extensive – HIPAA, GDPR, FedRAMP High for certain regions, etc.).

Data submitted to the Azure OpenAI APIs is not used to train the public models and is retained only temporarily for service operations (with options to disable even that logging).

This addresses data confidentiality: using Azure’s offering is arguably more secure for proprietary data than calling OpenAI’s API directly, since Microsoft contractually assures data isolation.

Additionally, you can choose the region your Azure OpenAI instance is deployed in to meet data residency requirements.

Microsoft also provides tools, such as Azure AI Content Safety, that can be used in conjunction with OpenAI to filter out harmful or sensitive outputs as needed.

Enterprise considerations:

  • Developer effort: Unlike Copilot in M365 or GitHub, Azure OpenAI is not a plug-and-play solution for end users – your development teams will need to integrate the AI models into applications or workflows. This provides maximum flexibility, but CIOs should ensure they have or can acquire the necessary AI development skills. Microsoft has been rolling out sample solutions and the Copilot Studio to simplify building custom copilots.
  • Licensing and costs in contracts: Azure OpenAI usage will be factored into your overall Azure bill. If you have an enterprise agreement with an Azure spending commitment (often called a MACC – Microsoft Azure Consumption Commitment), high OpenAI consumption can consume those credits quickly. It’s wise to forecast this when negotiating cloud commitments. Some enterprises negotiate specific discounts or funding for AI projects – Microsoft wants Azure to be seen as the go-to platform for AI, so that they may provide incentives (like free credits or expert assistance) for strategic AI initiatives. Always ask your account team if any AI adoption offers are available.
  • Alternative routes: A strategic consideration is whether to use Azure OpenAI versus OpenAI’s direct API or other AI providers. Many enterprises choose Azure for the enterprise-grade security, integration with Azure services, and simplified billing (especially if they already spend heavily on Azure). However, the flexibility of Azure OpenAI also means that organizations pay cloud rates. If Microsoft’s terms or uptime for models don’t meet their needs, they retain the option to negotiate directly with OpenAI for an enterprise API license or consider other model providers. This can be a bargaining chip in negotiations with Microsoft.

Enterprise Licensing Considerations and Best Practices

Adopting generative AI at scale raises several cross-cutting considerations beyond just price per user or token.

CIOs/CTOs should keep the following in mind as they plan and negotiate:

  • Contractual Protections & Compliance: Ensure that whichever Copilot service you deploy comes with proper contractual safeguards. Microsoft’s Copilot Copyright Commitment (effective Oct 2023) extends their standard IP indemnification to AI services. This means if a Copilot’s output inadvertently infringes someone’s IP, Microsoft will defend you and cover legal costs, as long as you’re using the service as intended. This is a critical protection – make sure your Microsoft Product Terms or agreements explicitly include it for Copilot and Azure OpenAI. Additionally, review how data is handled in each service: confirm that no customer data will be used to improve Microsoft’s or OpenAI’s models, unless you opt into a feature such as custom model training. The agreements should state that the data is your property and is only used transiently to provide the service. For highly regulated industries, also seek clarity on aspects such as audit trails (e.g., “Can we audit what prompts and responses our users are getting?” – Microsoft 365 Copilot does have audit logs for admin oversight).
  • Security and Privacy Configuration: From day one, involve your CISO and compliance teams. Configure admin controls to suit your policies – for example, you might disable Copilot’s ability to generate emails if you worry about it spamming external contacts, or use sensitivity labels to prevent Copilot from exposing confidential files in responses. In Azure OpenAI, utilize features such as private networking (VNET) and Managed Identity to ensure your AI application is not exposed to the public internet and has controlled access. Microsoft’s tools allow a secure implementation, but it’s up to the enterprise to turn on those knobs.
  • Pilot Programs to Validate Value: Before committing enterprise-wide, run a structured pilot. Select a group of users or a specific use case, and measure the impact (in terms of time saved, quality improvements, and user satisfaction). This data will help build a business case for the spend. It will also inform how you scale licensing: you might find only certain departments get high value, guiding you to allocate licenses strategically rather than to everyone.
  • Cost Optimization: Treat these AI services with the same rigor as any cloud service – optimize and review regularly. For M365 Copilot, monitor license utilization: are all assigned users using the features meaningfully? If not, consider reassigning licenses to those who will benefit from them. Since it’s annual, you may have to wait for renewal to reduce quantities, but plan. For Azure OpenAI, implement FinOps practices by tracking which projects or teams are driving usage and setting internal chargebacks or budgets to encourage the efficient use of tokens. There may be opportunities to utilize lower-cost models (e.g., using GPT-3.5 for simple tasks and reserving GPT-4 for the most complex queries) – such tuning can drastically reduce costs without incurring significant performance loss.
  • Negotiation Tips: Microsoft’s sales approach in 2024-2025 heavily features AI – they see Copilot as a flagship. Use this to your advantage. For example, if you’re renewing your Microsoft Enterprise Agreement, mention interest in Copilot or Azure OpenAI as part of a broader digital transformation, but be clear that cost is a concern. Microsoft has been known to offer incremental discounts or add-ons if it helps land a bigger deal. However, be cautious about overcommitting: only negotiate Copilot licenses in volume if you have a solid adoption plan. Microsoft reps might push for enterprise-wide adoption – don’t be afraid to start smaller. Also consider timing: if you don’t urgently need Copilot, you might hold off until your next renewal to see if pricing improves or if competitors (Google, etc.) influence Microsoft to adjust pricing.
  • Staying Informed: The AI product landscape is evolving rapidly. Microsoft could introduce new bundles or adjust pricing (for instance, a future “Copilot E5” bundle or usage-based plans). Keep an eye on announcements and the roadmap. Ensure your contracts have flexibility to accommodate changes – e.g., add-on riders that allow you to switch to a different AI licensing model if introduced. Additionally, monitor user feedback and success stories in your industry; this can guide you on where AI copilots deliver tangible value, allowing you to prioritize those areas.

Recommendations

  • Start with Clear Use Cases: Identify specific scenarios (such as coding assistance, marketing content generation, or data analysis automation) where these AI tools can address a pain point. This focus ensures you invest in Copilot where it matters and can measure success.
  • Pilot and Phase Rollouts: Launch a pilot program for any Copilot service before rolling it out enterprise-wide. Use pilot results (productivity metrics, user feedback) to refine your adoption strategy and justify the investment to stakeholders.
  • Negotiate as a Bundle: If you’re adding Copilot licenses, try to align it with your Microsoft Enterprise Agreement or renewal discussions. Leverage your total Microsoft spend – Microsoft may provide better pricing or incentives if Copilot is part of a broader commitment. Don’t hesitate to ask for trial periods, discount tiers, or additional value-added services (such as training) during your negotiation.
  • Train and Enable Users: Budget for User Training and Change Management. These AI copilots introduce new ways of working; employees will get the most value if they understand how to use prompts effectively and the limitations of AI. Consider internal “champions” or workshops to accelerate adoption and share best practices across the company.
  • Monitor Usage and Costs Ongoing: Implement dashboards or reports for Copilot usage. For Microsoft 365 Copilot, monitor how often it’s invoked and which features are used – this can inform license true-up decisions. For Azure OpenAI, set up cost alerting and monthly reviews of token consumption per project. Early monitoring will catch runaway usage or highlight if expected use (and ROI) isn’t materializing.
  • Review Security/Compliance Settings: Before enabling any Copilot, have IT security review the configuration to ensure it meets the necessary security and compliance requirements. Enable available controls, such as tenant-wide content filters for Copilot (to prevent sensitive data exposure), ensure MFA and Conditional Access policies cover Copilot access, and for Azure OpenAI solutions, place them behind proper security measures (virtual networks, role-based access). Update your data privacy notices or policies as needed to cover employee use of AI tools at work.
  • Leverage Vendor Commitments: Ensure you take advantage of Microsoft’s commitments, such as the copyright and security guarantees. Document internally that Microsoft will cover certain liabilities associated with the use of these tools. This will reassure your legal department and executives. Also, stay informed on improvements Microsoft makes (e.g., new compliance certifications, regional data centers for Copilot, etc.) and enable them when available.
  • Compare Alternatives Periodically: While Microsoft’s offerings are leading in integration, keep an eye on alternatives (like Google’s AI features or OpenAI’s offerings). Even if you stick with Microsoft, knowing the competitive landscape can provide leverage in negotiations and ensure you’re getting the best value. If Microsoft’s AI pricing doesn’t become favorable, you should have a backup plan – whether that’s negotiating harder or exploring other AI platforms for certain use cases.
  • Plan for Scaling Up (or Down): As AI proves its value, demand may surge. Be prepared to scale up usage – this may involve budgeting for additional Copilot licenses next year or increased Azure OpenAI spending. Conversely, if adoption is slower, have an exit or scale-down plan to avoid paying for unused capacity. Flexibility is key: try not to commit yourself to more than necessary until you’re confident in sustained usage.

FAQ (Frequently Asked Questions)

Q1. What are the prerequisites to deploy Microsoft 365 Copilot in our organization?
A: Microsoft 365 Copilot is sold as an add-on to Microsoft 365 Enterprise subscriptions. You will need to have either Microsoft 365 E3 or E5 (or Office 365 E3/E5) licenses for your users. The Copilot add-on ($30 per user per month) can then be purchased for these users. There’s no standalone Copilot license; it’s always layered on top of an existing Microsoft 365 plan. Additionally, as of now, you must commit to a 12-month term for the Copilot licenses. From a technical standpoint, ensure your environment meets requirements: users must be in Entra ID (Azure AD), using the latest Office apps (the “Microsoft 365 Apps for Enterprise” version), and some features (like Copilot in Outlook or Teams) may require using the latest client or Outlook Web. Microsoft initially had a 300-user minimum, but that has been removed, so you can start with fewer users if needed (even smaller businesses can enable Copilot as long as they have the proper base subscription).

Q2. How is GitHub Copilot licensed for enterprises? Is it part of GitHub Enterprise?
A: GitHub Copilot is a separate subscription from GitHub Enterprise, though they complement each other. If your company uses GitHub Enterprise Cloud for source control, you can add Copilot seats for your developers. The two enterprise plans are Copilot Business ($19 per user per month) and Copilot Enterprise ($39 per user per month). These are not automatically included in a standard GitHub Enterprise license; they are add-ons that you opt into and pay for per user. In practice, an enterprise can mix and match: for example, purchase 100 Copilot Business seats for some teams and 50 Copilot Enterprise seats for advanced use cases that need the extra features, all under the same Enterprise account. Billing is performed monthly based on the actual number of assigned seats. Suppose you currently allow developers to use Copilot individually (Copilot Pro at $10 per month via personal accounts). In that case, you might want to consolidate under the enterprise plan for better administrative control and to take advantage of enterprise features (SSO, organization-wide policy settings, and the important IP indemnification that comes only with the business/enterprise tier). It’s not part of a Microsoft 365 agreement; it’s managed in GitHub’s billing, though you can coordinate through Microsoft if needed. Also note: Copilot Enterprise requires that you have GitHub Enterprise Cloud (it leverages organization-wide features available in that platform).

Q3. Will Copilot or Azure OpenAI use our data (code, documents, prompts) to train AI models?
A: No, not by default. Microsoft has been very explicit that for its enterprise AI services, your data is not used to improve the foundational AI models. For Microsoft 365 Copilot, any data retrieved (emails, files) or prompts you enter are only used to generate the response for your user session – that information isn’t fed into OpenAI’s training pipeline. Similarly, GitHub Copilot (enterprise versions) does not take your private code and add it to OpenAI’s training set. Azure OpenAI Service provides the option to disable Microsoft storage of prompts and completions, ensuring that none of this data is sent back to OpenAI. Essentially, Microsoft acts as a buffer: you get the power of OpenAI’s model, but Microsoft’s enterprise agreements ensure your data remains confidential. Of course, normal data handling within your tenant or account still applies – for instance, Microsoft 365 Copilot might cache some results in your SharePoint if it creates a document, or Azure OpenAI might temporarily log requests for troubleshooting unless you opt out – but none of that is used to train or tweak the AI’s weights. Always review the specific privacy and data handling terms for each service, and configure settings (like disabling log retention in Azure OpenAI) if you have strict requirements. However, these services are generally designed to meet corporate privacy standards.

Q4. How can we effectively manage costs for Azure OpenAI Service, particularly as usage increases?
A: Cost management for Azure OpenAI is crucial because its usage is based. Start by setting a budget or spending limit in your Azure subscription for the OpenAI resource – Azure allows you to put quotas (for example, limit the number of tokens per day or cap the monthly spend). Use Azure Cost Management tools to monitor in near real-time how many tokens are being consumed and the associated costs. It’s wise to break out Azure OpenAI usage by application or department (you can use separate resource instances or tagging) so you can attribute costs internally. If you anticipate heavy usage, consider the following strategies:

  • Model selection: Use cheaper models when appropriate. For instance, you might use GPT-3.5 for high-volume queries and reserve GPT-4 for cases truly needing its advanced reasoning. Over thousands of calls, this drastically cuts costs.
  • Prompt efficiency: Optimize your prompts and the max tokens you allow in responses. Long prompts and long responses cost more. Developers can employ techniques like caching intermediate results or using shorter context windows to reduce token usage.
  • Scaling plans with Microsoft: If your usage is extremely high, consider discussing pricing options with Microsoft. In some cases, high-volume customers can negotiate rates, or you may be able to commit to a certain monthly spend for a discount (similar to Azure reserved instances or capacity planning, although this is not yet a standard practice for OpenAI – it’s worth inquiring about).
  • Auto-shutdown/test vs prod: Treat your AI experiments like any cloud resource – shut down or limit usage in non-production environments. It’s easy for experimentation to run wild with many test prompts. Use separate API keys for development and testing with low quotas.
  • Regularly review the cost reports. If you encounter unexpected spikes (e.g., one app integration calling the API far more frequently than intended), address them promptly – perhaps by adding rate limiting in the app or adjusting the logic.

By proactively monitoring and iterating, you can keep Azure OpenAI costs predictable and manageable. Unlike per-user licensing, you have the flexibility to scale down usage if an initiative is paused, ensuring you only pay for the value received.

Q5. What legal or compliance issues should we be aware of when using Copilots in our business (e.g., liability for AI outputs)?
A: The primary legal concerns revolve around intellectual property, data protection, and accuracy:

  • Intellectual Property: If an AI-generated output (code, text, image) unintentionally copies from copyrighted material, there’s a risk of IP infringement. Microsoft’s Copilot services now include indemnification for this scenario, meaning Microsoft will assume liability for copyright claims arising from Copilot’s suggestions, as long as you’ve paid for the service and follow the usage guidelines. This is a strong backstop for enterprises. However, you still need internal policies: developers should review AI-generated code for open-source license compliance (Copilot tries not to output licensed code verbatim and even warns if it does), and content creators should fact-check and ensure AI-produced text isn’t directly plagiarizing. The indemnity covers you for legal damages, but you still want to prevent problematic outputs proactively.
  • Data Protection and Privacy: When employees use Copilot, they might input sensitive data (for example, an email thread with personal data for summarization). Such use must comply with privacy regulations, such as GDPR or HIPAA. Microsoft has built these services to be enterprise-compliant (data is not stored long-term or used externally). Still, you should update your employee guidelines to cover AI usage – e.g., instruct staff not to paste classified information into any prompt unless necessary. Additionally, consider whether certain data categories should be excluded from AI processing. Tools like Microsoft Purview’s DLP can sometimes detect and block sensitive info from being sent to Copilot, adding a layer of protection.
  • Hallucinations and Accuracy: Copilot and GPT models can occasionally produce incorrect or fabricated results (so-called “AI hallucinations”). From a compliance perspective (and an ethical one), you don’t want business decisions or customer communications based solely on unverified AI output. Make it clear that users must treat Copilot’s output as a draft or suggestion. For critical functions (such as financial reports or medical information generation), include human review and verification steps. This isn’t a direct legal licensing issue, but it’s a risk area. If, for instance, Copilot erroneously summarizes a contract and a decision is made based on that, the enterprise bears the responsibility. Training users and having checkpoints mitigates this risk.
  • Regulatory Compliance: Some industries have specific regulations (e.g., SEC rules for finance communications, or data residency laws). Ensure that using Copilot doesn’t violate any of these. For example, if your country requires certain data, never leave the country, make sure your Copilot service is region-specific (Microsoft 365 Copilot will use the same region your tenant is in; Azure OpenAI can be deployed in a chosen region). For auditability, Microsoft 365 Copilot interactions can be logged. You may need to retain these logs if required by auditors to demonstrate what AI is suggesting in regulated processes.

In summary, from a legal standpoint, Microsoft’s licensing covers many bases (including IP indemnity and GDPR-compliant processing), but enterprises must use these tools responsibly.

Engage your legal and compliance team early to update internal policies regarding AI-generated content, just as you would for any third-party service handling company data.

Read more about our Microsoft Optimization Services.

Cut Microsoft Azure, M365, and Licensing Costs – Redress Compliance

Would you like to discuss our Microsoft Optimization Services with us?

Please enable JavaScript in your browser to complete this form.
Name
Author
  • Fredrik Filipsson

    Fredrik Filipsson is the co-founder of Redress Compliance, a leading independent advisory firm specializing in Oracle, Microsoft, SAP, IBM, and Salesforce licensing. With over 20 years of experience in software licensing and contract negotiations, Fredrik has helped hundreds of organizations—including numerous Fortune 500 companies—optimize costs, avoid compliance risks, and secure favorable terms with major software vendors. Fredrik built his expertise over two decades working directly for IBM, SAP, and Oracle, where he gained in-depth knowledge of their licensing programs and sales practices. For the past 11 years, he has worked as a consultant, advising global enterprises on complex licensing challenges and large-scale contract negotiations.

    View all posts

Redress Compliance