salesforce

Salesforce Storage and API Usage Management

Salesforce Storage and API Usage Management

Salesforce Storage and API Usage Management

Authorโ€™s Note: This chapter serves as a CIO playbook for managing Salesforce storage and API usage, written from the perspective of an independent licensing advisor. It offers an advisory overview on understanding limits, monitoring usage, optimization strategies, and negotiation tips. CIOs across industries can use these insights to avoid overage costs and ensure smooth Salesforce operations.

Introduction

Salesforceโ€™s powerful cloud platform comes with inherent limits on data storage and API calls. As organizations scale their CRM usage, itโ€™s easy to bump against these caps, incurring unexpected costs or performance issues if not managed proactively.

Storage limits constrain how much data and file content you can host in your Salesforce org, while API call limits cap how often external systems can interface with Salesforce programmatically. Both are crucial for CIOs to monitor and optimize.

This chapter provides a comprehensive guide to navigating Salesforceโ€™s storage and API usage constraints, with practical strategies to stay within entitlements and advice on negotiating for more capacity upfront. The goal is to help enterprise CIOs avoid surprises, such as sudden $ 100,000+ storage bills or halted integrations, by treating storage and API governance as an integral part of Salesforce management.

Understanding Salesforce Storage Limits

Data Storage vs. File Storage: Salesforce categorizes storage into two types: Data Storage for database records (e.g., Accounts, Contacts, Cases, custom object records) and File Storage for attachments, documents, Salesforce Files, Content, and Chatter feed files.

Each category has separate allocations and overage costs. Data storage covers the information in fields of records, typically with a minimum of 2 KB per record, regardless of how many fields are filled (some record types, like Email Messages, may consume more). File storage is consumed by binary files, such as PDFs, images, and spreadsheets, stored in Salesforce.

Default Storage Allotments: Salesforce editions include a baseline amount of storage plus per-user additions:

  • Data Storage Allocation: Enterprise and Professional editions typically come with a 10 GB base data storage allocation per organization, plus an additional amount per user (e.g., 20 MB of extra data storage for each standard user license). Higher-tier editions like Performance and Unlimited provide more per-user data capacity (e.g,. 120 MB per user for those editions)โ€‹. For example, an Enterprise org with 100 users would have 10 GB + 100 ร— 20 MB =ย 2,000 MB of data storage. Essentials (and some lower editions) have smaller starting allocations, often around 1 GB, for data storage.
  • File Storage Allocation: Most editions come with a 10 GB base file storage allocation per organization, plus additional storage per user. Enterprise and above typically add 2 GB of file storage per user licenseโ€‹. Some lower editions provide around 612 MB per userโ€‹ (approximately 0.6 GB each). Using the same example, 100 Enterprise users would grant 10 GB + 100 ร— 2 GB = 210 GB of file storage. (File storage per user is much larger than data storage since files tend to require more space.)

Overage Costs: If you exceed these included amounts, Salesforce charges a premium for extra storage. Additional data storage costs about $125 per month for each 500 MB blockโ€‹. In other words, 1 GB of extra data space costs roughly $250 per month โ€“ an extremely high price compared to commodity cloud storage (for perspective, Amazon AWS charges only about $0.023 per GB per month for standard storage).

Additional file storage is much cheaper by comparison, roughly $5 per month per 1 GB. These are recurring monthly charges, not one-time feesโ€‹. A key point for CIOs is that Salesforce intentionally prices storage high to discourage using the CRM as a general-purpose data warehouse. The platform provides just enough storage for CRM needs, expecting customers to archive or offload data that exceeds reasonable CRM usage.

Example: 10 GB of data storage (~the default for an Enterprise org) can hold about 5 million standard records (at ~2 KB each)โ€‹. That may sound like plenty, but a large enterprise with heavy Salesforce usage (cases, leads, transactions, etc.) or long history can accumulate this over a few years. Likewise, 10 GB of file storage could be consumed by about 2,000 PDF attachments of 5 MB each. Itโ€™s easy to see how a growing org might approach these limits. One SaaS company noted that adding ~1 TB of files to Salesforce per year would theoretically cost them on the order of $250,000 per month by year one in overage fees โ€“ an astronomical sum โ€“ making a compelling case for external storage solutions.

What Happens When You Reach Storage Limits:ย Reaching the storage cap doesnโ€™t immediately stop your organization, but it will prevent additional data from being created or uploaded. Salesforce will send an email alert to admins when data storage usage reaches critical levels (e.g., 100% of entitlement)โ€‹.

At that point, your options are either to delete or archive some data to free up space, or purchase additional storageย (in those expensive blocks) to increase the limit. File storage reaching its limit similarly blocks new file uploads. Running out of storage can disrupt business (e.g., unable to save new records or attachments), so proactive management is essential.

Salesforce API Call Limits

In addition to storage constraints, Salesforce imposes limits on API calls โ€“ the programmatic requests made by external applications, integrations, or scripts to read or write data via Salesforceโ€™s APIs. These limits ensure that no single organization overconsumes shared resources in the multi-tenant environment.

Daily API Request Limits: Salesforce defines an API call allowance for each organization per 24-hour rolling period. Unlike storage, API usage limits scale primarily with the number and type of user licenses and edition you have, rather than a fixed amount per org. For most production orgs, the formula is roughly:

  • Enterprise & Professional Editions: At least 15,000 API calls per 24 hours to start, plus additional calls per user license. Specifically, each full Salesforce user license typically contributes aroundย 1,000 calls per dayย to the pool, and there is usually a base allocation (e.g., a minimum, often on the order of 15,000โ€“25,000) for the organization. For instance, an Enterprise org with 200 standard users might have a limit of aroundย 200,000 calls per dayย (base 100k plus 200 ร— 1k). The Professional Edition (if API access is enabled) uses similar calculations, with 1,000 per user.โ€‹
  • Unlimited & Performance Editions: Higher-tier editions come with much more generous limits. A Performance or Unlimited license often provides around 5,000 calls per user per dayโ€‹. These orgs also have a higher base (commonly 100k). Thus, 200 Unlimited users could yield on the order of 1 million callsย per day. Salesforce also offers API call add-on packs for purchase in some cases, which can increase the 24-hour cap if an org needs more than its licenses provide.
  • Developer Edition/Sandboxes:ย Developer organizations (and trial organizations) have a flat limit (e.g.,ย 15,000 calls per 24 hoursย in total) since they are for testing purposes. Full-copy Sandboxes often have very high limits (e.g., 5 million) to facilitate testing at scale.

Itโ€™s important to note that the API call limit isย org-wideย (technically, all calls from all integration users count against the same pool). Salesforce counts any call to the SOAP, REST, Bulk, or other platform APIs against this limit, except for a few specific system callouts (such as Apex callouts to external services), which do not count towards it. Both incoming calls (e.g., middleware querying Salesforce) and outgoing calls from Salesforce (e.g., Apex making a callout to another API) can be subject to limits. However, the daily cap discussed here refers to inbound API usage of the Salesforce API by clients.

Concurrent API Calls: In addition to the daily volume limit, Salesforce enforces a limit on concurrent long-running API requests. Specifically, if more than 25 API calls are executing in parallel (lasting 20 seconds or longer) in a production organization, additional calls will be rejected until the concurrency level drops.

This typically affects scenarios such as multiple heavy data load jobs or complex queries being executed simultaneously. For shorter requests, hitting a concurrency ceiling is less common, but itโ€™s a consideration if you have many simultaneous integrations. (Developer edition orgs have a lower concurrent limit of 5)โ€‹.

There are also separate limits for specific APIs (e.g., the Bulk API allows up to 15,000 batches per rolling 24 hours). The Streaming API has limits on events and subscribers โ€“ those are beyond our scope here, but worth noting if you use those channels.

What Happens When API Limits Are Exceeded: If an organization exceeds the allowed API calls in 24 hours, Salesforce will begin toย block further API requestsย until the usage falls back under the limit. Each hour, usage from exactly 24 hours ago โ€œrolls offโ€. In practical terms, your integrations may start failing to connect โ€“ APIs will return errors (HTTP 503 or specific error codes indicating the limit has been reached). Salesforce may allow some leeway for small overages during spikes, but there is a hard cap in place to protect the platform.

This means if you hit the daily maximum, all integrations โ€“ from customer mobile apps to middleware to data sync jobs โ€“ could be halted for hours. Hitting the concurrency limit will similarly result in some calls being rejected with an error if too many run at once.

In either case, critical business processes can be disrupted. For example, if an e-commerce site canโ€™t create new leads in Salesforce due to API limit errors, it could impact sales follow-ups. Therefore, monitoring and managing API consumption is as important as managing data storage for a Salesforce-driven enterprise.

Monitoring Usage with Salesforce Tools and Analytics

Keeping a close eye on storage and API consumption is the first step in proactive management. Salesforce provides native tools to monitor these limits, and third-party solutions can enhance visibility.

Storage Usage Monitoring: Salesforceโ€™s Storage Usage page, located in Setup, breaks down both data and file storage usage. CIOs should ensure administrators regularly review this page to see how close the org is to its limits and which objects consume the most space. The page itemizes storage by object and file type, helping identify large contributors (e.g., a custom object with millions of records, or massive email attachments).

Salesforce can also display Big Objects usage here (Big Objects are a special storage mechanism discussed later). Setting up reports or dashboards based on this data can allow trending over time โ€“ e.g. storage used last month vs. this month โ€“ to catch growth patterns. Native Salesforce Optimizer reports (or the newer org health features) may also flag unusually high storage usage.

API Usage Monitoring: For API calls, Salesforce offers a few native tracking methods:

  • System Overview: In Setupโ€™s System Overview (or Company Information in Classic), you can see a summary of โ€œAPI Requests, Last 24 Hours,โ€ which shows how many API calls have been used in the past 24 hours, and the maximum allowed. This is a quick health check to see if youโ€™re nearing the daily cap. For example, it might show โ€œAPI Requests, Last 24 Hours: 14,000 (out of 15,000)โ€ โ€“ indicating you are at 93% of your limit.
  • Usage-Based Entitlements: The Company Information page also lists โ€œAPI Usage (Last 7 Days)โ€ and even a monthly API call usage metricโ€‹. This helps understand longer-term consumption trends, which is useful for capacity planning. If you have 450,000 API calls allowed per 30 days (as an example contract entitlement) and you see 400,000 used, thatโ€™s a warning sign.
  • Event Log Files / Monitoring API: If your organization has Salesforceโ€™s Event Monitoring (part of Shield or a paid add-on), you can get detailed logs of API calls. Each API call can be recorded as an event, including details such as the user, URI called, and timestamp. These logs can be exported and analyzed (e.g., in Splunk or other SIEM tools) to identify which integrations or users are making the most calls.
  • Custom Monitoring via API: Ironically, you can use an API to monitor the API โ€“ Salesforce provides a REST endpoint (/services/data/v<version>/limits) that returns the current usage and limits for APIs. This can be polled periodically by an external script to trigger alerts (e.g. send email if >80% of daily API limit used). Salesforce also has an API usage report that can be created in the Reports tab (the โ€œAPI calls made within the last 7 daysโ€ report). However, certain internal calls, such as the Salesforce mobile app, may not appear or count against the limit.
  • Third-party API Monitoring:ย Many integration platforms, such as MuleSoft and Boomi,ย or IT monitoring systems, can track Salesforce API responses and volumes. CIOs can leverage these to set up alerts โ€“ for instance, if an integration receives an API limit exceeded error, have it notify IT immediately. Some AppExchange tools also specialize in org monitoring, providing admin dashboards for API usage and even forecasting when you might hit limits.

Tracking Storage Growth: In addition to the Salesforce Setup pages, consider implementing internal processes to track storage, such as a monthly report of total records by object or using Salesforceโ€™s Weekly Export service to measure data size.

Some third-party apps can analyze your org for โ€œdata bloat,โ€ identifying large fields, old records, or big attachments that consume space. Salesforceโ€™s Local Backup (Data Export) files can reveal the total size of each objectโ€™s data extract, which is another way to gauge where the bulk is.

Why Monitor? Proactive monitoring allows you to take action before limits become a crisis. For example, suppose you notice a trend that API calls usage is growing 10% month-over-month and is at 70% of the limit. In that case, you can investigate whatโ€™s driving it (perhaps a new integration or user behavior) and mitigate it before it happens.

Likewise, monitoring storage trends might reveal that a marketing automation integration is generating thousands of campaign records weekly, providing an opportunity to archive old campaigns and free up space. Leading organizations often set internal thresholds (e.g., alerting when more than 85% of storage is used, or when daily API usage exceeds 90% at any point) so the IT team can react immediately.

Strategies for Managing and Optimizing Storage Usage

Storage management in Salesforce requires a balance between retaining critical data for users and offloading or deleting what you donโ€™t need in the live system.

Below are strategies to optimize storage usage and avoid excessive costs:

1. Data Archiving and Lifecycle Management: Implement a data retention policy for Salesforce records. Not all data needs to be stored forever in the primary organization. Identify data that is older or seldom accessed (e.g., cases closed more than 5 years ago, inactive customer records, or completed orders from a long time ago) and archive it outside of core Salesforce. Archiving can be done in several ways:

  • Big Objects: Salesforce Big Objects are a native feature designed to store extremely large volumes of data, including billions of records, outside of standard object storage limits. Big Objects donโ€™t count against your normal data storage entitlements. They are ideal for storing historical data that you rarely query in real-time. For example, you might move old opportunity records or log data into a custom Big Object. Caveat: Big Objects have limitations โ€“ they require defining an index, and you can only query them via async SOQL or through custom code, not standard reports. But as an archival repository accessible within Salesforceโ€™s platform, they are powerful. Use Big Objects when you want to keep data โ€œin Salesforceโ€ logically, but donโ€™t need it in everyday operations. Ensure your team understands the access and performance trade-offs.
  • External Archives (Off-Platform): A common approach is to extract and store data in an external database or data lake (e.g., AWS S3, Azure, on-premises SQL) once it reaches a certain age threshold. You can periodically export old records (along with their attachments) and then delete them from Salesforce to free up space. If needed, users can access the archived data via an external system or on demand via integrations. Some third-party solutions (e.g., archival tools like DataArchiva and OwnBackup Archiver) facilitate this by automating the transfer of Salesforce records to external storage, while preserving referential integrity and providing a user interface to fetch archived records on demand. Data archiving solutions can significantly reduce costs by moving data to cloud storage, which costs cents per GB, versus keeping it in Salesforce at $250/GB.
  • Salesforce Connect (External Objects): If you need real-time access to archived data in Salesforce, consider using External Objects via Salesforce Connect. This feature allows you to link Salesforce to an external database table, so the data remains in the external system but appears as if itโ€™s a read-only object in Salesforce. For instance, instead of storing 50 million order records in Salesforce, you could keep them in an AWS database and use an External Object to allow users to retrieve or report on an order when needed. External Objects donโ€™t consume Salesforce storage since the data is not actually in the org. The trade-off is slightly slower access (data is fetched on the fly) and the need for the external system to be available. This approach works well for large datasets that users only occasionally need to reference through Salesforce.
  • Native Data Retention Tools: Leverage Salesforce features like Auto-Archiving of activities. Salesforce automatically archives events and tasks after a certain period, so they no longer count against storage, although they can still be reported on for some time. Similarly, consider using analytical snapshots or summary objects to retain key metrics and archive the raw detail. For example, store only annual aggregates of old data in Salesforce and remove the transactional details.
  • Backup then Delete: Ensure you back up data before deletion (using Salesforceโ€™s Weekly Export or a third-party backup) so you have the information offline. Then delete records that are no longer needed. This straightforward approach (export then delete) can free space; for example, some orgs routinely delete cases or logs older than X years after backing them up. Deletion should follow corporate data retention policies and compliance requirements, being mindful of any records that must be retained for regulatory reasons.

2. File Storage Optimization: Files, including attachments and documents, can consume storage quickly, especially when users attach large presentations, images, or scans to records. Strategies to manage file storage:

  • External File Repositories: Rather than storing files in Salesforce, integrate Salesforce with an external content management system (CMS) or cloud storage. For example, users could upload heavy files to SharePoint, Google Drive, Box, or AWS S3 and then store the link or reference in Salesforce. Salesforce offers Files Connect, which allows certain external repositories to appear within Salesforce. This way, the file is effectively offloaded (no $5/GB charges), but users can still access it through the CRM interface. Third-party tools like XfilesPro (the source of the earlier cost data) specialize in seamlessly linking Salesforce to external file storage.
  • Attachment Offloading: If not using a formal integration, an admin can also periodically extract and purge old attachments. Many organizations find that email attachments, synced from Outlook, or email-to-case attachments accumulate. You might set a policy that any attachment older than 3 years is moved out. A script or tool can download these files to a server or cloud bucket, then remove them from Salesforce. Store a metadata record or URL in case users need to retrieve it later.
  • File Compression or Formats: In cases where files must remain in Salesforce, encourage efficient file usage โ€“ for example, upload PDFs instead of raw Word documents (which are often smaller), compress images, etc. This is a minor factor, but it can help in the long run.
  • Content Document Management: If using Salesforce CRM Content or Files, ensure users understand versioning โ€“ multiple versions of a file each consume storage. Train users to delete old versions or avoid unnecessary versions of large files. There is a limit (like maximum 30 million files, and 2GB per file upload limit) to be aware of in massive content use casesโ€‹.
  • Chatter and Email Attachment Settings: Salesforce has settings to restrict file size uploads and to decide whether emails sent through Salesforce include attachments as Salesforce Files or just links. Tuning these settings can prevent inadvertent storage bloat. For instance, using Enhanced Email, where attachments are stored as Files, might double-store them as both email messages and files โ€“ you could opt to store one or the other.

3. Clean Up Data Regularly: Prevention is better than a cure. Establish governance where admins (or a data stewardship team) routinely clean up the org:

  • Remove or reduce debug logs, archived platform events, or other technical records that consume storage (some technical logs may count towards data storage if not automatically purged).
  • Use data cleansing to delete duplicate records (no reason to store the same customer twice).
  • Trim unnecessary fields: for example, large text fields used for one-time data loads can be cleared if they are no longer needed.
  • Watch out for managed packages or AppExchange apps that store a lot of custom object data โ€“ if an app is uninstalled, ensure its custom objects and records are removed as they can linger and consume space.

4. Monitor High-Growth Elements: Use the monitoring methods discussed to identify whatโ€™s driving storage growth. Sometimes, a particular integration is creating hundreds of thousands of records (e.g., a logging or tracking object). Once identified, you can decide if that data needs to be in a different place. Often, logging data is sent to an external system, leaving Salesforce to handle core CRM data.

By applying these strategies, companies can often stay within their included storage or at least minimize the need for costly add-on purchases. A combination of archiving, deletion, and external storage will yield the best results: keep Salesforce lean and performant by retaining only what is actively useful on the platform. As SalesforceBenโ€™s advice succinctly put it: delete what you donโ€™t need, buy extra storage only when necessary, or better yet, archive to cheaper storage to โ€œtrim the bloatโ€ while keeping data accessible.

Strategies for Managing and Optimizing API Usage

Given the importance of integrations and connected applications, API calls are like the lifeblood of Salesforceโ€™s connectivity to other systems. Managing API usage is about ensuring vital integrations stay within limits and perform well.

Here are strategies to optimize API consumption and avoid hitting those caps:

1. Profile and Understand Your API Usage: Start by getting visibility into which systems or processes consume the most API calls. For example, you might find that a nightly ETL job accounts for 50% of daily calls, or that a custom mobile app used by field agents is polling Salesforce frequently. By mapping out usage, you can target optimizations to the biggest consumers.

Salesforceโ€™s Event Log or API usage reports (as mentioned) can help attribute calls to users or integrations. Often, each integration uses a distinct user account. Checking โ€œAPI Calls Made by Userโ€ can pinpoint, for example, that the โ€œSAP interface userโ€ made 30,000 calls yesterday, whereas others made 5,000. This guides where to focus.

2. Optimize Integration Design (Reduce Call Frequency): A common cause of API overuse is chatty integrations, which make excessive calls individually rather than in bulk. Work with your integration architects to apply best practices:

  • Batch and Bulk APIs: Instead of inserting or updating records one by one through the SOAP/REST API, use the Bulk API or batch operations. The Bulk API allows you to send large data batches (up to 10,000 records per batch) in a single requestโ€‹. This dramatically cuts down the call count when loading large volumes. For example, inserting 100,000 records via REST might require 1000 calls if done in batches of 100, whereas the Bulk API could do it in approximately 10 batches (calls). Use Bulk API for migrations, periodic data syncs, or any scenario with high volumes. Keep in mind that the Bulk API has its limits (up to 15,000 batches per day, shared between Bulk v1 and v2) โ€“ but thatโ€™s usually plenty, and the efficiency gains are huge.
  • Composite API: Salesforce offers a Composite REST API that allows multiple subrequests in a single call. For example, you can retrieve several related records in one round-trip or perform a series of actions in a single request. Using composite patterns can reduce the number of round-trip calls. For instance, a mobile app that needs to load an account and its contacts can either make separate API calls or use a single composite call that fetches both the account and its related contacts together. Fewer calls mean less chance of hitting limits (and better performance).
  • Caching and Data Sync: If external systems need frequent access to Salesforce data, consider caching commonly used data on the external side to avoid constant API reads. For example, suppose an e-commerce site frequently queries product info from Salesforce. In that case, it might cache that information in its database and refresh it via API periodically (or when changes occur) instead of on every request. Similarly, use a publish/subscribe model (see the next point) to push data to systems when it changes, rather than having them poll Salesforce repeatedly.
  • Optimize Queries: Craft SOQL queries and APIs to retrieve exactly whatโ€™s needed, no more. Retrieving large datasets or too many fields not only consumes API calls but can also slow the responses (and potentially hit other limits, such as heap size or query row limits). Ensure that integrations use selective queries (with filters) and, under the hood, use efficient indexes in Salesforce to avoid long-running requests. This can help prevent hitting the 20-second concurrency threshold, which flags a call as long-running and counts toward the 25 concurrent limit. The faster each API call completes, the less chance you have of many piled up concurrently.
  • Throttling and Queueing: Implement client-side throttling in your integration middleware or API clients. For example, if you approach say 90% of the daily limit, the integration can slow down non-critical transactions or queue them for later, to avoid hitting the hard cap. Many integration tools have throttle settings precisely for Salesforce usage governance. At minimum, design integrations to handle โ€œAPI limit exceededโ€ errors gracefully โ€“ e.g., back off and retry after some time rather than continuously hitting a locked door.
  • Use the Right API for the Job: Salesforce offers various APIs, including REST, SOAP, Bulk, gRPC (also known as GraphQL in newer APIs), Streaming, and more. Use streaming or event-driven approaches for scenarios that otherwise would require constant polling. For example, instead of an external system polling Salesforce every 5 minutes to check for updated records (which costs API calls even if nothing changed), you can use Platform Events or Change Data Capture to have Salesforce push a notification/event when a change happens. The external system subscribes to these events (using CometD or a similar technology, which has its limits but is generally more efficient for such use cases) and thus avoids polling entirely. This event-driven integration can significantly reduce API usage for certain high-frequency check-ins.
  • Parallelism vs. Serial: While you want to reduce calls, sometimes making some calls concurrently can improve throughput. However, be mindful of the 25 concurrent limit. If you have multiple parallel integration threads, limit them to no more than 25 active calls to Salesforce at a time. If using a multi-threaded approach, use a semaphore or connection pool of size 25 to ensure you donโ€™t exceed concurrent guidelines. If a job involves very long-running queries, consider running them sequentially or breaking the query into smaller chunks to complete faster and avoid tying up a thread for more than 20 seconds.

3. Govern and Prioritize API Consumers: Not all API usage is equally critical. A CIO should establish governance where mission-critical integrations get priority on API usage, and non-critical ones might be limited or scheduled during off-peak hours:

  • Schedule Batch Integrations Wisely: If you have heavy data sync jobs, such as nightly bulk data loads or morning batch updates, schedule them for times when interactive use is low. The API limit is 24-hour rolling, but you also donโ€™t want, for example, a nightly job consuming 80% of your calls and leaving too little for daytime integrations. Distribute large jobs over time โ€“ e.g., break a nightly 100k record job into chunks that run hourly throughout the day, if feasible, to smooth usage.
  • Integration User Separation: Use separate integration user accounts for different systems. While this doesnโ€™t increase the overall limit, it helps isolate and monitor usage per integration. If one integration user consistently has a high score, you can take action (such as optimizations or allocating it to a different organization if needed). Also, Salesforce imposes some per-user concurrency limits (for example, a single user can only have 25 concurrent queries), so splitting integrations across multiple users can help avoid a bottleneck for one user.
  • Prioritize API Access: If you’re nearing limits, decide which services can continue and which can be paused. For instance, internal batch exports might be paused if more important real-time partner integrations need the capacity. Such decisions should be part of an integration policy, much like managing network bandwidth.

4. Consider API Capacity in Licensing Decisions:ย If your business relies heavily on integrations (for example, a digital ecosystem with many API calls from customer apps or an IoT scenario that feeds data into Salesforce), factor this into your Salesforce edition and licensing choices. It might be more cost-effective to choose a higher edition or additional licenses that increase API limits, rather than constantly working around the limits or buying emergency add-ons. For example, if youโ€™re on Enterprise Edition with 200 users making around 200,000 calls a day but growth is projected, upgrading to Unlimited (with 5,000 calls per user) could give a 5x boost in API capacity โ€“ albeit at a higher per-user cost. This trade-off needs analysis: sometimes the premium for a higher edition is less than the cost of losing business due to API constraints or building complex workarounds. (We will cover negotiating for API capacity in the next section.)

5. Leverage Middleware and Integration Platforms: An enterprise iPaaS (Integration Platform as a Service) or middleware can act as a buffer and brain for API management:

  • Many integration platforms can aggregate and transform data, reducing the need for calls. For instance, the middleware could combine data from multiple Salesforce API calls and present it in a single payload to downstream systems, rather than having those systems individually call Salesforce.
  • Middleware can also cache data and implement retries or backoffs if Salesforce returns limit errors.
  • If you have multiple systems that need the same data, the middleware can make a single Salesforce call and distribute the result, instead of each system calling Salesforce separately.
  • Essentially, a smart integration layer helps avoid redundant API traffic and centralizes control over how and when Salesforce is invoked.

6. Monitor API Usage Continuously: As mentioned in monitoring, ensure that someone (or an automated system) is constantly tracking API consumption. Sudden spikes can indicate a malfunctioning integration (e.g., a loop that continuously calls the API due to a bug). Catching that early not only prevents hitting limits but also allows for quick remediation of the underlying issue. IT operations teams should include Salesforce API usage in their system health dashboards.

By applying these API optimization strategies, organizations can often stay well under their daily limits even as integration needs grow. The key themes are efficiency (doing more with fewer calls)ย andย governance (controlling who andย what can call and when). This ensures your Salesforce orgโ€™s connectivity scales without service interruptions or the need for constant manual intervention.

Negotiating for Better Storage and API Entitlements

A crucial aspect often overlooked until itโ€™s too late is the opportunity to negotiate higher limits or more favorable terms upfront, during your Salesforce licensing agreement discussions or renewals. Salesforce is known for monetizing additional usage, as one advisor put it, Salesforce can โ€œnickel-and-dimeโ€ you on things like support, storage, and API calls if not pre-negotiatedโ€‹. A savvy CIO will anticipate future needs and address them in the contract, rather than paying list prices later under duress.

1. Assess and Project Needs Before Negotiations: Before renewing or making an initial purchase, analyze your data growth trends and integration roadmap. Suppose you suspect that in 2-3 years, youโ€™ll have double the data or new systems that will heavily utilize the API. Quantify that. For example, if you have 500 GB of historical data that eventually needs to reside in or be accessible to Salesforce, thatโ€™s far above the default storage; something must be done. Or if a new mobile app launch will dramatically increase API calls, estimate those volumes. Armed with this data, you can approach Salesforce for solutions, either through product choices or contract terms.

2. Negotiate Extra Storage at a Discount (or Free): Itโ€™s often possible to get additional storage bundled at a reduced cost as part of a large deal. Rather than waiting and buying 500 MB at a time for $125, ask Salesforce sales if they can include, say, an extra 50 GB of data storage in the agreement for a flat fee or as a goodwill add-on. They may not give it for free, but even a 50% discount or a one-time fee (instead of a recurring one) would save you significant money in the long term. At minimum, negotiate the rate: get a lower $/MB rate locked in for any future overages. Itโ€™s better to have that in writing now than later when you have no leverage. Salesforce reps have some flexibility, especially if they know storage could be a blocker for your adoption. Call out your storage concerns in negotiations. Many customers forget, and later get hit with unexpected โ€œextra data storageโ€ chargesโ€‹.

3. Address API Limits Upfront: Standard API limits are usually sufficient for typical CRM usage, but if you know youโ€™ll push the envelope, discuss it. There are a few avenues:

  • See if Salesforce will formally increase the API limit for your org. They generally tie it to licenses (e.g., more licenses = more calls), but large customers have sometimes negotiated special entitlements. Salesforce does offer an โ€œAPI Calls Add-Onโ€ pack in some pricing sheets, which might, for example, allow an extra X thousand calls perย day for a fee. Ensure you understand if thatโ€™s available and get pricing. As a CIO, compare the cost of that add-on vs. adding more user licenses to boost API (some have cleverly bought cheap platform licenses to raise the API limit pool, as hinted by community experts).
  • If considering a higher edition primarily for API needs, use that as a bargaining chip. โ€œWe might need to upgrade to Unlimited for API limits, but thatโ€™s a big cost jump โ€“ can Salesforce offer a middle ground, like an API increase on Enterprise Edition for a fee?โ€ This signals your need without committing to the highest cost option immediately.
  • Ensure that anyย contract language allows flexibilityย โ€“ e.g., if you consistently hit limits, will Salesforce proactively alert you and work on a solution, or could they throttle you without notice? Itโ€™s rare, but being clear on expectations is part of good vendor management.

4. Co-term and Bundle Wisely: If you are negotiating multiple Salesforce products (such as Sales Cloud and Marketing Cloud), consider their interplay. Sometimes, storing data outside of Salesforce (in, say, Heroku or Marketing Cloudโ€™s data extensions) can be an alternative, so leverage the entire Salesforce stack in negotiations. But also be cautious of co-term contracts, as recommendedโ€‹ to maintain leverage. You might be able to secure a concession on storage if youโ€™re also buying additional products or higher support levels.

5. Donโ€™t Forget Support Plans and Hidden Costs: While beyond the scope of storage/API specifically, note that premium support is another area where Salesforce might upsell (20% of the net cost for Premier support by default). Ensure you negotiate support and other โ€œhiddenโ€ cost items (such as sandboxes) at the same time. Sometimes Salesforce might trade off โ€“ e.g., include extra storage if you agree to a certain support level, or vice versa. As a CIO, weigh these trade-offs strategically.

6. Document Everything:ย If a Salesforce account executiveย verbally assuresย you, โ€œOh, if you hit API limits, just call me and weโ€™ll sort it out,โ€ get that in writing (e.g., in an addendum or at least an email). Contractual protection is far better than relying on goodwill later. Negotiation is the time to lock in any promises.

In summary, anticipate growth in your Salesforce usage and use your buying power at renewal or purchase time to get what you need. A common pitfall is signing a deal that focuses only on user license cost and later finding out that you must urgently pay for more storage or API capacity at list prices.

By handling it upfront, CIOs can avoid scrambling and budget overruns. Salesforce is often amenable to these discussions if approached with data โ€“ after all, they want you to expand usage, just on their terms. Show them youโ€™ve done the math, and negotiate those terms to better align with your business needs. As Redress Compliance advises, donโ€™t be afraid to address storage and API needs in negotiations โ€“ if youโ€™re near limits, bring it up now (and ideally get some concessions) rather than after you signโ€‹.

Summary: Tools & Strategies for Salesforce Storage/API Management

The following table summarizes key tools and strategies discussed, and when to consider each:

Tool/StrategyPurpose/BenefitWhen to Use (Use Case)
Salesforce Storage Usage Page (Setup)View current data and file storage consumption by object/file type. Alerts admins to nearing limits. Cost: Included.Use for ongoing monitoring. Check monthly or when deploying data-heavy projects to catch growth trends early.
System Overview (API Usage)Quick view of API calls used in last 24h vs. limitโ€‹. Helps gauge integration load. Cost: Included.Use daily/weekly. Ideal for admin oversight or if an integration seems slow โ€“ check if API limit is a factor.
Company Info (Usage-Based Entitlements)Use for proactive alerts and convenience if your team needs automated warnings (email/SMS) or cannot frequently manually check usage. Also useful in complex integration architectures to centralize monitoring.Use for trend analysis and reports. Good for monthly CIO reviews of Salesforce utilization and capacity planning.
Event Monitoring (API Events)Detailed log of API calls (user, time, operation). Requires Shield or add-on. Cost: Add-on license.Use if you need granular usage analysis. E.g. in highly regulated or large orgs with many integrations โ€“ to identify usage by system.
Third-Party Monitoring ToolsE.g. AppExchange dashboards for storage, or external APM tools. Offer alerts, nice visuals, possibly predictive analytics. Cost: Varies (license or subscription).Use for proactive alerts and convenience. If your team needs automated warnings (email/SMS) or cannot frequently manually check usage. Also useful in complex integration architectures to centralize monitoring.
Data Archiving (External)Offload old data to external database or files. Frees up Salesforce storage significantly. Cost: Dev effort or purchase archiving tool; storage costs on external platform (cheap per GB).Use when data volume is growing fast or large history exists. E.g. after a few years of operations, or post-merger data import, archive closed transactions >X years. Also when storage cost with SF becomes notable in budget.
Big Objects (Salesforce)Store huge volumes within Salesforce platform without standard storage cost. Keeps data accessible in org (via API/async query). Cost: Included up to certain capacity; may require add-on for very large volumes.Use for Salesforce-native archiving. E.g. logging millions of events or storing historical records that might occasionally be queried by analytics โ€“ when you prefer to keep data logically in CRM but not count against limits.
External File Storage Integration (e.g. Files Connect, S3, SharePoint)Save files/attachments in external system while making them accessible from Salesforce records. Cost: Could be included (Files Connect to SharePoint/OneDrive is included for some editions) or require license (Files Connect for Google Drive, etc.), plus external storage costs.Use when file storage usage is high. Ideal for document-heavy processes (contracts, product images) โ€“ store those in a content management system to avoid hitting the Salesforce file cap. Also use if users regularly attach large files >50 MB (which Salesforce supports up to ~2GB each, but better to offload).
Data Cleanup (Manual or Automated)Delete unnecessary or duplicate data, remove large fields, etc. Cost: Staff time or tool cost if using a deduplication app.Use periodically (quarterly/yearly). Especially after major projects or when noticing unusual storage jumps. Also before purchasing extra storage โ€“ ensure no โ€œjunk dataโ€ can be cleaned out.
Bulk API for Data LoadsAPI that handles high-volume data operations in batches, reducing call count. Cost: Included (does count towards API calls but far more efficient per record).Use for migrations, integrations moving >10k records at a timeโ€‹. Also use for scheduled jobs like nightly syncs. Whenever a process would otherwise do tons of individual record calls โ€“ switch to Bulk API.
Composite API / GraphQL (if available)Combine multiple operations in one API call. Cost: Included.Use for chatty integrations or mobile apps. E.g. retrieving an account and related contacts in one call vs. multiple. Great for reducing round trips in custom apps.
Platform Events / Change Data CapturePush-based integration; Salesforce publishes events on data changes. Reduces need for polling via API. Cost: Limited events included; higher volumes need add-on (Shield or Platform Events pack).Use for near-real-time sync to external systems without constant polling. E.g. keep an external database updated with Salesforce records changes โ€“ subscribe to events instead of API polling every few minutes.
Middleware/Integration HubUse an intermediary system to manage integrations (e.g. MuleSoft, Boomi, Informatica). Can queue, cache, transform, and smartly distribute calls. Cost: Significant (separate software), but adds enterprise integration benefits.Use when you have many integrations or complex workflows. An iPaaS is justified for large enterprises connecting Salesforce with ERP, e-commerce, etc. It will help prevent inefficient API usage by centralizing data exchange logic.
Throttling & Governor PoliciesPolicies or mechanisms to limit API call rates from clients. Could be custom or using middleware features. Cost: Dev time or included in integration tool.Use to prevent runaway processes from exceeding limits. E.g. if a partner system should only pull data hourly, enforce that. Use whenever an integration is built โ€“ include rate limiting in the design.
License-based Scale-Out (Adding licenses or upgrading edition)Increase API and storage limits by adding more user licenses (each adds storage and API capacity) or moving to an edition with higher allowances. Cost: Cost of licenses/upgrades (significant, but might be needed for other reasons too).Use as last resort or strategic planning. If you foresee continuous growth that outpaces archiving and optimization, factor this into license count (buy a few extra licenses to boost limits) or edition (if enterprise-wide benefits). E.g. moving to Unlimited Edition largely solves API limit worries but at high cost โ€“ weigh vs. other approaches.
Negotiated Add-Ons (Contract)Pre-arranged extra storage or API call packs at negotiated rates. Cost: as per contract (hopefully discounted).Use when renewing or signing Salesforce contracts. If identified as a need, negotiate rather than react later. Ensures cost predictability and adequate capacity.

This table should serve as a quick reference for CIOs and Salesforce platform owners to decide on the right approach as needs evolve. In practice, a combination of these tactics will be used โ€“ for example, enabling event-driven integrations to reduce API usage, while simultaneously launching a data archiving project to control storage growth.

Recommendations for CIOs

In light of the above insights, here are actionable recommendations for CIOs to effectively manage Salesforce storage and API usage:

  • 1. Establish Ongoing Monitoring and Governance: Treat storage and API usage as key operational metrics. Have your Salesforce admin team (or Center of Excellence) review usage dashboards regularly. Set up alerts when usage approaches, say, 80% of the limit. Make it someoneโ€™s responsibility to report on this in IT ops meetings. Early warning allows controlled remediation instead of emergency firefighting.
  • 2. Implement a Data Lifecycle Policy: Donโ€™t let Salesforce become a dumping ground of historical or irrelevant data. Define how long different data types should be retained in the live system (e.g., cases closed more than 5 years ago archived, leads with no activity for more than 3 years deleted, etc.). Enforce these policies with scheduled jobs or archiving tools. Communicate to business stakeholders that data will be archived after a certain period โ€“ this sets expectations that Salesforce isnโ€™t an infinite storage solution. This policy will save costs and keep the organization efficient.
  • 3. Invest in Archiving Solutions: If your org has sizable data growth, evaluate archiving solutions (third-party or in-house using Big Objects or external databases). The investment in an archiving tool or project can pay for itself by avoiding Salesforce storage purchases. Archiving also improves Salesforce performance by reducing the volume of data the platform has to handle in reports and searches. Ensure that any chosen solution meets compliance requirements (e.g., data retention laws) and is secure.
  • 4. Optimize Integrations โ€“ Work Smarter, Not Harder: Work closely with your enterprise architects and integration developers to ensure all new Salesforce integrations follow best practices for efficiency. Conduct an integration architecture review for any project that will interface with Salesforce. This review should cover minimizing API calls, using bulk APIs, and respecting limits. For existing integrations, consider an audit or tuning exercise โ€“ often, a legacy integration can be refactored to make fewer calls (for instance, upgrading an old SOAP integration to a Bulk API process). Aim to reduce redundant data transfers and calls.
  • 5. Use Middleware and Caching Where Appropriate: As your Salesforce usage expands into an ecosystem of multiple systems, consider implementing integration middleware if it isn’t already in place. This adds a platform to manage, but can greatly streamline data flows and offload a lot of processing from Salesforce. Even simple measures, like caching reference data (such as products and currencies), can cut API traffic dramatically. Ensure solutions like these are part of your integration roadmap.
  • 6. Engage with Salesforce Early about Limit Increases:ย If you foresee a legitimate need to exceed standard limits (such as a one-time data load or a seasonal API spike), talk to your Salesforce account team proactively. In some cases, Salesforce can grant temporary limit increases or advise on strategies. For example, if a marketing campaign is expected to double API usage for a week, giving Salesforce a heads-up might allow them to accommodate it (or at least confirm the expected behavior). Donโ€™t wait until youโ€™re at 100% of your limit and everything stops โ€“ manage stakeholders by forecasting and communicating needs.
  • 7. Negotiate Contractual Safeguards: When renewing or expanding your Salesforce agreement, use the opportunity to address storage and API requirements. As noted, itโ€™s much cheaper and easier to get these included upfront than to buy piecemeal later. Even if you donโ€™t need additional capacity on day one, locking in an option or a discounted rate can protect your budget. In negotiations, be explicit: โ€œWe anticipate needing roughly X GB more storage in two years โ€“ what can Salesforce do for us as part of this renewal to cover that?โ€ Likewise: โ€œOur integrations are growing; can we get assurance of higher API limits or an add-on now?โ€ Getting these in the contract prevents future headaches and shows due diligence.
  • 8. Educate Users and Stakeholders: Often, storage and API issues can be exacerbated by user behaviors or misunderstandings. Guide your Salesforce user base on best practices โ€“ e.g., avoid uploading extremely large files unless necessary, and delete obsolete reports and dashboards (which can cause storage overhead), etc. For the API, ensure that your development teams are aware of the limits and design accordingly. This might include training for external partners or vendors who connect to your Salesforce. A well-informed community will create solutions that work within constraints rather than unknowingly breaking them.
  • 9. Plan for Scalability in Architecture: Align your Salesforce data and integration architecture with the principle of scalability. This might mean modularizing data (using external systems for data that doesnโ€™t need to be stored in Salesforce) and designing integration patterns that can scale horizontally, such as using queuing mechanisms, if the volume surges. By planning with headroom in mind, you reduce the risk of hitting a ceiling. Essentially, incorporate storage and API limit checks into your design review checklist for any major Salesforce-related build.
  • 10. Keep the Salesforce Account Team Informed:ย Finally, maintain an open line of communication with Salesforce. They have a vested interest in your successful usage of the platform (which drives renewals and upsells). Suppose you are transparent about your data growth and integration plans. In that case, a good account team will help find solutions, whether through technical guidance or packaging the right add-ons, before issues arise. Reaching out to Salesforce for roadmap information can also be useful (e.g., are there upcoming features to ease handling large data?). In essence, treat Salesforce as a partner in capacity planning, not just a vendor who sends an invoice when you run out of space.

By following these recommendations, CIOs can turn what is often seen as a โ€œlimitโ€ into a manageable aspect of their Salesforce strategy.

With diligent monitoring, smart use of technology, and proactive vendor management, you can ensure your Salesforce org scales to meet business demands without unwelcome surprises or budget overruns. In the end, the goal is to keep Salesforce running smoothly for both users and integrated processes, delivering value to the business continuously, with constraints well under control.

Author
  • Fredrik Filipsson has 20 years of experience in Oracle license management, including nine years working at Oracle and 11 years as a consultant, assisting major global clients with complex Oracle licensing issues. Before his work in Oracle licensing, he gained valuable expertise in IBM, SAP, and Salesforce licensing through his time at IBM. In addition, Fredrik has played a leading role in AI initiatives and is a successful entrepreneur, co-founding Redress Compliance and several other companies.

    View all posts