What the Oracle LMS Collection Tool Is — and Why It Matters
The Oracle LMS Collection Tool is a bundle of scripts — shell scripts, SQL queries, and system-level data collectors — that Oracle provides during a licence audit. The scripts are designed to run on every server in your environment where Oracle software is installed, collecting comprehensive data about what Oracle products are present, how they are configured, what features have been used, and the hardware characteristics of the underlying infrastructure.
Originally developed by Oracle's Licence Management Services (LMS) division, the tool is now maintained by Oracle's Global Licensing and Advisory Services (GLAS) organisation. Despite Oracle's rebranding from "audit" to "advisory," the tool's purpose is unchanged: to produce the evidence Oracle uses to identify compliance gaps and generate licence revenue. In our experience across 500+ Oracle audit engagements, the LMS Collection Tool output is the single most consequential document in any Oracle audit — it determines the starting position from which all compliance negotiations proceed.
Oracle's standard audit clause — updated in recent years — explicitly requires customers to run Oracle's "data measurement tools" and provide the results. Refusing to run the LMS scripts is not a practical option under most Oracle contracts. However, you have the right to review the output before submitting it to Oracle, to run the scripts during business-appropriate maintenance windows, to use Oracle's masking tool to anonymise sensitive identifiers, and to challenge any findings that result from incorrect script interpretation. Understanding these rights is as important as understanding the scripts themselves.
What the LMS Scripts Collect: Product-by-Product Breakdown
The LMS Collection Tool is not a single script but a modular toolkit with specialised collectors for each Oracle product family. Understanding what each module collects is essential for anticipating what Oracle will see — and what compliance claims will follow.
| Script Module | Products Covered | Data Collected | Why It Matters for Compliance |
|---|---|---|---|
| Database Collection | Oracle Database (all editions), Database Options, Management Packs | Installed edition and version; all options and features installed (whether licensed or not); DBA_FEATURE_USAGE_STATISTICS (cumulative feature usage history); physical CPU count, core count, socket count; virtualisation configuration; cluster membership; listener and TNS configuration | Identifies every database option and management pack ever used — even once, even briefly. This is the source of 60–70% of audit findings by value. |
| Middleware Collection | WebLogic Server, Tuxedo, SOA Suite, Forms, Reports, HTTP Server | WebLogic domain configurations; server instances and cluster membership; feature usage (JMS, coherence, clustering); deployed applications; JVM parameters and classpath entries | Detects Enterprise Edition features used on Standard Edition licences; identifies WebLogic installations that may not have been recognised as requiring licences |
| Application Collection | E-Business Suite, PeopleSoft, JD Edwards, Siebel, Hyperion | Application version and modules deployed; user accounts and role assignments; concurrent session data; module-specific usage metrics | Counts application users against licensed user quantities; identifies modules in use that may not be licensed |
| Hardware / OS Collection | All platforms (Linux, Solaris, AIX, Windows, HP-UX) | Operating system version; physical and logical CPU topology (sockets, cores, threads); memory; hostname; virtualisation hypervisor and configuration (for detecting VMware/Hyper-V/KVM) | Provides the hardware data Oracle uses to calculate processor licence requirements — core counts and core factors. Virtualisation detection triggers the soft/hard partitioning analysis. |
| Java Discovery | Oracle Java SE (JDK/JRE) | All Java installations by version and distribution (Oracle vs OpenJDK); installation paths; Java process inventory; application server Java configurations | Identifies Oracle Java SE installations that trigger the Universal Subscription requirement; distinguishes between Oracle Java (licensed) and OpenJDK (free) |
| VMware / Virtualisation | VMware vSphere, Hyper-V, KVM, OVM | Cluster membership; host inventory; physical core counts per host; VM-to-host mapping; vMotion configuration; resource pool assignments | Oracle uses this data to determine whether the entire cluster must be licensed (soft partitioning) or only the allocated partition (hard partitioning). This is the single highest-value data point in most audits. |
The Six Most Common LMS Script Findings — and Their Cost Impact
Across hundreds of Oracle audit engagements, the same categories of findings appear repeatedly. Understanding these patterns allows you to assess your own exposure before Oracle does.
| Finding Category | What the Scripts Detect | Typical Cost Impact | Frequency in Audits |
|---|---|---|---|
| 1. Database Options & Management Packs | DBA_FEATURE_USAGE_STATISTICS shows usage of separately licensed features — Partitioning, Advanced Compression, Advanced Security, Diagnostics Pack, Tuning Pack, OLAP, Spatial, Label Security, Data Mining | $500K–$10M+ (each option $5,000–$23,000 per processor × entire database estate) | Very High — present in 70–80% of audits |
| 2. Virtualisation Under-Licensing | VMware/Hyper-V cluster data showing Oracle VMs in shared clusters where Oracle asserts all physical cores across all hosts require licensing | $1M–$20M+ (licensing entire clusters at $47,500/processor for Database EE) | Very High — present in 60–70% of audits involving virtualisation |
| 3. Processor/Core Miscounting | Physical CPU topology showing more cores than the customer licensed — wrong core factor applied, hyper-threading confusion, multi-socket servers under-counted | $500K–$3M (per-processor shortfall × product list price) | High — present in 40–50% of audits |
| 4. Unlicensed Oracle Installations | Oracle Database, WebLogic, or other products installed on servers with no corresponding licence entitlement — test/dev instances, shadow IT installations, acquired company systems | $200K–$5M+ (full licence cost plus back-support at 22%/year) | High — present in 50–60% of audits |
| 5. Java SE Installations | Oracle Java SE (JDK/JRE) deployed across servers and desktops without Universal Subscription coverage | $500K–$2M+/year (employee-based subscription at $6.75–$15/employee/month) | Growing — increasingly targeted by Oracle's dedicated Java compliance team |
| 6. Named User Plus Under-Counting | Application access analysis showing more users accessing Oracle through multiplexing layers than the customer has licensed, or NUP counts below the per-processor minimum | $200K–$3M ($950/NUP for Database EE × undercounted users) | Medium — present in 30–40% of audits |
The DBA_FEATURE_USAGE_STATISTICS Trap
Oracle's database maintains a system view called DBA_FEATURE_USAGE_STATISTICS that records cumulative feature usage. This view is the single most dangerous data source in Oracle audits. It records every database option and management pack feature that has ever been accessed — even once, even accidentally, even years ago. A DBA who ran a single AWR report three years ago triggered the Diagnostics Pack usage counter. A developer who tested table partitioning in a sandbox created a Partitioning flag. A backup script that used RMAN compression activated Advanced Compression. Once the usage is recorded, it cannot be removed or reset (the counter persists across database restarts and upgrades). Oracle's position is absolute: if the feature was used, a licence is required. The LMS database script queries this view directly and reports every non-zero usage entry. This single view generates more audit revenue for Oracle than any other data source.
How Oracle Interprets LMS Script Output: The GLAS Analysis Process
Understanding Oracle's internal analysis process helps you anticipate how script output will be used against you — and where the interpretation can be challenged.
Step 1 — Data ingestion: Oracle's GLAS analysts import the script output into internal analysis tools that automatically map installations to products, features to licence requirements, and hardware to processor counts. The tools apply Oracle's published Core Factor Table and partitioning policy to calculate licence requirements.
Step 2 — Entitlement comparison: GLAS compares the calculated licence requirement against your documented entitlements (purchase orders, ordering documents, ULA certificates). Any gap between "deployed" and "entitled" becomes a compliance finding.
Step 3 — Aggressive interpretation: Oracle consistently interprets ambiguities in its favour. Feature usage that was accidental is treated as intentional. VMware clusters are licensed at maximum scope. NUP counts include every possible indirect access path. Core counts use the most expansive reading of the hardware topology. This is not an accusation — it is Oracle's documented audit methodology. Their compliance team's incentive structure rewards revenue generation, and the initial findings report reflects this.
Step 4 — Preliminary findings report: Oracle presents findings as a list of compliance gaps with associated licence cost (typically at list price). This document is the opening position of a negotiation, not a final determination. In our experience, the preliminary findings overstate actual exposure by 30–60% due to double-counting, incorrect entitlement attribution, and aggressive metric interpretation.
Oracle's preliminary findings report is a commercial document, not a legal ruling. Every line item can be challenged, every interpretation can be disputed, and every calculation can be verified independently. Organisations that accept the preliminary findings at face value — which Oracle's audit process is designed to encourage — typically pay 2–3× more than organisations that conduct a systematic rebuttal. The LMS script data is factual; Oracle's interpretation of that data is negotiable.
Reviewing LMS Script Output Before Submission: Your Rights and Process
You have the right — and the obligation to your organisation — to review LMS script output before submitting it to Oracle. This review serves two purposes: verifying the data is accurate and understanding what compliance claims Oracle will derive from it.
| Review Area | What to Check | Common Issues Found |
|---|---|---|
| Hardware accuracy | Verify physical core counts, socket counts, and CPU model against your CMDB and physical asset records | Scripts sometimes detect logical processors (hyper-threads) rather than physical cores; virtualised environments may report host-level hardware inaccurately; decommissioned servers included |
| Installation scope | Confirm every Oracle installation detected is still active and within audit scope; identify decommissioned, retired, or test-only instances | Powered-off servers, decommissioned databases, disaster recovery standby instances, and retired test environments frequently appear in script output as "active" installations |
| Feature usage context | Review DBA_FEATURE_USAGE_STATISTICS entries; determine whether usage was intentional, accidental, or artefactual | Features activated by Oracle's own internal processes (e.g., Spatial Geometry Validation triggered by DBMS_STATS); features used once in testing years ago; features enabled by default that the customer never intentionally used |
| Virtualisation data | Verify cluster boundaries, host assignments, and VM-to-host pinning; confirm whether Oracle VMs are isolated on dedicated hosts or part of shared clusters | Scripts may not accurately capture VM isolation controls (affinity rules, dedicated hosts); cluster boundaries may be misrepresented if the VMware data collection was incomplete |
| Entitlement documentation | Prepare complete entitlement evidence before submission: all ordering documents, purchase orders, SA certificates, ULA certification letters, OEM agreements | Missing entitlement documentation is the most common reason Oracle's findings overstate actual exposure — if you cannot prove you own a licence, Oracle assumes you do not |
| Data masking | Use Oracle's provided masking tool to anonymise hostnames, IP addresses, usernames, and other sensitive identifiers in the script output | Unmasked data may reveal internal project names, employee identifiers, or infrastructure topology that you prefer not to share; masking does not affect the compliance-relevant data |
Proactive Self-Assessment: Using LMS Scripts Before Oracle Does
The most effective Oracle audit defence strategy is not reactive — it is proactive. The LMS Collection Tool is available to Oracle customers through My Oracle Support (MOS). You can download and run the scripts yourself at any time, without triggering an audit or notifying Oracle. This transforms the LMS tool from an audit weapon into a compliance management asset.
How proactive self-assessment works: Download the current version of the LMS Collection Tool from MOS (patch/download section). Run the scripts across your entire Oracle estate — databases, middleware, applications, and underlying hardware. Analyse the output using the same methodology Oracle would: map installations to products, calculate processor requirements, identify feature usage, and compare against your entitlement documentation. Identify and remediate every compliance gap before Oracle has the opportunity to discover it.
| Self-Assessment Benefit | Description | Financial Impact |
|---|---|---|
| Remediate before audit | Identify unlicensed features, disable them, and document the remediation before Oracle initiates an audit | Eliminates the finding entirely — $0 exposure vs $500K–$10M+ if Oracle discovers it |
| Negotiate from strength | Enter any audit with complete knowledge of your compliance position; no surprises, no panic, no rushed concessions | Reduces audit settlement by 40–70% vs organisations that discover gaps during the audit |
| Optimise licence spend | Self-assessment reveals over-licensed products (shelfware) alongside under-licensed products — enabling reallocation and support cost reduction | Typical savings: 10–20% of total Oracle support spend through shelfware identification |
| Demonstrate governance | Regular self-assessment demonstrates proactive compliance management — reducing the likelihood of Oracle targeting your organisation for a formal audit | Reduces audit frequency and audit-related disruption (typically 500–2,000 hours of internal staff time per audit) |
Audit Response Framework: Step-by-Step Defence
8-Step Oracle Audit Response Playbook
Receive and Review the Audit Notification
Oracle typically provides 45 days' notice. Verify the contractual basis for the audit (which clause, which agreement). Confirm the products and business units in scope. Do not agree to an expanded scope beyond what the contract permits. Engage independent licensing advisory immediately for audits with estimated exposure exceeding $500K.
Assemble Your Internal Audit Response Team
Assign roles: audit project manager (typically SAM manager or IT governance lead), DBA lead (for database script execution and output review), infrastructure lead (for hardware and virtualisation data), procurement/licensing lead (for entitlement documentation), legal counsel (for contractual scope and audit clause interpretation). Establish a communication protocol: all Oracle audit communications flow through a single point of contact — never allow Oracle auditors to communicate directly with DBAs or infrastructure staff without oversight.
Prepare Your Entitlement Documentation
Before running any scripts, compile a complete entitlement archive: every Oracle ordering document, licence agreement, purchase order, Software Assurance certificate, ULA certification letter, OEM certificate, and prior audit settlement document. This archive is your primary defence. If you cannot document an entitlement, Oracle will assume it does not exist. Reconcile the archive against Oracle's records in My Oracle Support to identify any discrepancies.
Run the LMS Scripts Under Controlled Conditions
Schedule script execution during maintenance windows to minimise production impact. Run the scripts on all in-scope servers — do not omit systems, as Oracle will identify gaps. Use Oracle's provided masking tool to anonymise sensitive identifiers. Capture complete output including all log files and error reports. Store a complete copy of all output internally before submitting anything to Oracle.
Review All Script Output Before Submission
This is the most critical step. Analyse the output yourself (or with independent advisory support) before sending it to Oracle. Identify every compliance gap the data will reveal. For each gap, determine whether it is genuine (requiring remediation or licence purchase) or disputable (decommissioned servers, accidental feature usage, incorrect hardware detection, entitlements not yet matched). Prepare your rebuttal evidence for every disputable finding before Oracle raises it.
Remediate What You Can Before Submitting
For genuine compliance gaps involving features you do not need, disable the features and document the remediation with timestamps, change records, and before/after configuration evidence. Uninstall Oracle software from decommissioned or retired servers. Terminate user accounts that should no longer have access. Every gap you remediate before submission is a finding Oracle cannot make. Remediation is legitimate operational management — not evidence destruction.
Submit Output with Accompanying Context
When submitting the LMS script output, include a cover document that provides context Oracle's automated analysis will not capture: decommissioned servers in the output that are no longer active, disaster recovery instances that qualify for Oracle's DR licensing provisions, feature usage entries that were triggered by Oracle's own internal processes (not customer-initiated), virtualisation isolation controls that limit Oracle's licensing scope, and any other facts that prevent a finding from being valid.
Challenge the Preliminary Findings Systematically
When Oracle presents its findings, respond with a structured rebuttal addressing every line item. For each finding: state Oracle's claim, present your counter-evidence (specific documentation, configuration records, entitlement proof), and state your adjusted position. Never negotiate from Oracle's number — always negotiate from yours. The rebuttal typically reduces the initial claim by 30–50% through technical analysis alone, before any commercial negotiation begins.