Skip to content

Instantly share code, notes, and snippets.

@luisdelatorre012
Created May 28, 2025 13:38
Show Gist options
  • Save luisdelatorre012/a4b794fffb944450dcad91ce573b1df7 to your computer and use it in GitHub Desktop.
Save luisdelatorre012/a4b794fffb944450dcad91ce573b1df7 to your computer and use it in GitHub Desktop.
Sox report from O3

SOX Compliance and Production Change Approvals: Real vs. “Rubber Stamp”

Internal Control Expectations for Change Management

Under Sarbanes-Oxley (SOX) Section 404, companies must maintain effective internal control over financial reporting – which includes IT general controls like change management. Both management and auditors (guided by PCAOB/AICPA standards) evaluate whether changes to production systems (especially those impacting financial data) are properly controlled. In practice, change management is expected to be “a systematic and standardised approach to ensuring all changes to the IT environment are appropriate, authorised and preserve the integrity of… programs and data”. The widely-used COSO internal control framework reinforces that principle: it highlights control activities such as proper authorizations/approvals and segregation of duties as fundamental to mitigating risks. In short, SOX compliance demands that moving any update into a production environment be subject to meaningful, documented approval by the right personnel to prevent unauthorized or bad changes from undermining financial reporting.

Audit standards echo these expectations. Under PCAOB Auditing Standard 2201, auditors test both the design and operating effectiveness of controls over financial reporting. A control that exists on paper but is perfunctory (“rubber stamped”) in operation will not pass muster. For example, auditors expect to see controls over program changes that include “review and testing of planned changes” and “approval of these changes prior to implementation into the production environment”. If such a control is only nominal – say, an approval step exists but anyone can bypass it or no real review happens – auditors may deem it a design deficiency or an operational failure. In PCAOB inspections, change-management controls are frequently scrutinized; indeed, insufficient change controls (and related IT issues) are a common weakness leading to internal control deficiencies. Thus, SOX 404 audits place heavy emphasis on change approvals: they must be substantive and well-documented, or the company risks an adverse assessment of its internal controls.

“Real” (Substantive) Approvals vs. “Rubber Stamp” Approvals

Not all approvals are created equal. A real, substantive approval is one where the approver genuinely evaluates the proposed change and its impact before giving the green light. It entails an independent reviewer (with appropriate authority and knowledge) examining the change ticket or release: verifying that the change was properly tested, assessing any risks or potential impact on financial reporting, and confirming that required procedures (like QA or user acceptance testing) were completed. Such an approval leaves a trail of evidence – comments or sign-offs indicating what was reviewed, attachments like test results or code review notes, meeting minutes from a change control board discussion, etc. Crucially, the approver in a substantive process is not simply the person who developed or deployed the change; effective change control builds in segregation of duties so that no single individual can unilaterally make and approve a production change. This aligns with COSO and audit guidance that duties be separated or otherwise controlled to prevent self-approval or conflicts of interest. In short, a substantive approval means the change was truly reviewed through a meaningful process – the signature or electronic approval is backed by actual due diligence.

By contrast, a “rubber stamp” approval is merely formalistic. It refers to situations where the approval step exists in name but not in spirit – the approver might click “Approve” without any real scrutiny, or approvals are given as a perfunctory, automatic habit. Often, rubber-stamp approvals involve inadequate review time or lack of inquiry; they may even be “bulk” approvals of many changes at once with no detailed consideration. In some cases the approver is the same person who executed the change (violating segregation of duties), or a manager who signs off on everything presented by the team without probing. The tell-tale signs of a rubber stamp approval include: a lack of documentation showing what was reviewed, approvals happening after a change is already in production (or simply to “check the box”), and no record of questions, comments, or any rejection ever occurring. As one SOX commentary notes, “a simple signature isn’t sufficient” to prove a control’s effectiveness – auditors now expect documentation that dives into the details of what the reviewer did to decide on the approval. In other words, an approver’s sign-off should represent an informed decision, not just a timestamp. If all an organization can show the auditor is an approval stamp with no further evidence, it raises skepticism that the control might be a rubber stamp.

Example: A controller’s sign-off on a bank reconciliation used to be enough evidence of review; today, auditors demand more. Audit guidance for management-review controls says the documentation should enable a “third party, an auditor, to follow the same process as the controller” – including how the reviewer investigated differences and resolved issues, not just that they signed their name. In the IT change context, this means the approval record should reflect what the approver checked (e.g. linking to test results or indicating they assessed the impact) rather than just an “Approved” status. If the approval is done in a meeting, having minutes that record the discussion, any concerns raised, and the conclusion is considered strong evidence of a substantive review. A rubber stamp approval, lacking those elements, fails to demonstrate such diligence.

How Audit Guidance Evaluates Approval Controls

Auditors and regulators look closely at both the design of the change-approval control and how well it’s executed. Key audit frameworks and standards that shape these evaluations include:

  • SOX Section 404 and SEC/PCAOB Guidance: Management must assess, and the external auditor must independently test, the effectiveness of internal controls over financial reporting. Under these requirements, an approval control for moving changes to production is expected to address the risk of unauthorized or faulty changes. Auditors use a top-down, risk-based approach (per PCAOB AS 2201) focusing on significant applications and related IT general controls. A change-management control that mitigates risk to financial systems (e.g. an ERP or financial reporting system) is typically considered “key.” Auditors will examine if the control is properly defined (e.g. requires pre-implementation approval by appropriate persons, ensures adequate testing and segregation of duties) and if it’s consistently performed. They gather evidence of approvals on a sample of changes and assess the thoroughness of those approvals. If, for example, a company claims that “all production changes for System X require manager approval,” the auditor will expect to see that in practice no change moved to production without documented manager sign-off beforehand, and that the sign-offs included review of test results or other criteria.

  • PCAOB & AICPA Standards on Evidence and Documentation: Audit standards emphasize obtaining sufficient appropriate evidence that controls are working. In practical terms, this means an auditor will look for an audit trail: Who approved the change and when? Is there evidence they actually reviewed supporting information? For instance, PCAOB inspectors have faulted audit firms for not thoroughly testing change-management controls – one report described the control requiring review/testing of changes and approval prior to production, but found the audit firm’s testing insufficient. This illustrates that auditors are expected to verify both the existence of approvals and their substance. An empty approval (no associated documentation or analysis) may lead the auditor to conclude the control is not operating effectively. The AICPA’s guidance for SOC 1 (which often aligns with SOX for service organizations) likewise expects controls to ensure “all changes are authorized and auditable and that unauthorized changes are investigated”. In other words, there should be no unexplained changes bypassing approval, and every approval should leave evidence.

  • COSO Framework Principles: COSO’s 2013 Integrated Framework (often used for SOX 404 compliance) includes Principle 11, “Select and develop general control activities over technology to support objectives.” This implies companies should have IT process controls like change approvals to uphold financial reporting integrity. COSO also stresses that policies and procedures must enforce management’s directives. A policy that “all production deployments must be approved by [role] and documented” is one thing; to be effective, there must be a procedure ensuring that happens and that the approval isn’t a rubber stamp. COSO Principle 10 and 12 emphasize that control activities should mitigate risks to acceptable levels and be executed through formal procedures. In practice, an auditor will evaluate if the change-approval procedure is formalized (e.g. in a change management policy) and if it’s understood and followed by the organization. A control environment that treats approvals as a mere formality would conflict with COSO’s expectation of a culture of effective control activities.

Enforcement: If auditors find the approval control is ineffective, the consequences can be serious. A consistent pattern of insufficient change approvals can be deemed a material weakness in internal control, especially if it affects critical financial systems. For example, if multiple production changes were deployed without evidence of proper approval or testing, auditors may conclude there is a risk of material error or fraud that the controls failed to prevent. In one audit office’s review, “no evidence of approval of IT program changes prior to releasing changes to production” was a common deficiency noted at 32% of agencies. When such issues are found, auditors will report them to the audit committee, and the company must typically remediate immediately (tightening the process, re-training staff, etc.) to avoid an adverse SOX 404 opinion. PCAOB inspection reports also highlight when auditors themselves miss these problems – underscoring how critical regulators consider change-approval controls. In short, there is high scrutiny on final production approvals: both management and auditors are expected to ensure those approvals are real, substantive, and prevent “rogue” changes.

Common Pitfalls and Audit Findings for Change Approvals

Audit and compliance reports have revealed recurring patterns that lead to findings of ineffective production change controls. Some of the most common issues include:

  • Lack of Documented Pre-Approval: The change was deployed to production with no recorded approval before the fact. Auditors frequently flag situations where a change ticket has no manager sign-off or where approvals were done only informally (e.g. via hallway conversation) and not documented. The Audit Office of NSW, for example, found that in many cases there was “no evidence of approval of IT program changes prior to releasing changes to production” – a clear control failure. This is essentially an unauthorized change. Any unauthorized change in a SOX-relevant system is a red flag: industry guidance notes that “anything above zero” unauthorized changes is unacceptable, since it means the change management process can be circumvented.

  • “Rubber Stamping” by Approvers: Even when approvals are formally in place, auditors often find they are perfunctory. Signs of this include identical timestamped approvals for multiple changes in a batch, approvers who cannot describe what they reviewed, or lack of any notes or supporting files in the approval record. A signature or click without substance can lead to audit comments that the review control is not effective. As one compliance source put it, rubber-stamp approvals involve rushing approvals… without adequate review. This pattern might be revealed if, say, every change submitted by IT gets approved within minutes with no questions – suggesting the approver is not truly scrutinizing anything. Auditors may test this by interviewing approvers or looking for evidence of rejected changes (an absence of any rejections can imply that approval is a rubber stamp formality).

  • Inadequate Segregation of Duties (SOD): A classic pitfall is when the person who develops and deploys a change is also the one “approving” it. This defeats the purpose of an independent check. Audit findings often cite “inappropriate segregation of duties over developing and releasing IT program changes” as a cause of control breakdown. For example, a developer with direct access to move code to production might also mark the change as approved under their manager’s credentials, or a small IT team might let a single engineer both test and approve their own work. Without compensating controls (like a separate monitoring review), this is viewed as a design flaw. Proper SOD means an approver should be organizationally and technically separate from the implementer – if that’s not feasible (e.g. in very small teams), companies need alternative controls such as peer reviews or after-the-fact monitoring to satisfy auditors.

  • Poor or Missing Testing/QA Evidence: Auditors commonly find that changes were approved without sufficient testing evidence attached. An effective approval should verify the change was tested in a QA environment or had a code review. If approvers are not actually checking test results (or if testing is informal), the approval doesn’t effectively mitigate the risk of a faulty change. Audit reports list “poorly tested or inappropriate changes” as a risk when change controls are weak. A pattern like a high number of production issues or rollbacks can indicate to auditors that approvals weren’t substantive (perhaps changes were rubber-stamped despite inadequate testing). One listed “indicator of poor change management” is a low change success rate – e.g. many changes needing to be backed out – which points to insufficient review/testing before deployment.

  • Out-of-Date or Bypassed Procedures: Another finding is when the formal change management policy exists but is not followed consistently. For instance, an organization might have a written requirement for approvals, but in practice emergency changes are made without approval and later “regularized,” or certain systems/projects were given informal exemption from the process. The Audit Office report noted some agencies had change-management policies that were outdated or not enforced. Auditors will raise this as a process gap – the control isn’t effective if people circumvent it. Similarly, if documentation says approvals should include specific steps (risk assessment, security review, etc.) but these steps are routinely skipped, it’s a finding.

  • Lack of Audit Trail and Monitoring: If a company cannot easily produce evidence for each change (who approved it, when, and what was reviewed), it’s likely to draw an audit comment. Some companies fail to reconcile deployment logs with change requests, so unauthorized changes go unnoticed. Best practice is to periodically review all production changes to ensure each has an associated approved change ticket; absence of this monitoring can lead to undetected exceptions. Auditors often ask, “How do you know an unapproved change hasn’t been snuck into production?” If the answer relies solely on trust and not on a log/audit trail, that’s a weakness. A noted deficiency is “lack of formal process to review log of system changes” – meaning nobody checks if changes outside the normal process occurred.

These pitfalls frequently result in audit findings classified as control deficiencies or material weaknesses (depending on severity and pervasiveness). The common theme is that the approval control fails to do its job – either because it wasn’t properly structured or because in practice it was ignored or superficial. Companies that experience these findings often have to implement remediation plans (e.g. re-training approvers, implementing new tools, or even restating ICFR conclusions if a material weakness is identified).

Best Practices for Compliant, Audit-Ready Change Approval Processes

To avoid the above issues and ensure final approvals to production are SOX-compliant, organizations can follow a number of best practices. These practices align with guidance from audit firms, COSO principles, and industry frameworks (like ITIL or COBIT) and are routinely recommended in publications by Big 4 firms and others. Key best practices include:

  • Formalize the Change Control Process: Establish a written policy and procedure for change management that clearly defines how changes move from development to production. This policy should require documented approval by authorized personnel before deployment for all normal (non-emergency) changes. It should also outline roles (requestor, implementer, tester, approver), criteria for approval, and handling of emergency changes. Keeping this policy up to date and reflective of current systems is important – auditors expect policies to match practice. A well-defined process sets the foundation for consistent, audit-ready approvals.

  • Enforce Segregation of Duties: Structure the workflow so that no single individual can develop, approve, and deploy a change end-to-end. At minimum, the person who gives final production approval should be independent of the person who developed the change. Many companies use a Change Advisory Board (CAB) or at least a peer-review system to achieve this. In automated DevOps pipelines, tools can require a code reviewer or QA tester (not the code author) to sign off before merge/deployment. Microsoft’s own change controls, for example, “enforce the principle of separation of duties” by requiring code reviews by someone other than the developer and preventing anyone from approving their own code deployments. This greatly reduces the risk of a rubber-stamp self-approval and is looked upon favorably by auditors. If limited staffing makes strict SOD tricky, implement compensating controls (e.g. a periodic supervisory review of all changes made by a single admin).

  • Require Sufficient Testing and QA Evidence: An approval isn’t meaningful if the change hasn’t been tested. Best practice is to mandate evidence of testing (unit tests, QA test cases, user acceptance test results) before a change reaches the approver. The approver should verify this evidence. Many organizations incorporate a checklist: e.g. “Has this change passed QA? Attach test results or sign-off from QA lead.” Some use gating in release pipelines that automatically ensure tests ran and passed. Ensuring user acceptance testing (UAT) or other appropriate testing is completed for any change affecting financial reporting is critical; agencies are advised to “perform user acceptance testing before system upgrades and program changes are deployed”. The approver’s review should include confirming that testing was successful and that any issues were addressed. This practice makes the approval more than a rubber stamp – it ties it to quality assurance.

  • Maintain a Detailed Audit Trail (Documentation): Every change request/ticket should contain a record of the approval and the basis for approval. In practice, this means using an IT service management tool (e.g. ServiceNow, JIRA, etc.) or a source control workflow to track changes. The tool should capture who approved, when, and ideally retain any comments or attachments the approver reviewed. Microsoft notes that their service teams “use ticketing or source control tools to document evidence of approval and track all changes” – this is a strong practice. The evidence might include: the change description, testing evidence, impact analysis, and implementation plan. The approver might add a note like “Reviewed test results from 10/5; all cases passed. No segregation of duties conflicts. Approved for deployment.” This level of detail provides a clear narrative for auditors. Meeting minutes can serve as additional evidence for changes approved in CAB meetings – recording who attended, what was discussed, and the decision. The goal is that an auditor, looking at the documentation, can understand exactly what the approver evaluated and see that it was a thoughtful review (not just a blank checkbox). Good documentation differentiates a substantive approval from a rubber stamp.

  • Ensure Timely, Prior Approvals (No After-the-Fact Sign-offs): Best practice is that approvals happen before the change is deployed to production, not afterwards. The workflow should ideally prevent deployment until approval is given (many organizations implement technical controls for this in their deployment pipelines or change management systems). Emergency changes can be a challenge – the process should define what constitutes an emergency and still require some level of authorization (even if verbal with a subsequent formal approval within 24 hours). Auditors are wary of processes that routinely allow changes then get approval “later,” as it undermines the preventative nature of the control. Companies with strong compliance cultures make it clear that deploying a change without prior approval is a policy violation except in true emergency scenarios. Maintaining a zero-tolerance for unauthorized changes is often emphasized: “a tone at the top that clearly communicates intolerance of unauthorized changes is fundamental”. This attitude helps ensure everyone respects the approval gate.

  • Use Automated Workflows and Tools: Leverage tools to make the approval process reliable and auditable. For instance, use version control and CI/CD pipeline settings that require a pull-request approval before code merges to a production branch. Or use ITIL-based change management modules where a change ticket cannot be closed/deployed without an electronic approval from a designated approver’s account. Automation can enforce that all required fields (like risk impact, test evidence) are filled out before it even goes to approver, thus preventing rubber-stamp due to missing info. Some companies also integrate change monitoring tools that detect any change in production and match it to an approved ticket (for example, file integrity monitoring or deployment logs tied into change management). This not only prevents rogue changes but also produces evidence for auditors that no change goes untracked. While not strictly required by SOX, such tools greatly simplify compliance by providing confidence that all changes are captured and approved.

  • Training and Clear Criteria for Approvers: Simply assigning an approver isn’t enough; organizations should train those approvers on how to perform a proper review. Provide guidance or a checklist of what to consider: e.g. verify the change ticket has complete description, appropriate testing done, back-out plan in place, impact assessed, and that the change aligns with any relevant policies or controls (for example, making sure it doesn’t open a security hole). Approvers should know they are expected to question and even reject changes that don’t meet criteria. A “rubber stamp” culture often arises when approvers don’t feel responsible for quality – training and accountability can counteract that. Some companies incorporate metrics (like tracking how many changes were rejected or sent back for rework) to ensure approvers are truly engaged. It’s also wise to set up approval limits or scopes – for instance, high-impact or high-risk changes require a higher level of approval or multiple approvers (e.g. both IT and business owner sign-off), whereas trivial changes might be pre-approved by policy. Aligning the rigor of approval with the risk of the change helps maintain a substantive review where it matters most.

  • Continuous Monitoring and Internal Audits: To remain “audit-ready,” companies often implement their own periodic checks on the change process. For example, an internal audit or compliance team might sample a few changes each quarter to verify approvals were done correctly and with evidence. They might compare system deployment logs to the change tickets to ensure no changes slipped through. If any discrepancies or shortcuts are found, they can be corrected before the external auditor finds them. Additionally, tracking key metrics – such as number of emergency changes, percentage of changes with missing approvals, etc. – can highlight if the process is weakening. A downward trend in “unauthorized changes” and a healthy rejection rate (i.e. approvers sometimes sending changes back for more work) can indicate the approval control is functioning properly. As guidance from the IIA suggests, effective change management often correlates with fewer incidents and a more stable environment, whereas ineffective change control (with rubber-stamp approvals) can lead to “unmanaged changes” and more frequent issues. Monitoring outcomes thus helps demonstrate the control’s effectiveness.

Implementing these best practices yields a change management process that can withstand auditor scrutiny. For instance, a company might use a workflow tool where every production change ticket shows an approved-by manager name and timestamp, has attachments of test results, and links to code review records. The manager’s approval notes might say, “Tests passed in QA, peer-reviewed by Jane Doe, no open issues – approved” with date. An auditor sampling such tickets would find clear evidence that each change was reviewed and authorized in a substantive way. Furthermore, if the auditor inquires about any change, the company can produce the related ticket and documentation in minutes (demonstrating strong control and documentation). Compare that to a company with a weaker process – where approvals are just recorded in an email or not at all, and it’s hard to tie deployments to approvals. In that case, the auditor is likely to report a deficiency.

In summary, “real and substantive” approvals are those backed by robust process: independent review, thorough documentation, and genuine scrutiny. They fulfill the intent of SOX 404 by preventing improper changes from sneaking into production and potentially affecting financial data integrity. “Rubber stamp” approvals, in contrast, are a control illusion – they give the appearance of compliance but not the substance, and auditors (as well as savvy management teams) will quickly see through them. By adhering to the best practices above – from enforcing segregation of duties to documenting the approval rationale – companies can ensure their production change approvals are both SOX-compliant and operationally effective. This not only satisfies auditors and avoids findings, but ultimately contributes to more reliable systems and financial reporting, which is the very goal of SOX.

Sources:

  • Committee of Sponsoring Organizations (COSO) – Internal Control–Integrated Framework (2013), principles on control activities and technology.
  • Sarbanes-Oxley Act §404 and PCAOB Auditing Standard 2201 – auditor requirements to evaluate internal control design/operation.
  • Audit Office of New South Wales – Internal Controls and Governance 2021 report, noting common IT change control deficiencies (lack of SOD, no approval evidence, etc.).
  • KnowledgeLeader/Protiviti – Auditing Technology Changes guidance emphasizing authorization/testing of all changes and zero tolerance for unauthorized changes.
  • ZenGRC (recapping PCAOB views) – explanation that a simple sign-off is not sufficient evidence of a management review; detailed documentation of the review process is required.
  • Core Security – discussion of “rubber stamping” approvals as bulk or rushed approvals without adequate review, and the risks thereof.
  • KPMG Internal Controls Over Financial Reporting Handbook (2023) – examples of change management controls requiring management approvals and testing prior to production.
  • Microsoft Service Assurance – description of Azure/M365 change management using ticketing systems to document approval evidence and enforce separation of duties in code changes.
  • AuditBoard (B. Guzzi) – advice on SDLC vs. change controls, highlighting that auditors look for unauthorized/unapproved changes and properly functioning changes as key indicators (suggesting when change controls are unreliable, auditors dig deeper).
  • MetricStream GRC insights – summary of common SOX IT control weaknesses, including change management issues, and mapping to COBIT/SOX frameworks.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment