IT Dependency (ITD) Validation Process Best Practices

IT Dependency (ITD) Validation Process Best Practices

While Access Provisioning, Access Deprovisioning, and Change Management are areas of Sarbanes-Oxley (SOX) management testing that get a lot of attention, IT Dependency (ITD) validation for completeness and accuracy is often a misunderstood or a missed area of SOX testing by management. In our consulting work at Cooper-Richards with clients managing risk and compliance across many industries, we have noticed different requirements for ITD testing across the major external audit firms. While there is not a consistent approach, we have identified similarities that will help get you closer to meeting the compliance requirements and expectations of your external auditor. 

ITD testing has come into focus over the last several years from the external audit firms as a result of feedback they receive from the Public Company Accounting Oversight Board (PCAOB) inspection process. We have also noticed more key controls fail SOX testing for the year based only on a lack of ITD testing evidence. This is tough because management may feel the control is operating effectively, but if the evidence over the completeness and accuracy of the information used in the performance of the control is not provided or properly tested, the control will fail — resulting in a control deficiency.

This article will explain what an ITD is and walk you through an ITD validation process with key considerations for the four main types of reports. 

What Is an IT Dependency (ITD)?

ITD is a term used by many external and internal auditors; this is also known as Information Used in the Execution of Controls (IUC) and Information Provided by Entity (IPE). For any ITD the control performer needs to articulate how they are comfortable that the information they are using in the operation of their control is reliable, complete, and accurate.

Completeness ensures that all relevant transactions that occurred (or relevant information) are captured once and only once. Accuracy ensures that transactions/information are reported at the correct amount in the appropriate account, reflecting appropriate timing.

If an audit reveals that there is inaccurate or incomplete data supporting the performance of your control, the control will likely fail and cause the company to potentially face deficiencies in the control environment.

How do you validate an IT Dependency (ITD)? A Step-by-Step Approach

Following a standard ITD validation process enables management to take ownership of ITD quality by understanding exactly how the underlying data supports and benefits their control activities. This also provides a sustainable process to manage the ongoing reliability of the data and impacted controls.

Step 1. Identify & Classify All ITDs

The first step is to identify and inventory all ITDs that support the SOX key controls and classify them into one of the four primary categories: 

  1. Standard Report: A standard or canned report is designed by the application provider and comes preconfigured and/or predefined with the application package; typically, this cannot be modified by end-users.   
  2. Third-Party Supported Standard Report: A standard report supported by a service provider.
  3. Custom Report: A modified standard report or a custom report is built or configured by IT or the end-user to meet specific reporting needs of the end-user.
  4. Ad-Hoc Query: The end-user has access to plug in a set of parameters to generate a report or query from the system on an ad hoc or recurring basis that is typically exported to a spreadsheet.

Step 2. Define a Process for Testing Based on Category

Once the inventory of key ITDs is defined the next step is to define a process for testing; the testing steps/process will vary based on the category the ITD falls in as defined above. Below are possible test approaches by category. 

1. Standard Report

Validate that a report is a standard report, including verification that there were no changes to the report since system implementation. Evidence to support a standard report can include screenshots from vendor-supplied manuals or online help indicating the report is a standard report. In addition, evidence of the last modified date and who performed the modification should also be captured. In addition, testing the ITGCs is also necessary to support that the report continues to function as intended. 

When using a standard report it is best practice to include screenshots from the source system that includes the following information, which will provide the auditors the information needed to determine if the report is standard or custom: 

  • The system, menu or transaction code, and full name of the report. 
  • Any parameters entered to run the report (e.g., date range, company code, account number, etc.).
  • Whether the report is exported to Excel or another tool, provide evidence from the source system of the number of records produced by the standard or custom report to verify completeness. If this is not available, provide screenshots of the first and last pages of the reports from the source system to verify completeness.

2. Third-Party Supported Standard Report

When using a report out of a hosted solution it is important to obtain screenshots showing the system, menu path, and full name of the report. In addition, ensure a request has been made to obtain a copy of the provider’s SOC Report (preferred SOC 1 Type 2). Management should review the SOC report to understand the procedures taken to ensure completeness and accuracy for the report used. If the report is not listed in the SOC, further testing procedures may be necessary by the Control Performer and Audit depending on the type of report and the external auditor requirements.

3. Custom Report or Ad-Hoc Query (subject to ITGCs)

Test the completeness and accuracy of the report logic. Test the ITGCs to support the continued reliability of the report. When using an ad-hoc query tool like SQL or a third-party provided reporting tool, which is typically exported to Excel for analysis, screenshots must be taken that include: 

  • The query or parameters used. For example, a screenshot of the SQL Management Studio or tool used to build the query ensures the full query statement is included in the screenshot.  If necessary, copy and save the query to a Word document for a full view.
  • The source of the data used to build the report. For SQL the screen print should include a clear picture of the database and table(s) used.  
  • Completeness and accuracy of the original data. When data is exported to Excel it can be modified; as a result, the Control Performer must perform procedures to ensure completeness and accuracy every time with screenshots and reconciliations to the original data. Completeness and accuracy can also be achieved by exporting the data to a PDF and Excel at the same time to provide a copy of the data that cannot be modified.

4. Custom Report or Ad-Hoc Query (not subject to ITGCs)

Testing the completeness and accuracy of the report logic each time it is used to support the performance of the control, including, but not limited to, verification of the input parameters used to generate the report. Below are two possible methods of testing.

Method 1: Code Review – Assess the underlying technical programming of the report. This requires management and/or the tester to have sufficient knowledge of the technical programming language to create the report in order to be able to explain the report logic line-by-line.

Method 2: Perform full / false accept-reject testing. This involves selecting a separate sample for each attribute of completeness and accuracy as follows:

  • Report to Source for Accuracy. To assess whether the information is accurately reflected on the report, select a sample of items from the report and agree to the underlying system for specific criteria (i.e., customer name, invoice number). The specific criteria will depend on what information is used in the performance of the control. Depending on the report or spreadsheet use, testing for accuracy can include footing, cross-footing, and/or recalculating the report to verify formulas as well as selecting a sample from the report and agreeing to the underlying system as described above.
  • Source to Report for Completeness. To assess whether the information in the system is completely reflected in the report, select a different sample of items from the system and agree to the report for specific criteria. Depending on the report or spreadsheet use, testing for completeness can be as easy as agreeing the totals to the financial statements or general ledger (i.e., fixed asset roll-forward) or as complex as needing to select samples from the source and agree to the report as described above. 
  • Integrated Test Facility. Using dummy transactions, assess whether the report includes or excludes the transactions as appropriate.
The Evolution of SOX: Tech Adoption and Cost Focus Amid Business Changes, Cyber, and ESG Mandates

Conclusion: Developing a Sustainable Approach to ITDs

For any ITD testing process, it is important to come up with an approach that can be sustained going forward – which means staying on top of any changes in people, processes, or systems that affect the key controls. Changes made to ITDs used in the performance of a key control should be communicated as soon as possible to management’s testing function. This information will also be verified during walkthrough meetings and test of controls; ITD and system inventory listings are living documents and should be visited frequently.

Consistent communication with your external auditor is also very important to understand any changes in their documentation requirements year over year. By staying on top of your ITD documentation, you will put your organization in a strong position to have a smooth SOX testing process

Lori

Lori Richards is a Partner with Cooper-Richards, LLC. Lori has more than twenty-five years of experience as an information systems and financial auditor independently and previously with a Big Four firm, focusing on information technology controls and internal audit functions. Lori has led multiple SOX 404 compliance efforts for both IT and application controls with clients in several industries. Connect with Lori on LinkedIn.

Mike

Mike White is a Partner with Cooper-Richards, LLC. Mike has over thirty years of experience, including almost twenty-five years working at a Big Four public accounting firm, where he provided assurance and advisory services to help clients manage and mitigate financial, operational, and regulatory risks. At Cooper-Richards, Mike manages multiple engagements focusing on SOX compliance and internal audit projects. Connect with Mike on LinkedIn.