THE METHODOLOGY

Where does your audit function stand today?

This assessment is the recommended starting point for any audit leader working through this framework. Your results map to a maturity level and route you to the sections most relevant to your current situation. Ten minutes here will make every other page more useful. Results are saved locally and can be resumed on return.

This assessment is not a scorecard. It is a structured starting point for understanding where your function is today across the six dimensions that determine whether modernization will hold. Answer based on where things actually stand -- not where you want them to be. The most useful result is an honest one.

About your team

A little context helps us tailor your results. All fields are optional.

How large is your internal audit function?
Does your team include a dedicated analytics or data role that is not responsible for traditional audit fieldwork?
A dedicated role means this person focuses on analytics, data strategy, or methodology design rather than carrying a standard audit fieldwork assignment.
Has your organization completed a major enterprise system implementation in the last 24 months?
Step 1 of 6

How easily can your team get data when you need it?

We depend on IT or other teams to pull data for us
Data extracts are requested by ticket or email and delivered on IT's timeline.
Extracts provided by IT — your team has limited ability to retrieve data directly from source systems. Access requires a formal request and you work with whatever format IT provides.
We can pull some data ourselves but often need help
Some reports or exports are self-service, but complex queries or unfamiliar systems still require IT assistance.
Some self-service access — your team can run standard reports in key systems but relies on IT for extracts from databases, ERP modules, or less familiar platforms.
We can get most data independently with some effort
The team can navigate most key systems and pull needed datasets, though manual steps or workarounds remain.
Most data independently obtained — access exists for primary systems. Your team knows where the data lives and can retrieve it, but the process is not yet documented or streamlined into repeatable pathways.
We reliably access data using reporting tools or queries
Documented, structured pathways exist for primary data sources and most team members can use them independently.
Structured, documented pathways — data access is reliable and repeatable. Team members can pull data from major systems without IT involvement. Access methods are documented and shared across the function.
We have structured or automated data feeds for audit use
Automated extracts or pipelines deliver data to audit on a defined schedule without manual requests.
Automated feeds and lineage — data flows to the audit function automatically. Extracts are scheduled, versioned, and traceable back to source systems. The team spends time on analysis, not data retrieval.
Step 2 of 6

How does most of your team analyze data day to day?

Answer based on how the majority of your audit team works day to day, not your most advanced analyst or specialist role.

Primarily Excel with filters, pivots, and manual calculations
Most analysis involves spreadsheet formulas, sorting, and manual data manipulation.
Excel filters and pivots — VLOOKUP, PivotTables, and manual calculations are the primary tools. Results are built by hand each time and typically require significant rework between audits.
Mostly Excel, with a few team members using other tools occasionally
One or two individuals sometimes use BI tools or query languages, but Excel remains the team default.
Occasional analytics platforms — individual use of Power BI, ACL, SQL, or Python exists in isolated cases. This capability is not broadly shared or embedded in standard procedures.
Several auditors regularly use analytics platforms beyond spreadsheets
A meaningful portion of the team uses tools like Power BI, SQL, or analytics platforms for audit work.
Multiple auditors use analytics — the function is not reliant on a single person for advanced analysis. Several team members can run queries, build dashboards, or perform structured population analysis.
Analytics are built into repeatable workflows used across the team
Analytics steps are documented and most auditors follow standardized procedures rather than building from scratch each time.
Repeatable analytics pipelines — analytics procedures are standardized and version-controlled. A new team member can follow documented steps and produce consistent results without rebuilding the analysis.
Continuous monitoring or automated analytics run regularly
Monitoring routines run on a schedule and surface exceptions automatically without manual initiation.
Monitoring and automation embedded — automated analytics execute on a defined schedule and flag anomalies for review. The team focuses on interpretation and follow-up rather than running the analysis itself.
Step 3 of 6

When you finish an audit, how much of the analytics work carries forward to the next one?

Select the option that best reflects how the majority of analytics are performed across the audit team, not the most advanced example that exists within the function.

We rebuild most analytics from scratch each time
Each audit starts fresh — there are no templates or saved procedures to build on.
Starting from scratch — analytics are rebuilt each engagement. Workpapers may be saved but the logic or queries are not reused in a structured way. Each audit reinvents the approach.
We have some Excel templates we reuse
A few spreadsheet templates exist for common calculations, but they still require significant manual setup each time.
Excel templates — some reuse exists in spreadsheet form. Templates need to be repopulated manually with new data each cycle. The structure exists but is not yet connected to a repeatable data pull process.
We have repeatable workflows but still rely on manual data pulls
Analytics steps are documented and consistent, but data still needs to be manually pulled before each run.
Repeatable workflows with manual data — the analysis logic is documented and reusable. Queries or scripts may exist. But the data itself must be manually requested or retrieved before the workflow can run.
Most analytics are standardized and reused with minimal rework
Procedures are documented and reusable — a new engagement requires parameterization, not rebuilding from scratch.
Standardized and reused — analytics are consistent across engagements. Scripts, queries, and workpaper templates are maintained and version-controlled. New auditors can follow the documented approach without significant ramp-up.
Analytics run automatically on scheduled intervals
Monitoring or analytics jobs run on a defined schedule and deliver results without manual action.
Automated on schedule — analytics execute automatically and surface results at defined intervals. The team reviews outputs and takes action on exceptions rather than initiating the analysis each time.
Step 4 of 6

When issues occur in the business, how quickly does audit typically find out?

We find out during or after an audit, often long after the fact
Issues surface through periodic testing cycles — significant time passes between the event and detection.
During or after an audit — detection is reactive and tied to scheduled engagement cycles. By the time an issue is identified, it may have been occurring for months or longer.
We find issues during audit testing, but still after the activity occurred
Testing is structured, but findings always reflect past events rather than current or near-current activity.
During audit testing — issues are identified during fieldwork in a structured way. But the timing is still retrospective — audit is looking back at what already happened, not forward at what is happening now.
We review certain risks periodically and catch some issues earlier
Targeted reviews of specific risk areas happen between audits, reducing the detection lag for those areas.
Periodic risk reviews — some proactive coverage exists for high-risk areas. Quarterly or monthly reviews in select domains mean issues are caught within weeks rather than months for those areas.
We have analytics or dashboards that flag anomalies soon after they occur
Monitoring routines surface exceptions within days or weeks of the underlying activity.
Anomaly flagging — dashboards or scheduled analytics identify exceptions shortly after they occur. Exception owners receive alerts and closure is tracked. Detection lag is measured in days or weeks, not months.
Automated signals detect issues as they happen and allow rapid response
Exception signals are generated automatically and routed to owners in near-real-time.
Automated near-real-time signals — issues are detected and escalated as they occur across monitored areas. Audit provides strategic oversight of the signal environment rather than initiating detection manually.
Step 5 of 6

How is your team currently using AI tools in audit work?

We are not using AI tools yet
No AI-assisted tools are currently part of the audit workflow.
No AI tools — the team has not yet adopted AI-assisted capabilities in audit procedures. Traditional methods are used exclusively for all audit work.
We use AI occasionally for research, summarizing, or drafting documents
Tools like ChatGPT or Copilot are used informally to support writing and research tasks.
Informal AI use — AI assists with productivity tasks such as summarizing documents, drafting audit communications, or researching topics. It is not integrated into audit procedures or evidence evaluation.
We use AI to help build queries, workflows, or analytics
AI tools help write scripts, structure queries, or design analytics approaches during engagement work.
AI-assisted analytics development — AI accelerates the building and design of analytical work products. Team members use AI to write SQL, generate Python scripts, or structure data transformation logic.
AI-enabled tools help us review data and flag potential issues
Purpose-built or configured AI tools scan populations and surface anomalies as part of audit procedures.
AI-enabled issue flagging — AI assists in reviewing data populations and identifying potential exceptions for auditor review. The AI does the scanning; auditors evaluate and act on the results.
AI is integrated into monitoring workflows and runs continuously
AI-powered monitoring runs on a schedule, surfaces signals, and routes exceptions without manual initiation.
Continuous AI monitoring — AI is embedded in operational monitoring workflows and runs without manual triggers. Exception routing, threshold management, and signal escalation are handled by configured AI systems.
Step 6 of 6

How would you describe the analytics capability across your audit team?

Consider the depth of skill across the team as a whole, not your most advanced individual.

Analytics capability is limited to basic Excel use across most of the team
Most team members work with standard spreadsheet functions — advanced analysis is rare or absent.
The team primarily uses Excel for filtering, sorting, and basic calculations. Analytics beyond standard spreadsheet functions are uncommon.
Most of the team is comfortable with Excel; one or two individuals have more advanced skills
Excel is the standard across the team, with isolated examples of more advanced capability.
Excel is the dominant tool for most staff. One or two individuals occasionally use more advanced tools or techniques but this is not typical across the team.
Several team members can perform structured analysis using tools beyond spreadsheets
A subset of auditors can use analytics platforms, write queries, or work with larger datasets.
Several auditors can use analytics platforms, write basic queries, or perform structured data analysis. This capability exists but is not yet broadly distributed.
A meaningful portion of the team can build repeatable analytics or work with large datasets independently
The function is not dependent on a single specialist — multiple people can design and run analytics.
Multiple team members can independently build analytics, write queries, or analyze large datasets. The function is not dependent on a single specialist.
Analytics capability is distributed broadly with multiple staff able to design, build, and maintain workflows
Analytics skill is a core competency — most team members can contribute to building and maintaining repeatable routines.
Analytics skill is a core competency across the team. Multiple staff can design, build, and maintain repeatable workflows and monitoring routines without relying on one individual.

    Your selections are saved automatically in your browser. Return to this page to resume or review your results.

    Understanding the maturity levels

    The five maturity levels represent a progression from traditional audit operations toward AI-enabled oversight. Most audit functions sit somewhere in the middle -- and that is fine. The goal is to know where you are and what the next move is.

    Level 1
    Traditional Audit

    Sample-based testing, manual data processes, and periodic coverage. This is a functional starting point -- but not proportionate to the complexity of a modern electrical distribution enterprise. The gap between this approach and the organization's risk profile is likely widening.

    Level 2
    Emerging Analytics

    Analytics tools are present and some team members use them, but coverage is inconsistent and procedures are not yet repeatable or documented. Data access depends on external help. The foundation is beginning -- capability needs to be structured and scaled.

    Level 3
    Integrated Analytics

    Analytics are embedded in regular audit work, data access is largely self-sufficient, and procedures are documented and repeatable across the team. Monitoring is piloting. This is a strong foundation -- the next challenge is scaling and formalizing.

    Level 4
    Continuous Monitoring

    Population-level testing is the default for high-risk areas. Monitoring routines are operational with defined thresholds, escalation paths, and closure tracking. Audit provides ongoing visibility rather than periodic snapshots. Governance coverage of AI systems is developing.

    Level 5
    AI-Enabled Oversight

    The audit function provides active governance over AI-enabled decision systems, operates intelligent monitoring at enterprise scale, and delivers strategic insights to executives and the audit committee in near-real-time. This is the destination -- few functions are here yet.

    Next in The Methodology ERP Transition →

    Diagnosing where the function has lost its footing and how to sequence the recovery

    Recommended next step
    Reference Library
    Explore the standards, frameworks, and research that anchor the AI-Ready Audit Model -- useful for supporting modernization business cases and professional development.
    Open the reference library →