Why Your AI Strategy Needs an Audit Trail
AI adoption depends on trust. Here’s why transparency and auditability are essential for HR leaders.
Emily Faracca
Multimedia Content Writer
Workday
AI adoption depends on trust. Here’s why transparency and auditability are essential for HR leaders.
Emily Faracca
Multimedia Content Writer
Workday
Audio also available on Apple Podcasts and Spotify.
Imagine walking into your office to find a new expert consultant has been hired to manage your team’s payroll and compliance. This consultant is brilliant, lightning-fast, and never sleeps, but they refuse to explain how they reach their conclusions. They hand you a finalized compensation report and simply say, "Trust me."
This is precisely the dilemma facing today's workforce, but with AI. Organizations are eager to capture the efficiency of AI, yet adoption stalls because the humans behind the screens can’t see the gears turning.
Yes, people understand that the technology is powerful and full of possibilities. But what does it mean for the routines and processes we’ve all come to know? And exactly how does AI arrive at decisions with big operational implications?
Trust comes before adoption, and transparency is central to trust. This was a key focus explored by Cristina Goldt, Workday general manager of workforce management and payroll, and Harper Carroll, AI expert and educator, in a recent conversation.
Their ultimate takeaway? The era of the black box is over.
Report
One of the most interesting dynamics in AI adoption today is what might be called the trust–adoption paradox.
“Adoption is really the key,” Goldt says. “How do you get people to trust if they don't adopt, but they're not adopting if they don't trust?”
This circular challenge is playing out across HR right now. Leaders want their teams to leverage AI to automate repetitive tasks, accelerate workflows, and reduce compliance risk. But employees hesitate when systems feel opaque, especially in functions tied to compensation, financial reporting, or regulatory exposure.
The breakthrough often happens only after teams see the technology in action themselves. “What we're seeing is that they are more trusting as they use it more,” Goldt explains.
Trust and confidence grow through experience. But experience requires enough transparency to take the first step. For HR leaders, this means adoption strategies cannot focus solely on capability. They must focus on clarity.
There’s work to be done. Sentiment pulses from Deloitte showed that trust in company-provided generative AI tools was down to 31%, while trust in agentic AI systems that act independently “dropped 89% during the same period, as employees grew uneasy with technology taking over decisions that were once theirs to make.”
“Adoption is really the key. How do you get people to trust if they don't adopt, but they're not adopting if they don't trust?” - Cristina Goldt, GM of workforce management and payroll, Workday
In many parts of the business, AI experimentation can tolerate ambiguity. Marketing teams might test generative tools. Product teams might experiment with optimization engines.
Payroll isn’t really in the same boat. Compensation decisions, tax calculations, benefits administration, and regulatory compliance require precision and defensibility.
“It’s not about a black box where you do not know why it ended up with that decision,” Goldt says. “It really is knowing all the steps.”
Even when AI is not making final decisions, it is influencing outcomes. That influence must be traceable. In financial and people systems, explainability serves three critical functions:
Transparency turns AI from a mysterious optimizer into a collaborative partner. When systems surface their reasoning, HR professionals can validate, challenge, and improve them. That interaction builds capability on both sides.
“It’s not about a black box where you do not know why it ended up with that decision. It really is knowing all the steps.” - Cristina Goldt, GM of workforce management and payroll, Workday
In finance and payroll, audit trails are essential. One of the most practical trust builders in AI systems is auditability: the ability to trace outputs back through the process that produced them.
“If we're talking about a financial process or a payroll process, you audit it,” says Goldt. “So there's an audit mechanism to build that trust.”
Carroll adds AI often has “quite a good thought process” and that “if it comes to a decision, it naturally is an optimizer so we can learn from its thought process.”
More than protecting against error, auditing reinforces accountability. For HR leaders, especially those overseeing global payroll operations or navigating evolving regulatory environments, the reputational and financial risks of non-compliance are enormous.
Continuous monitoring and explainable workflows mitigate that exposure. More importantly, they create psychological safety within teams. Employees can rely on the system because they can interrogate it.
This is especially important in conservative or compliance-heavy functions—the very teams that might initially seem least likely to embrace AI. Interestingly, those teams are often among the strongest adopters once transparency is established. That’s because, as Goldt observes, “they are seeing the value.”
The value becomes visible when the process is visible.
For HR leaders, the implications of AI transparency go beyond system design. It’s a leadership mandate.
Teams take cues from leadership behavior. If leaders treat AI outputs as unquestionable authority, employees will feel displaced. If leaders treat AI as a transparent collaborator—interrogating outputs, modeling critical thinking, and reinforcing human accountability—employees will feel empowered.
“Leaders who adopt AI, their employees are twice as likely to adopt it as well,” Carroll says. “So you as a leader really have a kind of a responsibility to adopt AI.”
The future of AI in HR will be defined not by raw automation capability, but by how clearly organizations articulate specific ways in which AI supports:
The era of opaque systems is ending. The organizations that thrive will be those that embrace a “glass box” philosophy—one where processes are visible, auditable, and collaborative.
“Leaders who adopt AI, their employees are twice as likely to adopt it as well. So you as a leader really have a kind of a responsibility to adopt AI.” - Harper Carroll, AI expert & educator
Performance metrics and efficiency proclamations might excite the C-suite, but when it comes to enterprise adoption, visibility is everything. For risk-conscious HR leaders, this means asking a new set of questions when evaluating AI:
Removing the veil of mystery around AI helps everyone get on board. Build a culture of transparency and trust as a strategy, and adoption will naturally follow as the outcome.
Is your business ready to transform its approach to HR and payroll? Download the Workday HCM and Payroll Buyer's Guide to refine your solution selection criteria and build a compelling business case for change.
Report