Closing the AI trust gap.

Only 52% of employees welcome AI—why? There’s a lack of trust that AI will be deployed responsibly.

Depiction of two women on opposite sides of the AI trust gap.


Understanding sentiment with stats.


of leaders welcome AI adoption in their organization.


of employees say their company has shared guidelines on responsible AI.


of leaders agree AI processes should allow for human intervention.


of employees believe there isn’t enough clarity around what should and should not be automated.

"When I think about the future of AI, I think about the different disciplines coming together—from the humanities, the social sciences, engineering, legal, policy, HR, IT—to make sure that we take the time now to learn from past technological revolutions so that this one creates the best possible future, for everyone."

—Kathy Pham, VP, Artificial Intelligence and Machine Learning, Workday


Factors that are driving distrust.

AI needs human intervention.

Leaders and employees want human involvement in the AI processes but are unclear on the best way to do so.

Putting employees’ interests second.

Employees worry their organization will put company interests above their own when implementing AI.

Awareness of AI governance.

There is skepticism when it comes to collaborating on AI regulation and sharing responsible AI guidelines.


Leaders closing the AI trust gap.

See how some leaders are taking on the challenge of creating trust with responsible AI practices, and hear about the opportunities that lie ahead.

Ready to talk? Get in touch.