Digital Transformation » AI » Narrowing the workplace’s AI ‘trust gap’

Narrowing the workplace’s AI ‘trust gap’

AI has proven its potential when it comes to efficiency, creating new and exciting career paths and enabling workers to focus on higher-value tasks. However, some questions remain in the workplace around its capabilities and limitations – including risks and possible overreliance on AI solutions. There’s real opportunity for employers to build confidence around AI and demonstrate its importance for augmenting potential, driving value and developing careers.

The lack of confidence and trust between employees and employers, known as the AI Trust Gap, is not surprising given the relative immaturity of GenAI and the exponential pace at which technology is evolving. The World Economic Forum (WEF) reported this year that only 55% of employees are confident their organization will ensure AI is implemented in a responsible and trustworthy way and 42% believe their company doesn’t have a clear understanding of which systems should be fully automated and which require human intervention.

To unlock AI’s full potential, our organizations will need to bridge this trust gap and ease lingering trepidation. As we’ve leaned further into AI investment and adoption at EY, our priority is keeping people at the centre of all change. One way we’ve sought to better understand workforce sentiment is by conducting regular focus groups with broad representation of EY people. The key finding?

That employees want to be closely tied to leadership on the AI journey.

This ongoing consultation has taught five important lessons along the way:

  1. Develop a set of principles for responsible AI – The WEF reports that a set of guiding principles to delineate ethical boundaries and commitments of the organizations should be at the heart of any successful AI strategy. This ensures a unified approach to ethical decision making in both developing and deploying AI applications. To set our AI foundation, EY formalized a set of principles that cover everything from accountability, security and privacy to transparency, fairness and inclusivity and professional responsibility.
  2. Learn what’s causing concern – According to EY research in the US, three-quarters of workers are concerned AI will make jobs obsolete and nearly two-thirds are anxious about AI replacing jobs. At the same time, approximately two-thirds are concerned they’ll fall behind if they don’t use AI at work, and another two-thirds are anxious about not knowing how to use AI ethically. I mentioned our focus groups and employee feedback; we also use surveys to take the pulse of our people. Once you know what people are worried about, communications can be tailored to assuage specific concerns.
  3. Establish regular communication – Communicate clearly and effectively around AI adoption and opportunities. That means laying out a strategic vision and value proposition, along with intentions to upskill the work force, so employees know they’ll be part of the change. From there, cultivating an open dialogue and communication channels where questions can be addressed, feedback can be considered and there’s encouragement to co-create solutions. Communications shouldn’t just detail what’s going to happen, but how and why. Being clear about how AI augments the work and allowing employees to become familiar with the tools can help them embrace AI opportunities. In September 2023, we launched EYQ, our own AI assistant with secure chat capabilities that’s used by our people to ask questions, gather ideas, brainstorm and boost overall productivity. Adding these kinds of user-friendly tools into everyday roles has helped to boost user confidence.
  4. Enable new skills development – Surprisingly, given the positive sentiment toward GenAI and the anticipated increased usage, bolstering GenAI skills was found to be a low priority in EY’s 2023 Work Reimagined survey. Just 17% of employees and 22% of employers name training in GenAI skills as a top priority. However, by demonstrating a commitment to invest in upskilling the workforce, people know they aren’t being left behind. Upskilling everyone (not just some groups) will create a more level playing field and help avoid worker displacement. To lesson feelings of discomfort at EY, we’ve invested in foundational AI learning for all. More than 50,000 people already have earned AI learning badges, and more than 100,000 people have taken our AI learning.
  5. Keep people at the centre of transformation – A human-centric approach to AI can help organizations get ahead of mistrust. Research EY conducted with Oxford University’s Saïd Business School highlights the critical role of human behavior, particularly emotional factors, in the success of transformations. Focusing on the people involved in the change considering their emotions, and mastering the six drivers of transformation success can significantly increase success rates by 2.6%, not to mention cultivate trust and drive better outcomes, including a stronger culture and enhanced productivity.

At EY, we’re seeing the benefits of AI in our work with new tools reducing more cumbersome tasks so our professionals can focus on higher-value work. We don’t yet have the answer to every AI question, but we’re working hard to be transparent along the way. Building employee trust is a process that requires effort among leaders and an inclusive environment where everyone feels like an important stakeholder. Today’s organizations have an opportunity to lay the foundation for successful AI implementation by demonstrating the value of the AI, how it can augment work and careers – and by empowering employees to be part of the process.

What are some of the ways you’ve seen this ‘trust gap’ successfully reduced in your workplace?

Was this article helpful?

Comments are closed.

Subscribe to get your daily business insights