Diversity is Key to Eradicate Implicit Bias in AI Solutions

image-of-AI-robot

The age of digitalization is upon us. We find stories of the impact technological advances are having–and will have on our lives, our work, and our futures–everywhere. And truly, diversity is key to eradicate implicit bias in AI solutions and technology in general.

The term implicit bias refers to the process by which our brains notice patterns and make generalizations based on observations and experiences. We often refer to this process as stereotyping and our brains do this, unconsciously, all the time. For each of us, our unconscious, or implicit, biases play a role in how we understand the larger world around us.

But, are we considering the role implicit biases will play in the development of this technology?

“This tendency for stereotype-confirming thoughts to pass spontaneously through our minds is what psychologists call implicit bias. It sets people up to overgeneralize, sometimes leading to discrimination even when people feel they are being fair,” write Keith Payne, Laura Niemi, and John M. Doris, for Scientific American.

“We all bring unconscious biases into the workplace,” writes Laura Berger. “These deeply subconscious attitudes span race, gender, appearance, age, wealth and much more. They influence everything from the car you drive to the employee you promote and the one you don’t. And because they are so reflexively triggered without our knowledge, they are virtually unconcealable.”

So as we contemplate our futures, I find myself wondering about this. Who is monitoring the unconscious biases held by those developing the technological solutions to tomorrow’s societal problems?

If we are not already discussing this, we need to start. Today.

AI in L&D: Benefits and concerns

An example of our increasing reliance on technology in L&D is the use of Artificial Intelligence (AI). My colleague, Annika von Redwitz, and I are keenly aware of the stated benefits of using AI in learning and development. As we see it, the impact of AI on L&D has the potential to disrupt the delivery of corporate learning in the future.

However, while we envision much good to come out of this, we both share concerns about the implicit biases programmers may be imparting to the technology they develop for L&D.

Why? Learning and development are critical to any company’s success today. And, to be successful L&D must prepare leaders, train managers, inspire employees, develop great communicators, promote diversity, and ensure teams are high-performing. According to PwC, by the 2030’s, 38 percent of all U.S. jobs could be replaced by AI and automation.

“Many people say AI will get ‘smarter’ over time as it is used,” writes Annika in a recent article we co-authored for “Training Industry.” “Of course, this is true, but we need to make sure the recognition software doesn’t inhibit creativity or reinforce thinking patterns that may need to change – not unlike what can happen when internal trainers do all the training in organizations for their peers.”

Unconscious bias is our tendency to make mental shortcuts,” said Natalie Johnson, a partner at Paradigm, a firm that helps companies with diversity and inclusion. “While these shortcuts are helpful–they enable us to make decisions quickly–they can be prone to error. They can especially be prone to error when making decisions about people.”

Research published by Infosys in 2017 shows AI is perceived as a long-term strategic priority for innovation, with 76 percent of the respondents citing AI as fundamental to the success of their organization’s strategy, and 64 percent believing that their organization’s future growth is dependent on large-scale AI adoption.

“Tech companies have made big advances in terms of building artificially intelligent software that gets smarter over time and potentially makes life and work easier,” writes Michelle Cheng, for Inc. “But these examples reveal an uncomfortable reality about A.I.: even the most intelligent bots can be biased.”

Ideally, thanks to digitalization, we will all have more time to focus on people and human interaction. But, we need to remember, that human beings are developing technology like AI, each with implicit biases that impact the solutions they design.

Not only is is essential that diverse teams (of humans) work well together to develop those algorithms–it is imperative that we continue discussing how to manage the potential for problems caused by stereotypes and unconscious biases.

A version of this post was first published on Inc.

Photo credit: 123rf.com

The Role of AI in Learning and Development

Role-of-AI-in-learning-and-development

We have entered the Age of Artificial Intelligence. And, while many of us have heard how AI will impact market segments like manufacturing or R&D, I find myself wondering: What about other areas of business–like L&D? How will AI affect learning and development?

As James Paine points out, “It wasn’t so long ago that artificial intelligence was reserved to the realm of science fiction according to the public.”  AI grew exponentially in 2017 and is projected to be even bigger in 2018.

So, what will we need to know to make the best use of AI in Learning and Development?

It’s a bit challenging. Most of us are not yet even consciously aware of the AI we’re already using. From online shopping’s search and recommendation functions to voice-to-text in mobile usage, or AI-powered personal assistants like Alexa or Siri, our personal and work lives are already impacted by these new technologies.

Leading research and advisory company, Gartner, projects that AI bots will power 85 percent of customer service interactions by 2020 and will drive up to $33 trillion of annual economic growth.

What role will AI play in Learning and Development?

Given the fast pace of technological and societal changes, L&D has to stay abreast of the latest approaches and methodologies as they develop their learning strategies. Gone are the days of one size fits all. AI will provide insights based on the enormous amount of data it has collected and analyzed, which will facilitate the creation of customized learning programs–faster than before.

Access to these insights and data will allow us to develop a better understanding of learner behaviors and to predict needs by recommending and positioning content based on past behavior, according to Doug Harward, and Ken Taylor, in their article for Training Industry.

Adaptive learning that is personalized to the individual is a powerful way to engage today’s workforce, but Harward and Taylor point out that the challenge facing L&D is to be able to make sense of the data and to leverage those insights to drive business value.

As with AI in all its applications across diverse industries, there will be many positives, negatives, and…unknowns,” says Massimo Canonico, head of solutions engineering for Docebo. He sees a potential for reduction in the time spent in program development. But Canonico raises some concerns: Legacy L&D teams may feel they are relinquishing vital aspects of their jobs to automation, while the reality is that AI is an algorithm, not a magic wand, and will not be able to fix everything. “It will not fix garbage content,” he writes.

What do we need to consider in developing, using and promoting the use of AI products?

Today learning is about ‘flow’ not “instruction,” and helping bring learning to people throughout their digital experience,” says Josh Bersin. He believes it’s imperative that L&D focus on “experience design,” “design thinking,” the development of “employee journey maps,” and much more experimental, data-driven, solutions in the flow of work.

Bersin believes the job of L&D and HR is to understand what employee’s jobs are, learn about the latest tools and techniques to drive learning and performance, and then apply them to work in a modern, relevant, and cost-effective way. “We’ve been doing this for decades, and now we just have to learn to do it again – albeit with a vastly new set of technologies and experiences,” he states.

Practical considerations for the evaluation and assessment of AI solutions will include those which have been developed as mobile-first, designed for use on mobile devices, so content is displayed for easy mobile consumption.

One of the most important considerations in choosing an AI solution will be the level of analytics the solution can deliver. “If we are going to succeed when it comes to personalized learning, we have to understand how we learn, and when we learn most effectively,” says Rob May, in a post for Training Journal. However, he cautions, leaders in L&D and HR must remember that technology should never replace human interaction.

May’s comment resonates with me. I taught in a German program for five years, one that selected the best Ph.D. candidates from the country’s top schools in AI and robotics. The students traveled from company to company around Germany to attend courses, and my class on Intercultural Communication got the best scores on evaluations.

While the students were absolute wizards on the technological front, I was teaching them soft skills: Like how to sell their ideas at conferences, position their products or projects internationally, and develop partnerships abroad. The inclusion of the human touch made the course both popular and useful.

How is bias eliminated in AI?

One of the fascinating and challenging issues related to AI in L&D relates to bias. How can we eliminate bias in the development of these tools? AI can be taught to provide the best interpretation of the data sets, the right course for an individual or the perfect candidate for an open position. But it needs to be programmed to do so. And the human beings that create the AI solutions come to their work complete with conscious and unconscious biases.

So, it becomes increasingly clear that the developers of the AI and machine learning solutions must come from a diverse pool, and that the data used to train the algorithms in the tools is free of bias. “Even though AI learns–and maybe because it learns–it can never be considered ‘set it and forget it’ technology. To remain both accurate and relevant, it has to be continually trained to account for changes in the market, your company’s needs, and the data itself,” state the authors of How AI Can End BiasYvonne Baur, Brenda Reid, Steve Hunt, and Fawn Fitter.

The benefits of AI are many, and the concerns valid. In the final analysis, however, we will need to remember to deal with AI solutions in the same way we build learning and development programs: Identify the problem we’re trying to solve or topic on which we are training, and then find the best technological solution to help facilitate the end result.

A version of this post was first published on Inc.com

Photo by Alex Knight on Unsplash