Db 7 mgt 3210

Description

Case 7 – Biased Feedback Increases Employee Exit Rates, Study Says

By Mark Feffer -September 22, 2023

Although the pervasiveness of bias in the workplace has gotten increasing attention in recent years, the on-the-ground challenges haven’t changed much, according to new research from Textio.

In the last decade, “what hasn’t changed are the same broad patterns of bias in feedback that show up in the workplace year after year,” said Textio CEO Kieran Snyder.

And there’s a clear connection between the quality of performance feedback and employee retention rates, the report said. A key finding: Employers can change their retention patterns by making sure all employees regularly receive clear and actionable feedback.

Research published last year by Textio found that women, Black and Hispanic workers systematically received significantly lower-quality feedback than their coworkers. This year, Textio’s report, Language Bias in Performance Feedback, found not much has changed during the intervening months: women and people of color continue to receive the lowest-quality feedback, and those groups are leaving their employers at the highest rates.

Low Quality, Low Action

Among the report’s findings:

Over 50% of employees received at least some feedback that was not actionable.

People receiving low-quality feedback are 63% more likely to leave their organizations than others, and 38% show risks of attrition in their current roles.

Black employees get 26% more unactionable feedback than non-Black employees, despite only receiving 79% as much feedback overall.

Most men, 83%, say they understand what’s required to earn their next promotion. In comparison, just 71% of women, non-binary and transgender people, and only 54% of Asian people of all genders, feel the same way.

People who get performance reviews containing “I think” hedging statements are 29% more likely to leave the company within a year.

In addition, women and people of color consistently receive lower-quality feedback than white men, the report said. As a result, they’re far more likely to leave their company within 12 months.

Bias About Bias

Those who believe generative AI and similar technologies will help address bias should bear in mind that end users – employees and candidates – aren’t sold on the idea. For example, a significant number of employed job seekers believe AI recruiting tools are more biased than their human counterparts.

Such concerns aren’t new. A 2019 report from NYU’s AI Now Institute found that, like most of the technology sector, the employees of companies building AI solutions are largely male and white.

“We live in a biased world, so AI systems absorb this training data and may correctly replicate decisions humans make,” John Harney, CTO of DataScava, a New York-based provider of unstructured data-mining solutions, said at the time. “Do we expect AI systems to somehow filter out bias when we can’t?” Solutions to that conundrum aren’t easy, he said. From a technical point of view, creating such filters can’t help but reduce a system’s intelligence – even though they produce unintended outcomes.

What is/are the problem/s?

What are the causes? (The article gives you a lot of possible causes, what do you think? Apply your subject knowledge and personal experience to understand why this issue is happening. The article mainly talks about things from an international perspective. How do these problems translate to a US context?)

What solutions do you propose? 

Get your college paper done by experts

Do my question How much will it cost?

Place an order in 3 easy steps. Takes less than 5 mins.