Unpacking AI's bias problem in recruitment.
In this week's Careers Newsletter, read BoF senior correspondent Sheena Butler-Young's latest: AI's Big Bias Problem A report by Bloomberg this month is casting fresh doubts on generative artificial intelligence's ability to improve the recruitment outcomes for human resource departments. In addition to generating job postings and scanning resumés, the most popular AI technologies used in HR are systematically putting racial minorities at a disadvantage in the job application process, the report found. In an experiment, Bloomberg assigned fictitious but "demographically distinct" names to equally-qualified resumés and asked OpenAI's ChatGPT 3.5 to rank those resumés against a job opening for a financial analyst at a real Fortune 500 company. Names distinct to Black Americans were the least likely to be ranked as the top candidate for a financial analyst role, while names associated with Asian women and white men typically fared better. This is the sort of bias that human recruiters have long struggled with. Now, companies that adopted the technology to streamline recruitment are grappling with how to avoid making the same mistakes, only at a faster speed. —> READ MORE | |
|
| Case Study — How to Turn Data Into Meaningful Customer Connections | Before fashion businesses can put artificial intelligence to work or target the right shoppers online, they need good data and a deep understanding of who their customers are and what they want.
In our latest BoF Professional case study, "How to Turn Data Into Meaningful Customer Connections," BoF's technology correspondent Marc Bain, examines how fashion businesses can truly understand their customers, allowing them to make smarter decisions that serve shoppers and drive results. It provides an in-depth look at the data capabilities and consumer-research methods Tapestry Inc. has used to fuel success at its Coach brand, particularly among Gen-Z shoppers, who now see Coach's handbags as among the most coveted on the market. | | |
No comments:
Post a Comment