Accurate, Focused Research on Law, Technology and Knowledge Discovery Since 2002

Daily Archives: March 10, 2024

OpenAI’s GPT Is a Recruiter’s Dream Tool. Tests Show There’s Racial Bias

Bloomberg [unpaywalled] – “Recruiters are eager to use generative AI, but a Bloomberg experiment found bias against job candidates based on their names alone, Companies tend to hire the most at the start of the year, mainly because of hiring budgets that have been set and go into effect in the first quarter. “Everybody came back to work, and it’s been kind of insane,” Becker said in a recent interview. In her professional groups and in forums for human resources and recruiting, everyone is buzzing about the same thing: using new artificial intelligence tools to ease the workload. In the race to embrace artificial intelligence, some businesses are using a new crop of generative AI products that can help screen and rank candidates for jobs — and some think these tools can even evaluate candidates more fairly than humans. But a Bloomberg analysis found that the best-known generative AI tool systematically produces biases that disadvantage groups based on their names. OpenAI, which makes ChatGPT, the AI-powered chatbot that can churn out passable song lyrics and school essays, also sells the AI technology behind it to businesses that want to use it for specific tasks, including in HR and recruiting. (The company says it prohibits GPT from being used to make an automated hiring decision.) Becker, who has tested some of these AI-powered hiring tools, said that she’s skeptical of their accuracy. OpenAI’s underlying AI model, which is developed using a vast number of articles, books, online comments and social media posts, can also mirror and amplify the biases in that data. In order to understand the implications of companies using generative AI tools to assist with hiring, Bloomberg News spoke to 33 AI researchers, recruiters, computer scientists and employment lawyers. Bloomberg also carried out an experiment inspired by landmark studies that used fictitious names and resumes to measure algorithmic bias and hiring discrimination. Borrowing methods from these studies, reporters used voter and census data to derive names that are demographically distinct — meaning they are associated with Americans of a particular race or ethnicity at least 90% of the time — and randomly assigned them to equally-qualified resumes. When asked to rank those resumes 1,000 times, GPT 3.5 — the most broadly-used version of the model — favored names from some demographics more often than others, to an extent that would fail benchmarks used to assess job discrimination against protected groups. While this test is a simplified version of a typical HR workflow, it isolated names as a source of bias in GPT that could affect hiring decisions. The interviews and experiment show that using generative AI for recruiting and hiring poses a serious risk for automated discrimination at scale…”

House Committee Approves Bill Restricting Sales of Sensitive Data to Foreign Adversaries

EPIC: “March 7, 2024 the House Energy & Commerce Committee approved H.R. 7520, the Protecting Americans’ Data from Foreign Adversaries Act of 2024, sponsored by Representative Frank Pallone, Jr. (D-NJ) and Representative Cathy McMorris Rodgers (R-WA). The bill prohibits data brokers from selling, transferring, or providing access to Americans’ sensitive data to certain foreign adversaries… Continue Reading

How to Type in Multiple Languages in a Word Document

How to Geek: “Key Takeaways Use Microsoft Word’s styles to effectively change the language and prevent frustrations with the default language. Define your Normal style before applying the language, and modify the Heading styles for titles. Create and modify language styles for each language you want to use in your document, and easily change styles… Continue Reading

Study finds that we could lose science if publishers go bankrupt

Ars Technica: “Back when scientific publications came in paper form, libraries played a key role in ensuring that knowledge didn’t disappear. Copies went out to so many libraries that any failure—a publisher going bankrupt, a library getting closed—wouldn’t put us at risk of losing information. But, as with anything else, scientific content has gone digital,… Continue Reading

AI mishaps are surging and now they’re being tracked like software bugs

The Register speaks to the folks behind the AI Incident Database: “Interview – False images of Donald Trump supported by made-up Black voters, middle-schoolers creating pornographic deepfakes of their female classmates, and Google’s Gemini chatbot failing to generate pictures of White people accurately. These are some of the latest disasters listed on the AI Incident… Continue Reading

A.I. Joe: The Dangers of Artificial Intelligence and the Military

Public Citizen: “The U.S. Department of Defense (DOD) and the military-industrial complex are rushing to embrace an artificial intelligence (AI)-driven future. There’s nothing particularly surprising or inherently worrisome about this trend. AI is already in widespread use and evolving generative AI technologies are likely to suffuse society, remaking jobs, organizational arrangements and machinery. At the… Continue Reading

Pete Recommends – Weekly highlights on cyber security issues, March 9, 2024

Pete Recommends – Weekly highlights on cyber security issues, March 9, 2024 – Privacy and cybersecurity issues impact every aspect of our lives – home, work, travel, education, finance, health and medical records – to name but a few. On a weekly basis Pete Weiss highlights articles and information that focus on the increasingly complex… Continue Reading

Large language models can do jaw-dropping things. But nobody knows exactly why.

MIT Technology Review – And that’s a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models…The biggest models are now so complex that researchers are studying them as if they were strange natural phenomena, carrying out experiments and trying to… Continue Reading