Get the latest market trends in your inbox

Stay on top of the latest market trends and economic insights with the Axios Markets newsletter. Sign up for free.

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Catch up on coronavirus stories and special reports, curated by Mike Allen everyday

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Denver news in your inbox

Catch up on the most important stories affecting your hometown with Axios Denver

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Des Moines news in your inbox

Catch up on the most important stories affecting your hometown with Axios Des Moines

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Minneapolis-St. Paul news in your inbox

Catch up on the most important stories affecting your hometown with Axios Minneapolis-St. Paul

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Tampa-St. Petersburg news in your inbox

Catch up on the most important stories affecting your hometown with Axios Tampa-St. Petersburg

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Please enter a valid email.

Please enter a valid email.

Subscription failed
Thank you for subscribing!

Illustration: Eniola Odetunde/Axios

A huge controversy in the U.K. over an algorithm used to substitute for university-entrance exams highlights problems with the use of AI in the real world.

Why it matters: From bail decisions to hate speech moderation, invisible algorithms are increasingly making recommendations that have a major impact on human beings. If they're seen as unfair, what happened in the U.K. could be the start of an angry pushback.

What's happening: Every summer, hundreds of thousands of British students sit for advanced-level qualification exams, known as A-levels, which help determine which students go to which universities.

  • Because of the coronavirus pandemic, however, the British government canceled A-levels this year. Instead, the government had teachers give an estimate of how they thought their students would have performed on the exams.
  • Those predicted grades were then adjusted by Ofqual, England's regulatory agency for exams and qualifications, using an algorithm that weighted the scores based on the historic performance of individual secondary schools.
  • The idea was that the algorithm would compensate for the tendency of teachers to inflate the expected performance of their students and more accurately predict how test-takers would have actually performed.

The catch: It didn't quite work out that way.

  • When students received their predicted A-level results last week, many were shocked to discover that they had "scored" lower than they had expected based on their previous grades and performance on earlier mock exams.
  • For some, the algorithm-weighted results meant that they were now ineligible for the university programs they had expected to attend, potentially altering the course of their future careers.
  • Around 40% of the predicted performances were downgraded, while only 2% of marks increased. The biggest victims were students with high grades from less-advantaged schools, who were more likely to have their scores downgraded, while students from richer schools were more likely to have their scores raised.

Of note: The BBC reported that among the students hurt by the algorithmic scoring was an 18-year-old who last year won an award for writing a dystopian short story about an algorithm that sorts students into bands based on economic class.

Be smart: The testing algorithm essentially reinforced the economic and societal bias built into the U.K.'s schooling system, leading to results that a trio of AI experts called in The Guardian "unethical and harmful to education."

  • Accusations of algorithmic bias have also been aimed at the International Baccalaureate (IB), which also canceled exams and used a similar model to predict results that also seemed to penalize students from historically lower-performing schools.

What's new: After days of front-page controversies, the British government on Monday abandoned the algorithmic results, instead deciding to accept teachers' initial predictions.

Yes, but: British students may be relieved, but the A-level debacle showcases major problems with using algorithms to predict human outcomes. "It's not just a grading crisis," says Anton Ovchinnikov, a professor at Queen's University's Smith School of Business who has written about the situation. "It's a crisis of data abuse."

  • To students on the outside, the algorithms used to adjust their grades appeared to be an unexplained black box — a frequent concern with AI systems. It wasn't clear how students could appeal predicted scores that often made little sense.
  • Putting what Ovchinnikov notes was a "disproportionately large weight" on schools' past performance meant that students — especially those from disadvantaged backgrounds — lost the chance to be treated as individuals, even though scoring high on A-levels and going to an elite university arguably represents one of the best opportunities for individuals to improve their lot in life.
  • To avoid such disasters in the future, authorities need to "be more inclusive and diverse in the process of creating such models and algorithms," says Ed Finn, an associate professor at Arizona State University and the author of "What Algorithms Want."
AI systems are easy to scale. But if there's a problem, it's also easier to duplicate that problem.
— Anton Ovchinnikov

The bottom line: Bias, positive and negative, is a fact of human life — a fact that AI systems are often meant to counter. But poorly designed algorithms risk entrenching a new form of bias that could have impacts that go well beyond university placement.

Go deeper

Updated Nov 17, 2020 - Axios Events

Watch: A conversation on America's education inequities

On Tuesday, November 17, Axios' Sara Kehaulani Goo, Erica Pandey, and Courtenay Brown hosted a conversation on unequal opportunity and systemic racism in schools, featuring Northern California Indian Development Council Indigenous Education Advocate Rain Marshall, National Education Association President Becky Pringle and EdBuild CEO Rebecca Sibilia.

Becky Pringle discussed racial inequity in the education system, highlighting a lack of funding and accessible resources for students of color, as well as the need for congressional action around students' access to virtual education.

  • On the stark challenges of digital access: "60 million students did not have access to virtual learning in the spring...It is now November and those students still do not have that access. We are working with our educators and with communities, with our families or our partners to demand that the Senate act."

Rebecca Sibilia unpacked how school district lines can reinforce existing racial and economic divides, and discussed the possibility of making school districts larger to better distribute resources to students.

  • On growing wealth inequality and its impact on education: "[The] school district line becomes incredibly important in determining which students go to which schools and how well resourced they are. Because we fund schools primarily based on property taxes, the state tries to equalize for differences in the fundamental wealth of communities, but they just can't keep up."

Rain Marshall discussed the legacies of colonization on Indigenous students and the impact of those narrative being left out of school curricula.

  • On the classroom experience of Indigenous students: "You have a curriculum that doesn't reflect the population of Indigenous students...You have these leftover legacies in the school system and implicit bias where teachers just aren't aware that [this] erasure is harmful."

Axios' VP of Events Kristin Burkhalter hosted a View from the Top segment with President of Paul Quinn College Dr. Michael Sorrell and discussed the impact of poverty on students, and how to rethink the American education system.

  • On increasing accessibility to higher education: "People need are easier on and off ramps into higher education...The idea that what you study when you're 20, 21, 22 years old is going to be with you for the rest of your life and you won't need to make adjustments is just not realistic."

This event was the second in a yearlong series called Hard Truths, where we'll be discussing the wide ranging impact of systemic racism in America. Read our deep dive on race and education here and check out the series page here.

Thank you Capital One for sponsoring this event.

Bryan Walsh, author of Future
Nov 18, 2020 - Politics & Policy

America mortgages its future on school closures

Illustration: Sarah Grillo/Axios

The decision by many U.S. states and cities to keep kids out of school because of COVID-19 will have crippling economic and health effects that could last for decades.

Why it matters: Evidence shows that children, especially younger kids, present a low risk for COVID-19 transmission and that remote education is no replacement for in-person schooling. By keeping schools closed — even as more risky activities are allowed to continue — the U.S. is kneecapping the next generation.

Bryan Walsh, author of Future
Nov 18, 2020 - Technology

AI pioneer looks to specialize products for real-world business

Illustration: Sarah Grillo/Axios

A pioneering AI scientist and entrepreneur argues that technology needs to be specialized to work effectively in manufacturing.

Why it matters: AI has been slower to make a difference in many forms of business because it still takes expertise and investment to use it effectively. For now, that means models will need to be trained individually to be effective on the factory floor.