GitHub has a vision to make code more secure by design
GitHub is betting a yearslong investment in generative AI will fulfill the software security dreams of cybersecurity practitioners and government officials.
Why it matters: Some of the most common network vulnerabilities exploited by malicious hackers result from insecure and outdated coding practices.
- But many programmers don’t keep up with which coding frameworks or popular open-source tools have caused which software vulnerabilities, leading to a mountain of security vulnerabilities that hackers are eager to exploit.
The big picture: GitHub, a Microsoft-owned service that allows developers to store their code, widely released a generative AI-enabled tool, called GitHub Copilot, last year to help bridge this knowledge gap.
- The program gives developers predictive suggestions to speed up the time it takes to write code, as well as to make their code more secure.
Driving the news: A year after launching the product, GitHub released a report this week at the Collision conference in Toronto detailing how Copilot users are responding to the product.
- So far, users are accepting an average of 30% of Copilot’s suggestions. The acceptance rate "steadily increased as developers became more familiar with the tool," the report noted.
- Users with less coding experience were slightly more inclined to accept Copilot's suggestions, the report found.
- GitHub estimates that more than 1 million developers and 20,000 organizations are using its Copilot.
- However, most products released so far focus on analyzing information about cyber threats and assessing whether a network is affected by a recently discovered vulnerability. Few, if any, are helping coders write more secure code from the very start of a project.
What they're saying: "It’s kind of like a driver assistance system," GitHub CEO Thomas Dohmke tells Axios. "It doesn’t prevent all accidents that can happen, but it makes traffic a little bit more secure."
Between the lines: Dohmke says he doesn’t see generative AI resulting in fewer jobs for developers and programmers.
- Instead, he believes generative AI will just speed up the writing process and help clear out the backlog of projects that developers often face, as well as allow them to conduct security reviews at an earlier stage.
- "If you go to any software company, backlogs are endless, ideas are unlimited, and things take way too long," he says.
The intrigue: GitHub's usage data on Copilot arrives as the Biden administration is pushing organizations to adopt new principles that prioritize cybersecurity in all aspects of product development.
- Recent high-profile breaches involving popular open-source projects have been tied to flaws in code, like the widespread Log4j flaw in 2021. That's because open-source code is often run by volunteers who don't have the resources to conduct frequent security reviews.
- Mike Hanley, GitHub’s chief security officer and senior vice president of engineering, told Axios that as Copilot gets better, he envisions opportunities for the technology to help review old open-source code, like Log4j, for new vulnerabilities.
Yes, but: GitHub, Microsoft and OpenAI are facing a 2022 class action lawsuit claiming that Copilot — which was trained based on public code scraped from the web — has failed to credit the authors of that code.
- In May, the three companies asked a judge to dismiss the case.
Sign up for Axios’ cybersecurity newsletter Codebook here