Welcome to Codebook, a cybersecurity newsletter written by a man who has gotten lost in a Starbucks.
Situational Awareness: Google CEO Sundar Pichai is currently (as of newsletter release) testifying before the House Judiciary Committee. You can stream that here.
If you've got tips or story ideas, I'd love to see. Just reply to this email.
Illustration: Rebecca Zisser/Axios
At a Federal Trade Commission hearing on Wednesday, Malcolm Harkins, chief security and trust officer at Cylance, will pitch his pet idea: The government should hold companies that make security software — like his — accountable.
The big picture: Harkins is hoping the FTC, or at least someone, will require companies to "disclose all of the controls that failed" during a breach — from the security flaws exploited by hackers to the security products that didn't capture them.
To be clear, this isn't the type of idea the FTC usually goes for. The FTC's regulatory powers are largely based on its mandate to fight unfair practices — in cybersecurity that means deceptive claims of privacy protection. It's not an IT advice shop.
Why it matters: If Harkins' idea ever gets adopted, we'd know a lot more about blind spots in breach prevention.
That doesn't mean breaches always result from problems with products. But a company whose product roster matches that of a breached competitor might want to know how that combination failed.
Security vendors would almost definitely push back against any such scheme, as has happened whenever Harkins has brought up his idea in the past.
The bottom line: The security industry is not at a point where it's comfortable with the message that "even the best products staffed with the best people will occasionally fail" — nor is the public ready for that nuance.
Symantec traced a recent flurry of activity from the espionage group MuddyWater, known to spy on Middle Eastern countries, to a public GitHub source code repository.
The big picture: GitHub is an online platform that programmers use to collaborate on open source software development. If you keep an open repository on GitHub, then anyone who knows where to look can find your toolkit.
The intrigue: MuddyWater got its name for being hard to attribute. The group uses a lot of scripts (essentially a list of system commands), rather than executable malware, to make its work harder to track.
The bottom line: The new campaign discovered by Symantec was heavily focused on Pakistan and Turkey, with a handful of victims in Russia, Saudi Arabia and other countries and multinational organizations. Symantec found the group had upgraded its toolkit.
Palo Alto Networks published an analysis of corporate cloud security on Tuesday, and some of the results weren't pretty: 49% of cloud databases aren't encrypted.
Why it matters: Any important database should be encrypted. That's not purely a cloud problem. There are inherent security advantages and a few disadvantages to the cloud, but the bottom line is that no matter where you put data, basic security hygiene is still important.
In the same study, Palo Alto found that corporations were failing to meet many industry standards — missing more than half of the CIS AWS Foundations' best practices, around two-thirds of NIST best practices and GDPR requirements, and around 90% of HIPAA and PCI requirements.
Photo: Michael Heim/EyeEm via Getty Images
More organizations are able to detect breaches on their own, according to CrowdStrike data released Tuesday.
Details: The company's data shows that 75% of incident response customers discovered breaches on their own, as opposed to, say, being alerted by researchers that their data is for sale on the dark web. That's up from 68% last year.
Why it matters: Discovering a breach on its own means that a company can reduce the dwell time of an attacker — the amount of time a hacker can cause damage in a system or steal files.
That's still not perfect: "It's better, but we still see an average 80 days of dwell time," said CrowdStrike CSO and president of service Shawn Henry.
Codebook will return Thursday.