U.S. banking regulators on Friday assumed control of Silicon Valley Bank, the country's 16th largest bank and a top financial institution for technology and life sciences companies.
The big picture: This is the largest bank failure since Washington Mutual in 2008.
A hacker who uses the pseudonym "Denfur" is selling a database they claim includes stolen sensitive data from at least 55,000 customers of D.C.'s health insurance marketplace, including members of Congress and their staffs.
Employers are struggling to figure out how to fold ChatGPT into their workflows without risking the security of their corporate secrets, customer information and intellectual property.
The big picture: Engineers and programmers across many industries have found great utility in refining code or double-checking their work by running it through ChatGPT.
Why it matters: SVB is a cornerstone of the tech and life sciences startup economy. It's also America's 16th largest bank, and its failure would be the biggest since Washington Mutual.
Machine learning techniques can be successfully deployed to better identify food insecurity outbreaks across the world long before they take place, according to a new study.
Why it matters: The timely disbursement of humanitarian aid can be a matter of life or death during a food crisis. How we gather information and when we respond can make all the difference.
Police asked an Ohio businessman for video from his Ring doorbell camera, then issued a warrant for footage from more than 20 other cameras at his home and business.
Chinese-owned TikTok faces the threat of a ban over fears that the user data it collects could get fed to Beijing.
What's happening: Congress' long-running inability to pass a comprehensive privacy law has left online personal information vulnerable to be mined, hoarded and poached.
Why it matters: Virtually every major technology today opens data vulnerabilities that can cause havoc.
"Data privacy" may sound like an abstraction to much of the U.S. public, but our national failure to set privacy rules can have very concrete consequences.
Zoom out: Legal experts and privacy advocates have long warned of the dangers of the U.S.'s failure to bring privacy law into the 21st century.
It means that government authorities have a freer hand to seize digital information as evidence.
Private companies are freer to gather and resell the personal information of their customers and users.
In both public and private sectors, the absence of tough rules governing data handling makes every breach and hack more potentially damaging.
AI experts fearthat chatbots like ChatGPT trained on vast troves of internet text will already be seeded with an unknowable volume of personal data.
On its own, that's little different from what's available on Google or any other search engine today.
The difference is that ChatGPT and similar programs are capable of "remembering" and reusing information users share with them in unpredictable ways.
That means that details from any legal document, medical report, financial calculation or other input that someone shares with these systems might turn up again — accurately or erroneously — in answers to someone else's query, with no indication of the original source.
Our thought bubble: Every time you type at ChatGPT, consider that you might be sharing secrets with a thing that has an impossibly vast memory — and doesn't have a clue what a secret even is.
Between the lines: There may well be ways to equip generative AI systems with guardrails to protect against this kind of unintended sharing.
But right now developers have little incentive to build them, and the rest of us have no visibility into what data the systems are holding onto.
The bottom line: The faster technology advances and the more central it becomes in our lives, the more we'll miss having a good privacy law.
Religious leaders are dabbling in ChatGPT for sermon writing, and largely reaching the same conclusion: It's great for plucking Bible verses and concocting nice-sounding sentiments but lacks the human warmth that congregants crave.
Why it matters: As scarily good generative artificial intelligence tools start to disrupt all manner of professions, men and women of the cloth are pondering how eerily close it can come to projecting a human — or divine — soul.
Silicon Valley Bank, long one of the most popular financial institutions among tech and life sciences startups, saw its shares fall more than 60% on Thursday, wiping out a whopping $9.4 billion in market value.
Driving the news: Several top venture capital firms, including Coatue and Founders Fund, have suggested to some portfolio companies that they strongly consider pulling money out of SVB, as concerns grow over the bank's stability.
Epic Games is moving full steam ahead with its 4-year-old online store, launching tools today to self-publish video games and prepping it for the day it can finally appear on iPhones and Androids.
Why it matters: The company behind Fortnite and Unreal Engine has major ambitions to change global marketplaces for games and apps.
The Pentagon is attempting to better compete with Silicon Valley for civilian cyber talent in a newly released workforce strategy.
Driving the news: The Defense Department released a cyber workforce strategy Thursday that details training programs, recruitment process changes and apprenticeship programs it hopes to pursue between 2023 and 2027.
Silicon Valley Bank said it will launch $1.25 billion common stock sale, plus another $500 million of depository shares, and said private equity firm General Atlantic is buying $500 million of common stock in a separate transaction.
Why it matters: SVB is something of an avatar for the health of U.S. tech and life sciences startups, and right now it's calling for a doctor.
Two years after joining Meta, civil rights head Roy Austin Jr. tells Axios he believes the company has made strides on everything from how it designs products to checking for unintended discriminatory impacts.
Austin said the company has now completed or created ongoing projects to address 97 of the 117 to-do items identified as part of a civil rights audit completed in 2020.
Facebook parent Meta is sharing an updated data set for voice and face recognition AI that it hopes others in the industry will use to test how accurately their systems work across a diverse set of people.
Why it matters: Machine learning-driven artificial intelligence — which powers everything from these recognition algorithms to the popular ChatGPT — is only as fair as the data used to train and test it. The more representative the data, the less likely it is that human bias will turn into automated discrimination.