Sign up for our daily briefing
Make your busy days simpler with Axios AM/PM. Catch up on what's new and why it matters in just 5 minutes.
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Catch up on coronavirus stories and special reports, curated by Mike Allen everyday
Denver news in your inbox
Catch up on the most important stories affecting your hometown with Axios Denver
Des Moines news in your inbox
Catch up on the most important stories affecting your hometown with Axios Des Moines
Minneapolis-St. Paul news in your inbox
Catch up on the most important stories affecting your hometown with Axios Twin Cities
Tampa Bay news in your inbox
Catch up on the most important stories affecting your hometown with Axios Tampa Bay
Charlotte news in your inbox
Catch up on the most important stories affecting your hometown with Axios Charlotte
Pixabay
Wired has a look at eclectic groups of coders organizing hackathons around the country to archive science data from publicly available government datasets.
The problem: There's a worry that the Trump administration might direct huge dumps of environmental and scientific data. And it's already started — for example, coders discovered that some of NASA's atmospheric carbon dioxide datasets were empty.
The work: It could be as simple as tagging websites to be saved for posterity in the Internet Archive or as difficult as building algorithms to manage downloading gigabytes worth of datasets from the DOE.
The goal: Compiling the data and monitoring changes or deletions on government websites is a huge task, so automation is the key. Ideally, the groups can compile a huge network of volunteers in every state working 24/7 to code and archive data as quickly as possible.