David McCabe

FTC may take a deeper look at artificial intelligence

AP / Susan Walsh

Acting FTC Chairwoman Maureen Ohlhausen said the agency hopes to take a closer look at artificial intelligence "because it has a consumer protection element to it but also has a competition element to it."

Why it matters: In a split from the Obama years, the federal government in the Trump era has done very little to look at the policy questions posed by AI. That's starting to change — at a time when Silicon Valley is pouring more money than ever into the technologies.

The bigger picture: Ohlhausen acknowledged there's promise in using artificial intelligence to process massive amounts of data. "They may say that you're at risk for cancer you didn't realize you were at risk for, or here's a product that would suit you really, really well," she said. "But it also could be used to harm consumers."

Ohlhausen argued that the FTC — which focuses on whether the consumer has been harmed — is equipped to address these challenges.

On a related note: The regulator was also asked about whether she would continue the efforts of the Obama administration to look at how algorithms can be biased. "We do enforce laws that are to protect consumers from discrimination, and I think that's appropriate for us to continue to think about and to continue to be vigilant for," she said.


Google economist: tech will help workers get new skills


Google's chief economist says that technology will help people transition into new roles, even as it changes the nature of jobs. Hal Varian noted that technology has made it easier for people to learn crucial job skills — while on the job. Drivers, he noted, no longer need to have a perfect grasp of a city's geography; they can learn as they go because technology exists to help with navigation. Online content, such as Khan Academy, can help teach new skills.

"This cognitive assist is really a big deal because it allows for the kind of on the job training you're talking about," he said Monday at the Technology Policy Institute's annual conference.

Why it matters: Some argue that the tech industry has a responsibility to help workers who are impacted by automation, robotics and artificial intelligence. Varian told Axios that helping people pick up new skills is also good for business.


One idea for regulating Google and Facebook's control over content

Paul Sakuma / AP

We reported this morning on the mounting pressure on major web platforms over their role in moderating content. A conservative activist named Phil Kerpen circulated a confidential memo earlier this year on the mechanics and politics of how to regulate the political neutrality of major web companies like Google and Facebook. Find the full text below.

Why it matters: Moves to turn these ideas into concrete policy or regulation haven't happened. But the memo is certainly getting attention, especially as major web platforms try to walk the fraught line of removing extremist content while also maintaining an open platform for free speech in the wake of the Charlottesville attack.

Worth noting: "The unpublished draft memo represents preliminary thoughts on complex issues," Kerpen said when contacted by Axios.

Confidential Strategy Memorandum: Layer-Neutral Net Neutrality And The Private Censorship Problem

Social media (Facebook, Twitter) and search (Google) companies with dominant market position represent themselves as politically neutral while systematically promoting liberal views and limiting or even banning conservatives. They do so while enjoying blanket liability protection and with the full approval of liberal elites. Far too many conservative media and intellectuals defend the politically biased practices of these companies on the basis that viewpoint discrimination by private entities is beyond the reach of government.

That view ignores the reality that basic network economics create a high bar to competition – a problem that's been with us since the railroads – and that incumbents with market power therefore pose a serious threat to free speech.

Worse, that view incorrectly assumes the political bias of these companies is a free-market phenomenon, when it is largely result of federal law that insulates these companies from a natural market constraint on being an active political player: legal liability for publishing false and malicious claims.

Section 230 of the Communications Decency Act includes a finding by Congress that "The Internet and other interactive computer services offer a forum for a true diversity of political discourse," but has enabled precisely the opposite by allowing sites to exercise editorial control without becoming legally responsible for user-generated content.

CDA 230's provision for "Good Samaritan blocking and screening of offensive material" is so broad, allowing sites to filter or block content that is "harassing, or otherwise objectionable," that it effectively gives carte blanche to promote an aggressive political agenda without any risk of legal consequence.

Moreover, the very companies that are now exploiting these liability protections and their enormous incumbent market power were the principal corporate proponents of imposing draconian regulation on ISPs via the FCC in the context of net neutrality, which morphed into Title II public utility regulation. The arguments they made in that context apply in every respect to themselves, as both critics and supporters of net neutrality regulation have long observed.

The Title II order is ticketed for imminent revocation under Republican FCC Chairman Ajit Pai, and deservedly so. It has had a profound negative impact on broadband investment and represents a dangerous precedent of a federal regulatory agency dramatically expanding its own power without authorization by Congress.

At the same time, however, the Internet ecosystem is not likely to be satisfied with going back to the old, unenforceable broadband statement given the battle-scars on all sides of the net neutrality fight. Stakeholders will seek bright line rules requiring transparency and prohibiting blocking and throttling from the place the debate always should have taken place: Congress.

The legislative process is likely to be led by the Senate Commerce Committee and its chairman, John Thune of South Dakota, who unfortunately may fear taking on powerful edge companies after receiving unexpected criticism from the right when he held important hearings on the systematic political manipulation of the "Trending News" feature by Facebook.

The Trump administration should urge Thune to think bigger than just the ISPs and make clear that they will provide robust cover from the right if he takes on the challenge of political bias from the edge.

Social media and search companies, and possibly others, should be subject to the same neutrality rules because they possess the same benefits of market power that come from enormous fixed costs as well as, in the case of social network platforms, the network lock-in effects of having a large user-base.

Putting everyone in the same boat has enormous advantages, ensuring the exercise is genuinely pro-consumer rather than devolving into the familiar attempt by the edge to seek regulatory predation of the core.

The most likely approach would be a supercharged transparency rule requiring clear disclosure of how traffic is treated, and clear specification of the standards used for limiting speech, including any possible viewpoint discrimination.

Platforms that represent themselves to the public as neutral would be subject to enforcement actions if they violate those representations through a consumer-protection framework.

Platforms that elect not to be neutral would be free to exercise editorial control, but would have to prominently disclose they are doing so – and would no longer be eligible for a section 230 safe harbor to shield them from the legal consequences of the material they choose to publish.

Critics will raise First Amendment objections, but their arguments will smack of hypocrisy if they supported the FCC neutrality rules for ISPs, which also provide a legal template.

In USTA v. FCC the DC Circuit upheld so-called net neutrality regulation of broadband providers and laid out a roadmap for neutrality regulation without running afoul of the First Amendment:

If a broadband provider nonetheless were to choose to exercise editorial discretion—for instance, by picking a limited set of websites to carry and offering that service as a curated internet experience—it might then qualify as a First Amendment speaker. But the Order itself excludes such providers from the rules. The Order defines broadband internet access service as a "mass-market retail service"—i.e., a service that is "marketed and sold on a standardized basis"—that "provides the capability to transmit data to and receive data from all or substantially all Internet endpoints." That definition, by its terms, includes only those broadband providers that hold themselves out as neutral, indiscriminate conduits.

Search and social can, by the same logic, be required to enforceably identify themselves as neutral or non-neutral platforms.

Jack Dorsey of Twitter has said: "We think of it as an information utility and a communications network," making it functionally identical to the ISPs Twitter lobbied the FCC to regulate.

If Twitter is in fact an advocate for liberal views – as it appears to be – then it should be forced to say so clearly, as should Facebook and Google. And if they choose to be First Amendment speakers rather than neutral conduits, then they should be willing and able to defend the material they label as "fact checked" in court.

By simply proposing this framework, the Trump administration would make clear that the asymmetry of companies identified with conservative causes risking regulatory retaliation while companies identified with liberals are given a free pass is over.

Moreover, while the initial response will be indignation from the left as well as search and social companies – possibly including mass mobilization of site users, which is a potent political weapon – the focus on transparency, a core value of younger voters, as well as the hypocrisy of these companies supporting for ISPs precisely what they oppose for themselves puts these companies in an untenable position.

They are therefore likely to rely principally on the argument that regulation is unnecessary, to issue even stronger statements of political neutrality, and to actually improve their behavior to prevent regulation.

Rather than fighting a standalone rearguard action to defend rollback at the FCC, this approach puts us on offense on the net neutrality issue and assures a positive outcome whether or not the bill passes.


The walls are closing in on tech giants

Illustration: Rebecca Zisser / Axios

Tech behemoths Google, Facebook and Amazon are feeling the heat from the far-left and the far-right, and even the center is starting to fold.

Why it matters: Criticism over the companies' size, culture and overall influence in society is getting louder as they infiltrate every part of our lives. Though it's mostly rhetoric rather than action at the moment, that could change quickly in the current political environment.

Here's a breakdown of the three biggest fights they're facing.

Battle over content: Both sides are increasingly wary of the outsized role that Facebook and Google play as moderators of public discourse, as was seen following the violence in Charlottesville. In the White House, Steve Bannon has reportedly argued that Facebook and Google should be regulated like public utilities.

  • Right-wingers worry the progressive-leaning companies aren't going to give their views a fair shake. Recently they opposed Google's firing of an engineer whose internal memo questioned women's aptitude for engineering jobs. They've also criticized YouTube policies meant to combat offensive speech. They see a company with the ability — and, in their eyes, motive — to sideline their views.
  • A policy memo quietly circulated earlier this year by activist Phil Kerpen recommended rules to keep online platforms politically neutral, potentially subjecting platforms that violated that neutrality to government enforcement actions. In an email obtained by Axios, Kerpen said the general strategy would "get us on offense and scare the hell out of Google, Facebook, Twitter." (Kerpen told Axios that the "unpublished draft memo represents preliminary thoughts on complex issues.")
  • Sen. Ted Cruz told Axios that he's worried about "large tech companies putting their thumb on the scales and skewing political and public discourse." He asked during a June hearing whether "these global technology companies have a good record protecting free speech, and what can be done to protect the first amendment rights of American citizens."
  • On Monday, Fox News host Tucker Carlson said that since Google "has the power to censor the internet, Google should be regulated like the public utility it is to make sure it doesn't further distort the free flow of information."
  • The left's fixation on whether fake news impacted the election has ensnared Facebook and other platforms in investigations into Russia's influence during the campaign. Top Senate Intelligence Committee Democrat Sen. Mark Warner has spoken about fake news with Facebook staffers multiple times this year in both Silicon Valley and Washington, a source said.
  • There's also frustration that Facebook didn't remove the event page for the white supremacist rally in Charlottesville until right before it happened.

Battle over liability: Big tech firms are in a panic about a bi-partisan bill that would let sex trafficking victims sue web platforms that hosted content implicated in the crime.

  • For decades online platforms have been heavily shielded from liability for what users post. That's central to the business models of Google, Facebook, Airbnb and other internet companies and they fear this bill opens the door to new liability risks for other types of user-generated content.
  • There's been a flood of opposition to the bill from trade groups representing tech giants and outside groups that have received funding from the companies. Backers of the bill say the companies had the opportunity to step up. "We offered them a chance to provide constructive feedback and they chose not to, and instead decided to oppose a strongly bipartisan bill to help stop online sex trafficking," said Kevin Smith, a spokesman for bill sponsor Sen. Rob Portman.

Battle over size: Legal experts are crafting the antitrust case against tech giants.

  • The left-leaning team at the New America Foundation's Open Markets program has been pushing this issue hard for years, but has recently started to get traction. A law review note produced by a fellow for the program has brought new attention to Amazon's effect on competition. Amazon cares enough about the concerns to have met with members of the New America team in June.
  • "They deserve to be highly profitable and successful," Sen. Elizabeth Warren said of major tech companies in a speech last year. "But the opportunity to compete must remain open for new entrants and smaller competitors that want their chance to change the world again."
The other side: Big tech companies generally argue they compete aggressively with each other as well as upstarts, that they have no interest in injecting bias into how they moderate user content, and that their liability protections have enabled the the internet sector to thrive.

Across the pond: European regulators have taken action on their concerns about the companies' growing clout. Google faces a massive fine over allegedly anticompetitive behavior (and two other investigations) and Facebook has been docked for allegedly misleading regulators when it bought WhatsApp.

The political establishment is starting to buy in to these concerns, too: Democrats are urging tougher antitrust enforcement as part of their "Better Deal" platform. Republican leadership staffers told Google, Facebook and Amazon that aggressive pro-net neutrality advocacy would put their policy objectives at risk; sources say they invoked privacy as one issue where the companies could be vulnerable.

As history shows, it takes time for talk to turn to action: AT&T's antitrust disputes with its skeptics festered over a decade, and Microsoft's opponents agitated for years before the government took them seriously. And fringe arguments have a way of becoming mainstream: Critics of Ma Bell and Microsoft looked like outliers before picking up steam.

Trump said to pick antitrust lawyer as FTC chair

Alex Brandon / AP

Politico reports that Joseph Simons is Trump's planned pick to lead the Federal Trade Commission. He's a partner at Paul Weiss, a major corporate law firm, where he co-chairs its antitrust practice. He was the top official in the FTC's Bureau of Competition in the early 2000s. When asked, the White House told Axios it didn't have any personnel announcements.

The key question: How would he approach the growing trend of corporate consolidation? The FTC vets major mergers — including Amazon's proposed acquisition of Whole Foods — and investigates violations of consumer protection law. Simons' law firm bio notes that he was "responsible for overseeing the re-invigoration of the FTC's non-merger enforcement program."


Hate speech tests tech's core principles

AP File Photo

Several major tech firms are reevaluating their core value of openness as they clamp down on white supremacist rhetoric on their platforms. After protests turned violent in Charlottesville, companies are taking a harder line against hateful content than they have in the past.

Why it matters: The tech industry's vision has been to create open, neutral platforms that allow all viewpoints. Every time it filters content or restricts users' access, it has to balance that goal with concerns about hate speech that could lead to violence. This week's events have caused many companies to recalibrate that balance. Many are still grappling with where to draw the line between free speech and dangerous extremism.

Not all companies took action right away. The growing outcry over the Charlottesville violence led to several major platforms taking a stand in the days following. Some on the left argue that Facebook, for example, didn't act quickly enough.


  • Sunday: GoDaddy dropped white supremacist site The Daily Stormer from its domain registration system.
  • Monday: The Daily Stormer registered its domain with Google, which quickly severed the relationship. Recode reported that Facebook had removed a white nationalist group on the site. Airbnb CEO Brian Chesky said that the company "will continue to stand for acceptance and we will continue to do all we can to enforce our community commitment;" the company had already banned users who were planning on attending the Charlottesville event.
  • Tuesday: PayPal said it would "as we consistently have in the past – limit or end customer relationships and prohibit the use of our services by those that meet the thresholds of violating" its acceptable use policy. It has reportedly stopped working with some white nationalist websites.
  • Wednesday: Facebook announced the removal of a white nationalist's account five days after the attacks. Cloudflare ended its relationship with The Daily Stormer while Twitter appeared to suspend its accounts. Spotify removed music by white supremacist bands. Apple Pay stopped letting certain white supremacist sites sell items using its product. Squarespace removed some sites from its service in "light of recent events."

The dilemma: Where do companies draw the line? Cloudflare CEO Matthew Prince was candid about this in an internal email published by Gizmodo on Wednesday evening:

"It's important that what we did today not set a precedent. The right answer is for us to be consistently content neutral. But we need to have a conversation about who and how the content online is controlled. We couldn't have that conversation while the Daily Stormer site was using us. Now, hopefully, we can."

In a Facebook post Wednesday, Mark Zuckerberg, too, made clear the company has to be careful in how it deals with hate speech: "Debate is part of a healthy society," he said. "But when someone tries to silence others or attacks them based on who they are or what they believe, that hurts us all and is unacceptable."

Our thought bubble: Tech companies have always been wary of getting too involved in the process of vetting the content they host. The events since Sunday show the limits of that thinking as their platforms become more pervasive.


Zuckerberg: “The last few days have been hard to process”

Eric Risberg / AP

Facebook CEO Mark Zuckerberg said that his company has taken steps to curb hate speech on its platform after a white nationalist protest that led to violence. Zuckerberg said that the site has "always taken down any post that promotes or celebrates hate crimes or acts of terrorism — including what happened in Charlottesville."

He added: "With the potential for more rallies, we're watching the situation closely and will take down threats of physical harm. We won't always be perfect, but you have my commitment that we'll keep working to make Facebook a place where everyone can feel safe."

Key context: Zuckerberg's statement — which included a broader condemnation of bigotry — comes as tech firms are under new pressure to deal with extremist content. Facebook has been criticized for how long it took to delete an event page associated with the Charlottesville protests. It has since banned an account associated with white nationalism.

Our thought bubble: These are Zuckerberg's first comments on the weekend's events in Virginia. That's notable because he has spent the better part of this year working to better understand what binds American communities. He's weighed in on the president's efforts to bar trans service members and remove the U.S. from the Paris accords, but he was silent for days on some of the tensest 72 hours in America since the week of the election.


'March on Google' postponed

Mark Lennihan / AP

The protests planned to be held at several U.S. Google offices this weekend have been postponed by organizers. The so-called March on Google was announced after the company fired the author of a controversial memo about Google's diversity efforts and women's affinity for technical roles.

Details: In a statement posted online Jack Posobiec — the pro-Trump activist who organized the event — said that "credible Alt Left terrorist threats for the safety of our citizen participants" was a reason behind the decision. The statement said a "threat was made to use an automobile to drive into our peaceful march." No future date was given.

Context: The phrase "alt left" was used by President Trump yesterday to describe counter-protestors in Charlottesville during the white supremacist event that turned violent this weekend. Posobiec is a conspiracy theorist who has publicized things online like "Pizzagate," which alleged Democratic politician involvement in a child-sex ring.


Waymo nabs new policy chief from Senate

Courtesy of Waymo

Alphabet's self-driving branch is getting some more Washington firepower. Waymo has hired Senate Commerce Committee staffer David Quinalty as its new Head of Federal Policy and Government Affairs. He'll interface with lawmakers and federal transportation regulators as part of the job.

Why it matters: Self-driving technology is moving full speed ahead in Silicon Valley, and Washington in trying to keep pace. Lawmakers in both chambers — including the committee Quinalty currently works for — are working on bills related to autonomous vehicles. If the companies are able to secure federal preemption of state regulations, it would be a huge win that they say would make it much easier to roll out the technology nationwide.


Tech companies push for privacy update at SCOTUS

Jon Elswick / AP

Facebook, Apple and Google want the Supreme Court to update its understanding of privacy laws in light of new technology.

  • Context: The companies were among the firms that filed a brief in a case challenging a warrantless search for certain data that points to a cell phone's location.
  • Here's the key quote: "Although amici do not take a position on the outcome of this case, they believe the Court should refine the application of certain Fourth Amendment doctrines to ensure that the law realistically engages with Internet-based technologies and with people's expectations of privacy in their digital data."
  • Why it matters: The court is taking up a major question of how you handle privacy law at a time when everyone owns a smartphone.