Google is shutting down Google+, and the world is devastated.
In March 2018, internal auditors at Google detected a massive data breach within the Google+ system. The data breach was caused by a bug that had been in the Google+ API since 2015, allowing developers unauthorized access to users' account data. When a user gave an app permission to access their public profile data, the bug let them pull their non-public data as well -- and that of all their friends.
As part of a new privacy measure rollout, the company announced on Monday that it is shutting down the social network for consumers.
Google says it fixed the bug in March, but before it did, the company's internal privacy team, Project Strobe, ran two weeks of tests to understand the bug better. In those two weeks, they found that at least 496,951 users’ full names, email addresses, birth dates, gender, profile photos, places lived, occupations, and relationship statuses were potentially exposed to 483 apps, although Google says it has no evidence the data was abused.
Infographic Courtesy of the Wall Street Journal
But what legal experts find most concerning about the breach isn't that it occurred. It's the fact that Google decided not to tell you about it.
Thankfully, that's what journalists are for. The Wall Street Journal broke the story Monday afternoon, sharing news of the data breach with the public for the first time.
According to documents obtained by the Journal, Google executives were informed of the Google+ bug and its breach of user data, but opted not to inform the public. Reportedly, the company feared falling under regulatory pressure and public scrutiny in light of recent privacy scandals like Cambridge Analytica.
The Journal reports:
A memo reviewed by the Journal prepared by Google’s legal and policy staff and shared with senior executives warned that disclosing the incident would likely trigger “immediate regulatory interest” and invite comparisons to Facebook’s leak of user information to data firm Cambridge Analytica.
Chief Executive Sundar Pichai was briefed on the plan not to notify users after an internal committee had reached that decision, the people said. ...
The document shows Google officials felt that disclosure could have serious ramifications. Revealing the incident would likely result “in us coming into the spotlight alongside or even instead of Facebook despite having stayed under the radar throughout the Cambridge Analytica scandal,” the memo said.
It “almost guarantees Sundar will testify before Congress.”
The company's fear is not unfounded. Since 2017, tech companies have come under intense scrutiny from government regulators and privacy advocates. Facebook is currently under Congressional investigation for its role in the Cambridge Analytica scandal. Governments worldwide are working to enact policy regulations managing how tech companies use and share individuals' private data.
Upon discovery of the Google+ bug, lawyers reportedly advised executives that Google isn't required to share information about the data breach with the public. They were entirely correct. In the United States, no such federal law exists.
However, Europe's General Data Protection Regulation, which took effect this year in May, requires companies to notify regulatory officials of a data breach within 72 hours, under threat of maximum fines of 2 percent of global company GDP (which, for Google, is a lot). While Google's failure to disclose the bug would have made it subject to penalties under this law, the GDPR regulations didn't take effect until two months after the Google+ breach occurred.
However, some still plan to sue and potentially launch a class action lawsuit against the company.
In a statement released Monday, Google described vast changes to the way it manages developer permissions and user interactions with terms of service. In regards to the data breach you were never informed of, the company, in a claim corroborated by the Journal's memo, said they had no reason to believe developers had misused the private data to which they had access, and that the company wouldn't know how to discern which users had been affected if they did.
“Whenever user data may have been affected, we go beyond our legal requirements and apply several criteria focused on our users in determining whether to provide notice,” a Google spokesperson said in a statement.
When deciding whether or not to inform the public of the breach, the spokesperson said that Google considered “whether we could accurately identify the users to inform, whether there was any evidence of misuse, and whether there were any actions a developer or user could take in response,” the spokesperson said. “None of these thresholds were met here.”
Do you think Google should have informed users about the data breach?
You can read Google's statement on the bug and Google+ here.