In the past few months, Google has tried to escape from the growing backlash of technology, but yesterday the dam exposed personal information about 500,000 users through a bug in the rarely used Google+ network. We found and fixed a bug in March at the same time that the Cambridge Analytica story was heating up. But as the news broke, the damage is already spreading. The Standard Edition of Google+ has been closed and the German Privacy Commissioner in Germany and the United States are already expected to take legal action, and former SEC officials are publicly speculating about what Google may have done wrong.
The vulnerability itself is relatively small. At the heart of the problem was a specific developer API that you can use to view your private information. However, there is no evidence that it was actually used to view private data. It's not clear how much private data really took up because there are not many users. It's plausible that no one will ever use this method because the API is theoretically accessible, but the actual number of people applying for access is 432 (again Google+).
Google is legally clear. There are many laws about reporting violations, including GDPR, as well as national level legislation. However, according to this standard, what happened to Google+ was not a technical violation. Those laws are about unauthorized access to your information by organizing the basic idea that you have the right to know if someone steals your credit card or phone number. However, Google knew that the data was actually provided to developers, but did not actually take any data. Without a clear data theft, we do not have legal reporting requirements. It was not a violation as long as the lawyer was concerned, and it was enough to quietly solve the problem.
There is a real case against the release of this kind of bug. Because all systems have vulnerabilities, good security strategies are constantly finding and fixing them. As a result, the safest software is to find and patch most bugs, even though it may seem to be out of intuition. Asking your company to report each bug publicly can be a crooked incentive to penalize the product you are most likely to protect your users.
(Of course, Google has suddenly released bugs from other companies for years.) Project Zero is one of the reasons why critics want to go beyond obvious hypocrisy, but Project Zero's crew is a completely different dance with third party reporting, It is generally used as an incentive for patches.)
The logic is better suited to software bugs than social networks and privacy issues, but it is not to say that the cyber security world has accepted wisdom. It led to Google's idea of hiding and keeping this bug.
However, when Facebook was suffering from grace, legal issues and cyber security debates seemed almost at the core. Contracts between high-tech companies and users have become more vulnerable than ever, and the story is thinner. There is less concern about infringement of information than violation of trust. Something went wrong and Google did not tell anyone. Journal If it had not been reported, it is not clear that it was not until now. It is difficult to avoid uncomfortable and unquestioned questions. Does that tell us something?
It is too early to say whether Google will face true opposition to this issue. We suggest that if the number of affected users is low and the relative importance of Google+ is low. However, even if this vulnerability is minor, these errors are a real threat to the user and pose a real risk to a trusted company. The chaos of bugs, violations, and vulnerabilities makes it more confusing for companies to actually owe their privacy failures and how much control they actually have. These are crucial questions for the technology of this era and the industry is still wondering if there are any signs of the past few days.