Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sure, but at the same time, when Google announced that Google+ had a huge security breach of 52M accounts, they didn't publicly disclose it until well after the fact because they didn't think it was serious enough. I wish Google would follow their own principles.


The most recent post on Project Zero's blog is about a Chrome vulnerability: https://googleprojectzero.blogspot.com/2019/05/trashing-flow...

And it had the exact same automatic 90 day disclosure applied: https://bugs.chromium.org/p/chromium/issues/detail?id=944062

"Please note: this bug is subject to a 90 day disclosure deadline. After 90 days elapse or a patch has been made broadly available (whichever is earlier), the bug report will become visible to the public."

In fact they've reported a lot of Chrome vulnerabilities: https://bugs.chromium.org/p/project-zero/issues/list?colspec...

And Android ones: https://bugs.chromium.org/p/project-zero/issues/list?colspec...

Hey, look at that! Equal treatment for all.


Do you mind rephrasing that to make Google seem eviler? I have s confirmation bias to fulfill


By that standard, literally no company in the industry is following these principles, because internal findings are not routinely disclosed. Internal vulnerability researchers have access to information outsiders don't, so you can imagine, the bugs you're not hearing about are pretty lurid. Every major tech company in North America spends millions annually on third-party software security tests; did you think these just weren't turning things up? What did you think was happening to the reports?


For what it's worth, Mozilla routinely discloses internal findings, subject to the same policy as external findings: the bug report is opened up once the fix has shipped to a sufficient fraction of users.

So it's not "literally no company". ;)

Disclosure: I work for Mozilla and I have reported a number of security bugs on our code, the vast majority of which are now public.


Mozilla certainly discloses more than other vendors do, but I'm talking to Mozilla security team members about this now, and maybe one of them can jump in here and correct me, but I don't think they can claim that all their internal findings are reliably (and meaningfully, in advisory form) disclosed.

Regardless: that's a good point. I should have said, public disclosure of internal findings is not an industry norm. Mozilla is a good counterexample to the argument that everyone close-holds internal findings.


That's a good point about advisories. All the findings are public eventually in the form of non-hidden bug reports, but not all may have advisories issued. Doubly so if the finding happens before the affected code had first shipped in a release (so buggy code gets checked in, then internal testing finds a security bug in it before it ships, and that bug is fixed).


I don't think that is the point though so much as that Google has one standard for their internal findings and another for project zero, which also deals with other companies with the justification that it is better. Mozilla doesn't audit other companies so what they do with their internal finding isn't relevant for that argument. One can of course argue whether it is good, or justified, or not. But I don't think that changes that their is an argument there. If someone wanted to sue Google (ha!) over a project zero disclosure that is likely something they would try to argue. That Google knows that disclosing has consequences.


I could be mistaken, but I think internally reported issues that don't make it to release aren't assigned CVE numbers, which might be what he means by "disclosed".

Of course, as you say, we do rate almost all security issues, and eventually make them public, so the information is only a bugzilla search away! https://bugzilla.mozilla.org/buglist.cgi?keywords=sec-critic...


Google's goal with Project Zero is supposedly to raise the stakes in security. I'm happy they're doing it, but if they're going to enforce a non-negotiable 90 days public disclosure policy, it leaves a bad taste in my mouth when Google itself doesn't care to follow that for their own services.

Project Zero has long maintained that any serious company should be able to meet 90 day disclosure timeframe, and yet here comes Google+...


Project Zero was not the group that discovered the G+ vulnerability, though. Project Zero's terms do not bind other teams within the company who have not agreed to them.



The Google+ API issue wasn't a breach, but a vulnerability that was identified and patched within a week, before announcing publicly.

How is that different?


This is complete whataboutery.

Disclosure is one thing, remediation is another. The former is only instrumental to the latter. In the G+ leak, remediation was swift; so disclosure was not required.

(Btw, the affected accounts were 500K, reportedly, not millions.)


That was not discovered by P0. If P0 found it, there is no reason to believe they wouldn't have disclosed in 90 days.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: