[HN Gopher] Google launches new vulnerability reward platform
       ___________________________________________________________________
        
       Google launches new vulnerability reward platform
        
       Author : theafh
       Score  : 248 points
       Date   : 2021-07-27 13:04 UTC (9 hours ago)
        
 (HTM) web link (security.googleblog.com)
 (TXT) w3m dump (security.googleblog.com)
        
       | herpderperator wrote:
       | > To recap our progress on these goals, here is a snapshot of
       | what VRP has accomplished with the community over the past 10
       | years:
       | 
       | > Total bugs rewarded: 11,055
       | 
       | > Number of rewarded researchers: 2,022
       | 
       | So each person found an average of 5.5 bugs? That seems really
       | high, no?
        
       | Cromation wrote:
       | Is it just me or does the graphic towards the end of the post
       | misspell triage? I dont believe tirage fits at all for that use,
       | but I could be wrong.
        
         | invalidusernam3 wrote:
         | Report it as a bug and get that bug bounty!
        
         | sirdarckcat wrote:
         | It can be tiring at times.
        
         | alphabet9000 wrote:
         | it was fixed but here's the one this person was referring to
         | https://web.archive.org/web/20210727142303im_/https://lh4.go...
        
         | haydenchambers wrote:
         | Don't you always draw wine from a barrel after someone reports
         | a bug?
        
       | sschueller wrote:
       | I didn't see anything regarding higher rewards. Have they
       | increased them at all?
       | 
       | $29m for 10 years of bug fixes seem like a steal for a multi
       | billion dollar company. Especially if some of those bugs that
       | have been reported and fixed are potentially lethal for the
       | company.
        
         | asah wrote:
         | They pay what they need to pay. I'm sure they're also
         | monitoring the dark web for black hats selling vulnerabilities
         | instead of collecting bounties.
        
           | smorgusofborg wrote:
           | Google and Apple's bids weren't high enough to get anyone in
           | dozens of shady governments with access to NSO Group's
           | services to successfully risk adding burner phones and then
           | analyze these attacks.
           | 
           | I think any alternative explanation to too cheap is even less
           | savory.
        
             | tptacek wrote:
             | People will sell bugs on the grey market no matter what
             | Google pays, because not everybody can do business with
             | Google.
             | 
             | A reminder that grey market exploit purchases are tranched;
             | the figures you hear for them are payout _caps_ , not lump
             | sums. If your bug is burned before all the tranches pay
             | out, you're SOL.
        
               | smorgusofborg wrote:
               | True, but I think you missed my point a little.
               | 
               | The NSO group has been entrusting its vulnerabilities to
               | people who would happily embezzle if it is some
               | sufficient amount of money. If the bounty is ~2X these
               | people's typical price in many countries that are NSO
               | Group clients, then it is remarkable these bugs aren't
               | being burned as a form of embezzlement. Your description
               | of tranched payment risk only makes that more remarkable.
        
               | tptacek wrote:
               | NSO's clients have effectively unlimited budgets. The
               | bang for the buck on exploits is probably pretty shocking
               | compared to alternative intelligence collection methods;
               | sending actual people out to do stuff is incredibly
               | expensive. When you raise the price of exploits --- which
               | you should do for other reasons! --- you don't
               | necessarily harm NSO. Since they effectively take a cut
               | of exploit valuation, you may even help them.
        
               | smorgusofborg wrote:
               | As with any asset that's hard to lock down, if the scrap
               | value is high enough relative to salary of employees then
               | they can't estimate how many phones can be exploited
               | before the next "steal stuff from work" event.
               | 
               | As such the NSO Group would end up limiting its clients
               | to fit its pipeline and have trouble buying exploits
               | since most other market participants have a single
               | workforce with a drastically lower rate of loss.
               | 
               | I think that would devolve to countries needing to pay
               | liability pricing per attack on possible honeypot
               | phone's, etc, and more countries being cut off like
               | Morocco, so no more using it like an unlimited plan. Sure
               | these countries have unlimited budgets relative to their
               | own GDPs, but when they can't find stability acting as a
               | group they are all back to bidding for unique
               | vulnerabilities, and there probably aren't 212 great ones
               | on every platform at all times.
        
               | tptacek wrote:
               | NSO's clients are all organizations with effectively
               | unlimited budgets. That's the premise. Even the shadier
               | companies in NSO's space sell principally to state
               | actors. It's unlikely that Google can drive the price of
               | an exploit past the level that _any_ country can pay for.
               | These are petty cash figures.
               | 
               | NSO builds implant technology, so they add some value of
               | their own, but NSO is essentially a middleman in this
               | market. Driving up the prices of the underlying asset,
               | when buyers aren't price sensitive, _helps the
               | middleman_.
        
       | mrRandomGuy wrote:
       | With rewards this low it's no surprise that people sell bugs and
       | exploits to often shady third party entities.
        
         | tptacek wrote:
         | Nobody is selling XSS bugs to NSO.
        
         | drtgh wrote:
         | "How I Found a Vulnerability to Hack iCloud Accounts and How
         | Apple Reacted to It"
         | 
         | https://news.ycombinator.com/item?id=27564236
         | 
         | Spoiler: Apple didn't paid him (even they tried to fool him a
         | second time). By reading HN its not the first time it happens..
        
       | deckar01 wrote:
       | I would rather see more transparency once you are a reporter than
       | shinier leaderboards. It is extremely frustrating to spend a week
       | reverse engineering a vulnerability in an opaque cloud service
       | only to be told it was a known issue (but private), won't be
       | fixed (but is within 48 hours), and that you don't qualify for
       | any compensation. I would like to see private issues shared with
       | reporters when they are independently discovered. I would like to
       | see status updates from developers. I would like to see some kind
       | of shared compensation system that acknowledges it can take more
       | than one person to investigate a problem before it is fixable and
       | that even time spent replicating a vulnerability has value.
        
         | jcims wrote:
         | *Not a Google employee but have worked for a bug bounty*
         | 
         | I agree everything you've stated would be desirable, and if
         | there was a strong culture and policy of supporting bounty
         | programs from the CEO on down, this could potentially be
         | achievable. However:
         | 
         | dupes - On the bounty side dupes are _extremely_ common and
         | buddies telling buddies about their finds is going to drive
         | fraud up quite a bit. In my triage work I saw very clear
         | attempts at this regularly.
         | 
         | wontfix - This one is largely due to the fact that many bug
         | bounties don't have authority over or even shared reporting
         | structure with the product teams. There's probably room for a
         | consolation prize as long as the bug is in scope but that's
         | about it. The fact that the bug goes away later could be a fix
         | or could just be part of a new release. This should be
         | extremely rare though and is worth following up with the
         | program (again, as long as its in scope).
         | 
         | sharing issues - This is going to struggle mightily with legal
         | without good contracts and NDAs for each researcher.
         | 
         | status updates - Agree its frustrating but is challenged by the
         | product team/bug bounty alignment noted above. Most bounty
         | programs don't get info from devs either and if product teams
         | don't listen about fixing bugs the likelihood that they are
         | going to regularly report on fixes is almost nil.
         | 
         | shared comp - Unless I'm missing your point you can self-
         | organize outside of the bounty program (and many do) for this.
        
           | dylan604 wrote:
           | >dupes - On the bounty side dupes are extremely common and
           | buddies telling buddies about their finds is going to drive
           | fraud up quite a bit. In my triage work I saw very clear
           | attempts at this regularly.
           | 
           | wouldn't looking at the logs of when the reports were taken
           | pretty much clearly show the first person to make the report?
           | how is this a thing that gets confused to be an issue?
        
           | stingraycharles wrote:
           | Sounds like the conclusion is that the bounty programs needs
           | to work closer together with the product teams if it wants to
           | be more effective.
           | 
           | Phrased differently, internal organizational challenges
           | should never be a valid reason why a bug is disqualified.
           | It's completely irrelevant from an outsider's perspective.
        
             | jcims wrote:
             | I don't know if 'extremely valid' is good English but it's
             | how I would characterize your assessment. I totally agree.
             | 
             | The challenge, however, is constructing a sustainable model
             | to incentivize product teams to reciprocate this closer
             | working relationship. I would say all but the smallest of
             | companies running bug bounties also have an internal
             | security function that is already doing reporting on
             | vulnerabilities, time to fix, etc. etc. So whatever
             | internal 'reputation' there might be across product teams
             | is well established (and in my experience the culture
             | around bug fixing across product orgs is consistent from
             | both internally discovered and externally reported bugs)
             | 
             | Another thing that happens is that there is typically a
             | backlog on bugfixes, so if a researcher reports a new bug
             | that's Medium priority, it's not going to get prioritized
             | against a backlog of Criticals or Highs. Most of the lack
             | of feedback from devs is simply the fact that there's been
             | no action. Once the bug is front and center, the fixes are
             | extremely simple and done within a few days and rolled out.
        
               | [deleted]
        
               | stingraycharles wrote:
               | I think there are two separate issues, though:
               | 
               | * properly qualifying a bug bounty submission
               | 
               | * actually fixing the thing
               | 
               | From what I understood from your original comment, was
               | that sometimes things get qualified as "wontfix" because
               | of internal struggles. I understand that these struggles
               | exist, as with any large org with conflicting priorities,
               | especially when the issues are not 0day severity.
               | 
               | I think, however, from an outsider's perspective,
               | acknowledging it's a valid bug, but also saying "this
               | will not get fixed any time soon" is infinitely better
               | than just marking it as wontfix.
               | 
               | As such, I'd argue that the minimum effort the product
               | teams would have to put into the bug bounty program is
               | properly qualifying the submissions. If that, for some
               | reason, is too much to ask, I submit it's a higher level,
               | leadership problem that they want to have a bug bounty
               | program without the necessary resources to qualify the
               | bugs.
        
               | jcims wrote:
               | Your points are totally valid and I agree with them in
               | principle, I just see them as existing on a spectrum.
               | 
               | One pretty interesting example is GitLab's bounty program
               | because you can see both sides, both the reported issues
               | and how they are tracked on the engineering side:
               | 
               | https://hackerone.com/gitlab/hacktivity?type=team
               | 
               | https://gitlab.com/gitlab-
               | org/gitlab/-/issues?label_name=Hac...
               | 
               | There's very good organizational alignment here from what
               | I can see, and overall the program seems to be quite
               | adaptive and well supported (e.g. the comment about
               | eliminating the 50% bounty here
               | https://hackerone.com/reports/1154542#activity-11394543,
               | 'unduping' a report here
               | https://hackerone.com/reports/402658 ). It's about as
               | best-case as you can get for a bug bounty...relatively
               | small, single-product company that exposed to the
               | sunshine effect of a public issue tracker.
               | 
               | That said, if you peruse every one of those issues you're
               | going to find instances where they drop the ball, take
               | too long, make mistakes, etc. etc.
               | 
               | So I would just say that if you ask anyone that has run a
               | bug bounty, they will likely say they were surprised how
               | difficult it was. You'll also generally find that they
               | are advocates for the researcher community and want to
               | run the best program they can. Everything else is largely
               | a function of the dark art of business prioritization and
               | the quality of the experience will be a function of how
               | well they and their team is able to navigate it.
        
         | anonymouswacker wrote:
         | Sounds like a full time job vs. a gig.
        
           | notatoad wrote:
           | yeah, these are all features of working on the development
           | team of a project. involving every developer who wants to
           | work on a bounty at that level would be an insane amount of
           | management overhead.
           | 
           | It makes sense if you're vetting people beforehand to make
           | sure that giving them this level of access and communication
           | is worth the effort, but that's what a job interview is.
        
             | deckar01 wrote:
             | Not work on the bug, just be kept in the loop like the
             | initial reporter. Putting in the same effort as the first
             | reporter should earn you the same trust that is afforded to
             | the first reporter.
        
               | ehsankia wrote:
               | If you're not going to win a prize, why would you need to
               | be kept in the loop about an internal vulnerability that
               | is being worked on?
               | 
               | As for getting a prize, it goes back to the dupe issue
               | which is the source of a lot of abuse. There's no way to
               | prove you also worked on it or if you just got the info
               | from your friend and want to double your winnings.
        
         | tptacek wrote:
         | It's really clear to me why people _want_ more transparency on
         | this stuff. I 'd want it too if I was submitting to bounties.
         | 
         | But the transparency you're asking for is difficult to actually
         | provide. Meanwhile, for a vendor at Google's scale, there is
         | essentially zero upside to screwing over bounty hunters. At any
         | realistic valuation for a vulnerability, these are rounding
         | error sums to the business. In fact, the exact opposite
         | incentive exists: these bounty programs are deemed to be
         | performing well when they pay out _more_ money, not less. The
         | people managing these bounties aren 't paying with their own
         | money. They'd rather make you happy and encourage you to submit
         | more stuff.
         | 
         | The two phenomena you describe here are real and common. But
         | it's just simply the case that vendors are generally working
         | through backlogs of issues, triaged by severity. If you're told
         | your finding is a dupe of a private issue, it is overwhelmingly
         | likely that it is. If you pay for every independent discovery
         | of an issue, people game that; worse than the dead weight loss
         | of the bogus bounties, you set up crazy incentives on your dev
         | team to fix marginal issues because they're being gamed, rather
         | than triaging according to real severity.
         | 
         | And, bounty hunters do turn up real bugs that aren't real
         | security issues, but are still bugs. If those bugs are easy to
         | fix, they're going to get fixed! You have the same weird gaming
         | and precedent issues if you start paying out non-exploitable
         | bugfix findings; you encourage people to find and report non-
         | exploitable bugs, and you screw up the team incentives on what
         | to fix.
         | 
         | I don't expect people to like any of this logic, because it
         | boils down to "you should just trust the Google VRP people".
         | But: you should. This isn't worth the cortisol. If it's driving
         | you up a wall, maybe don't participate? There are other ways to
         | market security bug finding skills. :)
         | 
         | If you've never worked triage on a bounty before, my guess is
         | that you can't really imagine how terrible the median
         | interaction is. Maybe the next evolution of these programs will
         | be long-term contract relationships with trusted, successful
         | vuln hunters that address some of these concerns by separating
         | out the people who are good at this stuff from the median
         | submitter (the median submitter invests 2 weeks trying to
         | litigate whether copying a cookie out of the Chrome inspector
         | and pasting it into curl constitutes an account takeover
         | vulnerability).
        
           | jcims wrote:
           | >If you've never worked triage on a bounty before, my guess
           | is that you can't really imagine how terrible the median
           | interaction is.
           | 
           | I deleted three sentences about this very topic in my earlier
           | comment because it turned into an ugly rant, lol.
           | 
           | >Maybe the next evolution of these programs will be long-term
           | contract relationships with trusted, successful vuln hunters
           | 
           | I'm actually somewhat surprised that bounty programs missed
           | the whole 'gig workers are employees' issue.
        
             | tptacek wrote:
             | For what it's worth, I think the argument that bounty
             | participants are gig workers is pretty silly.
        
               | jcims wrote:
               | Totally agree FWIW :)
        
           | nwellnhof wrote:
           | > there is essentially zero upside to screwing over bounty
           | hunters.
           | 
           | Well, I felt pretty screwed over after Google decided not to
           | reward me for discovering CVE-2021-30560.
        
             | tptacek wrote:
             | I'm not saying you shouldn't feel that way. I'm saying
             | Google has no incentive to actually screw you over; that
             | they have in fact the exact opposite incentive.
        
           | scarybeast wrote:
           | @tptacek -- you're so awesome. I keep meaning to reply on
           | some of these security threads but then I see you've made the
           | relevant points of sanity in a well reasoned manner.
           | 
           | For what it's worth, when I was setting up the culture and
           | values of Google's first bug bounty programs, I hammered "be
           | magnanimous" into the reward committees. i.e. look for
           | reasons to reward more, not less. Find the value in the
           | information provided, even if the person is being a jerk.
           | etc. I don't think this culture has changed. There are teams
           | of people rooting for incoming reports to succeed, and they
           | get excitement and joy from issuing large bounties (because
           | this means Google security is getting stronger).
        
         | cwkoss wrote:
         | > only to be told it was a known issue (but private), won't be
         | fixed (but is within 48 hours), and that you don't qualify for
         | any compensation
         | 
         | This kind of issue is rampant and opaque among bug bounty
         | programs.
         | 
         | IMO if a company says a bug is a wontfix, that's an immediate
         | moral justification for public disclosure.
         | 
         | If they say it's a known issue, they don't give a timeline of
         | when it will be fixed, and it's still not fixed in a week,
         | that's also moral justification for public disclosure. If
         | multiple people have reported the same vulnerability, there is
         | a high likelihood that the bug is known by even more people.
         | Bugs that are not hastily addressed are wasting participants in
         | your bug bounty's time.
         | 
         | Companies with bug bounty programs need to treat security
         | researchers with more respect, and when they don't the moral
         | imperative shifts towards public disclosure so that others are
         | warned that vulnerabilities exist and are not being addressed.
         | Too many companies set up a bug bounty program as a box
         | checking exercise and then have a lackadaisical attitude about
         | addressing reports.
         | 
         | I found a pretty obvious XSS on Tesla's website. Submitted
         | through bugcrowd and got no information besides "marked as
         | duplicate". Publicly disclosed, bugcrowd temporarily suspended
         | me for disclosing, but it was fixed within a week. Nothing
         | lights a fire under people's asses like airing their dirty
         | laundry. If they had told me "we are working on a fix and
         | expect it to be live in 3 weeks" I would have respected that
         | and held off on disclosure.
        
           | tptacek wrote:
           | For what it's worth: you don't need an artificial moral
           | justification to post immediately. Post immediately if that's
           | your thing.
        
             | cwkoss wrote:
             | I specified "moral" because most bug bounty programs' terms
             | have a blanket prohibition against public disclosure (at
             | least until vulnerability is resolved), and in some cases
             | public disclosure could be legally ambiguous as well
             | (because CFAA is so vague and broad).
        
         | sirdarckcat wrote:
         | about duplicates - google has this thing called grants
         | https://bughunters.google.com/about/rules/5479188746993664 that
         | pay people for doing security research, even if they don't find
         | any bugs. we agree that doing security research is valuable
         | even if no bugs are fixed.
         | 
         | about having access to private bugs, we don't want to share
         | vulnerabilities with others without the researcher's
         | permission, but the original researcher can make bugs public on
         | the new website, you can see some of them here
         | https://bughunters.google.com/report/reports
        
         | thaumasiotes wrote:
         | > I would like to see private issues shared with reporters when
         | they are independently discovered.
         | 
         | Other people have covered most of the rest of your comment, but
         | there are things to say about this one specifically. The
         | reporting platforms support it. It's a very frequent request
         | from researchers who file duplicate reports. It's rare for a
         | company to do this, because (1) it doesn't do anything to help
         | with the issue being reported; (2) it doesn't change how the
         | report will be handled; and (3) researchers _hate_ it. They don
         | 't hate it when they file a duplicate report and want to see
         | what scooped them. But they hate it when their report gets
         | shared with someone who filed a duplicate finding.
         | 
         | Sharing duplicate reports causes a lot more fights than it
         | solves. And it tends to piss off the higher-productivity
         | researchers in an attempt to soothe lower-productivity ones,
         | who generally aren't soothed anyway.
         | 
         | The ticket I saw the most outrage on over this issue was one
         | that I duped to another ticket with a higher number. (HackerOne
         | assigns ticket numbers serially. Fun!) How could I justify
         | calling this report a duplicate of one that, as anyone could
         | see, was filed later?
         | 
         | Well, the second report began
         | 
         | > Hi, I reported this through email and was told that in order
         | to claim a reward I should open a ticket in the HackerOne
         | program...
         | 
         | (This situation is odd enough, and easy enough to explain
         | without sharing private information, that I could explain it to
         | the guy on the lower ticket number without causing problems (or
         | sharing the other ticket directly. That's a big no-no.). I
         | present it here as an example illustrating that, even if you
         | think you have ironclad evidence that the program is out to
         | screw you over, it probably isn't.)
         | 
         | I definitely saw companies doing things that were unfair
         | considering the program as a whole. That generally happened as
         | part of an effort to preserve a relationship with a researcher
         | who frequently filed useful reports. I never saw a ticket duped
         | inappropriately. If you want my advice on how to get the most
         | money out of your bug reports, it's this:
         | 
         | - Don't antagonize the team handling you.
         | 
         | - As much as you can get away with, demonstrate what you can do
         | with the issue you report. The program will say "we investigate
         | the impact of every issue and pay out according to the highest-
         | severity _potential_ impact. " They are sincere. But they don't
         | have the motivation you do to find the highest-severity
         | potential impact. Any impact you actually demonstrate will
         | automatically be considered, because nobody had to notice it
         | was possible.
         | 
         | - Sometimes an issue might ambiguously fall into a low-paying
         | category, or maybe a high-paying category. Try to characterize
         | it as belonging to the high-paying category.
         | 
         | - Decide whether you're a "deep" guy who finds complex issues
         | on a handful of platforms or a "shallow" guy who finds the same
         | set of related issues everywhere he can with minimal effort.
         | Both approaches work. If you're a "shallow" guy, make sure that
         | when you report an issue, it's really there, and don't argue
         | too much over what you think it should be worth. If you're a
         | "deep" guy, you have more scope to develop a personal
         | relationship with the team who handles you.
         | 
         | - Watch for existing programs to add new platforms. This often
         | happens when a company with an existing program makes a new
         | acquisition. The new platform probably has a lot of low-hanging
         | fruit.
         | 
         | - Clean up after yourself. The most outrage I've ever seen on
         | the _company_ side was someone who demonstrated full-on RCE on
         | a company server. He installed a webshell. And his webshell was
         | still up when he filed his report. Don 't be that guy. You can
         | be disqualified from a bug that would have paid tens of
         | thousands of dollars.
        
       | botwriter wrote:
       | I take it you'll get more money selling bugs to other actors.
       | 
       | The story about the guy who only got $5k for owning Starbucks
       | entire user database sticks in my mind.
        
       | bostondavidvc wrote:
       | Whoa, this kind of impressed me (linked from the blog post)
       | https://bughunters.google.com/about/patch-rewards
       | 
       | Payouts for security-positive improvements to security-critical
       | OSS projects:
       | 
       | * $20,000 for setting up continuous fuzzing with OSS-Fuzz
       | 
       | * $10,000 for high-impact improvements that prevent major classes
       | of vulnerabilities
       | 
       | but the low end of the scale is kind of neat too:
       | 
       | * "$1,337 for submissions of modest complexity, or for ones that
       | offer fairly speculative gains."
       | 
       | * "$500 our "one-liner special" for smaller improvements that
       | still have a merit from the security standpoint."
       | 
       | ... and you can qualify for these _even if your day job is
       | working on one of these OSS projects_!
       | 
       | > Q: I'm a core developer working on one of the in-scope
       | projects. Do my own patches qualify?
       | 
       | > A: They most certainly do.
       | 
       | Neat stuff.
       | 
       | (Googler here, but I don't work on the VRP.)
        
         | Trias11 wrote:
         | 1. Press [Submit]
         | 
         | 2. Thank you for your submission, that was already known issue.
        
           | biryani_chicken wrote:
           | Will project maintainers avoid writing issue tickets before
           | sending the patch to this platform?
        
         | HenryKissinger wrote:
         | They need to mltiply these amounts by 50x. Cybersec researchers
         | make 6-7 figures. 20k is almost nothing.
        
           | H8crilA wrote:
           | Not sure why you're downvoted, but the $3M/year total rewards
           | payoff is likely smaller than the corporate administrative
           | and developer time (for review) costs. I.e. if this was a
           | charity it would pay out less than 50 cents on the dollar.
        
             | tptacek wrote:
             | I downvoted because "cybersec researchers" do not in fact
             | routinely make 7 figures. For strong pentester types
             | reporting the typical (real) vulnerability the VRP handles,
             | the median is probably in the low 6's.
        
               | na85 wrote:
               | 6 figures from breaking systems and reporting them
               | responsibly?
               | 
               | Sounds amazing, what's the catch?
        
               | tptacek wrote:
               | There's no catch. You want a job as a pentester. That job
               | is in high demand.
        
           | fooker wrote:
           | Not everyone can move from wherever they are to the Bay area
           | though.
        
       | zibzab wrote:
       | If only Google could run their other businesses as frictionless
       | as this...
       | 
       | Found an interesting bug? Let's have a 1-1 chat over lunch,
       | drinks are on us BTW.
       | 
       | Your developer account was banned by some bot gone wild? Sorry,
       | no human interaction is allows in my department. Maybe if you
       | have a famous buddy that can pester some hotshots on twitter...
        
         | [deleted]
        
         | shadowgovt wrote:
         | It's tough; I wish I had a good solution to the latter problem.
         | Half the trouble is the majority of what Google bans is semi-
         | automated bot farms; the "minimal human contact" policies are
         | to avoid making the company vulnerable to social engineering.
         | 
         | (That vulnerability goes deep. People would look up the phone
         | numbers of Google offices and call with stories about kidnapped
         | family members and the need to get into a Gmail account to find
         | the ransom note.)
        
           | saalweachter wrote:
           | To some extent, it all comes down to identity.
           | 
           |  _If_ you knew which unique person was associated with each
           | account and _if_ you knew which unique person you were
           | talking to on the phone, it would be possible to verify both
           | that you are talking to the person you should be _and_ that
           | you aren 't talking to Notorious Social Engineer #2344 trying
           | to do something shady.
           | 
           | But instead you're stuck trying to figure out if "Saal
           | Weachter" on the the phone is the proper owner of the
           | 'saalweachter' account, or even if it is someone named "Saal
           | Weachter" on the phone at all.
        
       | ds wrote:
       | There must be something I am missing, because I dont understand
       | how underpaid most bug bounty programs are.
       | 
       | If I ran Googles program, I would immediately 10x all payments,
       | unironically. Yes, that means paying 1 million bucks for
       | something you previously paid 100k for. Drop in the bucket. You
       | also get a ton more eyeballs on you, letting you patch everything
       | ASAP.
       | 
       | But they dont do this. I dont know why. Security through
       | obscurity? I suppose that works if you are myspace.com in 2021.
       | Nobody likely gives a shit to try and hack it, but at the end of
       | the day this is still google so that really doesn't apply.
       | 
       | The downside of not paying handsomely is people realize they can
       | make more money selling to third party vendors, (which some do)
       | then every once in a while you get a bad PR story showing that
       | your stuff was hacked and exploited for months/years and it
       | potentially knocks a few points off your stock price.
       | 
       | Money is really the end all be all. If you pay more than third
       | party vendors, I can see almost no reason people would sell to
       | them. At that point, your only adversary's are gov employees of
       | nation states and the staff of companies dedicated to finding
       | vulnerabilities.
        
         | askesisdev wrote:
         | Third party vendors don't buy vulnerabilities on Google's
         | infrastructure and web services. Third parties like Zerodium
         | are interested in 0days on Android, iOS, Windows, Chrome...
         | 
         | You could try to sell it to criminal organizations or
         | monetizing the vulnerability yourself, but it doesn't make any
         | sense to be in that situation if you are making six figures as
         | a bug bounty hunter.. even if you didn't have any ethical
         | qualms regarding such acts.
        
         | bsamuels wrote:
         | Bug bounty prizes are set to encourage a certain quantity of
         | bugs to be reported.
         | 
         | If you offer 10x as much, your triage channels will get
         | overwhelmed and you'll have to deal with a bunch of hostile
         | researchers and development teams who hate your guts because
         | you just blocked their next 2 sprints.
         | 
         | If a bug bounty program is effective, then the payouts should
         | trend up slowly over time as your security program becomes more
         | efficient and produces more secure code.
         | 
         | It's important to remember that purpose of bug bounty programs
         | is not to reduce the number of bugs in the code base - it is a
         | validation measure to check whether your controls are effective
         | or if additional controls need to be added elsewhere.
        
           | gibba999 wrote:
           | As a customer, I'd be okay with dev teams being blocked for
           | their next 2 sprints if it meant security I can trust.
           | 
           | Google Docs, Search, and Mail do little in 2021 that I need
           | that they didn't do in 2016. There's a lot more churn than
           | bona fide improvement. Most tech just doesn't change that
           | much. Heck, I'd take an online version of WordPerfect 7 from
           | 1996 if it was trustworthy. That's a quarter-century. There's
           | nothing Google Docs does, aside from collaboration, that I
           | need that WP7 didn't do.
           | 
           | On the other hand, I strongly distrust Google to maintain my
           | data securely. As far as I can tell, aside from backwards
           | compatibility/legacy reasons, the major reason people use
           | Office 365, for better or worse, are issues like compliance
           | and security.
           | 
           | Security bugs ought to be sold to Google, found, and fixed.
           | They shouldn't be sold to a ransomware gang or a government.
        
       | egberts wrote:
       | Oh cool, look! Lower rewards!
        
       | dang wrote:
       | It's here: https://bughunters.google.com/. Not sure which is the
       | better top-level URL.
        
         | encryptluks2 wrote:
         | Dang, that is super slow to load on mobile. Seriously takes
         | like 5 seconds.
        
           | jackson1442 wrote:
           | Yikes, it gets a 19 on Lighthouse for performance on desktop.
           | It gets a 5 for performance on mobile.
           | 
           | Guess it won't be ranked very well on Google Search!
        
       ___________________________________________________________________
       (page generated 2021-07-27 23:00 UTC)