[HN Gopher] Stealing Your Private YouTube Videos, One Frame at a...
___________________________________________________________________
Stealing Your Private YouTube Videos, One Frame at a Time
Author : gbrown_
Score : 853 points
Date : 2021-01-11 12:46 UTC (9 hours ago)
(HTM) web link (bugs.xdavidhu.me)
(TXT) w3m dump (bugs.xdavidhu.me)
| jjice wrote:
| A bit off topic, but what does YouTube gain from offering private
| and unlisted videos? It's convenient hosting, but it seems like
| they don't get nearly the same benefit as from a public video.
| CobsterLock wrote:
| My gut guess is training data for their ML. Do ads run on
| unlisted/private videos?
| jjcon wrote:
| They have orders of magnitude more video data than they can
| use for training that is already public
| Causality1 wrote:
| To keep people from going to other video hosts who offer those
| features. It's a value-add for the platform. It lets you upload
| videos well in advance of publishing them, or temporarily
| remove them. You also have to consider the fact viewer numbers
| are so low for private and unlisted videos they may as well be
| free to host, aside from the drive space they occupy.
| justusthane wrote:
| Offering features that users want attracts users to the
| platform and in turn generates revenue. Not every feature has
| to generate revenue directly.
| baud147258 wrote:
| For unlisted videos, I'd say it's an useful feature, for when
| you want to easily share a video to some people, but not the
| whole world (especially if they already have a YT account). And
| for YT, it would allow for creators to continue to use YT
| instead of having to go looking for another solution.
| rwmj wrote:
| Not sure what Youtube gains except keeping people in the
| ecosystem, but Youtub _ers_ sometimes offer early access to
| their Patreon subscribers, which is done by uploading a private
| video and then changing the video to public after the exclusive
| period has finished.
| ghkbrew wrote:
| I'd imagine keeping people in the ecosystem is plenty reason.
| Sharing sites live and die by their creators, who are a small
| fraction of the users. Making life marginally more convenient
| for them should have disproportionate returns.
| CydeWeys wrote:
| I use unlisted and private videos. There's plenty of stuff I
| only want friends/families to see, but not the whole wide
| world. So, private or unlisted, then send the URL. It works
| great. So great, in fact, I can't think of anything better. Do
| you just not have this use case, or are you aware of a product
| that does this better?
| amalcon wrote:
| I don't think the question is why it's useful to consumers,
| but rather how does it lead to revenue for YouTube? Do
| private videos drive substantial direct ad revenue, increase
| engagement, or similar?
|
| Personally I'd bet on driving general engagement with the
| platform in some way, but the particular manner is not clear
| to me.
| Closi wrote:
| I bet Google just doesn't want you going to competing video
| sites and uploading content there instead of YouTube.
| elif wrote:
| Well, without private, i would not have uploaded anything
| to youtube. After a few personal uses, I eventually started
| using youtube to upload and share content. I assume they
| want to capture the audience like me.
|
| Also, the versitility of youtube as a tool leads me to buy
| youtube premium for $10/mo.
|
| Also, the most professional content creators upload
| private, then schedule the video public at a time that will
| get the most exposure by the youtube algorithm. They also
| pre-prepare multiple thumbnails and swap them out for the
| first few hours of public exposure. it's a calculus.
| CydeWeys wrote:
| Bleagh, I misread the post.
|
| Well in that case the answer is super simple: The same
| reason Google provides any other free service, whether it
| be Maps, Gmail, Photos, Search, Hangouts, Meet, Pay,
| whatever. The more Google services you use and the more
| time you spend using them, the more you can be monetized.
| ThisIsTheWay wrote:
| Jjice asked what the benefit is to YouTube, not the
| benefit/use cases of users.
| awakeasleep wrote:
| It's an essential feature for scheduling and organization.
|
| Imagine your business is built on Youtube. You want to be able
| to test things in your videos internally, and upload them prior
| to a scheduled release date.
| ridaj wrote:
| As I understand it, private and unlisted videos are often used
| as a draft mode for videos that eventually become public.
| Uploaders want to check everything, make sure all the
| transcoding is done, etc before flipping a video to public.
| Additionally, unlisted videos are sometimes used to run ads.
| This way, the ads can be played on YouTube without being listed
| on the advertiser's channel.
| swyx wrote:
| this was all very easy to follow and made sense. kudos to the
| author.
|
| for the experienced hackers in the room - what would your
| reasonable next step be if you wanted to get audio or higher
| resolution video?
|
| just wondering because i often see these researchers not stop
| after finding the first exploit, and its often the subsequent
| exploits built up from their knowledge of the system that uncover
| the really damning security holes
| markjgx wrote:
| If you don't want to download every single frame you could feed
| these into DAIN (Depth-Aware Video Frame Interpolation)
| https://github.com/baowenbo/DAIN
| Mindwipe wrote:
| Interesting.
|
| There are a lot of product announcements that are handled by
| uploading private videos that are made public at a given time, so
| there'd be quite a lot of attacker interest in this exploit if it
| hadn't been fixed. Worth the bounty.
| archi42 wrote:
| Not sure why this ~is~ was downvoted. IMHO that's a quite
| realistic scenario on how IDs of private videos might "leak".
| Mindwipe wrote:
| Yeah, I know.
|
| I think HN is getting brigaded pretty badly for anyone who's
| said that Parler's security was garbage, so maybe that?
| BHSPitMonkey wrote:
| It isn't a very noteworthy scenario, because it requires the
| person who _has_ been given access to a private video/ID to
| share it with someone who shouldn't have access. In that
| already-rare scenario, the person with access can simply
| download or record the video anyway, thereby leaking it (with
| audio and high-resolution to boot). And that's with
| everything working as intended.
| derangedHorse wrote:
| Maybe people disagree with the last statement of it being
| worth the bounty (as many including myself think it's worth
| way more)
| Mindwipe wrote:
| Oh, I would certainly not disagree with that, I meant worth
| at _least_ the bounty.
| Tinyyy wrote:
| Nice! I think you meant if the video is /30/ FPS, then the time
| between each frame is 33ms.
| jarym wrote:
| Nice work to the researcher and also gotta give YT credit for
| nailing down a lot of entry points in the first place and
| responding responsibly to this disclosure.
| jacquesm wrote:
| If it's on youtube, it isn't private.
| dkdk8283 wrote:
| Hunting vulns like this is super tedious - glad that there are
| bounty programs and hunters with the time to find and responsibly
| disclose bugs.
| xwdv wrote:
| Now if only they could get decent payouts.
| jcims wrote:
| The answer is pretty simple, don't participate if you don't
| like the terms.
| j0ej0ej0e wrote:
| I don't think it's as simple as that as you don't know what
| you're necessarily going to find at the starting line.
|
| Last month on HN someone got PS7500 from FB, but, everyone
| thought he should have got more:
| https://news.ycombinator.com/item?id=25401294
|
| There is also a darknet diaries episode (can't remember
| which) but the guy who found a bug had got into instagram
| s3 buckets and source code, he felt he should have got the
| $1M bug bounty but instead facebook claimed he did it
| without permission to go further and got fuck all.
| SeeManDo wrote:
| This reply led to a google search for darknet diaries.
| Thanks!
| [deleted]
| Jabbles wrote:
| s/everyone/someone
| Deukhoofd wrote:
| Well, this author states he got a $5000 payout, I'd consider
| that pretty decent.
| dessant wrote:
| Security researchers will disagree with you. This payout
| amount is considered exploitative for a bug that could
| cause major financial loss to clients, and reputational
| damage to Google.
|
| Most companies are exploiting security researchers and pay
| them bounties that could be compared to the discounts found
| on Fiverr for different services.
| dimitrios1 wrote:
| Well the market disagrees with security researchers.
|
| 5000 dollars is akin to to a very healthy contractor rate
| of $200 an hour at 25 hours of work, which is a
| conservative estimate of how much time OP spent
| discovering this. That to me feels pretty fair pay, based
| on things in reality, not some future value of potential
| costs savings that require some pretty hand-wavy maths to
| quantify.
| robocat wrote:
| Hourly rates are not appropriate.
|
| * experts are paid for applying their knowledge, not
| their time[1][2]
|
| * A "fair" time based system should also pay for
| unsuccessful searches e.g. the previous month
| unsuccessfully searching for a bug in Chrome.
|
| * if person A spends 1 hour finding bug X, and person B
| spends 1000 hours finding exactly the same bug X, then it
| is a fallacy that you could pick a fair hourly rate.
|
| Aldo I'll mention that you don't get paid according to
| how much damage a bug can cause. 1: usually the damages
| occur to a third party (e.g. users of Microsoft Windows,
| not Microsoft). 2: imagine you find ten bugs that could
| wipe out the business Acme - you can't get paid 10x
| Acme's value (not even just 1x Acme's profits.)
|
| [1] https://quoteinvestigator.com/2018/01/14/time-art/
|
| [2] https://www.snopes.com/fact-check/know-where-man/
| gfxgirl wrote:
| Maybe the solution is to start leaking them (anonymously
| of course) and then when the damages add up say "I was
| going to disclose this but you don't pay". A few
| incidents and maybe they'll start paying?
| asdfasgasdgasdg wrote:
| Considering there is no practical attack here -- you
| don't know the private video's ID -- it's unlikely that
| that would serve as much of an incentive in this case.
| They'd just get the bug for free. Plus, youtube has no
| way of verifying you had the bug before it was leaked
| publicly.
| gfxgirl wrote:
| My suggestion wasn't that someone should get credit. My
| suggestion was only that leaking the bugs and causing
| actual damage would eventually raise the price paid. As
| for proof of date of discovery, write a letter and sign
| it with bitcoin or equivalent.
| toast0 wrote:
| If a video used to be public, but was turned private,
| there could be references to it in other places.
| BHSPitMonkey wrote:
| That still leaves vanishingly small odds for a user to
| actually be exploited, in addition to the very small
| resolution and total lack of audio... both of which would
| be overcome by somebody simply downloading the video
| while it was public. As soon as the video was public
| _and_ shared/discovered by other people, it was already
| owned to begin with.
| [deleted]
| throwaway2245 wrote:
| For this kind of specialism, I'd expect a company to pay an
| internal employee that much ($5000) per week of work - and
| contractors should be charging double.
|
| The fact that a company has undervalued this work and
| failed to identify it as important, and someone external
| has identified it, makes it worth even more.
|
| A ransom demand on YouTube might be unbounded in value,
| e.g.: https://www.lexology.com/library/detail.aspx?g=e4d1be
| 15-18db...
| Jimmc414 wrote:
| Factor in how rare it is to find a software defect like
| this and how many fruitless hours of work go into finding
| it, the author is hardly breaking even. The only financial
| advantage I see is bestowed by the credibility gained from
| the publicity. Considering the monetary damage that a
| defect like this can do to YT and considering the thriving
| black market for zero days, $5000 seems irresponsibly low.
| sneak wrote:
| It's not when you consider it in context of the unpaid work
| one has to do to find payout bugs like this.
|
| For context, this is approximately what Google has to pay
| for an entry level engineer employee to work for ~40 hours.
|
| Finding a bug of this severity level in a publicly
| accessible service with a bug bounty program every 40 hours
| of work is... a stretch of the imagination for an entry
| level person.
| unityByFreedom wrote:
| After winning a $5k payout I bet you can get a decent
| consultancy going. These payouts are foot-in-the-door or
| for hobbyists, like Kaggle is for ML.
| krageon wrote:
| "Do it for the exposure" is a nasty thing to say to
| people, even if you couch it in disguising terms.
| sangnoir wrote:
| > Isn't the entire pen-test/security industry based on
| exposure (of your "brand") though? What you can bill
| depends directly on how well known you are & your past
| work, in lieu of an objective measure of how good at it a
| person is. Geohot could bill thousands (or tens of
| thousands) per hour of his time and no one would bat an
| eye, but if some guy named Blake tried that, he'd be
| laughed out of the room.
| leetcrew wrote:
| disagree, especially when the "exposure" is not the
| entire compensation package.
| hinkley wrote:
| Then you are pitting nascent white hat hackers against
| seasoned black hats and how do you expect that's going to
| turn out?
| rewq4321 wrote:
| Security researchers can easily get $500 per hour on
| consult.
|
| The author should "charge" based on a percentage of the
| value that this bug fix gives to google. I'd argue that
| for such a huge platform this bug is worth tens if not
| hundreds of thousands. Certainly would cause way more
| reputational damage than that of there was a large-scale
| data leak based on this.
| hinkley wrote:
| More importantly, the value of zero day exploits on the
| black market can run into the hundreds of thousands, we
| are told.
|
| As the music industry learned the hard way, if you make
| it too hard to be a good guy, everyone will become a bad
| guy.
| cg-enterprise wrote:
| I think that it's a pretty reasonable payout - people running
| bug bounties are mostly interested in actual security impact
| and practical exploitability. And the necessity to hit G
| server thousands of times to extract even a short video
| reduces that by quite a large proportion; which is definitely
| not a shot at the researcher, it's a great find, but
| exploitability is definitely lower than more direct form of
| IDOR (e.g. input private video ID and get the whole video in
| response).
|
| By the same logic, blind SQLi will typically be valued 'less'
| (hence pay out less) than SQLi with output.
| jonnycomputer wrote:
| My concern here is that, from the perspective of those
| looking for these, the relevant time spent is not only that
| spent on identifying this bug, but all the times spent
| looking for bugs where there were none, or where they
| didn't see them.
| SkyBelow wrote:
| They get paid so little because there isn't any other legal
| means to exchange a bug into money. A monopoly on the only
| legal way to sell goods means that you get bad prices.
|
| But I do wonder if it would be possible to set up a legal
| alternative. I suspect if you did you would find law makers
| lobbied to make it illegal and it would already be decided as
| unethical by the corporate designed ethics systems.
| SCHiM wrote:
| It seems like many people disagree. But it's true.
|
| Microsoft used to pay 20k for exploit primitives that could
| potentially lead privilege escalation. These days the bounty
| program seem to require a demonstration (read: working
| exploit).
|
| Zerodium offers up to 80k for a working local privilege
| escalation exploit. Depending on the workings, if that
| exploit can be used to break out of a browser sandbox you
| might earn a bonus.
|
| The whitehat bounties are not market rate, if you only look
| at the monetary rewards.
| tachyonbeam wrote:
| The other thing to keep in mind here is that serious
| vulnerabilities can also be sold on the dark web for ~50K+
| IIRC. They can also be sold to multiple different entities
| looking to build bot nets or whatever it is that people who
| are always wearing Guy Fawkes masks while sitting in dark
| rooms do. If you're paying just 5-20K, you're kind of
| counting on the people who find the exploit being nice and
| doing the right thing.
|
| Maybe that's mitigated because people with the know-how to
| find exploits like that are usually well-educated and not
| desperately in need of money, but people can be greedy.
| ridaj wrote:
| Also mitigated by the concern researchers should probably
| have for selling exploits to "bad guys", and the
| associated liabilities, criminal and otherwise
| anthony_r wrote:
| Exactly. Only one of these paths does not have a
| hidden/probabilistic cost. Hard to say what's the market
| rate for exposing yourself to criminal prosecution, it's
| not like there's a hedging market for such things.
|
| Though if as a corp you cover the black market rate fully
| then there's really no reason for a researcher to ever
| sell on the black market.
| SCHiM wrote:
| What's criminal about selling to Zerodium though? I guess
| a case could be made that selling on the darknet is 'not
| in good faith'. That is an important criteria for
| determining if you're aiding and abetting where I'm from.
|
| But this problem is not there with all avenues for grey
| market transactions.
| otterley wrote:
| If the payout were too small, nobody would bother researching
| vulnerabilities and claiming payment while following the
| rules. The fact that people are doing it suggests the market
| is working (and perhaps the payment is even high considering
| how much participation is going on).
| rossmohax wrote:
| Classic "confused deputy" problem. What is the current
| recommendation in the modern microservices world to solve it?
|
| When user agent (UA) makes authenticated call to service A, which
| in turn makes call to service B:
|
| UA -[user auth]-> A -[????]-> B
|
| how to pass authentication information from A, when making a call
| to B? Options I can think of:
|
| - pass UA token as is. This has a problem that token becomes too
| powerful and can be made to call any service.
|
| - pass own token and pass user auth info as an additional field.
| This doesn't solve confused deputy problem, since own token can
| be used with any user auth and service B can be tricked to make
| request for data in B not belonging to user
|
| - Mint new unique token derived from tuple (A own token, UA
| token, B service name). B then extracts user information from the
| token presented by A and authorizes request. This seems to solve
| confused deputy problem, because A has no access to other UA
| tokens, so it can't mint a new token for a wrong user. Downside
| is that token minting should probably be done in another service
| and it requires making a call to it for almost every request
| between two microservices, making it a bottleneck pretty quickly.
|
| I've never seen last one in real life, maybe it has some critical
| flaws I am failing to see?
| Natanael_L wrote:
| Capabilities systems are designed specifically for this
| purpose. In such a system, a capability specifically for the
| user's right to access A and B is exposed as handle / token,
| and services A and B can't access anything without first being
| given an exposed capability handle. Notably, capabilities can
| be constrained so that it's not keys to the kingdom.
| rossmohax wrote:
| Are there opensource projects can be used to build such
| system?
| ryukafalz wrote:
| There are a few but Cap'n Proto is probably the most mature
| at this point: https://capnproto.org
| majkinetor wrote:
| AFAIK keycloack by RedHat which is auth as service, passes
| token as is.
|
| Not sure what you mean by "token becomes too powerful and can
| be made to call any service." Each sub-service can have in
| token what is required to access it, and that can be managed by
| main frontend service.
|
| There is a limit to token size but you can easily optimize
| claims and stuff to not go overboard in majority of cases.
| rossmohax wrote:
| > token becomes too powerful and can be made to call any
| service.
|
| If UA token is passed as is down the chain of microservices,
| then every service starts to accept it. Intercepting this
| single token allows attacked to craws whole internal system.
| It wont grant access to other users data, but nevertheless it
| doesn't seem like a secure solution to me.
|
| > Each sub-service can have in token what is required to
| access it, and that can be managed by main frontend service.
|
| This would require UA token to contain audience claim of
| every single internal service, this is unlikely to pass
| security review.
| majkinetor wrote:
| > Intercepting this single token allows attacked to craws
| whole internal system.
|
| It can intercept it, but can not change it. It can replay
| it eventually (even that shortly, depending on timeframe of
| your access token which is usually minutes) but you can
| protect against it.
|
| > This would require UA token to contain audience claim of
| every single internal service, this is unlikely to pass
| security review.
|
| I have penetration tests on my main service. Sub-services
| are not accessible and can be secured to desired level on
| the internal network. I never had security inspections on
| internal services (I work on highly critical gov systems).
| Maybe in some domains its like you say but I believe its
| generally not a problem. Furthermore, we need to have some
| perspective on this - there are multiple easier ways to
| hack a service and there probably exists big number of
| other exploits that are easier to achieve.
| lukevp wrote:
| If the token having claims is a security issue, the entry
| point could swap the users token (containing just their
| unique id and an expiration) with an authorized token with
| claims, and keep that token within the local network. Then
| there's a single token broker layer and claims are secure.
| I'm not sure why claims would be an issue to have in the
| original token though, could you provide some more info on
| that?
| Hello71 wrote:
| Your third solution basically reinvents Kerberos. I don't think
| Kerberos envisioned services making calls to each other though.
| In the 1980s, I think it was assumed that the client would
| contact each service separately and combine the results itself.
| tyingq wrote:
| Hash token from A with shared secret that A and B both know,
| but UA does not, then pass both the token and the hash?
| rossmohax wrote:
| I like it. One simplification might be just to pass 2 tokens:
| UA as is and A own token.
|
| Service B then uses A token for authentication, but UA token
| for authorization.
| withinboredom wrote:
| You could also have A just sign the token for the same
| effect.
| tedunangst wrote:
| This isn't a confused deputy problem. There's simply no
| authentication on the endpoint. As the article says, it's
| Insecure Direct Object Reference.
| [deleted]
| cipherboy wrote:
| > I've never seen last one in real life, maybe it has some
| critical flaws I am failing to see?
|
| Doesn't Kerberos solve this with s4u2self and s4u2proxy and
| other delegated credentials?
|
| I'll admit it isn't quite the exact same, but the general idea
| is the same.
| johnmaguire2013 wrote:
| I believe Macaroons[1] attempt to solve this problem.
|
| [1] https://research.google/pubs/pub41892/
| recursive wrote:
| If the UA token has all the necessary permissions embedded in
| it, then it cannot be used to call any service for which the
| user is not authorized.
| ec109685 wrote:
| Spotify uses per user encryption, which is an approach that can
| solve this:
| https://engineering.atspotify.com/2018/09/18/scalable-user-p...
|
| That way account A couldn't access account B's decryption key
| to get to their private video data
| argomo wrote:
| Give user agent two tokens: one for A and one for B (let's call
| it UB). Pass UA and UB to A. A passes its own token to B plus
| the UB token. B uses user info from UB and roles from both UB
| and A's token.
|
| UB has a list of allowed intermediates (in this case, A) so
| user agent doesn't send it to every service.
|
| In my implementation there were various kinds of tokens, so UB
| couldn't be used by itself to invoke B directly.
|
| For our situation all this complexity turned out to be not
| worth it. :-/
| Sodman wrote:
| If you use a service-mesh (such as Istio), you can have all
| inter-microservice communications be over mutual TLS. Assuming
| you only expose an API gateway to the outside world, have the
| gateway handle authentication, then each service can handle any
| feature-level authorization with that user info.
|
| Bonus: When using a mesh service like this, you can also
| ban/rate-limit/load-balance/canary calls between any two
| microservices if necessary.
| jeffbee wrote:
| The idea that client A has its identity authenticated by
| service B, and that service B checks that client A is
| authorized to access some endpoint, does not solve the
| problem of B accessing content on behalf of A that user U
| should not get to see.
|
| The way Google does mutual authentication between services
| (which, I reiterate, does not address this problem) is
| described in great detail at
| https://cloud.google.com/security/encryption-in-
| transit/appl...
| majkinetor wrote:
| Amazing "tutorial".
|
| It definitely made me rethink each of interactions on my
| services.
| londons_explore wrote:
| The researcher suggests that finding private youtube video id's
| itself would be a bug...
|
| Youtube video ID's were generated by taking an internal integer,
| and encrypting it with a fixed key. That key has been leaked in
| the early days of youtube (pre google buying it).
|
| That means there are a bunch of early video id's that are
| predictable. That makes this bug much worse.
| bredren wrote:
| Also, notable, already linked, YouTube videos are made private
| all of the time.
|
| So this big created a method for continued albeit silent and
| low res access to private but known videos of note.
|
| I think videos after being made private can be edited by the
| owner. So it would be possible for new "private" data from a
| known video id to leak this way.
| swyx wrote:
| i dont see how that would be a "bug" per se - the concept of
| "private" is set based on the account ownership, not the id
| itself. you could set a public video to private, or vice versa.
| whatever the ID, google still has to check against your authz
| before showing the video. this seems better than relying on
| security by obscurity.
| Dylan16807 wrote:
| > the concept of "private" is set based on the account
| ownership, not the id itself
|
| I feel like this article should demonstrate why private
| should be both things whenever possible.
|
| > you could set a public video to private, or vice versa
|
| You could, but if you _don 't_ leak the ID all over then it
| should provide an extra step of security.
|
| > security by obscurity
|
| Hiding an ID like this isn't all that different from hiding a
| key.
| IndySun wrote:
| _It had ... an interesting feature called Moments...To be honest
| I am not quite sure what advertisers use this feature for,
| nevertheless, it seemed interesting_
|
| For putting their ad at the point most relevant, maybe, or better
| still, putting their ad at the point to which their audience will
| skim to.
| gymalpha18 wrote:
| What tool did he use to view proxy logs?
| jabroni_salad wrote:
| Telerik Fiddler is what I typically use.
| stragies wrote:
| I thought, that there was an unwritten rule on HN, that you
| don't mention closed-source, subscription-based software
| behind an email registration without making those down-sides
| abundantly clear? Don't enable the next Solarwind.
| jonplackett wrote:
| Nice work. Are there really still no updates on why this was the
| case? Seems like this all happened a year ago.
| SoSoRoCoCo wrote:
| This hack is great because the dev didn't have to read assembly
| code or parse WireShark logs or deploy any kind of cracking
| software. They simply observed the ecosystem carefully and
| deduced a failure mode through some serious mental BFS.
| phkahler wrote:
| >> This hack is great because the dev didn't have to read
| assembly code or parse WireShark logs or deploy any kind of
| cracking software.
|
| Monitoring network traffic (http requests) and logs is similar
| to any other logged data or reading disassembled code. Patching
| in a different video ID is sort of like patching ASM to
| implement some hack. The automation created at the end to
| extract and assemble the video was basically creation of
| cracking software for this particular exploit.
|
| What one person calls arcane knowledge is another's everyday
| tools. This is a case where _I_ see obscure technical stuff,
| but web devs see regular stuff ;-)
| SoSoRoCoCo wrote:
| > What one person calls arcane knowledge is another's
| everyday tools.
|
| Point taken. If this had been something about Android I'd be
| staring at my screen drooling like a dog looking at a TV.
| ehsankia wrote:
| It's also worth that the author doesn't go too much into
| detail of the how, and focuses more on the what
|
| > With my first account, I started using YouTube, trying
| every feature, pressing every button I could find, and
| whenever I saw an HTTP request with a video ID in it, I
| changed it to the target Private video
|
| Was this done with some tooling or scripts, or purely by
| eyeing devtools? I could see that step for example being
| very similar to "parse WireShark logs", for example.
|
| I agree that the level of detail included makes it fairly
| readable without being to scary to non-experts.
| Kaze404 wrote:
| As a web dev trying to get into reverse engineering, this was
| super interesting to read. With the knowledge I have, it does
| seem like both things require a certain amalgamation of
| barely-related knowledge to be able to do effectively, and I
| didn't realize that until just now.
| hn_throwaway_99 wrote:
| I have no idea what BFS means in this context.
| airstrike wrote:
| I'm guessing brute force search
| nitrogen wrote:
| Or breadth first search.
| SoSoRoCoCo wrote:
| Both, For Sure. :)
| [deleted]
| hackerpain wrote:
| This bug was found by my friend. David has found many creative
| bugs and we worked on a project together :)
|
| Must say a quick thinker, and he's just 17 or 18.
| dutch3000 wrote:
| why would i want to look at other peoples private YT videos?
| loydb wrote:
| Really nice work! And good response from Google.
| ju_sh wrote:
| Fantastic work and great write-up!
| mk89 wrote:
| This is indeed probably one of the most common issues with all
| these independent services/microservices/let's build this fast/I
| just need that fragment of the API, so I am not gonna call the
| main API, let me just build a fast wrapper around it/...etc.
|
| This is _absolutely_ one of the biggest issues I also have seen
| in several companies.
| thoughtsunifi12 wrote:
| delete thoughtsunufic account
| colek42 wrote:
| Transitive identity is a difficult problem. SPIFFE has a working
| group to set up to try to solve it.
| https://groups.google.com/a/spiffe.io/g/transitive-identity-...
| homero wrote:
| That API should have had rate limiting at the minimum. I would
| never let an API call get called thousands of times.
| blindm wrote:
| It's worth mentioning that many smartphone users upload content
| to Youtube seemingly mistakenly and unaware that they are doing
| so (also known as fat-fingering[0]). A quick search for:
| DSC001.mp4
|
| And then filtering that by recently uploaded always yields
| _interesting_ results. For those that don 't know, `DSC-XXX` is a
| standard naming scheme for digital cameras. More on the default
| naming scheme in the following link[1]
|
| [0] https://www.urbandictionary.com/define.php?term=fat-
| fingerin...
|
| [1] https://datagenetics.com/blog/december22012/index.html
| Anthony-G wrote:
| The DataGenetics article was interesting but doesn't explain
| where "DSC" comes from. Wikipedia [1] explains that it's an
| abbreviation for _Digital Still Camera_ , corroborated by the
| _Design rule for Camera File system (DCF)_ [2] which also
| specifies "DCIM" (an abbreviation for _Digital Camera IMages_ )
| as the name of the DCF image root directory.
|
| [1]
| https://en.wikipedia.org/wiki/Digital_camera#Digital_Still_C...
|
| [2]
| https://en.wikipedia.org/wiki/Design_rule_for_Camera_File_sy...
| jrochkind1 wrote:
| "still" as in, not film/video/moving. That it's in this case
| moving videos named after "digital still camera" is kind of
| amusing.
| sfblah wrote:
| I just tried that and didn't get very much on YouTube. Maybe
| they've cleaned it up somehow?
| claudiulodro wrote:
| If you're interested in diving deeper into those sorts of
| videos, http://astronaut.io/ was shared on HN a while and
| discussed previously[1]. It definitely digs up some interesting
| stuff.
|
| [1] https://news.ycombinator.com/item?id=13413225
| w-m wrote:
| https://youtu.be/sAMotY8CJkQ
|
| (6 views at time of posting it here)
| zapdrive wrote:
| Is that a human hanging upside down?
| seiferteric wrote:
| deer
| TaylorAlexander wrote:
| Hi could you please add "NSFW" to that? I am a vegan and
| these images are very disturbing to me. I mean, I saw someone
| said it was a deer so I very quickly clicked to confirm I
| should post this - I am mostly saying this for others and so
| people are aware this can be disturbing to some.
|
| Thank you.
| zmarty wrote:
| Reminds me of the joke: How can you tell is somebody is a
| vegan?
| immewnity wrote:
| https://underviewed.com/ and http://defaultfile.name/ pull
| random videos named by default. Pretty intriguing to look
| through.
| creade wrote:
| Also good in this genre is Jon Bois' Accidental Upload Film
| Review [0]
|
| [0] https://www.sbnation.com/2015/7/17/8990773/accidental-
| upload...
| chevill wrote:
| https://www.youtube.com/watch?v=HsVe-kF_w90
| 29athrowaway wrote:
| There is a subreddit for this:
|
| https://www.reddit.com/r/IMGXXXX/ (SFW)
| kristofferR wrote:
| https://www.youtube.com/watch?v=At3PyQH8Mx8
| newsbinator wrote:
| This doesn't seem to work well. At least I only see 1 video
| with this search param from the last year.
| notretarded wrote:
| If you're looking for porn, try bing
| silentsea90 wrote:
| One of the few times I could follow an exploit write up
| completely! Smartly done and well written!
| sam1r wrote:
| Timeline says on Jan 17, $5000 issued. Let's hope op gets it!
| encom wrote:
| He should have, it's been almost a year.
| sam1r wrote:
| Ridiculous. This is such a relevant bug optimal for
| exploitation globally, impacting not only publishers,
| journalists, rest of un-democratic world.
| progval wrote:
| > [Jan 17, 2020] - Reward of $5000 issued
|
| It's 2020, not 2021. The issue was reported in december
| 2019.
| drcoopster wrote:
| He neglected to mention his time machine.
| sam1r wrote:
| Good catch.
| EE84M3i wrote:
| Google pays $5000 for IDORs? That's really good.
| ffhhj wrote:
| He could have created a service to retrive private videos and
| earned way more from subscriptions/ads.
| wegs wrote:
| Ummm... no. If this is like a week's worth of work, you're
| looking at $250k annualized income, $125k after overhead.
|
| In practice, you're not finding a bug like this every week.
|
| The bug bounty programs were originally intended to give a
| white hat market alternative to the black hat and gray hat
| markets. They don't do that. If I find a bug, and I want
| profit, I'm much better off selling to my government than to
| Google.
|
| One can only imagine the number of exploits the US, China,
| Russia, North Korea, etc. have in their cyber-warfare vaults.
|
| Exploits compound. Often, two minor exploits make a major
| exploit.
| seastonATccs wrote:
| I got a 5k payout from google for serious OAuth bypass bug.
| I'm not a security researcher so I wouldn't have any idea or
| really desire to sell something like this to a Government.
| But I'd have to agree that if I had publicly revealed the bug
| Goog would have lost magnitudes of business or possible fines
| from governments far above and beyond 5k.
| sam1r wrote:
| Baller alert.
| bitwize wrote:
| If everything were priced at its actual value, SV would
| collapse like a house of cards. The whole industry is based
| on obtaining for bargain basement prices engineering or
| research work which coukd be worth billions.
| gitanovic wrote:
| That's exactly why there is a black market for 0-day
| exploits... because they are worth more than what are paid
| by the companies owning those liabilities
| jimmaswell wrote:
| Every time someone gets a bug bounty there's someone saying
| it's not enough and it should have been a bajillion dollars
| instead. $5000 for a week's work is great and clearly it's
| working.
|
| Some points to consider are that there's risk involved
| dealing with the black market, including getting the payout
| in a way that doesn't trace back to you and legal liability
| if you're caught, a company has no reason to pay >=$x for an
| exploit that will cost them $x, and beyond that I suspect a
| lot of people simply feel better about telling the company
| about an exploit than selling it to criminals who will use it
| for extortion and theft.
| JamesSwift wrote:
| So lets say you are able to find one of these every other
| week. $130k pretax. But you are finding 26 different bugs
| that (judging on the responses in this comment section)
| require fairly clever thinking, and you are doing it
| consistently.
|
| I don't think companies owe it to researchers to
| exclusively supply their income, but I think theres room
| for improvement on the payout when most of the point is to
| deter selling on the black/gray market.
| EE84M3i wrote:
| All I meant was that this is significantly better than other
| programs. I've seen similar bugs pay out in the hundreds.
| sam1r wrote:
| Beautifully put.
| JeremyBanks wrote:
| nice find
___________________________________________________________________
(page generated 2021-01-11 22:00 UTC)