_______ __ _______
| | |.---.-..----.| |--..-----..----. | | |.-----..--.--.--..-----.
| || _ || __|| < | -__|| _| | || -__|| | | ||__ --|
|___|___||___._||____||__|__||_____||__| |__|____||_____||________||_____|
on Gopher (inofficial)
(HTM) Visit Hacker News on the Web
COMMENT PAGE FOR:
(HTM) Doom has been ported to an earbud
JonathanFly wrote 5 hours 57 min ago:
Is there no way to play Doom with just the earbuds? There's a mod that
adds audio cues to make Doom playable for the blind: [1] Adding high
quality binaural audio to Doom would make this even more viable.
Earbuds have accelerometers which could also be incorporated to add
additional queues.
(HTM) [1]: https://www.youtube.com/watch?v=vtoAo__2kYo
neonmagenta wrote 8 hours 6 min ago:
Have we successfully ported Doom to a beeper yet?
linehedonist wrote 10 hours 54 min ago:
The Doom port itself is pretty fun, but I love the presentation of it.
Brilliant idea to let people play the game themselves on the actual
hardware.
nacozarina wrote 12 hours 55 min ago:
someone made a crude web server out of a vape pen a few weeks ago, we
canât be too far from running doom on one
tqi wrote 13 hours 5 min ago:
There's gotta be a Moore's law corollary for "Doom ported to [blank]"
milestones. I wonder where this all ends? Doom ported to a mechanical
pencil! Doom ported to a clipper card! To a lightbulb??
utopcell wrote 9 hours 1 min ago:
It all ends with it is ported to a paperclip machine [1]
(HTM) [1]: https://www.decisionproblem.com/paperclips/index2.html
guerrilla wrote 14 hours 51 min ago:
Awesome advertising for the Pinebunds Pro. No chance the Fairbuds can
do this? I don't know much about them.
Also, with DOOM running on all these things now, is it still impossible
to get it to run well on a 386?
theragra wrote 9 hours 23 min ago:
I think there was a port that was much less demanding, and likely
suitable for 386, but needs to be backported to x86
wolvoleo wrote 17 hours 23 min ago:
But can it run crysis?
frizlab wrote 17 hours 43 min ago:
> wow this front end code is atrocious, state management is everywhe-
> shhhh don't look don't look it's ok just join the queue
love it
Yiin wrote 12 hours 13 min ago:
I'm more impressed it wasn't written by llm than I can be judgemental
about the code itself
j1elo wrote 18 hours 30 min ago:
Next up idea: ThunderDoom
(HTM) [1]: https://news.ycombinator.com/item?id=46750419
mikeayles wrote 17 hours 41 min ago:
(HTM) [1]: https://www.reddit.com/r/apple/comments/1ihufa0/doom_running...
catlifeonmars wrote 19 hours 4 min ago:
As an aside, I really like the style of the page. I wish it was
available as a classless css dropin stylesheet.
anthk wrote 19 hours 23 min ago:
It's possible to run Zork I-III Frotz under a pen, some FPGA and even
interpreting a PostScript file. Even the Game Boy, the C64, MSX... So,
Doom is not the most ported game ever.
WXLCKNO wrote 20 hours 30 min ago:
A few more years and some more ram on these earbuds and we'll be able
to run some nice local earbud kubernetes clusters
epenn wrote 20 hours 44 min ago:
In light of this I propose "Doom's Law" as the ultimate expression of
late stage capitalism:
- Society continues to produce more and more powerful devices.
- More and more of these devices begin running Doom.
- When this reaches the saturation point, society becomes Doom.
moktonar wrote 20 hours 54 min ago:
We should definitely send a playable copy of doom to aliens on a golden
record on the next Voyager mission
listeria wrote 19 hours 58 min ago:
and risk having them interpret it as a declaration of war?
neurostimulant wrote 21 hours 5 min ago:
> Earbuds don't have displays, so the only way to transfer data to/from
them is either via bluetooth, or the UART contact pads.
Bluetooth is pretty slow, you'd be lucky to get a consistent 1mbps
connection, UART is easily the better option.
Does this means you can run a doom instance on each bud? Is it viable
to make a distributed app to use the computing power of both buds at
once?
arin-s wrote 20 hours 54 min ago:
This was a stretch goal, multiplayer. One earbuds versus the other.
It's not that hard to implement but I've got a few other things to
clear away first.
Using them for distributed computation though? interesting use of
free will xD
jentulman wrote 18 hours 36 min ago:
Single player, but stereoscopic?
One display from each bud.
"VR Doom has been ported to an earbud(s)" ;)
RandomTeaParty wrote 18 hours 37 min ago:
Compute one half of the screen on each
Left ear for the right eye and vice versa
shevy-java wrote 21 hours 13 min ago:
I am a bit said that it is always Doom.
Now ... I played the game when I was young. It was addictive. I don't
think it was a good game but it was addictive. And somewhat simple.
So what is the problem then? Well ... games have gotten a lot bigger,
often more complicated. Trying to port that to small platforms is
close to impossible. This makes me sad. I think the industry, excluding
indie tech/startups, totally lost the focus here. The games that are
now en vogue, do not interest me at all. Sometimes they have
interesting
ideas - I liked little nightmares here - but they are huge and very
different from the older games. And often much more boring too.
One of my favourite DOS games was master of orion 1 for instance. I
could, despite its numerous flaws, play that again and again and
again. Master of Orion 2 was not bad either, but it was nowhere near
as addictive and the gameplay was also more convoluted and slower.
(Sometimes semi-new games are also ok such as Warcraft 3. I am not
saying ALL new games are bad, but it seems as if games were kind of
dumbed down to be more like a video to watch, with semi-few interactive
elements as you watch it. That's IMO not really a game. And just
XP grinding for the big bad wolf to scale to the next level, deal
out more damage, as your HP grows ... that's not really playing
either. That's just wasting your time.)
Guvante wrote 20 hours 57 min ago:
Most people don't realize that games were small back then because
they had to be.
The value of being small for most users almost doesn't exist. If you
have bandwidth limits then yeah download size is important but most
don't.
So the only meaningful change optimizations make is "will it run well
enough" and "does it fit on my disk".
Put more plainly "if it works at all it doesn't matter" is how most
consumers (probably correctly) treat performance
optimizations/installation size.
The sacrifices you talk about were made at explicit request of
consumers. Games have to be "long enough" and the difference between
enough game loop and grinding is a taste thing. Games have to be
"pretty" and for better or worse stylized takes effort and is a taste
thing (see Wind Waker) while fancy high res lighting engines are
generally recognized as good.
I will say though while being made by indies means they are optimized
terribly the number of stylized short games is phenomenally high it
can just be hard to find them.
Especially since it is difficult for an hour or two game to be as
impactful as a similar length movie so they tend to not be brought up
as frequently.
bitmasher9 wrote 19 hours 26 min ago:
Storage space is at a premium. The PS5 has about 650gb of usable
space. At ~100gb/game which is not uncommon you can store 6 games
on the console without needing to free up hard drive space.
Filesize matters, especially to people with limited bandwidth and
data caps. The increasing cost of SSDs only makes this situation
more hardware constrained.
RadiozRadioz wrote 21 hours 5 min ago:
It's Doom in part because it's a significantly popular game, that was
open sourced, with low resource requirements (but not too low to be
trivial), with an innovative custom engine that people find
interesting, originally created by a person who many respect or
admired growing up, and the game itself is cool. And now there is
enough inertia to keep choosing it.
m-p-3 wrote 21 hours 3 min ago:
I wish there were more ports of Duke Nukem 3D :(
dclowd9901 wrote 21 hours 8 min ago:
Im with you. I want to play Freespace 2 on earbuds.
optimalsolver wrote 21 hours 20 min ago:
Relevant SMBC, "Computer scientist vs computer engineer":
(HTM) [1]: https://www.smbc-comics.com/comic/2011-02-17
npsomaratna wrote 22 hours 0 min ago:
On a tangent: I remember reading John Carmak saying that as game
engines became more complex, he had to relinquish the idea of writing
all the (engine) code himself, and start to rely on other folks
contributions as well (this was in an interview after the release of
Doom 3).
I wonder what his feelings are in this age of AI.
fennecbutt wrote 25 min ago:
I think what really happened is that as Carmack became more senior he
got more and more out of touch with the technology. So I don't
understand why people still refer to his words as gospel, especially
since the domain he's now in is so far outside of his original
specialty.
Doom 2016 would've never been possible with him at the helm. Then
again now they're adding viking bullshit to it so design by executive
committee kills yet another beloved franchise.
nortlov wrote 21 hours 3 min ago:
Carmak made extensive use of AI during Doom development: Approximate
Interpolation.
Insanity wrote 21 hours 49 min ago:
John is now on a mission to make AGI a reality. Iâd say given his
own investment there, heâs probably positive about it.
Just speculation on my part of course.
Also, âmasters of doomâ is such a good book. Recommend it for
anyone who wants to peek behind the scenes of how Carmack, Romero,
and iD software built Doom (and Wolf3D etc).
lgvld wrote 17 hours 35 min ago:
Yes such a fantastic book. I thought about it recently while
reading this article about the three teenagers who created the
Mirai botnet, same vibes, you would probably like it:
(HTM) [1]: https://www.wired.com/story/mirai-untold-story-three-young...
arin-s wrote 22 hours 6 min ago:
The standalone viewer (connected directly to the earbuds) also works on
mobile: [1] No touch controls though, it just plays the intro loop
(HTM) [1]: https://files.catbox.moe/pdvphj.mp4
nehalem wrote 22 hours 14 min ago:
Whenever I see another supposedly menial device including enough
general purpose hardware to run Doom, I wonder whether I should think
of that as a triumph of software over hardware or an economic failure
to build cheaper purpose-built hardware for things like sending audio
over a radio.
fennecbutt wrote 31 min ago:
Ah yes the "good old days when we wrote assembly" perspective.
Like, I get it, but embedded device firmware is still efficient af.
We end up stuffing a lot of power into these things because contrary
to say wired Walkman headphones, these have noise cancellation,
speech detection for audio ducking when you start having a
conversation, support taking calls, support wakewords for assistants,
etc.
tracerbulletx wrote 12 hours 3 min ago:
Less computing power is not necessarily cheaper.
gjsman-1000 wrote 15 hours 20 min ago:
I will never understand people who treat MHz like a rationed
resource.
pibaker wrote 17 hours 40 min ago:
You should see it as the triumph of chip manufacturing â advanced,
powerful MCUs have became so cheap thanks to manufacturing
capabilities and economies of scale means it is now cheaper to use a
mass manufactured general purpose device that may take more material
to manufacture than a simpler bespoke device that will be produced at
low volumes.
You might be wondering "how on earth a more advanced chip can end up
being cheaper." Well, it may surprise you but not all cost in
manufacturing is material cost. If you have to design a bespoke chip
for your earbuds, you need to now hire chip designers, you need to go
through the whole design and testing process, you need to get someone
to make your bespoke chip in smaller quantities which may easily end
up more expensive than the more powerful mass manufactured chips, you
will need to teach your programmers how to program on your new chip,
and so on. The material savings (which are questionable â are you
sure you can make your bespoke chip more efficiently than the mass
manufactured ones?) are easily outweighed by business costs in other
parts of the manufacturing process.
the_fall wrote 17 hours 54 min ago:
> economic failure to build cheaper purpose-built hardware for things
like sending audio over a radio.
You're literally just wasting sand. We've perfected the process to
the point where it's inexpensive to produce tiny and cheap chips that
pack more power than a 386 computer. It makes little difference if
it's 1,000 transistors or 1,000,000. It gets more complicated on the
cutting edge, but this ain't it. These chips are probably 90 nm or 40
nm, a technology that's two decades old, and it's basically the
off-ramp for older-generation chip fabs that can no longer crank out
cutting-edge CPUs or GPUs.
Building specialized hardware for stuff like that costs a lot more
than writing software that uses just the portions you need. It
requires deeper expertise, testing is more expensive and slower, etc.
Aurornis wrote 18 hours 38 min ago:
> Whenever I see another supposedly menial device including enough
general purpose hardware
The PineBuds are designed and sold as an open firmware platform to
allow software experimentation, so thereâs nothing bad nor any
economic failures going on here. Having a powerful general purpose
microcontroller to experiment with is a design goal of the product.
That said, ANC Bluetooth earbuds are not menial products. Doing ANC
properly is very complicated. Itâs much harder than taking the
input from a microphone, inverting the signal, and feeding it into
the output. Thereâs a lot of computation that needs to be done
continuously.
Using a powerful microcontroller isnât a failure, itâs a benefit
of having advanced semiconductor processes. Basically anything small
and power efficient on a modern process will have no problem running
at tens of MHz speeds. You want modern processes for the battery
efficiency and you get speed as a bonus.
The speed isnât wasted, either. Higher clock speeds means lower
latency. In a battery powered device having an MCU running at 48MHz
may seem excessive until you realize that the faster it finishes
every unit of work the sooner it can go to sleep. Itâs not always
about raw power.
Modern earbuds are complicated. Having a general purpose MCU to allow
software updates is much better than trying to get the entire
wireless stack, noise cancellation, and everything else completely
perfect before spinning out a custom ASIC.
Weâre very fortunate to have all of this at our disposal. The
groveling about putting powerful microcontrollers into small things
ignores the reality of how hard it is to make a bug-free custom ASIC
and break even on it relative to spending $0.10 per unit on a proven
microcontroller manufacturer at scale.
utopiah wrote 6 hours 44 min ago:
> Doing ANC properly is very complicated. Itâs much harder than
taking the input from a microphone, inverting the signal, and
feeding it into the output. Thereâs a lot of computation that
needs to be done continuously.
Neat, any recommended reading on the topic?
seabass-labrax wrote 14 hours 40 min ago:
The other aspect to consider is changing requirements. Maybe a
device capable of transmitting PSTN-level audio quality wirelessly
would have been popular twenty years ago, but nowadays most people
wouldn't settle for anything with less than 44.1kHz bandwidth. A
faster processor means that there's some room for software upgrades
later on, future-proofing the device and potentially reducing
electronic waste. Unfortunately, that advantage is almost always
squandered in practice by planned obsolescence and an industry
obsession with locked-down, proprietary firmware.
tobinc wrote 18 hours 46 min ago:
I think it's just indicative of the fact that general purpose
hardware has more applications, and can thus be mass produced for
cheaper at a greater scale and used for more applications.
Waterluvian wrote 18 hours 47 min ago:
I imagine itâs far more economical to have one foundry that can
make a general purpose chip thatâs overpowered for 95% of uses than
to try to make a ton of different chips. It speaks to how a lot of
the actual cost is the manufacturing and R&D.
sdenton4 wrote 18 hours 30 min ago:
The only real problem I could see is if the general purpose
microcontroller is significantly more power-hungry than a
specialized chip, impacting the battery life of the earbuds.
On every other axis, though, it's likely a very clear win: reusable
chips means cheaper units, which often translates into real
resource savings (in the extreme case, it may save an entire
additional factory for the custom chips, saving untold energy and
effort.)
daft_pink wrote 19 hours 9 min ago:
If you look at the bottom of the page, itâs an advertisement for
someone looking for a job to show off his technical skill.
ornornor wrote 18 hours 59 min ago:
Okay? Is that good or bad or what?
notarobot123 wrote 19 hours 41 min ago:
It's intuitive to think of wasted compute capacity as correlating
with a waste of material resources. Is this really the case though?
mathgeek wrote 19 hours 0 min ago:
Waste is subjective or, at best, hard to define. It's the classic
"get rid of all the humans and nothing would be wasted" aphorism.
mlyle wrote 19 hours 44 min ago:
Marginal cost of a small microprocessor in an ASIC is nothing.
The RAM costs a little bit, but if you want to firmware update in a
friendly way, etc, you need some RAM to stage the updates.
gpm wrote 20 hours 21 min ago:
Neither - it's a triumph of our ability to do increasing complex
things in both software and hardware. An earbud should be able to
make good use of the extra computing capacity, whether it is to run
more sophisticated compression saving bandwidth, or for features like
more sophisticated noise cancelling/microphone isolation algorithms.
There are really very few devices that shouldn't be able to be better
given more (free) compute.
It's also a triumph of the previous generation of programmers to be
able to make interesting games that took so little compute.
buildbot wrote 20 hours 17 min ago:
Plus thereâs actually less waste, I would imagine, by using a
generic, very efficiently mass produced, but way overkill part.
vs. a one off or very specific, rare but perfectly matched part.
echelon wrote 20 hours 17 min ago:
There are enough atoms in that earbud to replace all of the world's
computers.
We've got a long way to go.
tt24 wrote 20 hours 26 min ago:
Incredible to see people try to spin the wild successes of market
based economies as an economic failure.
Hardware is cheap and small enough that we can run doom on an earbud,
and Iâm supposed to think this is a bad thing?
hashmap wrote 20 hours 20 min ago:
I can sort of see one angle for it, and the parent story kind of
supports it. Bad software is a forcing function for good hardware -
the worse that software has gotten in the past few decades the
better hardware has had to get to support it. Such that if you
actually tried like OP did, you can do some pretty crazy things on
tiny hardware these days. Imagine what we could do on computers if
they weren't so bottlenecked doing things they don't need to do.
tt24 wrote 16 hours 9 min ago:
That wasn't the GP's claim. Their implication was that it's an
economic failure that we don't produce less powerful hardware.
hashmap wrote 13 hours 37 min ago:
Yeah, that's more or less what I'm getting at.
__MatrixMan__ wrote 21 hours 48 min ago:
If it can run Doom it can run malware.
TrainedMonkey wrote 22 hours 0 min ago:
> CPU: Dual-core 300MHz ARM Cortex-M4F
It's absolute bonkers amount of hardware scaling that happened since
Doom was released. Yes, this is a tremendous overkill here, but the
crazy part here is that this fits into an earpiece.
wolvoleo wrote 17 hours 20 min ago:
Yes but also Doom is very very old.
I bought a kodak camera in 2000 (640x480 resolution) and even that
could run Doom on it. Way back when. Actually playable with sounds
and everything.
Here's an even older one running it:
(HTM) [1]: https://m.youtube.com/watch?v=k-AnvqiKzjY
mlyle wrote 19 hours 37 min ago:
This is the "little part" of what fits into an earpiece. Each of
those cores is maybe 0.04 square millimeters of die on e.g. 28nm
process. RAM takes some area, but that's dwarfed by the analog and
power components and packaging. The marginal cost of the gates
making up the processors is effectively zero.
trhway wrote 16 hours 26 min ago:
so 1mm2 peppered by those cores at 300MHz will give you 4 Tflops.
And whole 200mm wafer - 100 Petaflops, like 10 B200s, and just at
less than $3K/wafer. Giving half area to memory we'll get 50
PFlops with 300Gb RAM. Power draw is like 10-20KW. So, giving
these numbers i'd guess Cerebras has tremendous margin and is
just printing money :)
mlyle wrote 15 hours 7 min ago:
Yes, assuming you don't need to connect anything together and
that RAM is tinier than it really is, sure. At 28nm,
3megabits/square millimeter is what you get of SRAM, so an
entire wafer only gets you ~12 gigabytes of memory.
And, of course, most of Cerebras' costs are NRE and the stuff
like getting heat out of that wafer and power in.
trhway wrote 14 hours 45 min ago:
Why not ddram?
mlyle wrote 14 hours 37 min ago:
Same reason why Cerebras doesn't use DRAM. The whole point
of putting memory close is to increase performance and
bandwidth, and DRAM is fundamentally latent.
Also, process that is good at making logic isn't
necessarily good for making DRAM. Yes, eDRAM exists, but
most designs don't put DRAM on the same die as logic and
instead stack it or put it off-chip.
Almost all these microcontrollers that are single-die have
flash+SRAM. Almost all microprocessor cache designs are
SRAM for these reasons (with some designs using off-die L3
DRAM)-- for these reasons.
trhway wrote 14 hours 17 min ago:
CPU cache is understandably SRAM.
>The whole point of putting memory close is to increase
performance and bandwidth, and DRAM is fundamentally
latent.
When the access patterns are well established and
understood, like in the case of transformers, you can
mitigate latency by prefetch (we can even have very
beefed up prefetch pipeline knowing that we target
transformers), while putting memory on the same chip
gives you huge number of data lines thus resulting in
huge bandwidth.
mlyle wrote 13 hours 31 min ago:
With embedded SRAM close, you get startling amounts of
bandwidth -- Cerebras claims to attain >2 bytes/FLOP in
practice -- vs H200 attaining more like 0.001-0.002 to
the external DRAM. So we're talking about a 3 order of
magnitude difference.
Would it be a little better with on-wafer distributed
DRAM and sophisticated prefetch? Sure, but it wouldn't
match SRAM, and you'd end up with a lot more
interconnect and associated logic. And, of course,
there's no clear path to run on a leading logic process
and embed DRAM cells.
In turn, you batch for inference on H200, where
Cerebras can get full performance with very small batch
sizes.
Telemakhos wrote 21 hours 45 min ago:
I remember playing Doom on a single-core 25MHz 486 laptop. It was,
at the time, an amazing machine, hundreds of times more powerful
than the flight computer that ran the Apollo space capsule, and now
it is outclassed by an earbud.
fennecbutt wrote 29 min ago:
And by an order of magnitude or more, too!
tadfisher wrote 19 hours 35 min ago:
And perhaps more fittingly, that PC couldn't decode and play an
MP3 in real time.
iberator wrote 20 hours 0 min ago:
Can we finally end this Apollo computer comparison forever? It
was a real time computer NOT designed for speed but real time
operations.1
Why don't you compare it to let's say pdp11, vax780/11 or Cray 1
supercomputer?
NASA used a lot of supercomputers here on earth pior to mission
start.
mlyle wrote 19 hours 28 min ago:
> It was a real time computer NOT designed for speed but real
time operations.
More than anything, it was designed to be small and use little
power.
But these little ARM Cortex M4F that we're comparing to are
also designed for embedded, possibly hard-real-time operations.
And dominant factors in experience on playback through earbuds
are response time and jitter.
If the AGC could get a capsule to the moon doing hard real-time
tasks (and spilling low priority tasks as necessary), a single
STM32F405 with a Cortex M4F could do it better.
Actually, my team is going to fly a STM32F030 for minimal power
management tasks-- but still hard real-time-- on a small
satellite. Cortex-M0. It fits in 25 milliwatts vs 55W. We're
clocked slow, but still exceed the throughput of the AGC by
~200-300x. Funnily enough, the amount of RAM is about the same
as the AGC :D It's 70 cents in quantity, but we have to pay
three whole dollars at quantity 1.
> NASA used a lot of supercomputers here on earth pior to
mission start.
Fine, let's compare to the CDC 6600, the fastest computer of
the late 60's. M4F @ 300MHz is a couple hundred single
precision megaflops; CDC6600 was like 3
not-quite-double-precision megaflops. The hacky "double
single precision" techniques have comparable precision-- figure
that is probably about 10x slower on average, so each M4F could
do about 20 CDC-6600 equivalent megaflops or is roughly 5-10x
faster. The amount of RAM is about the same on this earbud.
His 486-25 -- if a DX model with the FPU -- was probably
roughly twice as fast as the 6600 and probably had 4x the RAM,
and used 2 orders of magnitude less power and massed 3 orders
of magnitude less.
Control flow, integer math, etc, being much faster than that.
Just a few more pennies gets you a microcontroller with a
double precision FPU, like a Cortex-M7F with the FPv4-SP-D16,
which at 300MHz is good for maybe 60 double precision
megaflops-- compared to the 6600, 20x faster and more
precision.
mlyle wrote 16 hours 41 min ago:
I have thought about this a little more, and looked into
things. Since NASA used the 360/91, and had a lot of 360's
and 7900's... all of NASA's 60's computing couldn't quite fit
into a single 486DX-25. You'd be more like 486DX2-100 era to
replace everything comfortably, and you'd want a lot of RAM--
like 16MB.
It looks like NASA had 5 360/75's plus a 360/91 by the end,
plus a few other computers.
The biggest 360/75's (I don't know that NASA had the highest
spec model for all 5) were probably roughly 1/10th of a
486-100 plus 1 megabyte of RAM. The 360/91 that they had at
the end was maybe 1/3rd of a 486-100 plus up to 6 megabytes
of RAM.
Those computers alone would be about 85% of a 486-100.
Everything else was comparatively small. And, of course, you
need to include the benefits from getting results on
individual jobs much faster, even if sustained max throughput
is about the same. So all of NASA, by the late 60's,
probably fits into one relatively large 486DX4-100.
Incidentally, one random bit of my family lore; my dad was an
IBM man and knew a lot about 360's and OS/360. He received a
call one evening from NASA during Apollo 13 asking for advice
about how they could get a little bit more out of their
machines. My mom was miffed about dinner being interrupted
until she understood why :D
varjag wrote 22 hours 1 min ago:
Earbuds often have features like mic beam forming and noise
cancellation which require a substantial degree of processing power.
It's hardly unjustified compared to your Teams instance making fans
spin or Home Assistant bringing down an RPi to its knees.
grishka wrote 13 hours 32 min ago:
These sorts of things feel like they would be quite inefficient on
a general-purpose CPU so you would want to do them on some sort of
dedicated DSP hardware instead. So I would expect an earbud to use
some sort of specialized microcontroller with a slow-ish CPU core
but extra peripherals to do all the signal processing and
bluetooth-related stuff.
nehalem wrote 21 hours 45 min ago:
No doubt, maybe should I have emphasised the "general" part of
"general purpose" more. Not a hardware person myself, I wonder
whether there would be purpose-built hardware that could do the
same more cheaply â think F(P)GA.
nicoburns wrote 17 hours 26 min ago:
> I wonder whether there would be purpose-built hardware that
could do the same more cheaply
Where are you imagining costy savings coming from? Custom
anything is almost always vastly more expensive than using a
standardised product.
Aurornis wrote 18 hours 32 min ago:
> I wonder whether there would be purpose-built hardware that
could do the same more cheaply â think F(P)GA.
FPGAs are not cost efficient at all for something like this.
MCUs are so cheap that youâd never get to a cheaper solution by
building out a team to iterate on custom hardware until it was
bug free and ready to scale. Youâd basically be reinventing the
MCU that can be bought for $0.10, but with tens of millions of
dollars of engineering and without economies of scale that the
MCU companies have.
danielbln wrote 22 hours 1 min ago:
It's already very cheap to build though. We are able to pack a ton of
processing into a tiny form factor for little money (comparatively,
ignoring end-consumer margins etc.).
An earbud that does ANC, supports multiple different audio standard
including low battery standby, is somewhat resistant to interference,
can send and receive over many meters. That's awesome for the the
price. That it has enough processing to run a 33 year old game..
well, that's just technological progression.
A single modern smartphone has more compute than all global conpute
of 1980 combined.
ck2 wrote 18 hours 51 min ago:
I need that in lunar-lander exponents
(imagine the lunar lander computer being an earbud ha)
danielbln wrote 18 hours 26 min ago:
Well, current smartphone would be about 10^8 times faster/more
than the lunar lander.
A single Airpod would be about 10^4 times as powerful as the
entire lunar lander guidance system.
Or to put another way: a single Airpod would outcompute the
entire Soviet Union's space program.
rogerrogerr wrote 22 hours 5 min ago:
Or a third option - an economic success that economies of scale have
made massively capable hardware the cheapest option for many
applications, despite being overkill.
AlecSchueler wrote 19 hours 6 min ago:
Or the fourth option, an environmental disaster all around
compiler-devel wrote 18 hours 0 min ago:
Nobody cares, unless theyâre commenting for an easy win on
internet message boards.
Aurornis wrote 18 hours 35 min ago:
Itâs the opposite. Using an off the shelf MCU is much more
efficient than trying to spin your own ASIC.
Doing the work in software allows for updates and bug fixes,
which are more likely to prevent piles of hardware from going
into the landfill (in some cases before they even reach
customersâ hands).
dubbie99 wrote 18 hours 41 min ago:
The materials that go into a chip are nothing. The process of
making the chip is roughly the same no matter the power of it. So
having one chip that can satisfy a large range of customers needs
is so much better than wasting development time making a custom
just good enough chip for each.
AlecSchueler wrote 18 hours 32 min ago:
> The materials that go into a chip are nothing.
They really aren't. Every material that goes into every chip
needs to be sourced from various mines around the world,
shipped to factories to be assembled, then the end goods need
to be shipped again around the world to be sold or directly
dumped.
High power, low power, it all has negative environmental
impact.
cortesoft wrote 17 hours 21 min ago:
That doesn't contradict the point, though. The negative
impact on the environment is not reduced by making a less
powered chip.
AlecSchueler wrote 4 hours 25 min ago:
No, hence "all around."
direwolf20 wrote 18 hours 26 min ago:
Which materials are they and how would you suggest doing it
with fewer materials?
AlecSchueler wrote 4 hours 24 min ago:
Cease production.
cruffle_duffle wrote 39 min ago:
Why are you on a technology site?
AlecSchueler wrote 21 min ago:
I'm not sure why you're asking this or what you're
insinuating. The site is called Hacker News, it should
be open to anarcho- and eco- hackers too. Not all of
believe in infinite growth.
Do you want to expand on why you're on this site?
I've been here for more than 15 years and I'm not the
person I was when I signed up or when I went through
life in a startup.
serf wrote 17 hours 47 min ago:
ultra pure water production itself is responsible for
untold amounts of hydroflouric acid and ammonia , and most
etching processes have an F-Gas involved, and most plants
that do this work have tremendously high energy (power)
costs due to stability needs/hvac.
it's not 'just sand'.
direwolf20 wrote 17 hours 20 min ago:
How would you suggest doing it with fewer materials?
robotresearcher wrote 11 hours 16 min ago:
The claim was that "the materials that go into a chip
are nothing". Arguing that that is not that case does
not really put someone on the hook to explain or even
have any clue how to do it better.
stackghost wrote 18 hours 3 min ago:
In theory, graphene based semiconductors would eliminate a
lot of need for shipping and mining.
hansvm wrote 12 hours 39 min ago:
Maybe. They have the potential for faster semiconductors,
but only after adequate modifications. Graphene isn't a
semiconductor, and it isn't obvious that we'll find a way
to fix that without (or even with) rare resources.
cyberrock wrote 20 hours 51 min ago:
Also see: USB 3+ e-marker chips. I'm still waiting for a Doom port
on those.
lombasihir wrote 22 hours 23 min ago:
can we run doom on water pump?
KellyCriterion wrote 22 hours 37 min ago:
Im waiting for the post "Doom ported to disposable Vape chip" :-D
primitivesuave wrote 21 hours 15 min ago:
The Puya PY32 series MCUs found in most vapes have 3kb of RAM and
24kb of ROM, whereas Doom requires at least 4MB of RAM. Assuming
Moore's law also applies to the computing power inside a disposable
vape, we should be seeing that post in around a decade :)
DJBunnies wrote 22 hours 35 min ago:
(HTM) [1]: https://www.youtube.com/watch?v=rVsvtEj9iqE
KellyCriterion wrote 22 hours 12 min ago:
Good catch!
Though it misses my primary condition: "disposable" - ha! :-D
(this one is a refillable one, and it looks like he is streaming
the content from his PC?)
But a very cool link, thanks for sharing! :)
7777777phil wrote 22 hours 51 min ago:
List of Doom ports:
(HTM) [1]: https://en.wikipedia.org/wiki/List_of_Doom_ports
ryan-duve wrote 10 hours 51 min ago:
[666]
7777777phil wrote 5 hours 6 min ago:
this?
(HTM) [1]: https://news.ycombinator.com/item?id=40378759
omoikane wrote 18 hours 40 min ago:
Subreddit of Doom ports:
(HTM) [1]: https://www.reddit.com/r/itrunsdoom/
automatic6131 wrote 23 hours 10 min ago:
Do we have Doom on a USB-C plug microcontroller yet?
KellyCriterion wrote 22 hours 37 min ago:
Disposable Vape CPU!
TheCraiggers wrote 23 hours 13 min ago:
At first I thought you found a way to control/view the game
acoustically and I was very curious how that worked.
But, this probably makes more sense.
branon wrote 23 hours 36 min ago:
How are the PineBuds Pro, anyone have them? The Pine64 IRC network
doesn't have a channel for PineBuds discussion so I haven't had an easy
opportunity to ask.
yjftsjthsd-h wrote 17 hours 41 min ago:
Mine have been great. Full disclosure, I deliberately don't use
ANC... in fact, I may have installed firmware that doesn't have it.
So I can't comment on that. But just as Bluetooth earbuds, they do
their job.
TheCraiggers wrote 23 hours 10 min ago:
I'm also curious. I used to be a big supporter of Pine64 but the
e-ink tablet and phone debacles have kinda soured me on them.
utopiah wrote 23 hours 16 min ago:
Was using them just this morning. I've been using them since they are
out. Great device but battery is quite limited, ~2hrs top with ANC
on.
arin-s wrote 23 hours 22 min ago:
To be honest, I've never actually used them for their intended
purpose.
No idea what the comfort or audio quality is like.
There's a Pinebuds channel on the Pine64 discord, you can ask
questions there :)
jurakis wrote 1 day ago:
This is awesome! the amount of devices doom has not been run on shrinks
by the day haha
arin-s wrote 1 day ago:
Hi, I ported DOOM to the Pinebuds Pro earbuds.
It's accessible over the internet, so you can join the queue and play
DOOM on my earbuds from your PC!
More info as well as links to the github repos can be found on the
site.
RandomTeaParty wrote 18 hours 41 min ago:
What compression ratio does your jpeg encoding achieve?
arin-s wrote 18 hours 2 min ago:
It ranges from 5.8:1 to 4.7:1 depending on scene complexity.
Keep in mind I calculated these values using the 8-bit
pallete-based framebuffer that DOOM uses, not a 24-bit one that a
regular RGB buffer would use.
(DIR) <- back to front page