Slurrrrrrrrrrrrrrrrrrrrrrrrrrrrp
-
Esp32 probably doesn't have a bios crash. My bet is a raspberry pi
They are RPis.
I saw this setup at my local 7/11
-
So working on ad machines before a lot of them connect to an external ftp site to pull down the latest version of the logo. Things like this you don’t care if it’s secure or not
Until it displays porn
-
Kind of discriminatory, what about the right??
We're not excluding that, it's between the left testicle and the potato
-
Until it displays porn
wrote last edited by [email protected]Then they get to watch porn while filling out their slurpee. Win win to me.
-
This implies every drink and its display is handled by its own computer running linux. Potentially mtndew has a different IP than coca cola.
Internet of Slurp
-
Exactly. This implementation makes no sense. Unless the logos are animated, need to change frequently, or supposed to show advertising (I hope not), a backlit plastic label would do the same job just fine. In fact, that has done the same job for decades at this point.
wrote last edited by [email protected]This implementation makes a lot of sense if you think about the ability to support variable amount of screens without the need of complex routing and addressing.
It also has increased reliability where one failure doesn't break the whole system.
As for the need of it - well, that's "slurp" they try to sell some cold sugar to impulsive people who like flashy things. That implies animations on the screen and being "not boring".
The fact that they changed to screens by itself means that backlit plastic label was doing poorer job than this abomination.
-
Kind of discriminatory, what about the right??
Dog's right testicles are running Java
-
They are RPis.
I saw this setup at my local 7/11
Reminds me of a Rug Doctor rental machine I saw that was proudly displaying the default Raspberry Pi OS background and a login prompt
-
This post did not contain any content.
Is it just me who feels that having one processing unit per display is a waste?
I mean, I get it why they did it (it's way easier to just have one SBC per-display, both on the hardware and the software sides), but if designing such a system I would still try to come up with a single board solution if only because waste gets on my nerves.
-
Man it's so crazy how many small computers are around us. Just a few years ago that would have been a plastic label they swapped out when needed.
wrote last edited by [email protected]The difference in what can be done and the amount of work that needs to go into it between discrete digital electronics and just having a microcontroller or even microprocessor there is HUGE.
Also with microcontrollers and microprocessors most of the work moves from Electronics Engineering and circuit-design space to Software Engineering and software development, and the latter experts are easier to find plus the development cycle is way more friendly when it's just code which you can change and upload at will rather than physical circuits were simulation can only go so far before you have to actually create the physical hardware.
Even more entertaining, microcontrollers are so stupidly cheap (the most basic ones cost a few cents) that throwing in a microcontroller is almost always significantly cheaper than doing the control stuff with discrete electronics.
(For example a screen that size can be controlled by as ESP32 which if you embed it in your circuit yourself costs maybe $1 or $2, though that wouldn't be running Linux and programming it be much more low level, plus it's probably the cheapest you can go)
I actually got an EE degree back when we embedded circuits were just starting to be used so I didn't really get taught how to use them, then went for a career in software instead of electronics and came back to digital electronics years later and it's like night and day between the discrete digital electronics age and the everything is a computing device era.
-
Is it just me who feels that having one processing unit per display is a waste?
I mean, I get it why they did it (it's way easier to just have one SBC per-display, both on the hardware and the software sides), but if designing such a system I would still try to come up with a single board solution if only because waste gets on my nerves.
You'd think a damn sticker would be good enough
-
My slurped machine needs an HD upgrade? But I just upgraded it dammit!
Anecdotally, a friend had a bunch of raspberry pis running inside specific devices, running hot, SDcards would eventually fail.
Started properly venting and cooling the pis... SDcards stopped failing (didn't have to be MilitaryGradeeither).
-
The difference in what can be done and the amount of work that needs to go into it between discrete digital electronics and just having a microcontroller or even microprocessor there is HUGE.
Also with microcontrollers and microprocessors most of the work moves from Electronics Engineering and circuit-design space to Software Engineering and software development, and the latter experts are easier to find plus the development cycle is way more friendly when it's just code which you can change and upload at will rather than physical circuits were simulation can only go so far before you have to actually create the physical hardware.
Even more entertaining, microcontrollers are so stupidly cheap (the most basic ones cost a few cents) that throwing in a microcontroller is almost always significantly cheaper than doing the control stuff with discrete electronics.
(For example a screen that size can be controlled by as ESP32 which if you embed it in your circuit yourself costs maybe $1 or $2, though that wouldn't be running Linux and programming it be much more low level, plus it's probably the cheapest you can go)
I actually got an EE degree back when we embedded circuits were just starting to be used so I didn't really get taught how to use them, then went for a career in software instead of electronics and came back to digital electronics years later and it's like night and day between the discrete digital electronics age and the everything is a computing device era.
You're forgetting the main driving factor behind being able to personalize a screen vs a plastic label: advertising.
-
Exactly. This implementation makes no sense. Unless the logos are animated, need to change frequently, or supposed to show advertising (I hope not), a backlit plastic label would do the same job just fine. In fact, that has done the same job for decades at this point.
supposed to show advertising
I'm betting on this.
-
supposed to show advertising
I'm betting on this.
wrote last edited by [email protected]**sharp exhale** You're probably right. It's just like the gas pumps. A big soda cup takes a few seconds to fill up, and the system knows that's when you're holding the button down, staring at the tap. All that makes you an advertising target for the duration.
Is there some version of Occam's Razor where "enshitification" is the most likely answer?
-
Kind of discriminatory, what about the right??
That one runs BSD.
-
Dog's right testicles are running Java
So that's why it keeps swelling and needs constant purging of all that pus.
-
I've never seen one of these, but I assume it performs other functions - surely monitoring sensors, probably reporting that data, maybe allowing triggering maintenance functions, etc.
That said, processing and storage is so cheap on this scale that it's probably better (and cheaper) to go with a tried and true, widely supported system, than it is to optimize with custom hardware/firmware.
I assume it performs other functions
Advertising.
-
Idk, if they're plugging in one for each screen it sounds like a lot; there are libraries to do most of this. It wpuld only take me about a month or someone competent a couple days to write this. I kbow there's libraries to display, but i don't know what else this is monitoring/controlling. So that seems safe,
So there's a computer hardware cost that goes from ~5x(4?) Per machine to ~45x4 per machine. That's ~ 2 hours of code per machine difference that this would make, assuming you were paying ~80/hr to write it, which is reasonable.
Even assuming no code was needed for the pi, production takes twice as long as expected, and electricity costs don't matter (which, next to the condenser; they may not) you break even at ~16 machines. 20 if you want to throw in some other random arbitrary cost.
Even if you assune pi 0's, at, what 20/each? You still break even before 100 units.
So it would take less than a hundred machines for smaller chips to pay off. I'd believe an exec didnt (even ask someone else to) do this math, but how long have pi's had multiple video out's?
Idk, if they’re plugging in one for each screen it sounds like a lot; there are libraries to do most of this.
X11 can easily handle multiple screens.
Not sure about the Pi's limitations but COTS SBCs can too. -
I've never seen one of these, but I assume it performs other functions - surely monitoring sensors, probably reporting that data, maybe allowing triggering maintenance functions, etc.
That said, processing and storage is so cheap on this scale that it's probably better (and cheaper) to go with a tried and true, widely supported system, than it is to optimize with custom hardware/firmware.
I've seen a very similar print out when installing/loading Arch for the first time.