My HL2030 is incredibly easy to use from Linux
My HL2030 is incredibly easy to use from Linux
Because it’s a disproportionate amount of effort to natively support an extra OS (particularly one as fragmented as Linux), especially one with such a small userbase that largely isn’t interested in using proprietary cloud services in the first place because of data privacy and security concerns.
Obviously not all Linux users are super worried about that stuff (I mean, I use Linux and have a google pixel), but on average the Linux userbase is way way more aware of that stuff than most users who just want their photos backed up without having to worry about it.
Yep, this is it. I volunteered for my school’s IT department in high school, this was basically the logic. The laptops are cheap and easy to manage/administrate. Whether or not they were Linux was a non-issue.
Edit: also, since chromeOS is basically just a browser, there wasn’t much that could break, and if something did break everything was stored in google drive anyway, so you could just factory reset the device and hand it back to the student without needing to buy any kind of higher-level support contract.
the internet in general kind of sucks these days. Reddit has burned down a lot of the things that made its search results so useful in the past. Every forum post more than a few years old is a forest of broken links; the top of basically any internet search whatsoever is an ocean of SEO spam. And that’s before you get into the sheer amount of information that isn’t searcheable at all because it’s on platforms like discord.
I’ll tell you why I haven’t deleted reddit – aside from tech-heavy discussion here (Linux, Reddit, tech generally, that sort of thing), there isn’t a fediverse equivalent to things like the sports or food subreddits I follow.
I agree iscussions on lemmy are higher-quality and friendlier, for sure. But for a lot of the things I use reddit for they just don’t really exist here yet.
The old thinkpads that came with those self-repair manuals maybe were. But the new ones are more or less the same as most other modern laptops. I guess they don’t have soldered SSDs, which is good, but the framework is definitely better for repairability.
SSDs are very cheap these days
The motherboard itself is also open-source: https://github.com/system76/virgo/
The x220 is quite easily the best laptop ever made imo, and I’ll never understand why they just don’t slap modern hardware into it and re-release it.
I use manjaro, but it isn’t what I would call stable.
Who proposed doing that?
Agreed. If verbal agreements and handshake deals can be legally binding contracts, I don’t see why emoji wouldn’t be.
I can tell you with confidence that DACs can only convert digital sound data into analogue, and that’s due to the audio jack being older than digital audio.
Right. But the principle is the same; hardware that isn’t compatible with pre-existing systems has a control circuit, and a digital interface. The digital computer sends instructions to the controller, and the controller carries out the instructions.
An analogue device isn’t compatible with a digital device, much like how digital sound data (songs, audio tracks in videos, system sounds, etc…) and analogue audio don’t technically work.
Correct. That is why there is dedicated control circuitry designed for making analog and digital systems talk to each other – as there will be for optical analog computers and every other type of non-conventional computing system.
It’s true that conventional systems will not, by default, be able to communicate with analog computers like this one. To control them, you will send the question (instructions) to the control circuitry, which does the calculation on the hardware, and returns an answer. That’s true for DACs, it’s true for FPGAs, it’s true for CPUs, it’s true for ASICs.
Every temperature sensor, fan controller, camera, microphone, and monitor are also doing some sort of conversion between digital and analog signals. The light being emitted by the monitor to your eyes is a physical phenomenon that can be measured as an analog value (by taking a picture of your computer monitor on film, say). How does your monitor produce this analog signal? It has a control circuit that can take digital commands and convert them into light in specific patterns.
Using an analogue device to accelerate something requires at least some information to be lost on translation, even if the file size is stupidly large.
I don’t think you’ve understood what analog computers are used for (actually, I’m not sure that you’ve understood what analog computing even really is beyond that it involves analog electrical signals). Analog computers aren’t arbitrarily precise like digital computers are in the first place, because they are performing the computation with physical values – voltage, current, light color, light intensity – that are subject to interference from physical phenomenona – resistance, attenuation, redshift, inertia. In other words, you’re really worried about losing information that doesn’t exist in a reliable/repeatable way in the first place.
A lot of iterative numerical methods need an initial guess and can be iterated to an arbitary degree. Analog computers are usually used to provide the initial guess to save iteration flops. The resolution just is not that important when you’re only trying to get into the ballpark in the first place.
In other words, this computer is designed to solve optimization problems. Say you’re getting results based on the color and intensity of the light coming out of it, right, like you might get values of tides based on electrical voltage on an old desktop analog computer. It’s not that relevant to get the exact values for every millisecond at a sampling rate of a bajillion kilohertz; you’re looking for the average value that isn’t falsely precise.
So if you were designing an expansion card, you would design a controller that can modulate the color and intensity of the light going in, and modulate the filter weights in the matrix. Then you can send a digital instruction to “do the calculation with these values of light and these filter values”. The controller would read those values, set up the light sources and matrix, turn on the light, read the camera sensors at the back, and tell you what the cameras are seeing. Voila, you’re digitally controlling an analog computer.
I would use one of the tools listed in the archwiki; I have an intel chip so I’ve never used any myself.
Once you find a tool that can undervolt, usually the recommendation is to lower the voltage incrementally until you see unstable behavior and crashes, than raise it back to the last good voltage, then run a stress-test to verify.
just the readme for throttled
it would be the same way expansion cards work now; it would have digital control circuitry that can communicate with the analog circuitry.
We already have expansion cards that can do this. Audio cards are an example of an expansion card that convert between digital and analog signals.
Even things like graphics cards, ASICs, or FPGAs; it’s not a different type of signal, but it’s an architecture that isn’t compatible with the rest of the computer because it’s specialized for a certain purpose. So there’s control circuitry that allows it to do that and a driver on the computer that tells it how to.
would recommend it to everyone. I don’t use it every day, but there are a million and one ways to brew with it, it’s very handy for traveling, it’s super easy.
I use it particularly for when I’m at the end of a bag of coffee and don’t have enough left to do a French Press or a pour-over – I have a couple of Aeropress recipes that use 10-12 grams.
that’s a great tip, thanks for posting
I kind of wish I had played with ROMs and stuff earlier. I still like the idea, but I don’t use it because I use mobile payments so much that it would be a PITA not to have that working.
Yeah, I think that’s what she’s complaining about