

Sure, but you can do better for $3. Smaller, more powerful, similar power consumption


Sure, but you can do better for $3. Smaller, more powerful, similar power consumption
I’m sorry for being a bit of a good time to try and get a publishable result of the day 🤣


Dependence and addiction are different things. I find the story of mathematician Paul Erdős’s bet on stopping amphetamines illustrates it best where he had no problem stopping but said of the experience:
“You’ve showed me I’m not an addict. But I didn’t get any work done. I’d get up in the morning and stare at a blank piece of paper. I’d have no ideas, just like an ordinary person. You’ve set mathematics back a month.”
If you’re addicted to stimulants, that’s a health problem and has both poor sensitivity and selectivity for ADHD. If someone is taking stimulants and forms dependence but not addiction that is somewhat sensitive for ADHD but has terrible selectivity.


Takeaway option one: Society does a terrible job accommodating those individuals who are any sort of neurodivergant and many laws, especially those around drug use and possession, don’t actually improve society and instead primarily harm those who are most vulnerable. The takeaway here being that the laws/society are a problem that needs to be revised.
Takeaway option two: Neruodivergance = criminal and anyone with a diagnosis of ADHD/Autism/etc. should be placed under enhanced supervision as they are likely to do crime in the future. Those people are the problem and preemptive detention is the only possible solution to prevent them crime-ing all over the place.
Take a guess which direction the UK is going to go with this?

when its values were literally opposed to all of these.
That’s only true if you are viewing US history from the perspective of the white colonizers.
Ultranationalist isn’t always about race. 1930s US was waaaayyy more into using race as a signifier for nationalism than modern US, albeit I would argue that modern US is more nationalistic though the 1930s is when those seeds were being planted.
Also you want to talk about repressive and totalitarian?1850-1950 was the century of Jim Crow, company towns, robber barons, indigenous genocide, concentration camps, red scare, yellow journalism, etc.
Nazi Germany was just as “opposed” to all of your points so long as you were Von Germanic Protestant.
You are intentionally misrepresenting history and whitewashing the US and engaging in the exact ultranationalism you claim to be speaking against. Shut your keyboard.
Highway of tears, there have been several leads and several serial killers caught. The original list in 1980 included Larry Vu, Eric Charles Coss, and Phillip Innes Fraser but they were later removed after the “highway of tears” designation to focus exclusively on first nation women.
The lack of males is due primarily to the categorization, not the lack of victims.


Safe from theft, vandalism and harassment? Mostly yes! Safe from police looking to solve homelessness via criminalization? No.


It wasn’t originally my claim
Sorry, I wasn’t paying attention and missed that. I apologize.
loads of modern computers don’t use DDR5 or ECC variants of older generations at all, so don’t have any error-correcting memory. If the wrong bit flips, they just crash.
Integrated memory ECC isn’t the only check, it’s an extra redundancy. The point of that paper was to show how often single bit errors occur within one part of a computer system.
memory errors are really rare
Right, because of redundancies. It takes 2 simultaneous bit flips in different regions of the memory in order to cause a memory error and it’s still ~10% chance annually according to the paper I cited.


it’s talking about machines with error correcting RAM, which most consumer devices don’t have.
It’s a paper from 2009 talking about “commodity servers” with ECC protection. Even back then it was fairly common and relatively cheap to implement though it was more often integrated into the CPU and/or memory controller. Since 2020 with DDR5 it’s mandatory to be integrated into the memory as well.
gives figures around 10% for the chance of an individual device experiencing an unrecoverable error per year, which isn’t really that often
Yes, that’s my point. Your claim of “computers have nearly no redundancy” is complete bullshit.


I think you are both overestimating the ability of biological systems and underestimating the ability of mechanical systems to be repaired.
Biological systems have incredible self-repair capabilities, but are otherwise largely unrepairable. To fix issues with biological systems you mostly have to work within the bounds of those self-repair mechanisms which are slow, poorly understood and rather limited.
Loosing a few skin cells is perfectly normal. Corrupting a few skin cells can cancer cancers or autoimmune disorders. Loosing a few Purkinje cells can lead to significant motor impairment and death.
Computers, and mechanical systems in general, can have a shit ton of redundancy. You mention ECC, but neglected to mention the layers of error connection, BIST, and redundancy that even the cheap, broken, cost-optimized, planned obsolescence consumer crap that most people are mostly familiar with make heavy use of.
A single bit flipped by a gamma ray will not cause any sort of issue in any modern computer. I cannot overstate how often this and other memory errors happen. A double bit flip can cause issues in a poorly designed system and, again, are not just caused by cosmic rays. However, it’s not usually that hard to have multiple redundancies if that is a concern, such as with high altitude, extreme environment, highly miniaturized, etc. objects. It does increase cost and complexity though so____
The huge benefit of mechanical systems is they are fully explainable and replaceable. CPU get a bunch of radiation and seems to be acting a bit weird? Replace it! Motor burnt out? Replace it! The new system will be good as new or better.
You can’t do that in a biological system. Even with autografts (using the person’s own tissues for “replacements”) the risk of scarring, rejection and malignancy remains fairly high and doesn’t result in “good as new” outcome, but is somewhere between ‘death’ and ‘minor permanent injury’. Allografts (doner tissues) often need lifelong medications and maintenance to not fail, and even “minor” transplants carry the risk of infection, necrosis and death.


And don’t forget cheaper!
Which is why its imperative that little Timmy is sent to the mines despite all the risks and occupational health hazard that will eventually kill them.


Microsoft when Bing first came out was literally like “it is highly recommended that everyone here use Bing Search”.


A lot of opioids are fairly related, Oxycodone and Oxymorphone can be used as precursors to make Nalaxone and vice versa. I recall Four Thieves Vinegar Collective doing a similar trick though I don’t remember what the actual precursor was.


EssilorLuxottica might be “unknown” but it is the eyeglass maker with a functional monopoly on the industry and parent company to the many better known ‘companies’ such as: Ray-Ban, Oakley, Persol, Oliver Peoples, Vogue Eyewear, LensCrafters, Pearle Vision, Sunglass Hut, EyeMed etc.


The implications are the variables are conflated and the conclusions are overblown.
It should come as no surprise that acute trauma and injecting a foreign substance would cause a relatively significant immunological response. The issue is that for the “chronic phase”, which is where the novelty of this research lies, the evidence shown is far from difinitive compared to the story being told and what results are shown aren’t overly significant.
Even if you 100% believe the paper the conclusion is that the effect of getting tattooed is, arguably, similar to catching the flu once. However, the paper itself tried to obfuscate that so they have a more impactful result and the marketing/outreach/media site that was linked here doubles down on it trying to sell the story of “tattoos==illness and death”!!!


Oh honey… This is barely below average.


The full paper is here and, as usual, it’s hardly anything and decontextualized in order to get a publishable result.
This one is so bad that it doesn’t use established baselines or do any form of statistical analysis on the results instead opting for their own “baseline” measurements using very small sample sizes. It also plays a smoke and mirrors game where it shows a result for short term immunological response and then uses that to insinuate the ‘slightly reduced but still likely well within the error of the poor control’ long term effects are worth noting.
Other major flaws:
At best it’s a very poorly communicated and poorly designed experiment but I suspect that’s due to it result hunting.
Depends on when you count, but for the Iraq war specifically that’s about right for the official foreign coalition forces.
However, the Iraq war was just one front of the nearly dozen wars that were being fought as part of the “Global War on Terrorism” which NATO perpetrated, the short list of Afghanistan, Iraq, Libya, Pakistan, Somalia, Syria and Yemen, but arguably also including Camaroon, Philippines, Libya, Kashmir, etc.
Sure, of Europe only UK and Poland were officially deployed to Iraq, but they were actively collaborating with the rest of NATO in a much broader conflict and don’t ask about where the mercenaries and security consultants that were used heavily during those conflicts came from.
It’s not a misconception, it’s misdirection.
Also the issue of you need to be a deranged psychopath to get wealthy in the first place.
In what world have any of these “crashed and burned”