

That’s usually what I accomplish.
The alternator doesn’t like it though


That’s usually what I accomplish.
The alternator doesn’t like it though


Doctor offices used to charge a dollar per page to transfer records.
Circuit City used to charge a 25% restock free if you bought something and it wasn’t right.
Banks here used to charge a fee for writing too many checks, another fee for having not enough activity, or another fee for just having an account.
A family member of mine gets charged for sticking their own finger.
Hey, I’m not sure where you got your factor of 5 years, but it was a number I pulled out my ass. I’m a repair depot I typically didn’t see drives that live much longer than 17k hours (just under 2 years). That didn’t mean that they always fall at that age, only that systems that came through had about that much time on them max.
Regarding the 136 vs 150 million numbers, those numbers are pure bullshit. MTBF is a raw calculation of how long it will take these devices to fall based on operational runtime over how many failures were experienced in the field. They most likely applied a small number of warranty failures over a massive number of manufacturing runs and projected that it would take that long for about half their drives to fall.
In reality, you will see failure spikes in the lifetime of a product. The initial failures will spike and drop off. I recall reading either the data surrounding this article or something similar when they realized that the bathtub curve may not be the full picture. They just updated it again for numbers from up to last year and you can see that it would be difficult to project an average lifetime of 20 years, much less 150.
My last thought on this is that when Backblaze mentions consumer vs enterprise drives they are possibly discussing SATA vs SAS. This comes from the realization that enterprise workstation drives are still just consumer drives with a part number label on them (seen in Dell and HP Enterprise equipment). Now, they could be referring to more expensive SATA drives, but I can’t imagine that they are using anything but SAS at this point in their lifecycle.
It’s the future we should all aspire to.


The optometrists over here change an extra fee for lens fittings on top of the fee for an eye exam. I think it’s $80 for us after insurance.


There is a theory that Ghislaine Maxwell is/was a Reddit mod.


Depending on the company’s policies you may be able to install FOSS software. Outside of the other comments so far, looking at a problem differently may provide different insight.
The downside is that you will likely miss out on integrations provided by the proprietary software. Also, your company’s IT may not appreciate your experimentation. I had one coworker dual boot their computer to Ubuntu, which eventually broke the install. It was shortly after that that the company began locking things down tightly.
We tried the cheapest stuff Amazon will sell and went back to Powerade Zero.
The next step is trying to build your own electrolyte drink, but that is a recipe for disaster.
Puns aside, I was concerned about giving myself an electrolyte imbalance. I asked the doctor about it and they told me to just eat food, weirdo (sass for dramatic emphasis, but I definitely got the look I give end users when they ask stupid questions).
This conversation got me super excited to try it, only to see that it costs 3x as much as Powerade Zero packets. Since we easily chew through a 10pack of packets a week, I can’t bring myself to have her try it.


Stupid BAM Broadcom BAM Legacy BAM Wireless BAM Drivers BAM
I just read that recently. Let me see if I can run that source back down.
Edit: All in one CompTIA server plus certification exam guide second edition exam SK0-005 McGraw-Hill Daniel LaChance 2021 Page 138. In the table there it says that SATA is not designing for constant use.
Edit 2:
https://www.hp.com/us-en/shop/tech-takes/sas-vs-sata
Reliability:
SAS: Designed for 24/7 operation with higher >mean time between failures (MTBF), often 1.6 million hours or more SATA: Suitable for regular use but not as robust as SAS for constant, heavy workloads, with MTBF typically around 1.2 million hour
They are saying that SAS is a better option with a longer MTBF, but I don’t expect my drives to last 5 years, much less 136.
My own two cents here is that you probably don’t want to use SATA ZFS JBOD in an enterprise environment, but that’s more based on enterprise lifecycle management than utility.


I’m hugely done with the adverb hugely.



Here is my combination lab and workbench. I have been busy trying to buy/sell/trade computers that I have become significantly behind on cleaning as I go. I also just got the network rack:

I haven’t had time between work, hustling, and home maintenance to finish getting the cabling managed or the NAS:

The goal is to get the NAS in the rack, UPS to the items in the rack, the 3D printer under the bench, and the monitors on the wall and off the bench. Then I’ll start in on plastic organizers for the bits and parts that clutter my bench.


I only ask because I work in a HP environment and run off a G3 sff myself.


Is that an EliteDesk 800 G3, G4, or G5?


Could try a high volume air pump, but those only seem to do about 5 psi.
With a limited number of uses you could try CO2 cartridges or use blanks to fill a smaller compression chamber.


Yikes! Was the entire rack plugged into a single UPS?
I have been toying with the idea of using USB storage, but my concern is that the controllers are not meant to be used that heavily. Supposedly SATA controllers are also not built for the abuse I have been throwing them in my machines, and I don’t want to push it.
Phone book sized!
Way more cool than the Uline books I would get.