https://www.sony.jp/products/overseas/contents/support/infor...
Also, these TVs are apparently fire hazards. It doesn't matter that they're 20 years old (at the point of the "recall" in 2010).
I doubt the parts necessary to fix them were out of production; you can get parts for truly ancient electronics still. Things like capacitors don't become obsolete. The recall doesn't specify exactly which component is problematic, but says it's age-related, which usually points to capacitors.
Especially if customers allowing shorter lifetimes allowed companies to lower the prices.
Moreover, we're talking about televisions and old Macs. TVs with higher resolutions might come out, but lower resolution ones continue to be sold new (implying demand exists at some price), and then why should anybody want to replace a functioning old TV with a newer one of the same resolution?
Much older computers continue to be used because they run software that newer computers can't without emulation (which often introduces bugs) or have older physical interfaces compatible with other and often extremely expensive older hardware.
If people actually wanted to replace their hardware instead of fixing it then they'd not be complaining about the inability to fix it.
It depends. Older computers usually guzzle power, especially if you look at the absolutely awful Pentium4 systems. You're probably better off getting a RasPi or something, depending on what exactly you're trying to do. Newer systems have gotten much better with energy efficiency, so they'll pay for themselves quickly through lower electricity bills.
>TVs with higher resolutions might come out, but lower resolution ones continue to be sold new (implying demand exists at some price)
We're already seeing a limit here. 8k TVs are here now, but not very popular. There's almost no media in that resolution, and people can't tell the difference from 4k.
For a while, this wasn't the case: people were upgrading from 480 to 720 to 1080 and now to 4k.
>and then why should anybody want to replace a functioning old TV with a newer one of the same resolution?
They probably don't; if they're upgrading, they're getting a higher resolution (lots of 1080 screens still out there), or they're getting a bigger screen. It's possible they might want newer smart TV features too: older sets probably have support dropped and don't support the latest streaming services, though usually you can just get an add-on device that plugs into the HDMI port so this is probably less of a factor.
https://en.wikipedia.org/wiki/List_of_Intel_Pentium_4_proces...
The Northwood chips were 50 to 70 W. HT chips and later Prescott chips were more 80 to 90 W. Even the highest chips I see on the page are only 115 W.
But modern chips can use way more power than Pentium 4 chips:
https://en.wikipedia.org/wiki/Raptor_Lake
The i5-14600K has a base TDP of 125 W and turbo TDP of 181 W, and the high-end i9-14900KS is 150 W base/253 W turbo. For example, when encoding video, the mid-range 14600K pulls 146 W: https://www.tomshardware.com/news/intel-core-i9-14900k-cpu-r...
More recent processors can do more with the same power than older processors, but I think for the most part that doesn't matter. Most people don't keep their processor at 100% usage a lot anyway.
Even many Pentium 4-based systems would idle around 30 watts and peak at a little over 100, which is on par with a lot of modern desktops, and there were lower and higher power systems both then and now. The top end Pentium 4 had a TDP of 115W vs. 170W for the current top end Ryzen 9000 and even worse for current Intel. Midrange then and now was ~65W. Also, the Pentium 4 is twenty two years old.
And the Pentium 4 in particular was an atypically inefficient CPU. The contemporaneous Pentium M was so much better that Intel soon after dumped the P4 in favor of a desktop CPU based on that (Core 2 Duo).
Moreover, you're not going to be worried about electric bills for older phones or tablets with <5W CPUs, so why do those go out of support so fast? Plenty of people whose most demanding mobile workload is GPS navigation, which has been available since before the turn of the century and widely available for nearly two decades.
> For a while, this wasn't the case: people were upgrading from 480 to 720 to 1080 and now to 4k.
Some people. Plenty of others who don't even care about 4k, and then why would they want to needlessly replace their existing TV?
> They probably don't; if they're upgrading, they're getting a higher resolution (lots of 1080 screens still out there), or they're getting a bigger screen.
That's the point. 1080p TVs and even some 720p TVs are still sold new, so anyone buying one isn't upgrading and has no real reason to want to replace their existing device unless it e.g. has a design flaw that causes it to catch fire. In which case they should do a proper recall.
You can't compare CPUs based on TDP; it's an almost entirely useless measurement. The only thing it's good for is making sure you have a sufficient heatsink and cooling system, because it tells you only the peak power consumption of the chip. No one runs their CPUs flat-out all the time unless it's some kind of data center or something; we're talking about PCs here.
What's important is idle CPU power consumption, and that's significantly better these days.
>older phones or tablets with <5W CPUs, so why do those go out of support so fast?
That's an entirely different situation because of the closed and vendor-controlled nature of those systems. They're not PCs; they're basically appliances. It's a shitty situation, but there's not much people can do about it, though many have tried (CyanogenMod, GrapheneOS, etc.).
>Plenty of others who don't even care about 4k
Not everyone cares about 4k, it's true (personally I like it but it's not that much better than 1080p). But if you can't tell the difference between 1080p and an NTSC TV, you're blind.
>1080p TVs and even some 720p TVs are still sold new
Yes, as I said before, we're seeing diminishing returns. (Or should I say "diminishing discernable improvements"?)
Also, the 720p stuff is only in very small (relatively) screens. You're not going to find a 75" TV with 720p or even 1080p; those are all 4k. The low-res stuff is relegated to very small budget models where it's really pointless to have such high resolution.
But for certain video games on a large screen, I can definitely tell the different between 1080p and 4k. Especially strategy games that present a lot of information.
Btw, as far as I can tell modern screens use significantly less power, especially per unit of area, than the CRTs of old; even if that CRT is still perfectly functional.
It isn't. You can find both ancient and modern PCs that idle anywhere in the range from 10 to 30 watts, and pathological cases for both where the idle is >100W. Some of the newer ones can even get pretty close to zero, but the difference between zero and 30 watts for something you're leaving on eight hours a day at $0.25/kWh is ~$22/year. Which is less than the interest you'd get from sticking the $600 cost of a new PC in a 5% CD.
And many of the new ones are still 30 watts or more at idle.
> That's an entirely different situation because of the closed and vendor-controlled nature of those systems.
It's a worse situation, but if the complaint is that they abandon their customers long before the customer wants to stop using the device, they certainly match the criteria.
> But if you can't tell the difference between 1080p and an NTSC TV, you're blind.
Being able to discern a difference and caring about it are two different things. If your use for a TV is to watch the news and play 90s video games then the resolution of the talking heads doesn't matter and the classic games aren't in 1080p anyway.
> The low-res stuff is relegated to very small budget models where it's really pointless to have such high resolution.
Which is the point. If you have an old 30" TV and no space for a 72" TV, you do not need a new 30" TV.
For TVs specifically, the technology changed a lot. For a long time, everyone was stuck on the NTSC standard, which didn't change much. At first, everyone had B&W TVs, so once you had one, there was no reason to change. Then color TV came out, so suddenly people wanted those. After that, again no reason to change for a long time. Later, they got remote controls, so sometimes people would want one of those, or maybe a bigger screen, but generally a working color TV was good enough. Because TVs were glass CRTs, bigger screens cost a lot more than smaller ones, and there wasn't much change in cost here for a long time.
Then HDTV came out and now people wanted those, first in 720p, and later in 1080i/p. And flat screens came too, so people wanted those too. So in a relatively short amount of time, people went from old-style NTSC CRTs to seeing rapid improvements in resolution (480p->720p->1080->4k), screen size (going from ~20" to 3x", 4x", 5x", 6x", now up to 85"), and also display/color quality (LCD, plasma, QLED, OLED), so there were valid reasons to upgrade. The media quality (I hate the word "content") changed too, with programs being shot in HD, and lately 4k/HDR, so the difference was quite noticeable to viewers.
Before long, the improvements are going to slow or stop. They already have 8k screens, but no one buys them because there's no media for them and they can't really see the difference from 4k. Even 1080p media looks great on a 4k screen with upscaling, and not that much different from 4k. The human eye is only capable of so much, so we're seeing diminishing returns.
So I predict that this rapid upgrade cycle might be slowing, and probably stopping before long with the coming economic crash and Great Depression of 2025. The main driver of new TV sales will be people's old TVs dying from component failure.
Or not seeing diminishing returns. Which is the point.
Televisions improved over time:
- screens got flatter
- screens got larger
- image quality improved
- image contrast increased (people used to close their curtains to watch tv)
- televisions got preset channels
Last years model only does 4k, my eyes need 8k
edit: Anywhere between 208K to 277K.
However that doesn't imply TVs were that reliable.
Before the 90s TV repairman was a regular job, and TVs often needed occasional expensive servicing. I remember a local TV repair place in the 90s which serviced "old" TVs.
A modern TV may have an expected lifespan of five years. TVs from several decades ago had lifespans of... several decades. Quality has plummeted in that market.
A rock lasts billions of years, but its quality as a TV is rather questionable.
There's nothing inside today's monitors or TVs that can't run for at least 10 years. Our main TV, 42" 720p LCD, is from 2008, and I have monitors that are just as old.
It comes down to a few people don't knowing a lot about it - and I'm not blaming anyone for that, we all have our interests and most people have more than enough to do already to worry about what goes on inside their stuff.
Also, electronics are, to a lot of people in a lot of places, so cheap that they would rather just curse a little and buy a new thing, instead of bothering with taking the thing to a shop. And of course a few hours of skilled labour in a big city in the west might also be almost as expensive as making a whole new TV in a factory in Asia plus shipping, so it might not even make economic sense.
In many/most places, these repair shops don't even exist any more, because the products have gotten too complicated/integrated/parts-unavailable, and the economics are nonsensical.
The 2nd most common failure mode gotta be the mlcc (multi layer ceramic capacitor) cracks/shorts.
Measuring voltages, peak to peak, is a bit more work.
You can look for physical signs of degredation (bulgy, leaky, discolored), but to really test a capacitor for capacititance, you need to take it out of the circuit, at which point, you may as well put a new, high quality capacitor in.
The OEM capacitors may likely have a just right voltage rating, a new one with a higher voltage rating (and same capacitance, compatible type) may last longer in cirucit as well.
That's not necessarily true, higher voltage rating equals higher ESR which means more heat.
It covers the middle third of the screen, top to bottom, and the entire bottom 1/4 of the screen with some odd spots as well, it's really distracting and essentially makes the TV useless (to me).
I'll probably still stick to OLED though unless there's a new technology that can give me the same blacks and brights; can't keep up with all of the new marketing terms these days so not sure if there is something that can compete but last longer.
My other TV about the same vintage is starting to have stuck pixels in the corner.
Modern failure modes aren’t nearly as graceful.
[1] https://www.bestbuy.com/site/tcl-40-class-s3-s-class-1080p-f...
But, we shouldn’t let companies get away with selling products that catch fire after working fine for 20 years.
People still run these Trinitron TVs to this day.
What? That's nuts. Why bother buying a tv if you're immediately going to throw it in the trash
Louis Rossmann made many videos on this.
We anyway talk about expensive premium phones to start with, so relatively expensive after-warranty service is not shocking.
This may actually eventually sway me into apple camp. This and what seems like much better theft discouragement.
But as a customer it will overall be more expensive for you.
Re wastefulness - a decent laptop causes 10x more pollution to manufacture than phone. Desktop PC 10x that. TVs. Cars. Clothing. Phones are very much down a very long line of higher priority targets for eco friendly approach.
Other example have a longer shelf life or are at least repairable without being tied to a manufacturer. Notebooks have similar problems and the critique can be transferred here in a similar way. I see synergies in possible rules here of course.
The amount of horror stories I've seen over the years from independent repairers is just terrible. Just last year a friend had a screen hot snotted back on their Galaxy.
What represents a more efficient economy. The one where broken phones get reused for parts or the one where you have to throw them away?
This is just incredibly dishonest framing and completely ignoring what the right to repair and third party repair shop issue is all about.
> Buying genuine parts, which are available from Apple,
It is not a margin problem, it is an availability problem. Apple does not allow third party repair shops to stock common parts, such as batteries or displays for popular iPhones. This is only possible when providing the devices serial numbers. This effectively prevents third party repair shops from competing with Apple or Apple authorized service providers because they have artificially inflated lead times.
Becoming Apple authorized isn't an option for actual repair shops because that would effectively disallow them from doing actual repairs when possible, rather than playing Dr. Part Swap. Everything what Apple does in the repair space essentially boils down to them doing everything they can to avoid having competition in the repair space.
> eats into the margins
Replacing a 45ct voltage regulator on a mainboard is cheaper than replacing the entire mainboard with everything soldered on is cheaper, but doesn't allow for very nice margins.
> There is very little honour in the repair market
There is very little honour in any market. Honour does not get rewarded nowadays, people are in <insert market> to make money, if you're lucky they still take a little pride in their work. If a repair shop offers good service or not should be up to the consumer to determine, not up to Apple (or any electriconics manufacturer that employs the same tactics).
> makeup applied to it by a couple of prominent youtubers and organisations.
That is called marketing, that's what Apple does also pretty good. They're also lying when they say they are environmentally conscious while they also have their genius bar employees recommend an entirely new screen assembly on a MacBook just because a backlight cable came loose.
> The amount of horror stories I've seen over the years from independent repairers is just terrible. J
The amount of horror stories I have experienced with Apple is no joke either. Apple is always taking the sledgehammer approach with their repairs. I've had the pleasure myself to deal with Apple repairs once for my old 2019 MBP. It wouldn't take a charge anymore, went to the Genius Bar and received a quote for a new mainboard costing well over 1000 EUR. Being familiar with some of the more technical videos of Rossmann etc, I found one electronics repair store that actually does board level stuff and got it fixed for a fraction of the price (iirc it was ~200 EUR).
For instance, maybe Apple could supply parts in bulk to repair shops but require registration of those parts prior to usage. The repaired iPhone would function regardless but loudly alert the user that unregistered parts were used to repair it. Gray market parts naturally aren’t going to be able to be registered (either due to serial not existing in their system or having been parted out from stolen devices), and thus the user is given some level of assurance that they’re not paid for questionable repair services.
Also mentioned here: https://amiga.resource.cx/exp/a3640
They also had backwards caps on the CD32 and A4000
That's why the next computer Sir Clive launched was the Cambridge Computers Z88. But note, some of the later bicycles were Sinclair Research branded:
https://en.wikipedia.org/wiki/Sinclair_Zike
https://en.wikipedia.org/wiki/A-bike
Amstrad did not acquire or develop the Sinclair QL, for instance, but it did sell Sinclair-branded x86 PCs.
https://www.computinghistory.org.uk/det/3404/sinclair-pc200/
It sold on some stock of existing ZX Spectrum hardware, but mostly it sold models it developed:
* Spectrum +2 -- a Spectrum 128 with a mechanical keyboard and built-in cassette drive
* Spectrum +3 -- a redesigned Spectrum 128 with a DOS from Locomotive Software and a 3" (not 3.5") floppy drive. Dropped compatibility with 48 peripherals such as Interface 1 and Microdrives, and 128 peripherals such as the numeric keypad, serial ports, etc. Added the ability to page out the ROM and replace it with RAM, so it could run CP/M 3, also ported by Locomotive.
* Spectrum +2A, the black +2: a cut-down +3 with a cassette drive.
These were designed by Amstrad engineers and contractors, and manufactured by Amstrad. No Sinclair involvement I'm aware of at all.
Definitely starting Wednesday off productively.
I think this will allow me to classify today as productive.
It's something to check, but the polar ones should be clearly marked as such.
PSA: Electrolytic capacitors have a rough lifespan of 10 years. Any much older than that need to be checked out-of-circuit for ESR and then capacitance. Also, tantalums (historically) suck(ed). [0] Quality audio equipment from the 80's like a/d/s/ car amps used only ceramic caps and other over-engineered passives, and have the potential (pun intended) to basically last forever.
0. https://www.eevblog.com/forum/projects/whenwhy-(not)-to-use-...
https://en.wikipedia.org/wiki/Capacitor_plague#Premature_fai...
The normal lifespan of a non-solid electrolytic capacitor of consumer quality, typically rated at 2000 h/85 °C and operating at 40 °C, is roughly 6 years. It can be more than 10 years for a 1000 h/105 °C capacitor operating at 40 °C. Electrolytic capacitors that operate at a lower temperature can have a considerably longer lifespan. ... The life of an electrolytic capacitor with defective electrolyte can be as little as two years.
What I have seen done is cheaping out on parts in order to get the price as low as possible, because customers shop primarily on price.
Not to lash out, but it kind of hits a nerve for me, because people think we design products to purposely fail. Hell no, we try really hard to do the opposite, but everyone just loves to buy the cheapest shit.
The $25 LED bulb that will last for eternity will rot on the shelf next to the $3 bulb that will probably be dead in 6 months. And one more "they build these things to fail" complaint will be posted online.
Additionally, the issue is that as a consumer, it's not easy to differentiate between quality markup and greedy markup. I don't see the cap manufacturer on the box so the 25$ light bulb might last 10 years or it might last 6 months just like the 3$ one. At least with the 3$ one I can come back and buy another...
This is just fashion, right? As something becomes commoditised, it starts to become subject to fashion, which means cheap and looks fashionable is more important than durability. So you can buy one every year and keep up, and throw away the old one.
This is so true and insightful. Even plenty of so-called “luxury goods” are trash. And outside of luxury, it seems like most markets are dominated by roughly 90-99.9% products that are all identically garbage, just ranging from 1%, 2% up to 800% profit margins. So the person buying the high-end brand and the one buying from Aliexpress both have the same quality components (trash tier) and sometimes the same PCBs and designs. If you want a good quality refrigerator, you can buy a Viking or something for $9,000, or spend any amount of your choice from $800-5000 for something that will die in 2 years and be outside of warranty. It makes my blood boil, especially in areas like appliances, where the amount of waste, by tonnage, is super offensive.
It's not my fault if other people are too dumb to comprehend TCO because I would buy the $25 bulb if it had a 30 year warranty.
And? That just sounds like they have good engineers. If you are designing a machine, you have an target lifetime. You'd obviously want the product to last through the warranty period, because warranty claims are a cost to the company.
Every choice of a component affects lifetime. Designers of mass-market products can't just use premium components everywhere -- the mass market will not pay steep premiums for otherwise equivalent products.
Value engineering and planned obsolescence are not the same thing, but they are often confused.
That being said, Samsung appliances suck and I hate them. Mine failed within warranty several times.
> And, with the pressures of multinational oligopolies and BlackRock/Vanguard/State Street.. there is little incentive to invest $100M into a moderately-superior incandescent lightbulb using yesterday's technology that lasts 100kh and 5k cycles and sells for $1 more than the next one.
It isn't that. It's pressure at the shelf that does it. Consumers behavior simply does not reward equivalent-feature products with premium components that claim (true or not) to have a longer lifespan. Unfortunately, they will buy based on their uninformed sense of quality first.
If you release a light bulb that is identical to the best selling one on the shelf, but claims 10x lifespan, your competitor will do something like gluing a weight in theirs, putting some marketing BS on the box, and will put you out of business. Consumers just don't pick products based on actual quality.
> It isn't that. It's pressure at the shelf that does it. Consumers behavior simply does not reward equivalent-feature products with premium components that claim (true or not) to have a longer lifespan. Unfortunately, they will buy based on their uninformed sense of quality first.
This is a failure of marketing and buzz of the sales channel(s) and manufacturers to educate properly, not the failure of the customer.
Many people think there's some unethical conspiracy going on, and consumers actually want a product that lasts a long time, but companies are refusing to give it to them. But this is projection of individual preferences on to the market as a whole. Consumers want cheap shit that is in fashion, and their buying preferences prove this time and again. Maybe you want a 50 year old toaster in your kitchen, other people are buying products based on other factors.
If consumers really wanted to pay a premium for high duty-cycle equipment with premium lifespans, they can already do that by buying commercial grade equipment. But they don't.
If you are familiar with the history of home appliances, you'd probably come to appreciate the phrase 'value engineering'. Even poor people can afford basic electric appliances now because of the ingenuous ways that engineers have designed surprisingly usable appliances out of very minimal and efficient designs.
If you look at ads for electric toasters 100 years ago, you'd see they cost over $300 in today's money adjusted for inflation. Thank god for value engineering.
I seems to me that there is also a social dynamic to things. If consumer grade products become a race to the bottom then it is going to become more difficult for regular people to purchase products which aren't low quality. There's also a degree to which society (e.g. in the form of government policy, cost of living adjustments, etc.) factors in differences in prices.
It completely changed the way our societies operate. I think it is a good thing that people have the option to buy crappy washing machines, rather than being forced to use the washboard and bucket my grandmother used. Yeah, they sometimes do develop a bad belt, or the timer mechanism might fail. But it beats being unwillingly forced into homemaking as a career.
The world only has so much wealth to go around, and that isn't the moral quandary of the engineer picking an item on a BOM on Tuesday morning to fix. If anything, squeezing a few more pennies out of that BOM is going to lift some people at the fringes out of poverty. At the opposite end of the product value equation, every unused and functional component in every product that is no longer in service, is wealth that is wasted that could have been spent elsewhere.
If it squeezes a small but solid chunk out of product lifetime too, then it's also likely to harm people on the fringes. If they can buy it with one less month of savings, but then it breaks a couple months earlier, they're probably worse off. (For actual pennies divide both of those numbers by some orders of magnitude.)
People don’t want to walk their clothes basket down to the laundromat for one more month while they save for the nicer washer that lasts a little bit longer. They want the cheap one now, because they just got off some shitty shift at work, and they’re sick and tired of lugging their laundry down the street. Having a quality washer [x] years from now is not a desired part of the equation. Immediacy is of higher value.
2. If everything lasts twice as long for 15% more, you can get a half-expired used one for even cheaper.
> they already know that. They're not dumb.
I think they're not dumb and they already know it's extremely difficult to figure out which brand fits that criteria, if any, so it's not worth it because it's such a gamble.
That's also true. At the individual unit level, small differences in MTTF/MTBF are negligible because product failures are naturally distributed anyway. The mean time is just a mean, and nobody gives a shit about a good mean product failure rate when theirs happened to fail below the mean. That's true no matter how much you spent.
> At the opposite end of the product value equation, every unused and functional component in every product that is no longer in service, is wealth that is wasted
This also is a counterpoint to your position though: of everything that goes into say, a fridge, it’s all wasted in 5 years now because Samsung chooses to put a PCB that is barely fit for purpose and will just fail with an error code, and because instead of putting such a failure-prone part behind a door and using an edge connector so it can be swapped in 5 minutes, they bury it God-knows-where in the chassis requiring an $800 labor charge, and charge $300 for the part. (As though that PCB is actually more complex than an iPad logic board, lol). So the whole 600 pounds of steel, refrigerant, insulation, glass, ice maker, and the compressor goes to the dump since who would invest $800 in a fridge that could have the same failure in a month and only the part is warrantied (you have to pay labor again). The poor people you’re worried about are buying these components over and over again because the appliance makers like this system. All this is done in bad faith. They’re morally bankrupt compared to their grandfathers who made appliances that lasted decades.
> Poor people could always purchase used appliances.
The reality in mid 20th century US demonstrates this isn't the case. Most went without the modern appliances that are commonplace today.
That costs a ton. I just want a better lifespan, I don't want to 20x the duty cycle and also pay B2B prices.
It's too hard to figure out which consumer products have a better lifespan, so companies do a bad job of catering to that need. This makes companies try too hard to be cheapest, and they often fall below the sweet spot of longevity versus price. Then everyone is worse off. That's not the fault of the engineer but it still means the engineer is participating in making things worse.
Therein lies the problem. A more durable product exists, and yet, even you don't want to pay more for it. And you are likely much more privileged than the rest of the world. What do you expect the rest of the world to be doing? Most of the world isn't picky about whether their hand mixer has plastic bushings or ball bearings. They're are choosing between any appliance at all and mixing their food with a spoon.
> It's too hard to figure out which consumer products have a better lifespan, so companies do a bad job of catering to that need.
There are many companies that try to break this barrier over and over, with tons of marketing material proclaiming their superiority. Why do they all fail? Because their hypothesis is wrong. The majority of the mass market doesn't want appliances that last for tens of thousands of hours. Most people use their appliances very lightly and for short periods of time before replacing them.
I think a lot of people on this forum have points of view tainted by privilege. Poor people aren't dumb, they know that they are buying cheap stuff that doesn't last as long as more premium options. They're making these options intentionally because a bird in the hand is worth more than two in the bush to them.
This is disingenuous as hell.
I want to buy a version that cost 15% more to make. I don't want to buy a version that cost 3x as much to make (or is priced as if it does).
When I can't find the former, that is part of problem. When I don't buy the latter, that is not part of the problem.
> They're making these options intentionally because a bird in the hand is worth more than two in the bush to them.
The best way to have the most birds in hands on an ongoing basis is to optimize for both price and lifetime per dollar, not just price.
Well, it just doesn't work that way. Premium components that truly extend product life are multiplicatively more expensive than what you'll find in value engineered products, if not exponentially so. Furthermore, a product that is 15% more expensive than competitors won't sell 15% fewer units, it will sell significantly fewer units, and then your fixed costs will also be higher, on top of the higher BOM costs.
Quality products with measurably longer lifespans in pretty much any product category are significantly more expensive than lower quality equivalents. The entire global manufacturing industry isn't in on some conspiracy.
It's not a conspiracy when a product that is somewhat more expensive but lasts much longer per dollar doesn't sell well, but it is a market failure.
I think it if was clear at a glance that such a product lasts much longer, there'd be enough buyers to avoid the low-volume costs. At least in many markets for many kinds of product.
> And you also said that the difference between a 5 year washing machine and a 30 year washing machine is probably "a matter of tens of dollars". Why is it suddenly multiplicatively or exponentially more expensive?
You're conflating my statements about BOM costs and the final price of the product, which are two entirely different things. Demand is not a constant for your product at any price (because the market is likely elastic, and you have competitors). Demand will go down as price increases, often sharply. If you add tens of dollars in BOM cost, your product sells fewer units as a result, and now you have fewer units to spread the (potentially significant) fixed costs across. So, unfortunately the tens of dollars in BOM cost might mean hundreds in cost to the end consumer.
But if you insist they would, then we can talk about a world where that level of quality is the minimum. Somehow. I don't really care how. It would be better, yeah?
You realise incorrectly, I would say. It's very defensible to claim that Western society has the most - by a giant margin - social, economic and technological advances in history, and to boil it down to this is just a bit silly, in my opinion.
> Every choice of a component affects lifetime. Designers of mass-market products can't just use premium components everywhere -- the mass market will not pay steep premiums for otherwise equivalent products.
Dying just out of warranty is only okay if the warranty covers the actual expected lifetime of the product. And for appliances, it doesn't.
The difference between a 5 year washing machine and a 30 year washing machine is not very big. Anyone pinching those specific pennies is doing a bad thing.
At least in the US, people move frequently, and a washing machine that lasts for decades isn't even a benefit, because they'll likely have left it behind.
> The difference between a 5 year washing machine and a 30 year washing machine is not very big. Anyone pinching those specific pennies is doing a bad thing.
Absolutely right, it's only a matter of tens of dollars, probably. However, retail consumer appliances live and die at the margins. Nobody is opening up their washer to inspect the components to see if the $510 washer has better components than the $499 washer. All else equal, they're buying the $499 washer 90% of the time. Your fixed costs are going to eat you alive when spread across your fewer units, and retailers will stop carrying your product because it isn't moving.... All the while the $499 washer is going to be sitting in that home 5 years from now when the realtor puts a sign out front. And literally zero people are buying a house based on the bearings in the washing machine.
You say this in the same breath you talk about people being desperate for any cheapest appliance instead of having nothing?
> And literally zero people are buying a house based on the bearings in the washing machine.
Well that's them being dumb.
Yes? I think you’re suggesting that the existence of old machines would be good for the poor. That’s true. However, manufacturers don’t make used machines. They only make new ones. So the forces of supply and demand do not apply.
> Well that's them being dumb.
Wat. No I’d say choosing a home based on location, school district, or inherent qualities of the home itself is a less dumb idea.
As far as I can tell you were discounting the value of old machines, and suggesting long durability wasn't useful. I'm glad you agree they're useful.
And I know manufacturers make new machines. I'm suggesting that if the cheapest machines were much more durable at not-much higher prices, the end result would be better for everyone including the poor people that would otherwise have bought the even cheaper model.
(If we switched cold turkey it would be worse for them for a couple years before it got better. So let's not switch cold turkey. But that's not a reason to act like the current situation is anywhere near optimal. It's great that appliances have gotten massively cheaper than they used to be, but we could do even better.)
> Wat. No I’d say choosing a home based on location, school district, or inherent qualities of the home itself is a less dumb idea.
Wat. Do you think that's an either-or choice?
It's reasonable to say people choose a home based on price, right? If there's a washer and dryer as part of the package, the expected lifetime is basically an offset to the price.
Often worse -- in many markets a buyer will pick whichever available option has a plurality of their preferences. Most buyers are going to prioritize the location, size, and permanent qualities of the home, and that's going to narrow them down to a short-list of options. Major renovations tend to affect the price of a house somewhat, but the quality of individual appliances typically does not, because they are easily changed and account for maybe 1% of the value of a home. Even a home without appliances entirely will tend to sell just as fast and for prices similar to other homes.
Product have an expected lifespan longer than the warranty period. This is malicious if given as a target. I'd like to see MTBF numbers on everything so people can lump together and sue the shit out of manufacturers who do this. Would also make it easier to check the 25$ light bulb.
Also it is mathematically stupid, because products do not fail at consistent rates, nor are they used by customers are equal rates. If you want to minimize warranty costs, you do need to target some mean lifetime well beyond the warranty period.
MTBF (or MTTF) might be useful number if you buy 100 light bulbs, but is not really a useful number for you buying one appliance. Product failures don't follow a normal distribution. The stuff that ticks people off about shitty products is the infancy-failure part of the bathtub curve -- It's when you get 13 months out of a $200 blender that fails in infancy that you're pissed. Not when you get 24 months out of a $20 blender that fails from end-of-life.
A 30 year warranty would certainly make a difference in the decision making. But more typically you see the $3 bulb, with a 1yr "warranty", next to the $25 bulb, and the $25 bulb either has an identical 1yr warranty, or has a warranty period not commensurate to the price difference, such as a 2yr warranty.
"The Phoebus cartel engineered a shorter-lived lightbulb and gave birth to planned obsolescence"
Technology Connections explained this well in a video about a year ago: https://youtu.be/zb7Bs98KmnY
> bulb lifetime is a trade off between lumens, filament life, and energy consumption
At a specific temperature, using specific materials.There is no reason to suspect that material science would not advance. Or other constraints would change. A specific company choosing that particular sweetspot for a particular product line is fine. But a collusion between companies dictating that specific constraint (in lieu of, e.g., wattage per lumen) is too clear a marker of anti-consumer intent.
The temperature of incandescent bulbs is directly related to the light that they give off, and tungsten is the obvious best material, there's no other materials on the horizon giving an improvement for the tech. It really is a pretty well understood tradeoff surface on a very mature technology (which was then displaced by completely different ones).
There have been plenty of discussions on HN about brands that used to produce durable products no longer doing so. I mostly buy cheap stuff because I assume that everything will be built as cheaply as possible, so I will get something that will not last anyway.
No it isn't. It is simply optimization of price and the features/form-factor that many buyers have demanded.
If anything, the lifespan of a ~$1.50 household LED bulb is quite incredible. I'm not sure exactly how anyone would be able to increase the lifespan at that price point and keep the traditional Edison form factor.
> Amazon should be required to test all [..] products on its site such that they can prove safety and standards conformance.
No, the manufacturers should be required to... the same way it works for literally every other product with safety regulations.
I don't think I've had any last more than 5 years.
If you bought a cutting edge LED bulb back in 2002 or so, those had a life expectancy of over 60 years, and the build quality was such that you could reasonably expect to get that.
There are plenty of teardowns on YT showing how poorly even major brand name LED bulbs are put together.
> I don't think I've had any last more than 5 years.
Do you shut them off every 3 hours? That's probably what the estimate on the box is based on. Run the same bulb half the day and you'll only get 2.5 years out of it.
> There are plenty of teardowns on YT showing how poorly even major brand name LED bulbs are put together.
I've seen them. And dissected my own. Still, at the price that modern LED bulbs are being made, I'm surprised they're built as well as they are. Brand name Sylvania bulbs are $0.79/ea in a bulk Amazon right now.
LED bulbs aren't lasting any longer than incandescent bulbs used to. My house has 2 bathrooms, one had incandescent bulbs when I moved in and I didn't bother to replace them. Those incandescent bulbs have outlived multiple sets of LED bulbs in the other bathroom.
I honestly worry about the increase in e-waste with LED bulbs vs the old incandescent bulbs.
> Do you shut them off every 3 hours? That's probably what the estimate on the box is based on. Run the same bulb half the day and you'll only get 2.5 years out of it.
Which given that LEDs should damn well last 20-30 years of always being on, this is all a farce. I can't even pay 2x the price to buy a bulb with an honestly stated lifetime on it.
> Yeah I would hope those bulbs were built pretty well, they were crazy expensive... expensive enough that they wouldn't be competitive in lifetime-per-dollar against today's crappiest bulbs even if they lasted a person's entire lifetime.
I bet they would be. Given LED bulbs last less than 3 years now, with some not even lasting 2 years, a 20-30 year bulb could cost 4x as much and be competitive.
The real problem is that those long lifetime LED bulbs are not driven as hard, so the light output isn't nearly as high. AFAIK all research in the last 20 years has been into bright LEDs with meh lifetimes, so I wonder if it is even possible to mass produce long lifetime consumer LEDs anymore.
(Except the LEDs in all my consumer electronics have no problems staying on for 5 years non-stop! Tiny output, long lifespan...)
They tend to last a lot longer for me.
The reason indicator LEDs last a different amount of time is because they are very different applications. If you want to walk around your house lit by an indicator lamp, go right ahead, but most people want more light, and more light = more heat and degradation of the diode.
https://duckduckgo.com/?t=lm&q=led+bulbs+enclosed+fixture+ra...
And that's a very time consuming and somewhat risky operation on an old machine you want to keep running. Some old PCBs are quite fragile.
I wish there was a way to test capacitors without removing them.
[1] https://www.eevblog.com/forum/beginners/is-there-any-way-to-...
What if I have a stash of big electrolytics that have been out of service for 10+ years? I know that I need to reform them over a few days, but can they even run at spec after so long out of operation?
We're talking BIG stuff, 400v, 200+J each
But if you wanted to use them in production, and be able to blame me when it didn’t work, I’d say no.
The last few times I made a mistake, there wasn't even an explosion, even less a short-circuit. The thing slowly boiled and bubbled or unfolded.
Anyway, it blows up because the capacitor's insulation layer isn't some stable material, it's a tiny oxide layer built over the metal plate by anodization. If you put a high voltage on it with the wrong polarity, you reverse that anodization and short the liquid and the metal electrodes.
https://wiki.console5.com/wiki/Amiga_CD32 C408 C811 "original may be installed backwards! Verify orientation against cap map"
A4000 https://wordpress.hertell.nu/?p=1438 C443 C433 "notice that the 2 capacitors that originally on A4000 have the wrong polarity"
Much worse is Commodore A3640 68040 CPU board aimed at top of the line A3000 and A4000 http://amiga.serveftp.net/A3640_capacitor.html https://forum.amiga.org/index.php?topic=73570.0 C105 C106 C107 silkscreen wrong, early revisions build according to bad silkscreen.
I think we're envious that Apple did a better job of engineering their systems
P.S. still my favorite Mac of all time was the IIcx. That one coupled with the 'full page display' was a dream.
That said, apple did a really good job with mac pro cooling fans where the shroud spun with the blades.
I think it did better than the the best PC cooling fans like noctua.
If I remember, jobs had them not include a cooling fan. As it would heat up and cool down the chips in the motherboard would work their way out of the socket. So one of the official solutions to try if you were having issues would be to drop it a couple of inches to try and get the chips to re-seat inside.
Crazy.
Unless I misunderstood your story
I bet this could be done at the output side, too. And a company like Apple that values the customer experience could try to build a filter on their laptop DC inputs to reduce touch currents experienced by the user when connected to a leaky power supply. Of course, the modern design where the charging port is part of a metallic case might make this rather challenging…
(Seriously, IMO all the recent MacBook Air case designs are obnoxious. They have the touch current issue and they’re nasty feeling and sharp-edged.)
At first everything seemed OK. but when I plugged a monitor into the PI I Was Made To Realize a) the nice 18-volt PS really was high quality, and although it was transformer-isolated its output ground was tied to the wall socket earth, b) monitors also tie HDMI cable ground to earth, and so c) my lash-up now had dueling grounds that were 9V apart.
I found one a few years back when I repaired a linear power supply. This required me to reverse engineer it first because there was no service manual. I buzzed the whole thing out and found out that one of the electrolytic capacitors had both legs connected to ground. They must have shipped thousands of power supplies with that error in it and no one even noticed.
How/what does adding capacitance help with?
It helps to always think of current draw in a compete loop, out the "top" of the capacitor, through your IC, and back into the ground side (this isn't necessarily what's happening physically). Shorter loop means less inductance, shorter traces less resistance.
But I do recall having had inductance issues with high frequency transfers to an atmega.
I have no idea how a capacitor would help. Sounds like it would increase a signal delay (tiny one perhaps).
But definitely interesting, I'll need to learn about this if/when I dive further into electronics.
Capacitance is in many ways the opposite of inductance. If you place capacitors close to the power sink (e.g. the ATMega), the traces will have lower inductance and be “decoupled” from the power supply.
The layout CAD is often done by a different team that follows the schematic provided by design engineering. Automated workflows are common. The silk screen is predefined in a QA'd library. It is not their job to double check engineering's schematic.
The components are placed per the layout data.
Both those teams did their jobs correctly, to incorrect specifications. In fact, the factory performing assembly often is denied access to the schematic as it is sensitive IP.
If you're going to cast blame on a 30 year old computer, at least direct it at the correct group. It wasn't soldered incorrectly at the factory. They soldered it exactly how they were told to - backwards.
Just as a note, this is a fairly archaic way of working nowadays. At my place schematic design and layout go hand-in-hand, and we rejected a candidate because he didn't do the latter. The main reason is layout is no longer an afterthought, it's a key part of the electrical design of the system, and there's little room for a tedious back and forth between the circuit designer and the person doing the layout about what traces are and aren't important to optimize for various attributes.
Even the free software that I use -- KiCad -- would ding me.
We make bigger mistakes instead. ;-)
I wonder if it has the same defect
I think the -5 volts is only there in case an expansion card needs it.
Most RS-232 receivers of that era had a fair amount of gain with a transition point close to 0V. So only a little minus voltage would be required in practice, not the entire -5V. So just as long as the reversed capacitor was not entirely shorting out the rail things would have worked.
These days, a charge pump device costs just a few dollars from Ali-express, so even though things work fine this way for many things, it wouldn't hurt to make them work for most if not all things.
When a replacement power supply does provide negative voltage, I try to use it, even if it does mean adapting -12 volts, which is what most ATX power supplies provide, to -5 volts, which is what, it seems, many 68030 Macs want.
Funny - I just did an ATX power supply for a Quadra 630, and it seems that most of the 68040 machines were using -12 volts instead of -5 volts (except the Quadra 605, which happened to use the same form factor and power supply as the LC / LC II / LC III).
But it just never quite worked right. I remember how frustrated and confused my older brother was. The computers would sometimes see each other but would drop off so easily.
Was this that?!
[1] https://www.blizzplanet.com/blog/comments/warcraft_ii_tides_...
WC2 worked mostly well but just kind of slower game speed.
In fact I got Diablo running on the LC3. A level took like 10 mins to load and it ran at 1fps though. :)
You can just about make out "68040" in the requirements on the box art here https://images-worker.bonanzastatic.com/afu/images/7569/5203...
Edit: found a photo of the expansion system requirements, also lists 68040 as supported https://cdn.mobygames.com/covers/4056502-warcraft-ii-beyond-...
About 30 years ago I designed my first PCB with frequencies in the GHz range. It was full of challenging transmission line paths with frequencies in the hundreds of MHz and above.
I am still proud of the fact that all of the high speed signals worked as designed, with excellent signal and power integrity (the large FPGA was challenging). Emissions passed as well.
I did, however, screw up one thing: DC
I somehow managed to layout the DC input connector backwards!
These boards were very expensive ($2K), so an immediate respin was not possible.
I had to design a set of contacts to be able to flip the connector upside-down and make the electrons go in the right way.
The joke from that point forward was that I was great at multi-GHz designs but should not be trusted with DC circuits.
So you can see why it probably didn't matter that this capacitor didn't work: It's only needed for rare occasions. RS-422 is a differential form of RS-232 (https://en.wikipedia.org/wiki/RS-422) so being differential it's fairly robust against changes in load if they affect both wires. And the worst that can happen is you lose a few characters from your external modem.
In addition, electrolytics can probably work when reversed like this, at least a little bit. It's not exactly optimal and they might catch fire(!).
The two RS-422 ports are actually used quite often on these old Macs for printers, modems and apple talk networking. It was the only communication port, as there was no parallel port. They were backwards compatible with RS-232.
So it obviously worked well enough.
The backwards cap was measured to reduce the voltage to about -2.4v.
I suspect that all it did was reduce the maximum range, which started at a massive 1,200 meters for RS-422 (and a good 10m for RS-232)
When I was a demo coder my artist friend would just haphazardly go through all my assembler code and snip random lines out until it stopped working to improve performance.
I think I know exactly enough about electronics to ask more annoying questions than someone who doesn’t know anything at all.
Nothing as bad as PCBs as far as I'm aware.
I found a bad solder joint that looked ok, but was intermittent, and had been that way, in a Television built in 1948 and used for decades.
Bad design and assembly goes back forever, as near as I can tell.
until you have to deal with negative voltage (-5V). Another out of bounds bug.
The reason to not (just) use optical flow is that it isn't absolute. If you pattern your surface correctly, you can ensure that every few by few pixel region on a QR code like bitmaps surface is unique, and thus can be decoded into an absolute position. Basically a 2D absolute optical encoder fast enough to be part of a motor control loop.
This is the buried lede! I am of the opinion that half of the capacitors in any modern circuit are useless; the trouble is we don't know which half.
> He often carried a pair of wire clippers, and when he thought that one of his employees was "overengineering" a circuit, he would begin snipping components out until the picture or sound stopped working. At that point, he would tell the engineer "Well, I guess you have to put that last part back in" and walk away.[14]
Techmoan recently did a video "The story of the 4-Track Cart" https://www.youtube.com/watch?v=Uqoglkbe9sc covering most of Muntz life.
On a serious note this is a terrible practice and nowadays wont pass any EMC certification.