I personally consider it very inspirational though I recognize that I will probably never be able to undertake such a difficult task. I can imagine that it is very inspirational to the next generation of extremely proficient and dedicated teens who want to master software development and explore leading edge hardware.
2. Some people want to use Apple's impressive ARM hardware, but their needs require an alternative operating system.
Something similar to this happened in the early days of the iPhone, with the iPhone Dev Team. Initially, iPhone "apps" were going to be web pages, but then these reverse engineers came along and developed their own toolchain. Apple realized they had to take action and their "webpages as an app" strategy wasn't going to work.
A much more plausible theory is that Apple likes their 30% app store commission from big players.
> https://www.apple.com/newsroom/2023/05/developers-generated-...
"App Store developers generated $1.1 trillion in total billings and sales in the App Store ecosystem in 2022"
People forget the only thing fueling big corps is profit.
As someone who built one of the first web apps featured by Apple, I can say that your view, too, is incomplete and revisionist.
A much more plausible theory
Theories are not necessary. Apple was very up-front about its trajectory with the iPhone at launch.
what makes you think it was set in stone?
It was not. But you got contradicted by people who actually remember what happened. It is fairly well documented, and was common knowledge even at the time. Jobs was initially sold on Web Apps for several reasons, and the state of iPhoneOS 1 and its internal APIs was very precarious and not stable enough for serious third-party development. Again, this was known at the time thanks to the jailbreak community, and it has been explained in details over the years by people who left Apple since then, and Jobs himself in Isaacson’s biography.
When they pivoted towards the AppStore model, there was no predicting how big it would become, or even if it would be successful at all. The iPhone was exceeding expectations, but those were initially quite modest. It was far from world-record revenue.
Neither, I, nor anyone else, can promise you it wasn't just a simple $ calculation.
That being said, literally every signal, inside, outside, or leaked, was that apps / public SDK, if it existed meaningfully before release, had to be accelerated due to a poor reaction to the infamous "sweet solution", web apps.
I agree its logically possible, but I'd like to note for the historical record that this does not jive with what happened, at the time. Even setting that aside, it doesn't sound right in the context of that management team. That version of Apple wasn't proud of selling complements to their goods, they weren't huge on maximizing revenue from selling music or bragging about it. But they were huge on bragging about selling iPods.
Also, I assume you haven't read the Steve Jobs biography, which discusses this and contradicts your point.
One positive outcome of your comment is that it reminded me I still have the 2008 book, "iPhone Open Application Development" by Jonathan Zdziarski. That was a nice walk down memory lane.
https://www.amazon.com/iPhone-Open-Application-Development-A...
Could you give 3 examples of this? Because I cannot think of many.
However since their moat is now filling with European soil this is not something they will attempt at this point IMO.
That didn't happen accidentally.
It would take an outright legal revolution in the definition of antitrust for this to be even a remote possibility, and frankly that is not happening.
By your logic, Apple could also be sued for not doing it. See how your logic doesn't add up?
This used to be one of the best things about Apple when Steve Jobs was still running the company: you'd get a bunch of features that a purely profit-focussed "rational business" would never approve, just because Steve wanted them. And I suspect Apple still has some of that culture.
As long as Apple wants people to develop software for its platforms and/or sell to web developers, Android developers, scientific computing, etc. they will not lock down the Mac.
Also if they really cared, they would operate/manage brew and integrate it better than have people spend their free time to make apple's products more usable for software developers.
m3 support still not there (let alone m4) because things broke. Which is expected from Apple, they are just doing their thing and improving their products.
If they cared they would have at least hired these people by now. It wouldn't make a dent in their budget.
They don't want to leave M1/M2 half botched before moving on to the next gen that will ultimately support more features.
If you are not happy with the pace go on and contribute, but don't invent false issues.
It's still profoundly weird to me that nobody can run Safari outside MacOS, even for testing. At least the EU has strong armed them into dropping thunderbolt ports now, so we have that minor interoperability going for us, which is nice.
May I remind you of the famous `--my-next-gpu-wont-be-nvidia` flag in Linux compositor? Meanwhile, apple literally went out of their way to make secure boot for third-party OSs possible.
Apple is better at some stuff but it's really not total domination and you really need to look at things in a very particular way to think they are indisputably better. If you add value/cost, everything falls apart and it really becomes: if you have cash, you can buy that stuff that is going to be much better at this very specific use case.
It's even true for their headphones where the only thing they are better at is integration in their own system, everything else is passably competitive but if you look for value it's just plain bad. I had AirPods, I find it amazing how much better the Nothing Ears are for the cost, if you don't care about the Apple ecosystem (the "magic" gimmicks never worked that well for me anyway).
If Apple and other companies like them were a little less greedy we could have far more nice things for free and Alyssa and other brilliant engineers could go work on other great projects. Also if regulators were a little more vigilant and a little less corrupt.
Someday.
One important lesson I've learned regarding open source is that companies absolutely love it when you work for them for free.
Something I've learned about Apple is that one of their primary businesses is selling hardware with proprietary software running on it.
Your point about working for free is right on the money. I get that asahi is probably intellectually stimulating to work on, but I couldn’t do it knowing I am essentially enriching a company that doesn’t have public benefit in mind.
The people I know at Apple actually do have public benefit in mind. They believe in things like usability, privacy, accessibility, sustainability, etc. They don't want to replace you with intrusive AI (yet). And personally I like Apple's products and am using one right now. Unfortunately all large companies tend to turn into monsters that do their best to grind up resources - natural or human - in order to turn them into money. And the designer of the iPhone regrets turning us into zombies - that was not an intended effect.
People were already zombies. They just swapped out television for smart phones.
Which is why everyone should AGPLv3 their code.
I’m pretty sure that Turing and newer work the same way. The drivers basically do nothing but load the firmware & do some basic memory management if I recall correctly.
It just so happened that after possibly even more painstaking reverse engineering, the responsible hardware vendor later realized that server machines are de facto linux, and they better support it properly. Like, that Intel chip working that good was not achieved differently, we just like to wear rose-tinted glasses.
If you think you want to run Linux, don't buy hardware from a company that views it as a threat to their business model, simple as that.
Show me any hardware that is 100% "libre"? Even the pinephone itself has plenty of closed source blobs running as firmware.
If you take some random windows laptop off the shelf, it will boot linux (and continue to do so in the future) because they have to support UEFI. If you take a "linux" friendly vendor off the shelf, you may even have coreboot or something on-board.
But with this apple situation there is no guarantee the next generation of hardware won't lock you out, or they might push out an OTA firmware update that locks you out. It's like porting linux to the playstation or something.
You can also trade it back in to get recycled, no one should be straight up throwing computer hardware in the trash.
The only sustainable thing you can do with a bad laptop is fix it or part it out, but for all my years taking apart fragile electronics, is it really worth the effort to take apart a device that was intentionally designed to be difficult or impossible for the owner to repair?
I also have a really old iPad 2. It works perfectly HW wise, screen, wifi etc. But is effectively a paper weight due to software.
I am logged into it from my old Apple account, that was only ever used for this tablet. I have the username and password but cannot log in as I don't know the security questions, so I can't reset the device or try to install apps. I even phoned apple but they said there's nothing they can do.
It pains me to just dump a perfectly good piece of hardware.
Apple does not have a reason to support any other operating system.
Apple engineers do however both officially and unofficially support Asahi, so there's that.
There's no getting around the sexiness of Apple hardware. But there's also no getting around how closed it is. You implicitly support Apple's business model when you buy their hardware, regardless of whether you run Asahi or Mac OS.
I believe the best solution to proprietary hardware is not to buy them in the first place. Buy from vendors who are more open/supportive, like Framework and Thinkpad. This helps those vendors keep supporting open source.
Not to mention that you'd be supporting a corporation that has this hostile stance towards their customers to begin with.
Meanwhile, other x86 and ARM manufacturers are making substantial improvements that are shortening Apple's lead. You're not losing much by buying a new CPU from them in 2024 or 2025, but you gain much more in return. Most importantly, the freedom to run any software you choose.
There’s still some offenders (Surface, HP, Broadcom) that introduce quirks that break sleep and some HID accessories but most of it works out of the box.
ARM has been the Wild West for a while but they’re going in the right direction with device trees et al. Apple however doesn’t have to care about the “wider” ecosystem since they left x86 for their own silicon and tighter integration from bottom up allows for some really nice end user experience.
I still think it’s much better to use the VM infrastructure and just run Linux as a guest. Mac as a platform is very end user friendly as-is unlike Windows.
https://github.com/ArmDeveloperEcosystem/systemready-guides?...
That is, I would be surprised if linux on the M1 had close to macos levels of battery life. My theory being the better battery life on the M1 is more due to the tight integration between the OS and the hardware power profiles than the power profiles themself.
The only distinct advantage is Safari, which is heavily optimized for efficiency. But a lightweight Linux desktop, with fewer active services, can compensate for that.
Linux tends to be regarded as energy inefficient because it ships with defaults that prioritize performance on desktops, workstations and servers. Some simple tweaks with udev rules can make a big difference. For example, adjusting the energy policy of your SSD can reduce consumption by 0.5-7 W. Wake-on-lan is also something that is typically enabled and drains a lot of battery when the machine is suspended.
The fundamental hardware of Apple Silicon is very efficient but I don't think MacOS is inherently optimized any better than the others. My experience installing Linux on Intel and PowerPC Macbooks tended to increase their battery quite noticeably.
Apple historically cares very little about Linux on Mac whereas it seems like you’re talking about the non-Mac product lines. Indeed, they go out of their way, if I recall correctly, to make it possible and easier in the first place.
Also, what's not to say that they will decide to lock the bootloader just as they do on all their other devices? What does Apple practically gain by leaving it unlocked? If they're doing it as a gesture of good faith to their users, they're doing an awful job at it. Doing it so they can sell a negligible amount of machines to a niche group of hackers also doesn't make business sense.
Depending on the good will of a corporation that historically hasn't shown much of it to the hacker community is not a great idea.
Even T2 macs had no way to boot custom firmware on the T2 chip, without exploits.
Sure, they could've done way more, but evidently they'd rather not lock down the platform completely.
Worse case you restore macos and possibly sell at a medium loss. That said I’m still waiting.
However, as a Mac Studio M1 owner that has used Asahi as a daily driver for software development since the first release (originally Arch, later Fedora), I can confidently say that I could care less. By running the software I want to run far faster than macOS could on the same hardware, Asahi has saved me countless hours and made me far more productive. And I'm incredibly grateful for this tangible benefit, regardless of what happens in the future.
I suspect this is primarily due to Linux being a more performance-optimized OS compared to macOS, which seems to have introduced a great deal of bloat over the years.
There are people running Linux on abandoned hardware from companies that went out of business, and that's okay.
Maybe there are other companies you'd prefer to support, but it's still only going to work if lots of other people buy stuff from them, too.
x86 has fundamental issues that I believe prevent it from ever achieving the MIPS per watt efficiency of anything from ARM. I mean... the newest M4 laptop will have a 24 hour battery life. That exceeds anything remotely possible in the same laptop form factor but with x86 by nearly an order of magnitude.
So now you're talking just ARM. Linux has been compilable on ARM for a while now, so where are the competing ARM laptops that are anywhere close to the power of Apple's version of ARM?
I do get what you're saying though (I'm definitely a Linux fan and have a Linux Framework laptop), but I wish it wasn't an x86 laptop because its battery life is crap (and that is sometimes important).
Better question: where are the incentives for them to make it? Apple is pretty much the only company with an outstanding architectural license to design ARM cores, and the best off-the-shelf ARM core designs don't even compete with 5-year-old x86 ones. If you're a company that has Apple-level capital and Apple-tier core design chops, you might as well embrace RISC-V and save yourself the ARM licensing fee. That's what Nvidia does for many of their GPU SOCs.
If SoftBank offered ARM licenses under more attractive terms, there would be genuine competition for good ARM CPUs. Given that Apple has a controlling stake in SoftBank, I wouldn't hold out faith.
Many other companies have done this to great effect in the past, but in recent years it has become more common to just license one of ARM's own stock cores, instead of designing your own from scratch.
This follows a period where companies like Qualcomm and Samsung were still trying to roll their own core designs from scratch, but ending up with designs that were slower and less power efficient than the cheaper stock cores you could license from ARM.
However I think ARM platforms tend to be way less open source friendly than x86, at least on mobile. Maybe the laptops are better because they have to run Windows and Microsoft probably wouldn't put up with the shit vendors pull on Android. I don't know.
I wouldn't be surprised if Arm is strongarmed into playing hardball with Qualcomm by Apple itself.
Honestly, why is this such an appealing feature? Are you often away from an outlet for 20+ hours?
I use 6+ year old laptops that last 4 hours at most on a single charge, even after a battery replacement. Plugging them in every few hours is not a big inconvenience for me. If I'm traveling, I can usually find an outlet anywhere. It's not like I'm working in remote places where this could even be an issue.
Then there's the concern about fan noise and the appeal of completely silent computers. Sure, it's a bit annoying when the fan ramps up, but again, this is not something that makes my computers unbearable to use.
And finally, the performance gap is closing. Qualcomm's, AMD's and Intel's latest chips might not be "M4 killers" (or M3, for that matter), but they're certainly competitive. This will only keep improving in 2025.
And these things compound, as the other poster mentioned: 24 hours of light use means longer heavy use, which actually does matter to me. I often move around while using the MacBook because it's good to change up my (physical) perspective - and it means I can do that without the charger while hammering every core with rustc.
Once you see that better things are possible, it's very hard to go back to the comparatively terrible performance-per-watt and heat generation of equally powerful x86 laptops.
Yeah, there’s something a bit freeing about being able to go all day or more without charging. Just not needing to think about power or charging when you’re busy focusing on other things.
I’m glad other manufacturers got a bit of pressure to catch up as well. Now people come to expect laptops to just run for days at a time without charging.
With Apple, you first have to pay whatever exorbitant price they want to charge you. The Air is relatively affordable, but bump up the specs on a MBP and your eyes will start to water.
That's fine. You get what you pay for, right?
Except now you're stuck in their walled garden and have to pay the same tax for overpriced accessories, software, and anything else they decide to charge you for.
OR you go the Asahi route and live with the stress and uncertainty of depending on a small team of contributors to keep your machine running. I already experience this stress with Linux on supported hardware, and couldn't imagine what it would be like if the hardware had to be reverse engineered for this to work.
That said, I'll probably end up buying a used MacBook Air just to follow the progress of Asahi. If nothing else I'll have a more informed opinion the next time this discussion is brought up.
1) Internet cafes that removed outlets to encourage laptop people not to squat :)
2) Airports where you can sit anywhere you want instead of just near an outlet
3) Airplanes when the power doesn't work (has happened more than once in my experience)
4) Cars, trains, subways, buses
5) My boat, sometimes I like to work from it for a change of pace
6) Don't have to hunt for an outlet when my fam is at the grandparents' house
7) I can work on my deck without dragging a power cord out to the table
8) I can go to a pretty overlook and watch the sunset while working
9) Conference rooms, don't have to deal with the hassle
10) Libraries, same thing. I can sit outside or in the stacks (quieter there) instead of only in the reading room (those tables are the only ones with power, in my local library)
11) Power outs and just other situations where you lose power for a time
12) It's extra juice to power your phone off of (or anything else)
I'm certainly forgetting more examples.
macOS is a deal-breaker for me, whether I use it in a cubicle or on a boat.
So the only alternative is Asahi Linux, which has its own set of drawbacks.
Clearly we must have different priorities, which is fine. Just don't think that yours are somehow superior to mine.
So I completely get where you're coming from, because it's not a mutually-exclusive thing with me. As a software dev, I love open-source stuff!
I did just order the M4 Max Macbook Pro with maxed-out specs though, because I like to work on locally-running machine-learning stuff
Recently saw someone wondering why no one has tried building a laptop with as much quality as an Apple? A special version of Linux to run on such a laptop would offer more long-term commitment and maybe pull in more adoption.
Is the "this" in that sentence your previous paragraph of concern that Apple will purposefully break AsahiLinux?
Most of that lead is gone, x86 is cheaper, open, and as battery friendly now.
It's also not just the CPU: the laptops themselves are simply phenomenal. I have never had a x86 laptop that's this much of a genuine pleasure to use (ignoring macOS, which is my least-favourite operating system). Some of the newer Windows laptops are ahead, but they're still worse in at least one metric.
Microsoft also has forced makers to drop the old S3 sleep modes which set x86 laptops back decades regarding sleep and power management.
I don’t notice the fan running on it much at all.
That's just false. Performance-per-watt, AMD, Intel, Nvidia and Qualcomm still get smoked by comparable M-series chips.
On desktops the equation is different because CPUs and GPUs can go ham with the wattage.
Similarly, I completely do not understand the popularity of Apple's laptops in this community. Endemic mindless fanboyism.
I wonder if there will be a similar issue with the displays when Asahi gets around to supporting HDR on machines equipped with FALD mini-LED backlights (or XDR, as Apple calls it). HDR displays usually regulate their brightness to keep the panel from getting too hot, and if Apple does that in software too then Asahi will need to replicate it.
More likely to be repeating audio, whatever was last in the buffer.
They do it because it lets them drive the speakers much louder than they could do safely with a simple power limiter. Apparently there are amplifier chips which can do that smart regulation internally but Apple decided to do it in software for whatever reason. Maybe they can squeeze out more sound quality with a more complex DSP that way, I doubt it's cost-cutting given their margins.
I get being upset at Apple for e.g. 30% take on in-app purchases. But what exactly would they be trying to do by making it complex to control their speakers?
Making it harder for alternative OSes to use their hardware without accidentally damaging it, further cementing macOS' position as the only OS to be trusted.
It's similar to their war against third-party repair shops by deliberately making it difficult to replace parts, even with genuine ones --- see my second link about the iPhone 12 camera.
This is the automotive equivalent of adding sensors and ECU firmware that detects if the engine is using the manufacturer's proprietary oil, and subtly changing the parameters to decrease power and fuel economy, increase emissions, and/or shorten the life of the engine if it isn't. Then blaming third-party repair shops when customers complain.
They make and sell more speakers than any other company on earth and are routinely praised for their quality to size ratio.
If apple wanted to kill asahi linux, they wouldn't have had to lift a finger. Its the opposite - Apple engineers have needed to do several small things to keep the door open to easily run 3rd party operating systems on their computers. Remember, a modern mac has essentially identical hardware as an ipad. Apple has locked down the entire boot process and uses digital signatures all through the boot chain. They actively changed how macs boot to make sure things like asahi linux can run on their computers at all.
I don't think they deserve special praise for that. But it does make it a stretch to claim their speakers were intentionally designed hurt linux-on-mac. If they wanted to stop asahi linux, they had plenty of much easier, much more effective ways to do so.
Sometimes I wonder if it really makes sense to spend so much time to do the work Apple should have done in the first place & with no guarantee it will even work after a firmware upgrade or on the next model.
Spending the same effort on more open platforms or open hardware efforts might be wiser in the long term.
On the other hand, I adore the security and cross-application sandboxing that iOS brings. I wish Linux had more ways to sandbox applications from one another - since it’s clear that “trust all computer programs” is an insanely idiotic policy in 2024. And “implicitly trust all code” is de facto how Linux, npm, cargo, etc all run today.
My comment was in response to the idea that the speakers are maliciously designed to hurt asahi Linux, which just seems obviously wrong given how much power Apple has over their devices. They could completely kill asahi Linux with a flick of the wrist if they ever want to. There’s no way that’s the reason for their complex speaker setup.
If they killed off asahi Linux, they would face far more backlash than if they just made it harder for them (with ostensible reasons why), while keeping it strictly worse than macOS.
> Apple has been caught sabotaging third-party repair multiple times and this sort of design is totally in line with that strategy.
Using fewer specialized hardware chips to dynamically regulate speaker temperatures and implementing it in software seems like it would be more repair-friendly, not less. Unless you meant this in some more general sense that doesn't include repair, per se?
This, and the nannying nature of their OS is why I could never have a mac as a primary machine. I'm always slightly mind-boggled at the amount of technical people that do. But then I guess many just live in an IDE which works fine.
It doesn't require malice or conspiracy to wind up with a closed proprietary design that makes up for cheap flawed hardware with software workarounds. It's the default if you don't have the opposite as an key value and make a deliberate effort to achieve it.
It really puts into relief how weird it is that we get such good quality out of phone cameras. They’re almost as much generative AI images as they are actual photos.
Also I note that it took them more than a decade to fix the bug where left-right balance would drift.
I think it's just a different, integrated, approach to hardware a software development. If you're doing things custom anyway, then why add an extra chip?
Because: 1. Software is more likely to fail at protection with worse consequences when it does (fire, damaged goods, warranty claims). Not just now, but also the future updates. 2. It eats away at the resources that are intended for the user. In other words: it makes the machine slightly slower for the user, for no good reason to the user. 3. You can do things that are impossible in laptop OS software. It gives redundancy, even if the OS freezes you can still make sure the speaker doesn't overheat. If it's implemented in a seperate chip. Also there is real time ability, etc. 4. It makes the OS and drivers much much simpler, which is important if you want to support other OSes on the same laptop.
Advantages, for Apple, to do it in software: 1. Software upgrade is easier and cheaper (assuming they never ever fail). 2. Cheaper. 3. You can keep competing OS'es off of your hardware, because it's too hard to write drivers for your secret sauce closed source drivers that include functionality that is "preventing parts from frying themselves".
I know that Apple like to supply the speakers with slightly more power (Watts) than they can deal with to get them loud, but I haven't seen nor heard of this temperature monitoring so far. I couldn't find anything related to it either. Please share citations, if you have.
https://github.com/AsahiLinux/speakersafetyd has a readme and the code
Apple isn't the first to use advanced limiters to get the most of loudspeakers, if I remember well, this trick is what allowed Devialet to make the Phantom what it is.
Which is why loudspeaker measurements should _always_ include something like this: http://0x0.st/XG1y.png
That said, I've been using Asahi for a month, and I'm ditching it. Maybe in a year or two it'll be stable, but for now it's got too many bugs and unsupported features. A lot of the problems come down to Wayland and KDE/Gnome, as you literally have to use Wayland. But there's plenty of other buggy or challenging parts that all add up to a very difficult working experience.
One of the biggest challenges I see is support for hardware and 3rd party apps. Not only do all the apps need to support this slightly odd Arm system, but so do hardware driver developers. I never realized before just how much of a Linux system works because most people had an incredibly common platform (x86_64). Even if Linux on Mac became incredibly popular, it would actually take away development focus on x86_64, and we'd see less getting done.
(This kind of problem is totally common among Linux laptops, btw; there's a ton of hardware out there and Linux bugs may exist in each one. Adding a new model doesn't add to the number of developers supporting them all. If anything, the Mac is probably benefited by the fact that it has so few models compared to the x86_64 world. But it's still only got so many devs, and 3rd party devs aren't all going to join the party overnight)
You're definitely right that having a usable system is not just about supporting first-party hardware. Linux on its own is a huge mess of different components that all somehow need to work together, and it's a miracle of engineering that it works as well as it does, even on well-supported hardware. I can't imagine how difficult it must be getting all of this to work on hardware that requires reverse engineering. It seems practically impossible to me.
But then again I only use those software that's available in the distribution or those that can be compiled by me. So naturally I don't deal with incompatible third party software.
That's great. Is your experience somehow more valid then?
Downvoting because of disagreement is asinine to begin with. Burying opinions that contribute to the discussion does nothing but perpetuate the hive mind.
For some reason a lot of apps have bugs in Wayland, like mousing over menus, only the first menu item shows up; move to a different item, then back to the first, and suddenly all the menu items show up. Persistently happens in several apps.
Some of the Flatpak apps also have serious performance issues and bugs that native ones don't. A big app I need to use is FreeCAD, which there was no ARM build for until the recent 1.0pre releases. Their FreeCAD build has some issues that their AppImage doesn't. But even the AppImage release has some weird bugs, and I wasn't able to figure out if they're FreeCAD bugs, Wayland bugs, Fedora bugs, or what.
And then there's DisplayLink drivers+userland, which amazingly kind-of works (after a bunch of tries), but then hard crashes or prevents the machine from resumung from suspend.
I also get weird giant green flashes on straight HDMI. And because the laptop resolution is fixed, when I attach a monitor, the laptop screen "invades" the monitor screen. Fullscreen doesn't work, the panels conflict, it's kind of a mess. I'm almost certain the latter is some kind of KDE/Wayland bug, but the green flashes must be the video driver.
There's a bunch of other issues I don't remember at the moment. But all of this adds up, sadly, to something I just can't make work. Really wanted it to.
On the other hand, I suspect people will start making choices for their hardware/software that maximise compatibility, as they already do for Linux x86. ("Don't buy NVIDIA if you want functioning Wayland", etc.) It'll be tough, but things will hopefully get better over time.
https://softwareengineeringdaily.com/wp-content/uploads/2024...
I don't know if they still do that but that's the first thing I think of every time I see Asahi mentioned or I think about giving it a try.
Because I haven't seen anything like that, so you seem to be attacking a strawthem.
Well, you just didn't look hard enough, this is from literally the last thread about Asahi before this one. Pretty much every single post about Asahi Linux on HN has transphobic comments.
They've already thought about these things and it's still what they chose to work on, so I can understand them blocking a contributing source of those comments.
That being said, I agree the more recent threads have more positivity than negativity so that's good.
yet here someone makes great effort and most comments are negative Nancy's asking why it's being done or bringing up support issues with newer hardware revisions from a 1-3 person outfit that everyone said would be impossible to do.