I hope software 3.0 is more human, is more intelligible, is more directly shapeable.
I dream of a world where people aren't intimidated to (and can) open the hood on the systems about them, connect the dots to string up new systems and inter-system capabilities.
It's not clear what the market is for convivial computing. It this long era of computing getting ever more esoteric, receding further & further away from humanity into ever higher built fuedal data keeps: it feels like it cannot keep going this way forever. That someday some democracy & liberty that some open sourcerer crafts will spread & reshape the image of software, a 3.0 that a broad we can grasp & tangle with.
My sense, as a workaday programmer has been that the fewer abstractions are put between the human and the hardware, the more tractable a system becomes. But, as nice as that usually is to work with, I wonder what, perhaps a century from now, the day of a programmer working on a Dyson Sphere would be like. I can't imagine he'll be working at a high level of abstraction, and risk nuking a solar system because an NPM package broke. But at the same time, he probably won't be looking at assembly. Maybe he works mostly on FPGAs?
> It's not clear what the market is for convivial computing […]
That market is the same as the market for ubiquitous computing, I think. After all, even the poorest nomads tending cattle have smartphones now. But I don't think it's going to be realized by, say, Microsoft, who can't even clearly demonstrate definitive step-function capabilities of their AI systems but handwave, starry eyed, about how they'll vacuum up all our data so that "we can create a calmer, more helpful and supportive era of technology, quite unlike anything we’ve seen before."[0] That kind of computing seems less like it serves the user and more that it works for a corporation, and to the extent that's the case, there's a tremendous incentive to make it impossible to inspect.
[0]: https://blogs.microsoft.com/blog/2024/10/01/an-ai-companion-...