It's just an intro for clickbait.
(I kid, mostly :)).
The Right Thing, as any Lisp programmer can tell you, is lambda.
- LISP: everything is a lambda.
- Tandem: everything is a database.
- QNX: everything is a message.
- IBM System/38: everything is a capability.
Here you are: https://en.wikipedia.org/wiki/Category_theory
;-)
Sure I learned with DOS and Turbo Pascal and it was wonderful, but if you ask my teachers who learned with machine code and microcontrollers, they worried that computers have become too abstract and kids these days have little chance to learn the true details.
I have found this not to be the case.
Very often, an inaccurate mental model is the ideal user state, and it's my job to afford and reinforce it.
But that's just me, and my experience. I'm sure there's a ton of folks that have found better ways.
That doesn't mean I don't want abstractions. It means I think what constitutes a good abstraction is determined as much by how true it is as by how intuitively appealing it is. An abstraction that corresponds to a naive user's expectations but that doesn't accurately reflect (the essential/relevant aspects of) what is actually happening is not an abstraction, it's a lie.
Edit: And the tragedy of it is that users are, by and large, extremely good at figuring out the reality behind the lies that well meaning developers make their software tell. When our software breaks and misbehaves, its internal realities are surfaced and users have to navigate them, and more often than not they do, successfully. Internet forums are full of people reasoning, experimenting and fiddling their way to success with faulty lying software. Apple Community is perhaps the purest example of a huge population of users navigating the broken terrain behind the abstractions they weren't supposed to think about, with absolutely no help whatsoever from the company that built them. We should have more respect for users. If they were as dumb as developers assume they are almost none of the software we write would ever be successfully used.
Agreed. In my case, I write software for a lot of really non-technical users, and have found great utility in reinforcing inaccurate, but useful, user narratives.
So "respect" doesn't just mean assuming users are smart. It's also making house calls. Meet them where they live, and do a really, really good job of it, even if I think it's silly.
Many camera controls are where they are, because, at one time, we needed to have a physical connection between the actuator and the device. Nowadays, the buttons no longer need to be there, but remain so, because that is where users expect them to be.
Almost every car control is like that. Touch interfaces are falling flat, these days. Many manufacturers are going back to analog.
I am currently developing an app to allow folks to find on-line gatherings. It’s not an area that I have much experience in, personally, so I’m doing a lot of “run it up the flagpole, and see who salutes” kind of stuff. I just had to do a “start over from scratch” rewrite, a week ago, because the UX I presented was too complex for users.
The app I just released in January, had several pivots, due to the UX being incompatible to our users. We worked on that for about three years, and had to abandon several blind alleys. Not all the delay was for that, but one thing I did, was have a delicious feast of corvid, and bring in a professional designer, who informed me that all the TLC I had put into “my baby” was for naught.
The app is doing well, but we’re finding some things that users aren’t discovering, because our affordances aren’t working. You can tell, because they ask us to add features that are already there.
Often, I will be asked to add a specific feature. I usually have to ask the user what their goal is, and then work towards that goal, maybe in a different manner than what they suggested, but sometimes, their request opens my eyes to a mental model that I hadn't considered, and that gives me a good place to start.
I’m really, really hesitant to post examples here, as the very last thing that we need, are thousands of curious geeks, piling onto our apps.
That way, they will hear it from the users.
I am often written off as "a whiner," but they have to pay attention to the end-users.
A whole lot of, say, Bay Area software developer salaries/compensation have genuine comfort.
Defeat and acceptance doesn't come into this, for most organizations: they face little-to-no accountability for security problems or other system defects, so... comfort for the developers.
The real problem with abstractions is when they are implemented poorly, or have side-effects, or just plain bugs. In other words, we will always be at the mercy of human-produced software.
Which is always, and is why clean maintainable code uses minimal abstractions to accomplish the task. However it seems the default these days has become "pile it on".
So when people run explain on their SQL query, IMO you've broken the declarative abstraction.
Even with the perfect abstraction, people will need to dig beneath the surface. This is because every layer of code is subject to side-effects due to its actual implementation. That's why we have, e.g., multiple sorting algorithms, and documentation for their individual behavior, speed, etc.: so developers can understand the limits and idiosyncrasies of each and pick the correct one for the job.
We may make incremental steps towards more bug-free code, but because everything is limited by the details of its actual implementation (whether real or an abstraction), there is no magic bullet that will make a huge leap forward in the realm of software development as a whole.
Or, in other words, the developer wants to break the abstraction.
If your definition of "perfect abstraction" doesn't include the developer never needing to look beneath the surface, I'd say it's a pretty bad one.
The abstraction for SQL is related to how the work gets done, not what it produces. The missing piece is for the user to be able to control that optimization of how the work gets done, which is getting right down into the details of what SQL was abstracting in the first place.
There could be a middle ground where the user provides input about target performance and the engine uses that input during choice of execution plans.
Maybe an OPTIMIZATION clause for difficult or long running queries: ALLOW_EXTENDED_PLAN_SEARCH=True or MINIMIZE_EXECUTION_TIME_FACTOR=10 (1=just do simple, 10=expend more time+space up front to reduce overall time)
It's true, it would nice if the user didn't need to intervene, but unfortunately query optimization is a combinatorial problem where the engine has incomplete information causing performance problems (or cost problems if you are on Snowflake), so the user is required to intervene some percentage of the time.
In agile software development on the other hand, technical competence usually ends at the lowest tier. A scrum team has folks on it who make software, that's it. Then, lots of scrum masters, business analysts have probably never coded much; the first actual boss in the hierarchy has mostly secretarial and managerial work and will hardly look at code at all.
Point is, it's not just that software development is done in ticket-sized portions which does not invite philosophical considerations about the numbers of abstraction layers that one builds and maintains. It's that software developers don't even have a seat at the table(); they get babysat by scrum masters, make compromises during code review, are discouraged from thinking beyond the ticket, and are then of course not usually promoted into leadership roles to which they would bring their technical competence.
It appears therefore that any movements to bring awareness to the "software crisis" will be relegated to hobbyists, as the article states at the end: to "Handmade, Permacomputing, and various retro-computing circles".
() I partly blame Hollywood and their incessant humiliation of software/IT people, while creating endless leading roles for doctors and lawyers, effortlessly weaving their complicated terminologies into fascinating storylines, which is aparently not possible to do for us? Maybe the scriptwriting AIs can come up with something here soon.
The most agile pilled company I worked for just treated juniors & seniors as interchangeable cogs, except seniors should be able to clear more points per sprint. Active discouragement from thinking outside the scope of your ticket, keep your head down and mouth shut.
I think scrum is irremediable, but even then, some places only practice it pro-forma: divide the work into tickets at the beginning of the sprint, rewrite the tickets into what they did at the end, and go home happy.
welcome to the car factory
Why do developers only work on ticket-sized portions of the actual requirements? To put it succinctly: because they are simply too dumb. They cannot wrap their heads around it. They cannot grasp it.
Do I sound frustrated? I am. It is inscrutable.
Sorry.
Having worked for larger organizations (whatever FAANG calls itself these days, I can't keep track), as well as academia and independent education, I've seen both halves of the "production line" for newcomers to computing.
Something has to change in how we bring individuals into our field. I have some ideas based on my experiences, but you're not in the wrong for feeling frustrated about this. It is the state of things, and many companies are not equipped to handle it, because it's unexplored territory.
My boss doesn't care about Tech Debt. Get this ticket done, get it done quick and move on. He figures he will be long gone before tech debt racks up to point he would get in trouble for it. Hell, I'm not sure his higher ups even realize the tech debt problem so fact if he is here for 4 years, they wouldn't realize what was cause of the tech debt.
I understand many do not have the energy to fight the status quo and some may not have the… eloquence to do so. I have worked very hard for many years to end up where I am. If others don't, I expect them to at least accept where they remain. Because they don't do "don't care". They are effectively sabotaging projects.
They certainly aren't unintelligent. They still act pretty dumb. Again, I must apologize for my polemics.
Now, whatever. It's a job that pays well. I'm making rich people richer but I'm not sure what I'm supposed to do differently. MBAs continue to strip mine everything and my country upcoming elections are between two people we should be saying "Sure grandpa" while they tell us stories in their nursing home.
If that makes me dumb, I'll take the label I guess. I try not to be but when I'm having to explain to Dev for 5th time to stop doing appsettings.json, I realize that my tilting at windmills isn't going to fix anything.
Also, this is a feature factory: https://www.productplan.com/glossary/feature-factory/
I think the reason is entirely different: ticket-sized portions of requirements are the only thing that one can hope to estimate in any useful fashion. Business side needs estimates, so they create pressure to split work into pieces they know how to handle.
Put another way, it's not that developers are "too dumb" to wrap their head around actual requirements. They're not allowed to. The business prefers devs to consistently crank out small improvements, keep "velocity", even though it leads to "organically" designed systems - i.e. complex, full of weird edge cases, and smelling of decay. The alternative would be to let the dev talk, design, experiment, take the project holistically - but that also means the work becomes impossible to estimate more precisely than "sometimes next year, maybe", which modern businesses can't stand.
It’s fine, really. But then please don’t try to overstep the role you assumed.
In my opinion, it is critical you do both: Know the big picture, the vision, build a technological vision based on that. And then you must work on this, in bite-sized pieces. From my experience, in all but the smallest projects, not working iteratively (“experiment”, as you call it) is pretty much a guarantee to build the wrong thing from a user/customer requirement standpoint. Not having the technological vision is also guaranteed to result in a steaming pile of tech debt.
I don’t see a problem with providing reasonably accurate long-time estimates either. Build your technological vision and you’ll know. Everybody knows and will understand that substantial requirement changes or newly discovered requirements will change the timeline.
Doctors and lawyers deal with people and everyday problems that are easy to turn into a interesting story. I don't see many contract lawyers or radiologists as protagonists - it's ER docs and criminal law.
Software development is about rigorously talking to a computer all day, either solving mundane solved tasks in a new application, or problems that you can't even grasp without a technical background. I'm a developer who started programming as a teen over 20 years ago for fun - and I'm bored out of my mind with most of the work - I don't even try to talk about it to non-developers because I know it's about as interesting as accounting, more people would get useful info out of their stories.
But there's two problems: they can't get into the weeds, and they also are subject to the perverse incentive of being rewarded for generating complexity.
Some people fight it, sure, but those who fight it are less likely to be promoted. You don't get rewarded for having less staff under you or eliminating your own role.
* I do not argue against abstractions, but against the unrestricted application of them.
* I do not advocate for a reversion to more constrained platforms as a solution.
* I do not advocate for users becoming "more technical" in a "suck it up" fashion.
The key to understanding the software crisis is the curves of "mastery of a platform" and "growth/release cycles". We have, in the past 40+ years, seen these curves diverge in all but a few sectors. We did not address the crisis when these curves were close in proximity, but the second best time is now.
As for folks calling this clickbait, it is the first in my log, and reflects my thoughts on the situation we find ourselves in as developers. The sentiments are mirrored, in various forms, around multiple communities, some of them based in counterculture.
I do want to deliver some part of the solution to these problems, so I do intend on following up on "I'll show you how". I am a single entity, so give me time and grace.
At the same time, I do want to show that I have confidence in my ideas. Hubris and confidence must be applied in equal parts.
In a little more woo woo: abstractions are never pure, they will always carry a trace of the material conditions that produce them. To me, this is the story of computing.
And I agree entirely. Tracking the history of an abstraction will usually tell you its root, though it gets pretty muddy if that root is deep-seated!
I have recently started writing myself and paying more attention to what I read. I really liked your style and enjoyed reading it.
So please keep writing!
I think most of us realize that working at corporate is still a necessary evil as long as we need to pay for stuff. Frankly that sector can burn because they've been taking advantage of developers. Most of the money goes to people above you, performance bonuses are not the norm, etc. We shouldn't be actively trying to give them improvements for free because of this behavior. Let them follow the hobby/passion projects and learn these practices and limits on their own.
I don't think its a necessary evil, I think most people either don't want to work to realize their own vision of what they want in the world, or want more than they need and they are willing to sacrifice their soul for it.
I'm working on a series of articles at enlightenedprogramming.com to prove it out.
I'm also working "show you how" solutions because, absent data (and sometimes even with it), I still get people who believe that the industry has never been better, we're in some golden age and there is not a whole to improve on and it just boggles my mind.
Personally, I'm anxious to see your proposal on "how".
That said, I think the reason the situation is not going to change is clearly economic. That's not saying that bad software is cheaper. But there's a strong incentive to cheap, bad practices because cutting corners allows one person/organization to save money now with the (greater) costs being picked by the organization itself later, the organization's customers and society at large. Moreover, software isn't amendable to the standards of other sorts of engineering and so there's no way to have contracts or regulations demanding software of a certain standard or quality.
Edit: The only imaginable solution would be a technological revolution that allowed cheaper and better software with same technology also not allowing even cheaper and worse software.
Lately I feel like we have built a society with expansive software infrastructure, where that software is doomed to be crappy and inhumane because our society actually couldn't afford to build this quantity of software well.
So another hypothetical fantasy solution would be a lot less software, like being careful about where we used software, using it in fewer places, so that we could collectively afford to make that careful intentional software higher quality.
Certainly a society like that would look very different, both in terms of the outcome and in terms of a society that is capable of making and executing a decision to do that.
Just "my hunch", but one I reflect on a lot these days.
But this incredibly hypothetical. A lot of software labor today revolving around manipulating the user rather than aiding them so where we'd go with "better" versions of this is hard to say.
Oh! I thought you were going to say "testing teams, design reviewers, trainers".
I'm not on-board with this "10-50x" claim for the amount of effort. I'd say maybe 3x the effort, given the developers have been well-trained, the methodology is sound, and the managers' focus is exclusively quality. That last item is one I've never experienced; the closest I came was on a banking project where the entire development team was subordinated to the Head of Testing.
Getting everybody (or even a minority of any sufficient size) to act in service to a single goal has been a problem for humanity ever since we first invented society.
It takes time and effort to optimize software for performance, user-friendliness, developer-friendliness and everything that ethical and mindful developers hold up so highly, but that’s not the kind of thing that the big money or academic prestige is very interested in. The goal is to get the most users possible to participate in something that doesn’t repulse them and ideally draws them in. The concerns of ethics, dignity, and utility are just consequences to settle, occasionally dealt with before the world raises a stink about it, but are hardly ever part of the M.O.
Imagine if developers could make a dignified living working to listen to people and see what they really need or want and can focus on that instead of having to come up with a better ‘feature set’ than their competitors. It’s essentially why we have hundreds of different Dr. Pepper ripoffs sold as ‘alternatives’ to Dr. Pepper (is this really a good use of resources?) instead of making the best Dr. Pepper or finding a way to tailor the base Dr. Pepper to the consumers taste, assuming enough people claim they want a different Dr. Pepper.
That’s easy to say writ large but I think if you examine specific areas you consider “too abstracted” you might find it a humbling experience. Most likely, there are very good reasons for those “over abstractions”, and the engineers working in those areas also find the abstraction situation hairy (but necessary or unreasonable to fix).
For example, a lot of software is built by providing a good abstraction on top of a widely used or accepted middle abstraction (eg Kubernetes on Linux + container runtimes + the traditional control plane:config layer:dataplane architecture). All that logic could all be implemented directly in a new OS, but that introduces compatability issues with users (you can build it, but you may have no users) unless you reimplement the bad abstractions you were trying to avoid. And, that kind of solution is way, way, way harder to implement. I personally want to solve this problem because I hate Kubernetes’ design, but see it purely as a consequence of doing things the “right way” being so absurdly difficult and/or expensive that it’s just not worth it.
The laws are:
1. When a distinguished but elderly scientist states that
something is possible, he is almost certainly right. When
he states that something is impossible, he is very
probably wrong.
2. The only way of discovering the limits of the possible is
to venture a little way past them into the impossible.
3. Any sufficiently advanced technology is indistinguishable
from magic.
Depending on one's background and when they entered this field, there exists a non-trivial probability that previous abstractions have become incorporated into accepted/expected practices. For example, at one point in computing history, it became expected that an Operating System with a general purpose file system is assumed to be in use.The problem I think you are describing is the difficulty one experiences when entering the field now. The prerequisites in order to contribute are significant, if one assumes the need to intricately understand all abstractions in use.
While this situation can easily be overwhelming, I suggest an alternate path; embrace the abstractions until deeper understanding can be had.
In fantasy worlds, wizards spend years studying arcane phenomena, refining them into somewhat easily and reliably usable spells and artifacts, which can be used by general public. (Don't believe depictions of magic in video games, they simplify all the annoying little details.)
The above paragraph is actually true about our world, we just happen to call these people scientists and engineers, not wizards.
(Edit) and his magic wasn’t remotely Vancean
Windows and Office require now, 50-100GB disk just to function? 1000X for what gain?
This is sheer insanity, but we overlook it — our modern systems have 5000-10000X as much disk, RAM, and CPU. Our home internet is literally a million times faster than early modems.
A stepping stone to the 1000000X gain that would make it possible for Windows and Office to emulate the mind of a Microsoft sales executive so it can value extract from the user optimally and autonomously. Also with memory safe code because Microsoft fired all the testers to save money. And so they can hire people who can't write in C or C++ to save money. And ship faster to save money.
It left scars in my spline, to the point that I still have to actively hold myself back from the reflex of hitting Ctrl+S halfway through this comment.
I have a habit of hand-writing the markup for the site, so it's missing a few features people expect from things like statically generated sites or hosted blog platforms.
I'm watching this thread for accessibility and feedback, and am taking the suggestions to heart!
There is so much software on so many platforms, such a fragmented landscape, that we cannot say anything general.
Some projects may be struggling and falling apart, others are doing well.
There is software that is hostile to users. But it is by design. There is cynicism and and greed behind that. It's not caused by programmers not knowing what they are doing, but doing what they are told.
The users themselves drive this. You can write great software for them, and they are disinterested. Users demand something other than great software. Every user who wants great software is the victim of fifty others who don't.
Mass market software is pop culture now.
But that's simply the opposite of the case: the purpose of abstractions — even bad abstractions — is detail elision. Making it so that you don't have to worry about the unnecessary details and accidental complexity of a lower layer that don't bear on the task you have at hand and instead can just focus on the details and the ideas that are relevant to the task you are trying to achieve. Operating on top of all of these abstractions actually makes programming significantly easier. It makes getting things done faster and more efficient. It makes people more productive.
If someone wants to make a simple graphical game, for instance, instead of having to memorize the memory map and instruction set of a computer, and how to poke at memory to operate a floppy drive with no file system to load things, they can use the tower of abstractions that have been created on top of hardware (OS+FS+Python+pygame) to much more easily create a graphical game without having to worry about all that.
Yes, the machine and systems underneath the abstractions are far more complex, and so if you set out to try to completely fit all of them in your brain, it would be much more difficult than fitting the entirety of the Commodore 64 in your brain, but that greater complexity exists precisely to free the higher layers of abstraction from concerns about things like memory management and clock speeds and so on.
So it's all very well and good to want to understand completely the entire tower of abstractions that you operate atop, and that can make you a better programmer. But it's very silly to pretend that everyone must do this in order to be remotely productive, and that therefore this complexity is inherently a hindrance to productivity. It isn't. Have we chosen some bad abstractions in the past and been forced to create more abstractions to paper over those bad abstractions and make them more usable, because the original ones didn't allied out the right details? Yes, absolutely we have. I think a world in which we abandoned C Machines and the paradigm where everything is opaque black box binaries that we control from a higher level shell but have no insight into, and instead iterated on what the Lisp Machines at the MIT AI Lab or D-Machines at Xerox PARC were doing, would be far better, and would allow us to achieve similar levels of ease and productivity with fewer levels of abstraction. But you're still misunderstanding how abstractions work IMO.
Also, while I really enjoy the handmade movement, I have a real bone to pick with permacomputing and other similar movements. Thanks to influence from the UNIX philosophy, they seem to forget that the purpose of computers should always be to serve humans and not just serve them in the sense of "respecting users" and being controllable by technical people, but in the sense of providing rich feature sets for interrelated tasks, and robust handling of errors and edge cases with an easy to access and understand interface, and instead worship at the feet of simplicity and smallness for their own sake, as if what's most important isn't serving human users, but exercising an obsessive-compulsive drive toward software anorexia. What people want when they use computers is a piece of software that will fulfill their needs, enable them to frictionlessly perform a general task like image editing or desktop publishing or something. That's what really brings joy to people and makes computers useful. I feel that those involved in permit computing would prefer a world in which, instead of GIMP, we had a separate program for each GIMP tool (duplicating, of course, the selection tool in each as a result), and when you split software up into component pieces like that, you always, always, always necessarily introduce friction and bugs and fiddliness at the seams.
Maybe that offers more flexibility, but I don't really think it does. Our options are not "a thousand tiny inflexible black boxes we control from a higher later" or "a single gigantic inflexible black box we control from a higher layer." And the Unix mindset fails to understand that if you make the individual components of a system simple and small, that just pushes all of the complexity into a meta layer, where things will never quite handle all the features right and never quite work reliably and never quite be able to achieve what it could have if you made the pieces you composed things out of more complex, because a meta-layer (like the shell) always operates a disadvantage: the amount of information that the small tools it is assembled out of can communicate with it and each other is limited by being closed off separate things as well as by their very simplicity, and the adaptability and flexibility those small tools can present to the meta layer is also hamstrung by this drive toward simplicity, not to mention the inherent friction at the seams between everything. Yes, we need tools that are controllable programmatically and can communicate deeply with each other, to make them composable, but they don't need to be "simple."
Only if those abstractions are any good. In actual practice, many are fairly bad, and some are so terrible they not only fail to fulfill their stated goals, they are downright counterproductive.
And most importantly, this only works up to a point.
> Yes, the machine and systems underneath the abstractions are far more complex, and so if you set out to try to completely fit all of them in your brain, it would be much more difficult than fitting the entirety of the Commodore 64 in your brain, but that greater complexity exists precisely to free the higher layers of abstraction from concerns about things like memory management and clock speeds and so on.
A couple things here. First, the internals of a CPU (to name but this one) has become much more complex than before, but we are extremely well shielded from it through its ISA (instruction set architecture). Some micro-architectural details leak through (most notably the cache hierarchy), but overall, the complexity exposed to programmers is orders of magnitudes lower than the actual complexity of the hardware. It's still much more complex than the programming manual of a commodore 64, but it is not as unmanageable as one might think.
Second, the reason for that extra complexity is not to free our minds from mundane concerns, but to make software run faster. A good example is SIMD: one does not simply auto-vectorise, so to take advantage of those and make your software faster, there's no escaping assembly (or compiler intrinsics).
Third, a lot of the actual hardware complexity we do have to suffer, is magnified by non-technical concerns such as the lack of a fucking manual. Instead we have drivers for the most popular OSes. Those drivers are a band-aid over the absence of standard hardware interfaces and proper manuals. Personally, I'm very tempted to regulate this as follows: hardware companies are forbidden to ship software. That way they'd be forced to make it usable, and properly document it. (That's the intent anyway, I'm not clear on the actual effects.)
> I think a world in which we abandoned C Machines and the paradigm where everything is opaque black box binaries that we control from a higher level shell but have no insight into, and instead iterated on what the Lisp Machines at the MIT AI Lab or D-Machines at Xerox PARC were doing, would be far better, and would allow us to achieve similar levels of ease and productivity with fewer levels of abstraction.
I used to believe in this idea that current machines are optimised for C compilers and such. Initially we could say they were. RISC came about explicitly with the idea of running compiled programs faster, though possibly at the expense of hand-written assembly (because one would need more instructions).
Over time it has become more complicated though. The prime example would again be SIMD. Clearly that's not optimised for C. And cryptographic instructions, and carry-less multiplications (to work in binary fields)… all kinds of specific instructions that make stuff much faster, if you're willing to not use C for a moment.
Then there are fundamental limitations such as the speed of light. That's the real reason between cache hierarchies that favour arrays over trees and pointer chasing, not the desire to optimise specifically for low(ish)-level procedural languages.
Also, you will note that we have developed techniques to implement things like Lisp and Haskell on stock hardware fairly efficiently. It is not clear how much faster a reduction machine would be on specialised hardware, compared to a regular CPU. I suspect not close enough to the speed of a C-like language on current hardware to justify producing it.
And we are creating software today that would absolutely blow the minds of software developers working 30+ years ago. Mostly because we have amazing abstractions that makes it possible to stand on the shoulders of giants.
All that is happening is that as we get better at writing software, we and our users demand more of it. So we will always be at the breading edge of not being able to deliver. But we still do. Every day.
It's unconstrained side effects and dependencies, resulting in an increase in complexity, that seem to cause the major issues and have to be managed.
The real problem, of course, is the human capacity to comprehend (or not) the entirety of the system or subsystem by modeling it correctly in the brain.
I started my career in the era is 8 bit microcomputers. Yes it was great to know the entire stack from processor up through the GUI. But I would never want to go back to those days. Development was too slow and too limited.
We are in a golden era of software development.
> It's a nice coincidence when they do.
> It's catastrophic when they don't.
Well, generally, it’s not my experience. Most software out there is not critical. Many bloated crappy webapp might end up badly doing what user is expecting while sucking irrelevantly large amount of resources all day through with erratic bugs showing here and there, yes all true.
But this is not as critical as the software that handle your pacemaker or your space rocket.
Most software can afford to be crap because most projects are related to human whims for which a lake of quality is at worst a bit of frustration that will not cause a feeling of devastation or pure death penalty. All the more, I guess that most software developers out there are neither working on Sillicon-Valley-like money motivation, nor paying their bill with the kind of software project they love to build on passion. So this means most software that hits market are generated through exogenous miserable remunerations. Who will expect anything else than crap to be the output of such a process?
I really don't like missing stairs.
We're very removed from the usage of our software, and experience it in short-form (hopefully) actionable signals that we use to inform our development process. We don't get to appreciate the real pain in this "death by a thousand cuts" unless we can somehow switch bodies with a new user.
I see programming as a trade, however, and we do have the power to govern the quality of our software. There are, however, incentives, financial or not, that can get us to look the other way.
However, I think it's clear why it's this. Business team wants cheaper developers and hope is that if you put enough abstractions out, they can turn over Development to AI or monkeys with typewriters trying to create great works of Shakespeare. I remember reading Log4J exploit and wondered which Log4J developer thought it was a great idea to allow "feature" they added in the first place. Probably someone trying to prevent the monkey from destroying the typewriter.
However, it's excellent article and I await the next installment.
Projects running over-budget
Projects running over-time
Software was very inefficient
Software was of low quality
Software often did not meet requirements
Projects were unmanageable and code difficult to maintain
Software was never delivered
Now take the word "software" out and how many human endeavours have one or all of these things? And then how much software is actually pretty great? We tend only see the failures and the flaws and success is just a baseline that we completely ignore even as it gets continuously better.When we press the power button on our computer and it gets to a desktop, our computer has already run through hundreds of abstractions. Just at that desktop it is already the most complicated machine we have or will interact with all day. This happens billions of times a day, all across the world, and mostly flawlessly. And that's just one tiny example.
No such limits exist on software beyond maybe performance and memory constraints. But both of those are in abundance, so we can and do patch over crap endlessly. Every one of us has had that moment where we think or say, "how does this even work?" But until the user hits the wrong edge case they have no idea how rickety the underlying code is.
> This happens billions of times a day, all across the world, and mostly flawlessly.
No, I'd argue it's much more common for there to be flaws. They're just not obvious. They're random crap like my phone continuing to vibrate after I've answered the call until I get another call or text. Or options disappearing from a web app because something didn't load 100% correctly. Or the joys of flaky Bluetooth pairing. The list is endless.
"Have you tried turning it off and on again" is evidence of this. It's normal for software systems to get into such inscrutably subtle bad states that the only fix is to wipe everything and reload from scratch.
This is kind of what I'm talking about. The absolute massive complexity within your device that you and billions of people to seamlessly make calls from anywhere in the world to anywhere in the world with devices made by all different companies using infrastructure made by all different companies and it all works so incredibly well that we mostly take it entirely for granted.
But yes, sometimes the phone doesn't stop vibrating.
It's one thing to appreciate the natural world around us and how it all seems to work together flawlessly to provide the reality we all experience together—but that's because it's natural, not artificial, like the world of software we have created. When things work less than perfectly in software, there are human causes behind it, which, once identified, could be resolved in order to improve everyone's lives. But instead, most people share your "good enough—it's a miracle it all works!" mentality, which causes acceptance of any and all software issues, which leads to further issues down the line as we say "good enough" and build yet another layer atop it.
It's the stuff at the edges that appears to be less reliable but that's mostly because it's new. It doesn't really feel like that is getting worse though. We are constantly interacting with more and more software than ever before but it's not like everything is broken. The fact that you can reliably make a phone call is far more significant than the fact that the vibration doesn't stop. Both are build on ever-increasing stacks of abstractions.
The difference in that example isn't some emergent complexity -- it's just that one is more important than the other. There are lots of analogies in the physical world where less-important things are crappier than more important things. We don't consider that some crisis of abstraction.
One thing good programmers do well is choosing abstractions that are tested and validated, not popular or hyped.
The stack has not changed I decades. E.g It's still TCP, memory allocations and perhaps SQL. Whether you don't want to learn those is up to you. Learn them and you will know your way around for decades to come.
I failed to communicate clearly. Yes, I agree, the scope of human achievement is amazing, software included. However, the issues with software go far deeper than just the trivial example I gave of the phone. It's pervasive and pernicious. I assume most software developers understand this as lived experience, but I'll elaborate more.
Almost every single person I know who is not an IT professional of some sort regularly comes to me, my son, and others who are IT professionals for help with things that are broken and non-functional to the point that they cannot use it without assistance. And that's just the show-stoppers. There is tons of crappy software that they've just found workarounds for, and they've gotten so used to having to find workarounds that they've just stopped complaining. Not because it's working correctly, but they would rather be happy and accept it than get constantly worked up about something over which they have no control. This is not entirely unique to software, but there isn't really any other mass technology that is as bad as software.
Software is simultaneously miraculous and horrible. Because of computing we have capabilities that would be seen as science fiction or magic not too long ago. But because of the things called out in this article, these magical tools and technologies are plagued by endless amounts of things that just don't work right, break randomly, and fail in unexpected ways.
With physical systems and hard sciences we identify the physical constraints, model the system, and then engineer solutions around those models, refining our solutions as we refine our models. With software we make up the models, rarely fully document them, and slap things together as we go. Some companies do better than others, and some domains take quality and reliability seriously, but in my experience even that correlates to distance from physical constraints. In general the portions of our industry that are closer to real-world constraints (chip manufacturing, control interfaces, etc.) have historically been better about delivering quality software.
If I come across as frustrated it's because I am. I love building software, and I used to love using software, but I am incredibly frustrated by the gap between what we've built and what is possible.
I could keep going, but I'll end this comment with one other observation. I'm not an Apple fanboy by any means, but Apple used to be the exception to this general rules of "software just kind of generally sucks." Apple used to be "it just works." And that was because Steve Jobs was the forcing function on quality. It's possible.
Again, this isn't unique to software. Excellence in any domain requires someone to be the standard bearer. At a restaurant it may be the head chef, refusing to let anything out of the kitchen that doesn't meet their high bar, for example. In every domain there exist various pressures that push quality down (time, cost, etc.) but software lacks the natural floor established by physical constraints, and is unique riddled with problems, IMHO.
> But because of the things called out in this article, these magical tools and technologies are plagued by endless amounts of things that just don't work right, break randomly, and fail in unexpected ways.
There is way more software out there than one can even imagine responsible for literally every aspect of human society. There is an insatiable need for more software and there simply isn't enough software developers in the world to do it all. What you get is triage -- things half baked because of effort. There isn't some problem specific to software with regards to physical constraints. Cheap stuff made quickly breaks whether it's software or power tools or children's toys. It's that simple.
> And that was because Steve Jobs was the forcing function on quality.
Exactly. What does that have to do with abstractions or the lack of physical constraints?
> What does that have to do with abstractions or the lack of physical constraints?
Someone like Steve Jobs becomes a forcing function to maintain a high bar of quality. He's the exception that proves the rule. In the general case you can't rely on someone like Steve Jobs, but in other domains you still have a higher minimal quality bar driven by physical constraints.
I concede that in every domain it is true that "cheap stuff made quickly breaks." My argument is that even then there is a minimal bar that is set by physical constraints. If the bridge doesn't stand it doesn't get built. If you cut too many corners or rush too much it just doesn't get built, or has to be rebuilt. This leads to overages and delay, but at the end of the process the bridge meets the physical specs or it falls over. I think the lack of any such physical constraints means that there is no such minimal bar for software and the floor of quality is therefore much lower, even in software by large, successful, and well-funded, "marquee" companies.
All the same factors are at play: poor planning, insufficient resources, bad management, inexperienced workers, insufficient funding, lack of testing.
Plenty of construction projects have corners cut. Just like with software, it is someone's job to patch it later.
So it’s not so much that we don’t have people pushing for quality, we do. We just can’t agree on what quality even looks like.
Completely agree with this. The number of good+reasonable solutions is almost infinite, and the number of bad solutions is also almost infinite.
What makes it even worse is that we really don't have a good method of communicating the design+structure of our models to others (tech and non-tech). As the system gets more complex the issue gets worse.
We carry so much info in our heads about system models, and that info is painstakingly acquired through inefficient communication methods and trial and error and thoughtful analysis.
It would be amazing if we could create tools that allow us to describe the essence of the model and make it directly available to our brains so we could all collectively reason about it and manipulate it collaboratively.
Fuck that, I need a job.
But I don't think that's the real motivation for articles like these: the real motivation is that things just seem out of hand.
But I think as an experienced programmer you have to balance that feeling of overwhelm with what needs to be done. It's taken me a while to arrive at this conclusion, but I believe it's important. Things will never be completely figured out and you have to be okay with it.
Over the years I've started to tune out articles that say everything is doomed or articles that propose the silver bullet to fix everything. Things are not terrible -- they are, in fact, always getting better. It's never been a better or easier time to be a software developer. But, because of that, the problems we have to solve are harder. The demands are greater. I think this appears to many like we're stuck in place because we always feel like we're behind the 8 ball. The appetite for software is insatiable.
You want a binary answer as in "it never happens with X"? Because for most kinds of project, we say it didn't go well when 1 or 2 of those happen, while we declare a victory in software if we manage to avoid 2 of them.
> And then how much software is actually pretty great?
And now I'm curious about your bar for "pretty great" software. This is obviously subjective, but most people wouldn't look at any practical software and call it great.
This what I think is insane. We interact with probably thousands of pieces of software every day (and I'm probably off by an order of magnitude) and none of it is great? What is the bar here?
It's funny that something that was absolutely amazing yesterday is no longer even so much as great today. I can instantly connect to a remote server from my phone, type some text, and generate a pretty good poem about my cat and all of that is already "meh".
I do not consider a text generator to be the pinnacle of "great software", regardless of how well the code was written. I don't think "meh" is the wrong word.
Literally five minutes after this absolutely crazy-ass technology was released to the world, it's already the baseline.
Most of which you enumerate, if not all, can be traced back to two fundamental concepts; communication and understanding.
Communication because this is how understanding can be propagated and the lack thereof remedied.
Understanding because this is fundamental to the success of any endeavour. Without it one or more of the symptoms above will manifest. With it, they may still, but at least there can exist a pathway to success.
In my experience, the majority of problems in the software engineering industry are people problems. Not technical, nor technology, nor process problems.
This is why communication and understanding are vital to success.
Except for your brain manipulating, and taking signals from, and reasoning about, the computer. ;)
In the case of Scrum, Scrum is implemented because it gives managers and stakeholders some semblance of observability and control over the software development process. Granted, Scrum shops are merely cosplaying at implementing a rigorous, controlled methodology, but if you take that semblance away you will have angry, frustrated decision makers who resent the software teams for being opaque and unwilling to commit to budgets or schedules.
In the case of abstractions... maybe there are a bunch of junior devs who just can't learn the complexities of SQL and need an ORM layer in order to reckon with the database. (I worked at a software shop where the most senior dev was like this; part of the reason I was brought on board was because I knew how to munge the PL/SQL scripts that formed part of their ETL process.) Maybe one part needs to be completely decoupled from another part in order to allow, for example, easy swapping of storage backends, or the creation of mocks for testing. Maybe some architect or staff SE is just empire-building, but at any rate they're way above your pay grade so the thing will be built their way, with their favorite toolkit of abstractions and advocating for the use of fewer abstractions will get you nowhere.
If you're working on a team of John Carmacks, great! You will be able to design and build a refined jewel of a software system that contains just the right amount of abstraction to enable your team of Carmacks to maintain and extend it easily while still running smoothly on a 386. Unfortunately, most software teams are not like that, most customers are not like that, so the systems they build will develop the abstractions needed to adjust to the peculiarities of that team.
For some reason, my eyes cannot cope with white text on black backgrounds, so I usually just go to reader mode in cases like this article. But here, this option does not exist, for some reason?
I have a feeling it's just me not annotating the markup correctly, which I am in the process of fixing and porting around!
With this it's not possible for the browser to know what is the content of the article.
Someone wanting the world's attention because of a crisis really should not add extra friction.
GUIs are where this all falls apart as they are literal islands that don’t communicate with each other in a composable manner.
I’ve been experimenting with some GUI-meets-shell-pipeline ideas with a tool I’ve been working on call guish.
https://github.com/williamcotton/guish
I’m curious to know if anyone knows of any similar tools or approaches to composable GUIs!
The parser is off-the-shelf and seems robust. The AST-to-bash aspect is a bit messy. Wrapping arguments with spaces in them with either " or ' is somewhat of an unsolved problem. Like, it's hard to tell if someone wants to embed a bash variable and thus wants " or if they are using something like awk that uses $var itself and thus need '. We'll see how it goes!
But the larger problem is not GUIs. GUIs are a problem, but they are necessarily at the top of the abstraction stack, so the problem doesn't compose any further. (What interestingly means they are so much of a problem that they aren't anymore.)
The elephant in the room nowadays are distributed systems.
There are two main dynamics that I noticed:
- Each distributed component has a high minimal cost to create and maintain, what leads people to consolidate them into more complex, fewer ones;
- You'll invariably need to interact with some component that doesn't share your values. So you need security, contracts (the legal kind), consumer protection, etc... and that creeps into your abstraction.
Outside of that, there's one thing that isn't inherent but does happen every single time is information erasure. Distributed interfaces are way less detailed than the ones on monolithic systems.
I agree -- a big reason I like Unix, and find it productive, is that it's "shallow and composable".
But GUIs have long been a weakness. I guess because there are many "global" concerns -- whether a UI is nice isn't a "modular" property.
Personally I would more of a UI, but still retain automation
That is, the property of shell where I can just saved what I typed in a file, and then run it later
And modify it and run it again
Or I can copy and paste a command, send it to my friend in an e-mail, etc.
---
As context, I've been working on a from-scratch shell for many years, and it has a "headless mode" meant for GUIs. There are real demos that other people have written, but nobody's working on it at the moment.
Screenshots:
https://www.oilshell.org/blog/2023/12/screencasts.html#headl...
https://www.oilshell.org/blog/tags.html?tag=headless#headles...
More links here - https://github.com/oilshell/oil/wiki/Interactive-Shell - some interesting (inactive) projects like Xiki
If you find the need for a compatible shell that's divorced from the terminal, or a NEW shell that is, feel free to let me know (by e-mail or https://oilshell.zulipchat.com )
Basically we need to people to test out the headless protocol and tell us how it can be improved. I think we should make a shell GUI that HAS a terminal, but IS NOT a terminal -- which looks like it has some relation to what you're building
Right now we're working mostly on the new YSH language, but I'd like to revive the GUI work too ... I'm not an experienced UI programmer, so it would be nice to have some different viewpoints :)
---
Also, I'm a big fan of ggplot, so I'm glad you included it there.
Actually ggplot is exactly where I miss having graphics in shell!
Is there a place y’all hang out or a way you communicate? I foresee needing some guidance in places.
The key idea is that OSH and YSH can run "headless" with the "FANOS" protocol (file descriptors and netstrings over unix domain socket), which has a tiny implementations in C and Python
The reason we use FD passing with Unix sockets is so that anything that uses terminal still works:
ls --color
pip install
cargo build
All these programs will output ANSI color if `isatty(stdout)` is true (roughly speaking).Most people didn't quite get that -- they thought we should use "nREPL" or something -- but my opinion is that you really need a specific protocol to communicate with a shell, because a shell spawns other programs you still want to work.
---
Here is a pure Python 3 implementation of the recv() side of FANOS in only ~100 lines
https://github.com/oilshell/oil/blob/master/client/py_fanos....
It uses the recvmsg() binding . I'm pretty sure node.js has that too? i.e. it has Unix domain socket support
---
Anyway, thanks for taking a look - we're interested in feedback!
That is, the shell itself is a limitation
OSH is the most bash-compatible shell by a mile, AND we can easily add new hooks and so forth.
I think most stuff can be already done with FANOS, but I think completion is one area that needs some testing / feedback. For example, I imagine that when constructing a pipeline, you might want to complete exactly what's available -- like shell functions that the user has defined ?
I believe the emacs architecture is the answer. emacs is not just a CLI, it's not just a GUI, it's a wonderful (but archaic) matchup of both, where they integrate seamlessly.
I am currently working on some ideas on this as well, my idea even looks superficially similar to yours, but I would say you should look into how this works in emacs as I feel like your approach is not really very "composable" (though I haven't looked very deep, sorry).
EDIT: perhaps this may also inspire you, in case you don't know it: https://gtoolkit.com/ (Smalltalk environment where literally everything is programmable, just like in emacs but kind of the other way around: the GUI is language itself, not a result of its commands).
It is as composable as a shell pipeline because all it does is make shell pipelines!
Sure layers of abstraction is leaky and has their issues, but I don’t want to write a hipster language in a hipster editor. If you enjoy that, great.
Also, it’s easy to look at the past with rose tinted glasses. Modern softwares are bloated mess but still a million times more productive.
This is wishful thinking. Try running Windows 2000 on era-appropriate consumer hardware and tell me just how much better off we truly are now with bloated and unresponsive web apps.
> Also, it’s easy to look at the past with rose tinted glasses. Modern softwares are bloated mess but still a million times more productive.
What metrics are you using to quantify "a million times more productive"?
However, we have a project management crisis, which is not only limited to software, where people in charge of planning are distanced from people in charge of the delivery. And we don't seem to be able to bridge the gap. Agile, Scrum, whatever are indicators of this gap where "gurus" take all of us as fools and we are not able to produce anything better ourselves.
Commoditization of software development is also contributing to this mess because people of all skill levels can find a way to partake in this activity with results of varying success. This is not good or bad, but just the nature of ease of entry into the field. Not much different than food business where we have both Michelin star restaurants as well as MacDonalds of the world both of which have consumers. But we don't say we have a restaurant crisis.
About that toaster point; this is the actual, real, concrete example of the software crisis. Your toaster runs bad code. Bad code that is not designed with resiliency or security in mind but is connected to internet. It reduces your toaster's lifetime. In the past, you might have had a toaster which lasted 10 years. Now, because of the bad software and maybe even mandated Wifi or Bluetooth connection, your toaster is garbage after 2 years because the vendor stopped pushing updates. Or maybe there were no updates at all. But because we might not always directly see it, or because the current hyper consumption and never-ending buying of new products, the crisis is not always that visible.
We might be even okay that toaster stopped working after 2 years, and did not pay attention or knew why that happened. But maybe it was part of Mirai botnet https://www.cloudflare.com/learning/ddos/glossary/mirai-botn...
Likely toaster was not part of that, as they use simpler chips, but who knows.
I agree wholeheartedly that an internet connected toaster is a very stupid idea, but our ability to build such a system shows, if anything, a triumph of software (and hardware), not a crisis. What is a crisis here is societal one; the fact that there is a demand, however artificially created, for such things. Software abstractions or layers of abstractions to be able to build such devices have nothing to do with this kind of crisis.
We have managed to build systems with software that "triumph" for a very long time. Look no further than Voyager 1. A technical masterpiece which still works.
But fundamentally, about the toaster; we see these things in toasters because there are so many abstractions and possibilities. It is easy to embed software for the toaster without having a legion of software developers from different fields.
And it adds up and is the main issue of "unintended" usecases for toasters. Many from these cases are not depending from the developer. Developers do not even know about them or were not thinking about them because there were abstracted away. Overall system of the toaster can be so complex in these days, that no single person can understand it completely alone.
But yes, the consumer part and that it is okay if toaster breaks, is indeed social problem. But likelihood of toasters breaking or doing something else than supposed to, could be originating or increasing from the software crisis.
A real estate developer would probably promise you a ten story building made of straw if you seemed willing to pay but a civil engineer will never go make it because they listen to the physics and ethical rules of the trade mostly.
For some reason software engineers bend over and say yes sir when faced with a similar situation.
Perhaps looking at Boeing the same is becoming true of other engineering specialties as well.
It is worth remembering that software development is a very broad field ranging from changing size and color of buttons on a webpage (which is perfectly fine) to sending rockets to space.
I don't know how true it is these days but people are/were able to find software jobs after a couple months of code camps.
Doesn't that just support the author's claim that there's too much software?
FTR, my Dualit toaster doesn't run software.
I have no opinion on how much is too much software. But perhaps if we are employing people to write software for toasters, then we have too many programmers.
https://www.joelonsoftware.com/2002/11/11/the-law-of-leaky-a...
https://en.wikipedia.org/wiki/Tower_of_Babel
https://en.wikipedia.org/wiki/Hierarchy
https://en.wikipedia.org/wiki/Abstraction
https://en.wikipedia.org/wiki/Abstraction_(computer_science)
(Compare and contrast!)
Rich Hickey said something along the lines of "A novice juggler may be able to juggle two or three balls, but the best juggler in the world can only juggle maybe nine balls. There isn't an order of magnitude difference in human ability, we hit a ceiling quickly." If we are to surpass those limits, we have no choice but to abstract
Of course there may be bad abstractions or too many abstractions in a given case, which is what I think the author is mad at. But that's an important distinction
This part is also plainly false:
> It is no longer easy to build software, and nothing comes with a manual.
Making software has never been easier or better-documented
We don’t necessarily need indirection to understand a whole, complex thing.
In order to deal with complexity, we need to disentangle it, so we can understand each part in isolation. Hiding complexity with indirection, puts distance between the us and thing we have to reason about.
Abstraction is good for users (devs or not), so they don’t have to care about the details, but it doesn’t make it easier for us to build it.
It is very easy if you know the right tools for the right job, but information about these are suppressed so you never hear about them.
What the vast majority of people think the tech tooling landscape looks like and what it actually looks like are very different. The tools we know about are mostly horrible. They try to be silver bullets but they're really not good for anything... Yet they're the most popular tools. I think it's about capital influence, as hinted by the article.
For example, with the tool I'm using now, I made a video showing how to build a relatively complex marketplace app (5+ pages) in 3 hours with login, access control, schema validation, complex filtered views from scratch using only a browser and without downloading any software (serverless). The whole app is less than 700 lines of HTML markup and 12 lines of JavaScript code. The video got like 10 views.
https://news.ycombinator.com/item?id=40326185 https://news.ycombinator.com/item?id=40530828
Correction: with the tool you created and are not-so-subtly trying to market right now. Complete with an $18/mo subscription fee, no users and (of course) a cryptocurrency that's fueled by promises and hype.
No, despite the conspiratorial twist, modern tooling is more flexible and easier to use than ever before. It has flaws, but those flaws are nothing compared to what developers had to go through a couple of decades ago. There's no big collusion to cover up your tool.
So, I started watching your video and the first question I had was: what are Codespaces? How is it secured? Where does the final app run? Can I run it on my hardware? Can I run it without cloud access? Which cloud? Will it be around next week, next month, next year... ten years from now? However don't pay too much attention to a sample of one. ;-)
I think the situation with no/low code can be summed up by analogy to desktop publishing. There is no CI/CD pipeline for DTP: you press print; it is a fully integrated environment... unlike literally all of the "continuous integration" environments which try so hard they can't stop shouting about it because it's eating them alive. Somewhere there might be a tool for generating color seps, setting gutters, importing fonts or creating postscript header libraries or LaTeX; but you'll never use it or see it. Somewhere, somebody is still doing multi-plate prints with pigments exhibiting greater variety than CMYK that are only vibrant in natural light; most people don't see the point because all they get is a crappy snapshot on their phone. It's not just a creator problem, consumers don't know what's possible and don't possess devices which fully reveal the full scope of what's possible. Consumer devices actively compromise that full scope for myriad walled-garden reasons, as well as run-of-the-mill negligence and ignorance.
I got a Cadillac as a rental a few years ago, and I'll never buy a modern Cadillac and avoid driving one if possible. I couldn't control the windshield wipers, and the console display flashed the most attention-grabbing WARNING several times that I was not looking at the road, presumably because it couldn't suss that I was wearing glasses. I didn't actually look at the console display since it happened at high speeds in heavy traffic on an unfamiliar road (I had a passenger, and I asked them "WTF is that flashing screen saying??!?"); good job, Cadillac!
It makes it impossible to introduce back end bugs or security flaws (assuming that access control rules are defined correctly; which is a lot easier to verify via a simple UI than reading thousands of lines of back end code).
Look at this app I built with it on the side: https://www.insnare.net/
It's only ~1300 lines of custom code in total. Most of it is HTML. Throughout the entire process from start to finish, I never encountered a single back end bug. Deployment essentially never fails; the UI doesn't let you do anything invalid and it helps avoid inefficient queries. Yet look at the complexity of what kinds of queries and filters I can run against the data and all the different views. It has almost 1 million records... Indexes are not even fully optimized. Look at the data that's behind access control/associated with specific accounts (e.g. the Longlists).
Go ahead and try to bypass access controls... The WebSocket API is documented here: https://github.com/Saasufy/saasufy-components/wiki/WebSocket...
It would normally have taken months and tens of thousands of lines of code (front end and back end) to build that.
The value is there in terms of security, development efficiency, maintainability etc... The hard part is getting people to see it.
If you’re going to advocate we change, it might start with recognition of the value we have and the effort it took to realize it. The flaws can only be resolved insomuch as they solutions don’t dilute the gift.
We've found better ways to organize things over the years, and reduce incidental complexity too. We continue to chip away at small problems year after year. But, there's "no silver bullet," to give us a 10x improvement.
Believe I agree with the piece that we have too many layers building up lately. Would like to see several of them squashed. :-D
Uh-huh, all hail to the coming of the layer police.
Ends - "Things can be better. I'll show you how." - @author - Maybe it would have been better to under promise and over deliver instead?
----
But that got me asking whether there might indeed be a software crisis and yes, I think there is a crises of sorts on the personal level. Maybe for others too. It's not one that is structural as the author proposes. It's that the software landscape is so vast and chaotic. There's so much going on that it's almost impossible to know where to focus. FOMO I suppose, too much to do and not enough time.
So many clever people, so much energy, doing all kinds of amazing things. For many different reasons, some good, some not. A lot of it looks, to coin a phrase, like fractal duplication, e.g. yet another JS framework, yet another game engine, yet another bullshit SAAS, just because. Seems inherent redundancy is built in to the systems.
Good times, I suppose.
We're often just hiding some mechanical details when in truth we should be searching for and codifying fundamental ontologies about the given domain.
It's hard because at the same time we can't yet be users because the thing does not yet exist, but yet we can't really know what we must build without a user to inform us. We can create some imaginary notions with various creative explorations, but images can often deceive.
I do believe the tools most used for software development are fundamentally terrible at modelling these ontologies and are really little more than telling computer to do A then do B and so have never really abstracted much at all.
I can't find a definition of the title term, "Software Crisis," anywhere in the post.
Is it "...[the] growing complexity of software..."?
It's difficult to reason about something with no definition.
The software crisis, if there is one, is caused by complexity. Complexity is the single enemy of a software developer. I would say that reducing complexity is the whole purpose of the software engineering field. I have many small hobby projects where I am the sole developer, and I still struggle with complexity sometimes... I've tried many languages and programming paradigms and still haven't found one that actually "solves" complexity. I am convinced, for now, that the only solution is developer discipline and, guess... good abstractions.
Because complexity doesn't necessarily come from abstractions. In fact, it's the exact opposite: the only way we know to make things "look" simpler, so that we can make sense of it, is to abstract away the problem! Do you need to know how the network works to send a HTTP request?? No!!! You barely need to know HTTP, you just call something like "fetch url" or click a link on a browser and you're done. This is not something we do because we are stuck on some local maximum. Whatever you do to "hide" complexity that is not crucial to solving a certain problem, will be called an "abstraction" of the problem, or a "model" if you will. They always have downsides, of course, but those are vastly offset by the benefits. I can write "fetch url" and be done, but if something goes wrong, I may need to actually have a basic understanding of what that's doing: is the URL syntax wrong, the domain down, the network down, the internet down, lack of authorization?? You may need to dig a bit, but 99% of the time you don't: so you still get the benefit of doing in one line what is actually a really complex sequence of actions, all done behind the layers of abstractions the people who came before you created to make your life easier.
> Various efforts have been made to address pieces of the software crisis, but they all follow the same pattern of "abstract it away"
Of course they do. Abstracting away is the very definition of addressing complexity. I believe what the author is actually trying to say is that some of the abstractions we have come up with are not the best abstractions yet. I would agree with that because as hardware evolves, what the best abstraction is for dealing with it also should evolve, but it hardly does so. That's why we end up with a mismatch between our lower level abstractions (Assembly, C) and the modern hardware we write software for. Most of the time this doesn't matter because most of us are writing software on a much higher level, where differences between hardware are either irrelevant or so far below the layers we're operating on as to be completely out of sight (which is absolutely wonderful... having to write code specific for certain hardware is the last thing you want if all you're doing is writing a web app, as 90% of developers do), but sure, sometimes it does.
> We lament the easy access to fundamental features of a machine, like graphics and sound. It is no longer easy to build software, and nothing comes with a manual.
I have trouble to take this seriously. Is the author a developer? If so, how can you not know just how wonderful we have it these days?? We can access graphics and sound even from a web app running on a browser!! Any desktop toolkit has easy to use APIs for that... we even have things like game engines that will let you easily access the GPU to render extremely complex 3D visualisations... which most of the time working on most Operating Systems in use without you having to worry about it.
Just a couple of decades ago, you would indeed have to buy a Manual for the specific hardware you were targeting to talk to a sound board, but those days are long gone for most of us (people in the embedded software world are the only ones still doing that sort of thing).
If you think it's hard to build software today, I can only assume you have not built anything like even 10 years ago. And the thing is: it's easy because the "hard problems" are mostly abstracted away and you don't even need to know they exist! Memory allocation?? No worries, use one of a million language that come with a GC... even if you want the most performant code, just use Rust, still you don't need to worry (but you can if you must!!! Go with Zig if you really want to know where your bytes go). Emit sound? Tell me which toolkit doesn't come with that ready off the box?? Efficient hash table? Every language has one in its standard lib. Hot code reloading so you can quickly iterate?? Well, are you using Lisp? If so, you've had that since the early 80's, otherwise, perhaps try Smalltalk, or even Java (use jdb, which lets you "redefine" each class on the go, or the IntelliJ debugger, just rebuild it while it's on, specially if you also use DCEVM which makes the JVM more capable in this regard) or Dart/Flutter, which has that out of the box.
Almost any other problem you may come across , either your language makes it easy for you already or you can use a library for it, which is as easy to install as typing "getme a lib"). Not to mention that if you don't know how to do something, ask AI, it will tell you exactly how to do it, with code samples and a full explanation, in most circumstances (if you haven't tried in the last year or so, try again, AI is getting scary good).
Now, back to what the problem actually is: how do we create abstractions that are only as complex as they need to be, and how many "layers" of abstractions are ideal? What programming model is ideal for a certain problem?? Sometimes OOP, sometimes FP, sometimes LP... but how to determine it? How to make software that any competent developer can come and start modifying with the minimum of fuss. These are the real problems we should be solving.
> Programming models, user interfaces, and foundational hardware can, and must, be shallow and composable. We must, as a profession, give agency to the users of the tools we produce.
This is the part where I almost agree with the author. I don't really see why there should be a limit on how "shallow" our layers of abstractions should be because that seems to me to limit how complex problems you can address... I believe the current limit is the human brain's ability to make sense of things, so perhaps there is a shallow limit, but perhaps in the future we may be able to break that barrier.
Finally, about user agency, well, welcome to the FOSS movement :D that's what it is all about!!
Something as big and complex as the internet, which covers technologies from email to fiber, is held together by layered abstractions.
Also, software has gotten incredibly better since the 70s. We've built so much better tooling (and I believe tooling can help close the gap on growing complexity). When I was learning how to program, I had to double check I had every semicolon in the right place before hitting compile. I simply don't have to do that anymore. I can even get an entire API auto-completed using something like copilot on VSCode.
Nonetheless, a very thought-provoking article. Thank you for sharing!