For example I think that strict formatting is a good thing. Since I tried to use Prettier I'm using it and similar tools everywhere and I like it. I can't do vertical alignment anymore, it eats empty lines sometimes, but that's a good compromise.
May be there should be a good compromise when it comes to "best practices"? Like "DRY" is not always best, but it's always good enough, so extract common stuff every time, even if you feel it's not worth it.
I often deal with this dilemma when writing Java with default Idea inspections. They highlight duplicated code and now I need to either disable this inspection in some way or extract the chunk of code that I don't really think should be extracted, but I just can do it and move on...
I think we're more like Pythagoras: some useful theory about numbers, taken way too far and became an actual religion[0] listening to the Delphic[1] Oracle[2].
[0] Tabs or spaces? Vi or emacs? Is the singularity the rapture of the nerds, with Roko's Basalisk as the devil and ${insert name here according to personal taste} as the antichrist? SOLID or move fast and break things?
vs. really a real religion: https://en.wikipedia.org/wiki/Pythagoreanism
[1] Not https://en.wikipedia.org/wiki/Delphi_(software)
but rather https://en.wikipedia.org/wiki/Delphi
[2] Not https://en.wikipedia.org/wiki/Oracle_Corporation
but rather https://en.wikipedia.org/wiki/Oracle
DRY is a perfect example though of something which in moderation is a good idea but as the article says is vulnerable to ‘inexperienced programmers who lack the ability to judge the applicability’ and if over-eagerly applied leads to over-abstraction and premature abstraction which does more harm than good.
In other words, the author knows better than you.
The author could have put forward precedent, principles, or examples. But instead he chose to make it about the people (inexperienced), not his arguments.
What you can legislate/codify are procedures, safety and outcomes. So for example building designs must be signed off by a structural engineer and architect, both of whom are liable if the buildings collapses and kills someone. There are standards materials must meet and for which materials can be used. Buildings must meet standards for fire protection, air flow, heat loss etc.
I’m not sure software is at the stage where we even know what to codify or what is good and what is not good.
- I think SW needs much more creativity than other industries.
- Typically SW is not mission critical (in mission critical things, it IS pretty much regulated to uncomfortable extremes)
You could regulate it to death, and would probably have some positive impact by some metric, but you would be easily overtaken by FOSS, where for sure there will be less restrictions.
Software is usually quick to write, update and deploy. And errors usually have pretty low impact. Sure, your website may be down for a day and people will get grumpy, but you can hack together a quick fix and have it online with the push of a button.
Compare that to, say, electrical engineering, where there's often a long time between finishing a design and getting a manufactured prototype (let alone mass production.) And a fault could mean damage to equipment (or people) and the cost of having to replace everything. So you'll find that there's a lot more work done up-front and the general way of working tends to be more cautious.
There's also the idea of best practices as a form of communication. This also helps for programmers, as code that looks and acts the way you expect it is easier to follow. But code is primarily shared with other programmers. Other engineering disciplines (more) frequently need to collaborate with people from other domains. For example, a civil engineer's work could be shared with architects, government bureaucrats and construction managers, and best practices often provide a common familiar standard.
Compared to other engineering disciplines, software is a big unorganized mess. But it's also incredibly fast and cheap to make because of that.
It is just that high-velocity low-reliability web and consumer application development is a very large niche. A lot of our best-practices are about attempting to maintain high velocity (often with questionable results), more than increasing reliability.
And most of them have no care about the user experience of the end user at all.
Almost every piece of software I have to interact with on a daily basis is absolute garbage. Just full of frustrating bugs that makes most of my day when I'm forced to use a computer absolutely miserable. But to each of the devs it's just a small annoyance in their particular app. Not caring to the end user it's one annoyance that leads to a death by a thousand cuts.
Software is just atrocious nowadays.
An retailer website is not the same as a trading platform, the same way that a house is not the same as a railway station. But we blindly try to apply the same "good practices" everywhere.
We also have another interesting phenomenon, our products can mutate in their lifetime, and our practices should follow (they often don't) an MVP can become a critical system, a small internal project can become a client-facing application, we can re-platform, re-write, etc. That's very rare in other industries.
(https://softwareengineering.stackexchange.com/questions/2207...)
"Byte for byte equivalent" doesn't necessarily mean it's a copy, if the semantics of it are different.
And then get ready for the horrors of electrical connections. Not necessarily in how many there are; the real horror is how many think there is a "one true answer" there.
You can find some solace in learning of focusing effects. But, focus isn't just getting harder for individuals. :(
In the end, other engineering areas also have lots of "it depends" situations, where often there are multiple correct answers, depending on availability, legislation, safety, physical constraints, etc.
Perhaps in software engineering people are just too quick or immature to judge.
> rabbet hole
Nice pun ;)
They don't. CAD, the "programming languages" of most other engineering disciplines, is as much of a Wild West.
or Heat Exchanger efficiency calculations (https://en.wikipedia.org/wiki/Logarithmic_mean_temperature_d...) etc.
Often the models and equations rely on making assumptions in order to simplify the problem (cue the joke about physicist and the spherical cow). This is one of the reasons thing are designed with tolerances and safety factors.
Software like CAD and particularly Computational Fluid Dynamics (CFD) packages can simulate the problem but at least with CFD you would typically perform other types of verification such as wind tunnel tests etc.
Analyzing a CAD model as you describe is more like running a compiler or type checker on code already written, which is the norm in software too, but is not within in the vein of the topic of discussion.
Usually, engineering creates best practices for the industries to follow.
A lot of engineering discipline is a way to prevent engineered works from causing unintentional injury, physical or fiscal.
Most software development is far away from physical injury. And fiscal injury from software failure is rarely assigned to any party.
There's no feedback loop to push us to standardized process to cover our asses; we'd all prefer to do things our own way. It's also pretty hard to do convincing studies to determine which methods are better. Few people are convinced by any of the studies; and there's not been a lot of company X dominates the industry because of practice Y kinds of things, like you see with say Toyota's quality practices in the 80s and 90s.
‘The reasonable man adapts himself to the world; the unreasonable man persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.’
The unfortunate corollary to this is that all retrogression also depends on the unreasonable man. The reasonable person (as defined here) maintains the status quo, for good or ill.
In high-reward / low-risk environment, such as building an indie turn-based retro-style game, go with your gut feeling unless you have a good reason not to.
In a high-risk / dubious-reward environment, such as implementing cryptography, follow the best practices to a t, unless you know intimately how things work and maybe codified some of the practices.
There is a wide gamut between these two extremes.
In my experience, many "best practices" are the pitfalls you should be wary about, as they can easily translate into hundreds or thousands of lost hours of work and derail and doom entire projects. (The most annoying part of this is that the real causes won't be found, precisely because "best practices have been followed". Therefore the reputation of the best practice will stay untarnished).
Cryptography on the other hand is a well known example of something you should not touch at all unless you are an absolute expert- that's not even a "best practice" but probably the only reasonable practice.
It's standard practice to install outlets with NEMA connectors in North American buildings. Sure, you could technically swap those out with a more optimal connector that is "better" (one that prevents electricity from flowing while the plug is partially exposed, for example), but using the standard practice is best practice for human-shaped reasons that are often not apparent to early-career engineers.
However, there are many where the cost/benefit ratio is so large that you can default to "you should just do this".
I dont think Id ever look at a company that e.g. had no CI or versioning for a large project for instance and think "they might have had a good reason for this". They didnt.
A lot of the time best practice can also mean “we did it this way last time and it was ok”. I don’t think anyone is saying don’t find improvements in “what is currently defined as best practice” and if they are then that’s your problem.
It's downright dangerous to assume a "best practice" in software development somehow automatically means it's some super distilled wisdom juice. A whole lot of it, in my experience, is just hype and rapid, unquestioning cargo culting slamming in right behind the hype.
Use your fucking brain. If the "best practice" is objectively a good solution to your problem then use it. If not, think of something better.
...or,
The best practice of best practices is vetting the source of the best practice to verify its authenticity.
No?
Cargo culting much?
I'd say, follow best practices only if you can say exactly why it is best practice.
This allows me to start faster, and be in a better shape in short term. Then I can flex advice more and more as I understand how a language works under the hood.
The bigger issue is that developers (me included) are usually making the decisions in their own heads. Usually the reasons are quite ok but they are not really said out loud or documented.
I've been stumbled upon this as both developer trying to get their code approved and when doing a code review.
For the developer it feels super annoying that somebody nitpicks about things which the developer has probably gone through in their heads already and ended up with the resulting solution. Just like Martin in the post complains and reacts passive aggressively towards reviewers mentioning these things.
For the code reviewer it feels like the developer is being sloppy or doesn't care about our common way of doing things thus increasing the need to nitpick and explain so that the developer understands how it should be done.
The solution for this is actually quite easy: document in the comments and in the Pull Request _why_ you're breaking team's / company's guidelines and why you think it's warranted for solution for this case. This seems to remove quite a lot of friction.
Oh and of course it doesn't mean that the guidelines should be broken all the time just because one thinks this kind of stuff is for "idiots|assholes". When working as a team, we need to adhere to common way of working for most of the time.
The real question is how do we prevent best practices to be perverted and I fear the answer is having the right people in the right places. The one best practice to rule them all: Have knowledgeable balanced people who know when to break the rules.
That was always a bad idea. Often the best choice in one context would be a bad choice in other contexts. You don't want an engineer in a 20-person startup making decisions like they're at Google, or vice-versa. You have to take responsibility for deciding what's best for a particular problem in a particular context, without looking for the cover of "everybody does it this way."
But as I got more senior, when asked by less experienced developers the best way to do something, my answers tended to increasingly start with: "Well, it depends...".
And that's the thing, there is no universal best practice for everything, there are solutions which are more often than not good enough, but as all solutions tend to be a trade off favouring a particular scenario over another, sometimes they're also highly inappropriate to the particular circumstances.
Another term for someone trying to blindly apply supposed best practices is "cargo culting".
In summary, there is lot's nuance to software development and the particular circumstances it's being applied to meaning that you need to understand the trade offs of particular solutions to see which one makes the most sense for a particular situation.
1. Advice which worked in one situation - “we tried Agile, everyone should use it!”
2. Proven stuff like code review, which you call best practices when begging your org to implement it: “please let’s do it, I can clearly see how this will improve our org.”
These 2 examples represent locations on a spectrum: let’s call it “provenness”.
The author’s problem boils down to subjectivity - everyone positions different practices in different places on the provenness axis. The upshot of that is when one person says “we should do this, it’s obvious and/or it’ll definitely help” another person hears “we tried it once, you should try it too!” and then everyone has a bad time.
Then it gets confounded by everyone calling everything best practices - no matter how long ago or unproven the practices might be.
What would be handy is some generally agreed-upon yardsticks for graduating practices into or out of best practices status, plus better terminology to cover the spectrum of provenness so more sophisticated discussions can be had that account for the nuance and we don’t all talk past each other..
But then analyst companies wouldn’t get to make their glossy 2x2 charts, so it probably won’t happen.
Only in the last one you can either understand the reason, or ignore the rule because it doesn't apply to your situation.
Telling other people what to do is human's favorite thing. Give a man an opportunity to make and/or enforce rules, and you have a very happy man. People dedicate their whole lives to reaching stage after stage in their rule-making game.
And note I don't mean stable as in, not crashing. I mean it as not changing.
For a while, this was doable with java. For its warts, it gave a good foundation. Industry practice got caught up in start up enthusiasm, though, and that went out the window.
Similar could probably be said for Windows. I was not a fan of its model, but it provided a stable base for business apps for a long time.
If we can agree that most large, financially successful software projects are of questionable quality, then either
- they used best practices and yet they still suck, OR
- they did not use best practices, but are widely successful anyway.
So no matter how you look at it, software best practices just haven't panned out.
I still like Postel's law, but don't imagine for a second it's a 'law' with the kind of authority the article implies, and didn't enjoy the article's slime about people who like it!
It is an error to throw out every best practice and reconsider everything just as it is to blindly follow the best practices.
IMO it’s best to apply the best practices by default and question them as they are used. Essentially trust but verify. Assuming they are right most of the time you get the benefit of speed. Assuming some of them are wrong, it still leaves room to course correct.
In an always-evolving technology landscape, it feels like a better representation of what we're doing, and it helps prevent the kind of dogmatic stubbornness that forms around the word "best".
The problem is that a lot of true things in the world are counter-intuitive. So insisting that all the rules "make sense" in an immediate way is clearly a non-starter. In the safety industry there are many examples of best practices that are bred from experience but end up being counter-intuitive to some. For instance, it might not make intuitive sense that a pilot who has gone through a take-off procedure thousands of times needs a checklist to remember all the steps, but we know that it actually helps.
It's hard because there is usually some information loss in summarisation, but we also have limited memory, so we can't really expect people to remember every case study that led to the distilled advice.
As a chemical engineer by training, though, I have constantly been amazed at how resistant software people are to the idea that their industry could benefit from the kind of standardisation that has improved my industry so much.
There are too many languages, too many tools, too many (conflicting) conventions (especially ones designed by committee), and too many options.
Having systems, tools, and components that don't change often with respect to compatibility and are formally-verifiable far beyond the rigor of seL4 such that they are (basically) without (implementation) error would be valuable over having tools lack even basic testing or self-tests, lack digital signatures that would prove chain-of-custody, and being able to model and prove a program or library to a level such that its behavior can be completely checked far more deeply in "whitebox" and "blackbox" methods for correctness would prove that some code stand the test of time. By choosing lesser numbers of standard language(s), tool(s), and component(s) it makes it cheaper and easier to attempt to do such.
Maybe in 100 years, out of necessity, there will be essentially 1 programming language that dominates all others (power law distribution) for humans, and it will be some sort of formal behavioral model specification language that an LLM will generate tests and machine code to implement, manage, and test against.
I ask, because the intra-organizational dynamics of software have been ugly for standardization. Vendor lock-in, submarine patents, embrace-and-extend, etc. have meant naive adoption of "best practices" meant a one-way ticket to an expensive, obsolete system, with an eventually insolvent vendor.
The magnitude of impact also means that many industrial plants fall under government regulation, and in the safety field specifically there is a lot of knowledge sharing.
I think there is also a component about the inflexibility of real matter that factors into this. It's much harder to attach two incorrectly sized pipes together than it is to write a software shim, so the standardisation of pipe sizes and gets pushed up to the original manufacturers, where it also happens to be more economical to produce lots of exact copies than individually crafted parts.
I suspect we would have a defined Software Engineering profession if there were only a few dozen vertically integrated firms.
The author seems to be arguing for nuance: that these “laws,” require context and shouldn’t be applied blindly. I agree.
However they shouldn’t be rejected out of hand either and people recommending them aren’t idiots.
Update: one problem with “best practices,” that I think the article might have unwittingly implied is that most software developers aren’t aware of SWEBOK and are repeating maxims and aphorisms they heard from others. Software development is often powered by folklore and hand waving.
https://ieeecs-media.computer.org/media/education/swebok/swe...
But, you know, I want the whole ordeal. I want the SWEBOK, not the "how to read the SWEBOK". Where can I find it?
> In 2016, the IEEE Computer Society kicked off the SWEBOK Evolution effort to develop future iterations of the body of knowledge. The SWEBOK Evolution project resulted in the publication of SWEBOK Guide version 4 in October 2024.
So the thing called "SWEBOK Guide" is actually the reference text for SWEBOK.
The actual BOK isn't supposed to have a concrete representation. It's not supposed to be standardized either, but standard organizations always ignore that part.
What people usually call "state of the art" is the best knowledge that is reasonably well known. That is out of scope. If you take a look on this one, it's full of stuff that we knew not to use on the 20th century. This is typical.
Most best practices that I have been told about were low local maxima at best, and very harmful at worst.
If someone quotes a best practice to you and can't cite a convincing "why", you should immediately reject it.
It might still be a good idea, but you shouldn't seriously consider it until you hear an actually convincing reason (not a "just so" explanation that skips several steps).
This whole thing is really silly and obvious.
Of course you shouldn't blindly follow advice without thinking. But not following advice just because it might not always be right is also a bad idea.
My advice: In general, you should follow good advice from experienced people. If enough experts say this is the best way to do something, you should probably do that, most of the time.
But that advice will never trend on HN because it isn't clickbait or extreme, and requires using your noggin.
Whenever a "best practice" or "convention" has been presented to me, that is how it has been framed. (...it is best practice, therefore, it will definitely benefit you to follow it)
In many work places either they do not have time or at least think they do have time to think things through 100% for themselves from first principles so they depend on best practices instead.
That makes sense to me and I would expect better results on average with using best practices than rejection of best practices in the above context.
That said I try to work on things where I am not always in the above context, where thinking things through end to end provides a competitive advantage.
There are plenty of them that help us write concurrent code that avoids common deadlock situations without having to resort to writing proofs every time. Someone already did the work and condensed it down into a rule to follow. Even if you don’t understand the underlying proof you can follow the rule and hope that everything will shake out.
What I find we struggle most with is knowing when we actually need to write the proof. Sometimes we bias ourselves towards best practices and intuition when working it out formally would be more prudent.
It’d be ideal if you could identify when it doesn’t work. But in the absense of that applying it everywhere is still a net positive.
This matches my experience, though sometimes they indeed will be helpful, at least after some consideration.
> If someone quotes a best practice to you and can't cite a convincing "why", you should immediately reject it.
In certain environments this will get you labeled someone who doesn't want to create quality software, because obviously best practices will lead to good code and not wanting to follow those practices or questioning them means that you don't have enough experience or something. Ergo, you should just apply SOLID and DRY everywhere, even if it becomes more or less a cargo cult. Not that I agree with the idea, but that way of thinking is prevalent.
(not that I agree with that, people just have that mindset sometimes)
Never mind that AWS recommends what is good for AWS, not us.
If everyone follows that then every decision will be bikeshedded to death. I think part of the point of the concept of "best practices" is that some ideas should be at least somewhat entrenched, followed by default, and not overturned without good reason.
Ideally your records of best practices would include a rationale and scope for when they should be reexamined. But trying to reason everything out from first principles doesn't work great either.
Although I also do work in fintech and well... card payment systems are messy. The legal framework covers liability for when actors send bad data but your system still has to parse/process/accept those messages. So you need some leniency.
It does drive me up the wall sometimes when people will hand-wave away details and cite maxims or best-practices... but those are usually situations where the details matter a great deal: security, safety, liveness, etc. People generally have the best intentions in these scenarios and I don't fault them for having different experience/knowledge/wisdom that lead them to different conclusions than I do. They're not idiots for suggesting best practices... it's just a nuisance.
That's what I mean about the rejection being too strong. It should be considered that best practices are often useful and helpful. We don't have to re-develop our intuitions from first principles on every project. It would be tedious to do so. But a healthy dose of skepticism should be used... especially when it comes to Postel's Law which has some decent research to suggest avoiding it.
There was recently a HN thread about it: https://news.ycombinator.com/item?id=41907412
Also it shouldn't be taken for granted that best practice is always "best/good" - there definitely are idiots recommending best practices.
SWEBOK seems the opposite of that. A body of knowledge is not at all the same thing as a best practice. The only unapologetic best practice in SWEBOK is that professionals should be familiar with every topic in SWEBOK. Definitely not that you _should_ do everything in the book.
The book is quite sophisticated in this. It explicitly separate the technical knowledge from the judgments of which, when, and where to apply it. Most occurrences of "best practices" in the text are quoted, and are references to other works and describe the need to choose between different best-practice libraries depending on context. Others are part of a meta-conversation about the role of standards in engineering. Very little of SWEBOK is promoted as a "best practice" in itself.
Here's a quote from SWEBOK v4, 12-5
> Foremost, software engineers should know the key software engineering standards that apply to their specific industry. As Iberle discussed [19], the practices software engineers use vary greatly depending on the industry, business model and organizational culture where they work.
In my view best practices emerge from a body of knowledge (or sometimes from the practice and wisdom of others that haven't been documented/accepted/etc yet) and are "shortcuts."
I'm not defending Postel's Law; I agree that, after years of practice and research, it leads to security issues and surprises.
However, the point is that these kinds of principles don't arise out of people's heads and become accepted wisdom for nothing; they're usually built off of an implied (or explicit) body of knowledge.
Does that make sense?
But SWEBOK is very clear that "best practices" are context specific - they are radically different forces and solutions in video games as compared to chemical engineering control systems. There's no such thing as a "best practice" absent a context. The footnotes in SWEBOK point off in a million directions saying "go look over there for best practices for YOUR context".
And to be fair, the best practices for designing a bridge or a skyscraper are not the same ones for designing a doghouse.
Recently I was told that Hungarian notation was "best practice" and I must use it.
The author might go on to make other points that are worth discussing, but lays out his supporting arguments clearly in the opening paragraph. Best practices do not necessarily do harm because they offer bad advice, they do harm because they are advocated for by zealots and the inexperienced.
My first reaction is how unfortunate it is that this particular developer has found himself in the presence of bad engineers and the inexperienced.
But then, the argument is automatically self-defeating. Why is the rest of the article even worth reading, if he states upfront what his arguments are and those arguments are very easy to refute?
It is deeply irrational to judge the merits of an idea based solely on who is advocating for that idea.
My advice to the author is to reflect on the types of positions that he accepts, the ones that have him so put off by the people that he works with that he is openly writing about abandoning what he admits could be sound engineering practice, solely based on who that advice is coming from and how it is being delivered.
Developing software is complicated. It is constant problem solving. When solutions to problems come about, and we abstract those solutions, it is quite easy for individuals to misapply the abstraction to an inappropriate concrete. To drop context and try to retrofit a lousy solution because that solution was appropriate to a slightly different problem. But at the end of the day, these abstractions exist to try and simplify the process. Any time you see a "best practice" or design pattern acting as a complicating force, it is not doing its job. At that point you can either be objective and exercise some professional curiosity in order to try and understand why the solution adopted is inappropriate ... or you can take the lazy way out and just assume that "best practices" are the opinions of zealots and the inexperienced who blindly follow because they don't know any better.
It's not very hard to weigh a suggestion. speculate about its costs, benefits and risks.
I think the point is that blindly suggesting "best practices" often is bad advice.
It's a common form of bikeshedding—it allows someone to give their casual two cents without doing the hard work of thinking through the tradeoffs.
I think that’s what this article is basically saying. And I agree.
In every case where you want to say “best practice” there is a better alternative, which is to say “practice.” The concept of best is never needed or warranted, because practices are not subject to rigorous testing and review.
I have been an independent consultant and trainer since 1999 and not once have I taught or recommended a best practice.
I do have many opinions. I call them: opinions. I think my opinions are the best, but I can’t think of any reason that anyone else beyond my wife and dog should think so.
> “Don’t Repeat Yourself” (DRY) is basically good advice, but sometimes just copy/pasting things is just the more pragmatic thing to do, and not really a big deal.
Duplicating code on purpose is not about being pragmatic, it's about recognizing when DRY would violate the single responsibility principle.
The ability to weigh tradeoffs in context is what makes some engineers better than others.
It sounds to me like they did understand the tradeoffs. But that they were being brow-beaten to apply "best practices" that were inapplicable because of the tradeoffs.
https://respectfulleadership.substack.com/p/dan-morena-is-a-...
My summary of his idea:
No army has ever conquered a country. An army conquers this muddy ditch over here, that open wheat field over there and then the adjoining farm buildings. It conquers that copse of lush oak trees next to the large outcropping of granite rocks. An army seizes that grassy hill top, it digs in on the west side of this particular fast flowing river, it gains control over the 12 story gray and red brick downtown office building, fighting room to room. If you are watching from a great distance, you might think that an army has conquered a country, but if you listen to the people who are involved in the struggle, then you are aware how much "a country" is an abstraction. The real work is made up of specifics: buildings, roads, trees, ditches, rivers, bushes, rocks, fields, houses. When a person talks in abstractions, it only shows how little they know. The people who have meaningful information talk about specifics.
Likewise, no one builds a startup. Instead, you build your startup, and your startup is completely unique, and possesses features that no other startup will ever have. Your success will depend on adapting to those attributes that make it unique.
Many of my interactions are with electronic systems deployed by companies or the state. It's rare that I deal with an actual person a lot of the time (which is sad, but that's another story).
They're best practice aiming, shooting, walking, communicating, hiring (mercs), hiding, etc...
The people that are in the weeds are just doing the most simple things for their personal situation as they're taking over that granite rock or "copse of lush oak trees".
It's easy to use a lot of words to pretend your point has meaning, but often, like KH - it doesn't.
When all things are the same, the army with more will win.
When all things are not the same, there are little bonuses that can cause the smaller/poorer, malnourished army to win against those with machine guns. Often it's just knowing the territory. Again though, these people are individually making decisions. There isn't some massively smart borg ball sending individual orders to shoot 3 inches to the left to each drone.
> That doesn't defeat my point- is the smaller/poorer army using best practices?
I don't agree, but neither do I disagree. But I do think it is ambiguous enough that it is not using best practices to illustrate the point you intend. > malnourished army to win against those with machine guns
With my example I meant literal birdsI’m pretty sure building an organization on a free for all principle is anathema to the idea of an organization.
"Do X because it's best practice" is very different than "do X because you were commanded by your rightful authority to do so."
> No army has ever conquered a country
Napoleon and his army would like to have a word with you…I get the analogy but I think it can be made a lot better, which will decrease people who dismiss it because they got lost in where the wording doesn’t make sense. I’m pretty confident most would agree that country A conquered country B if country B was nothing but fire and rubble. It’s pretty common usage actually. Also, there’s plenty of examples of countries ruled by militaries. Even the US president is the head of the military. As for army, it’s fairly synonymous with military, only really diverting in recent usage.
Besides that, the Army Corp of engineers is well known to build bridges, roads, housing, and all sorts of things. But on the topic of corp, that’s part of the hierarchy. For yours a battalion, regiment, company, or platoon may work much better. A platoon or squad might take control of a building. A company might control a hill or river. But it takes a whole army to conquer a country because it is all these groups working together, even if often disconnected and not in unison, even with infighting and internal conflicts, they rally around the same end goals.
By I’m also not sure this fully aligns with what you say. It’s true that the naive only talk at abstract levels, but it’s common for experts too. But experts almost always leak specifics in because the abstraction is derived from a nuanced understanding. But we need to talk in both abstractions and in details. The necessity for abstraction only grows, but so does the whole pie.
>> Also, there’s plenty of examples of countries ruled by militaries. Even the US president is the head of the military
Maybe I should have reversed the order of these two. I didn't intend to use the US as an example of a country ruled by a military but rather that military is integral and connected directly to the top.I think we can all agree that if that is the case, you’ve in fact conquered nothing.
Edit: Since we say opposite things, maybe we wouldn’t agree.
For a startup, winning "battles, not wars," is what you need, because you have finite resources and have an exit in mind before you burn through them. For a large enterprise, "winning wars not battles" is important because you have big targets on your back (regulators, stock market, litigation).
One might paraphrase the whole shooting match with the ever-pithy statement that premature optimization is the root of all evil.
Most things of any value are abstractions. You take a country by persuading everyone you've taken a country, the implementation details of that argument might involve some grassy hill tops, some fields and farm buildings, but its absolutely not the case that an army needs to control every field and every grassy hill top that makes up "a country" in order to take it. The abstraction is different to the sum of its specific parts.
If you try to invade a country by invading every concrete bit of it, you'll either fail to take it or have nothing of value at the end (i.e fail in your objective). The only reason it has ever been useful or even possible to invade countries is because countries are abstractions and it's the abstraction that is important.
> The real work is made up of specifics: buildings, roads, trees, ditches, rivers, bushes, rocks, fields, houses.
Specifics are important - failing to execute on specifics dooms any steps you might make to help achieve your objective, but if all you see is specifics you won't be able to come up with a coherent objective or choose a path that would stand a chance of getting you there.
There was a culture of best practices zealots that had an uncanny resemblance to organized religion. All the answers have been written down for us, and everything that goes wrong is because of arrogant, misguided people who decided to trust their own fallible thoughts instead. Their faith was strong enough to survive any project, no matter the outcome.
- small localized team vs big distributed team
- bug fixes and incremental improvements vs green field poc
- saas vs system scripts
Context matters, and if people aren't giving you the context in which they deem practices to be "best", they are myopically wrong
I am not seeing this issue with programmers in general or with my coworkers, with the exception of those who in general have a hard time collaborating with others.
So my question was/is if you discount the above exception are people seeing a problem with programmers/coworkers not taking context in to account? I have not noticed a wide spread issue and I am interested in how prevalent you, and others, perceive the issue to be.
I'd like to point out I never called it a problem. I said that was a judgement call for you to make. We all have harmless biases.
But yeah, it can be a problem. If I have an engineer derailing my team because of his insistence for svelte, and can't read the room: ie can't take any of the context of the business, stack, domain, team, into his consideration, then yeah, it becomes a problem. Time is money
(svelte isn't a good example, it's not a best practice per se. s/svelte/TDD/)
I would describe this someone who does not know how to collaborate, maybe they don't know the balance they need between give and take, maybe they do not know how to format their ideas so they are understood by the group, maybe there is some fundamental misunderstanding. Since the tool of collaboration is not working for them, they reach for other tools to leverage and achieve their goals, like argument by authority via a convenient best practice.
The best practice/standard was not the issue, lack of context for the best practice was the the issue, the lack of collaboration or ability therein is the issue.
I'm actually not even convinced that this is a good rule. When it's explained it makes sense on the surface. I also think that since beginners to programming use global variables, they come to the conclusion that it must be universally bad since that's what their beginner self did.
Having work on codebases with most state being being stored as global variables (in a single namespace at that) is manageable at worst and easy to develop on at best, assuming certain conventions are followed.
Prospects and customers desperately want to know our "best practices" and then complain when we say "it depends" or something experimentation is required, as if we are hiding secret teachings from them.
For me this is more a personality test: people who just want solutions on a silver platter vs DIYers who want to tinker and distrust black boxes.
One thing I learned after many years working in consulting is that, more often than one would believe, best practices are just a compilation of whatever could be found (hopefully at least common practices, more often "things I could find that were minimally documented to be reusable"), with no serious analysis of their claim of superiority other than them being common.
So, first thing: learn to challenge the claim of "best". Best for whom? Under what context? What other not-so-good practices are out there, and why is this the best?
Second:if it's documented and evident enough to be treated as a best practice, it's probably fairly common knowledge already. Barring the commonality of really bad things being done out there, don't expect that you'll become much more than mediocre by adopting best practices. By the time they get to be called there, they are no longer any competitive advantage, more a basic thing you should be doing already - assuming they are indeed best practices (as per my previous point).
It's not that I'm against best practices as a concept, or compiled bodies of knowledge. Just don't expect them to do more than keep you somewhere in the middle. True leadership and innovation lies where best practices have not been established yet - together with all the dangers and mistakes you can make on uncharted waters.
In a golden path, lots of others have gone before you and figured out all the nuance. But this doesn't mean the path is the best one for you, but does mean you should have a good reason for starying from it
Buddy that’s not a reason, that’s a rationalization.
In many cases, being predictable is better for future maintenance than forging your own path, even if the custom solution is "better" against all (current) metrics.
Put another way, knowledge is knowing best-practices, but wisdom is knowing where and when to apply them. Unfortunately, most building software have only the knowledge and there is too little consideration for the fact that structure is not free, and considerations must be made for when velocity is the primary objective vs safety and adherence to best-practices.
It all comes down to architecture design in the end.
https://en.wikipedia.org/wiki/Wikipedia:Emerson_and_Wilde_on...
(This is relevant to the extent that programming is as much art as science/engineering.)
[1]https://www.joelonsoftware.com/2009/09/23/the-duct-tape-prog...
I've worked too often with people who think they know better
They do not
Straw men do not change the fact that "best practices" , especially the ones quoted, are very good.
No sensible person is saying "never use globals". We caution you to think very carefully before you do, only for truly global state.
I am suffering from inherited code, written by a very good programmer, who got lazy with globals and comments. Too many of the former, too few of the later. What a nightmare
This article is arrant nonsense
Maybe so, but still, plenty of people are saying it.
So even your comment disagrees with your claim that "best practices" are very good.
I tried to have the grammar checked by chatgpt but it was too challenging
It annoys me to no end when devs talk about some specific technical change "increasing accessibility". The accessibility best practices are used as a checklist where more checks = more accessibility points = better. It results in people gaming the score with meaningless repetitive metadata or low-impact tweaks, rather than actually improving the descriptive/alternative/touch/visual interface. Usually never even trying any alternative method of interacting with an interface.
The best practice is "always include metadata", but it leaves off "... that adds context about the element rather than noise, and integrates with a surrounding application that uses consistent metadata labelling. Remember, this is a portion of a complete descriptive interface someone has to use."
These best practices being driven into people's brains verbatim means conversations devolve into inane on-or-off taxonomy discussions like "is this colour accessible?" or "have we added accessibility? Do you need a ticket for that?" where pushing back isn't seen as caring about users, it's seen as being "against accessibility".
https://graypegg.com/2023/11/25/the-private-definition-of-ac...
If you take it as a given that some number of people are going to get an idea lodged in their head, treat it like gospel, and beat as many other people in the head with it as they can... the best strategy you can adopt is to have the ideas in their head be at least somewhat useful.
Yes, reasonable people understand that "best practices" come with all sorts of context and caveats that need to be taken into account. But you won't always be dealing with reasonable people, and if you're dealing with an asshole, zealot, or idiot, I'd sure as hell prefer one who blindly believes in, say, test-first development versus believing that "test code isn't real code, you should spend all of your time writing code that ships to users" or some other even worse nonsense.
If much of our industry is new, evangelizing these rules as harder and faster than they are makes a lot of sense to bring people to get people ready for the next stage. Then they learn the context and caveats over time.
In my mind, this author is merely signaling software counter-culture, some of which I agree with, others I don't. And the people whom you describe above are signaling software culture, in a hostile and domineering way.
And of course, these two sides are not impermeable, forever persistent: culture and counter-culture shift constantly, often outright reversing from one another on a roughly 30 year timeline. And, they both have important things to offer. Best practices are best practices for a reason, and also telling stuffy people to chill out when they're focused so hard on best practices that they lose the plot of what the org is actually attempting to accomplish is also important.
"Best practices" is a really good tool if you use it in the correct context.
Most of these "best practices" have valid opposing camps anyway. There's DRY, but there's also "a little copying is better than a little dependency". Both are valid in different contexts.
I don't think that the issue is with "best practices," or any other type of dogma.
I think the main issue, is that companies tend to hire folks that aren't especially skilled at what they do, and rely on excessive structure, to compensate, or that they don't stay around, long enough, to get comfortable with the structure.
This can apply to both newer folks, who don't understand the system well enough to effectively deviate, and ones with a lot of experience, who have allowed themselves to get so hidebound, they are afraid to deviate.
As I have gotten older, wiser, and more battle-scarred (often, from self-inflicted injuries), I have learned that "It Depends™" is the only true mantra for my work.
Usually, best practices/dogma/structure becomes important, when the codebase is being worked on by a team, and when there's the need to coordinate work between teams.
There's some type of work that just can't be done, without structure. I've done that type of work. Other work can be killed by too much structure. I've done that kind of work, as well.
- Use source control?
- Have build automation?
- (at least some) automated testing?
- Protect data structures accessed concurrently with a mutex?
- Have backups?
I wouldn't say there isn't some imaginary situation where you'd do something different, but it's safer to fall back to doing the above in most situations.
That said many people make up "best practices" or use them without understanding the circumstances to which they apply.
Books like Code Complete are useful tools but not bibles.
Just like everything on the internet: it's just another person's opinion. What matters is what works (i.e. makes you money).
for example:
- use source control
- don't put spaces in file names
- camelcase is better than underscores in variables
- vi not emacs
some are really best practices, some are controversial choices someone is pushing.
Some places I've worked with good policies deliberately left some open. And made choices earlier on troublesome ones like spaces not tabs.
I think the choosing or defining vs not choosing is what companies should do to define themselves.
But if I can't understand why I should do that instead of something else I've thought of, then I'll do it my way thank you.
As always it's a matter of context. If you have a low level language like C without any automatic cleanup having an out label to goto which cleans up in all cases makes a lot of sense.
There's always some reasoning behind those rules, sadly rarely talked about directly. You should check if that reasoning applies to you and weigh the benefits against the costs. A lot of time it's just doing what everyone else is doing, which is a very fine reason, as it makes onboarding new developers easier.