However, in practice, most people figured out an IDE setup on their own time and would run the code on their own computer.
why its mostly skipped is i think because most platforms that provide the tools will try to also provide sane defaults for this stuff, so you rarely need it. you dont need to invoke LD if GCC can do it for you etc.
It can get a bit more complicated if you want to use DSO's (eg. dynamically loading DLL's at runtime rather than linking at link time), but not a whole lot more.
Compilation? You probably want to understand GNU make pattern rules: https://www.gnu.org/software/make/manual/html_node/Pattern-R...
Loading? That's generally the OS kernel's responsibility, as long as you're executing a properly formatted executable that has been compiled and linked correctly you shouldn't have do do anything magic.
There's really not a whole lot to it, but you're correct that the information is not always provided in one place, since not every build environment for every platform uses GNU make. However it really is pervasive in the embedded space and in many OSS C/C++ projects, so it's well worth learning. Most other C/C++ build systems aim to improve on it, but I usually can't be bothered with the additional build dependencies.
As a curious systems minded person, at one point I wrote my own port of the Amiga exec.library and a few device drivers that ran in x86 real mode in Microsoft C with a bit of inline assembly. It was necessary to do some linker magic to get things in the right place to be loaded from a custom boot sector. The whole thing did all the task switching and message passing that was done on the Amiga, and was a great learning experience. I spent many, many hours disassembling the Amiga's ROM and other libraries to follow the code paths and reverse engineer exactly how the system worked. There were lots of cool details in the internals that dynamically discovered drivers and libraries at run time inside the ROM. The Amiga was a great learning platform with preemptive multitasking, Autoconfig and so many other nifty features.
Learning GNU binutils came about in part from working on the Linux kernel, as well as building a system from scratch during the a.out to ELF transition in the mid 1990s. The info pages are very thorough. That experience was invaluable, and combined with everything else was a great way of gaining the skill set needed to work on embedded systems in my later career.
IDEs are a great tool, but nothing beats the experience of putting the bits in the right spot yourself. The university compilers and OS courses will cover a lot of concepts, but not necessarily all of the open source tools available. If that isn't something you've gone through, I'd recommend looking at Linux From Scratch as a good starting place if that piques your interest. Knowing how to use command line tools like awk and sed, editors like vim and emacs, bash, tcsh, ssh, make.... They all take time to learn how to use, but they're powerful tools that let you do things that would take forever using a GUI (I find controlling a computer with the keyboard is faster than taking my hand off the keyboard to move the mouse).
There's no better way to solidify learning about programming than to try to bulid something from nothing, like writing a compiler for a language with a hand written recursive descent parser or implementing a routing protocol based on an IETF RFC. Every project contributes to a body of knowledge about what approaches to writing software work well and what becomes a huge time sink. You have to have experience to intuit what data structures fit when solving a particular problem.
For me many of these experiments were done in C because that's what existed in the Linux open source ecosystem. With the power of computers today compared to 30 years ago, languages like python are Good Enough for many problems, but python is never going to expose you to what really goes on under the hood. It was somewhat easier growing up in the 1980s as we were forced to use a whole bunch of different platforms from the 8 bit 6502 using in the Apple ][, Vic 20 and Commodore 64 to the 16/32 bit 68000 in the Amiga and then on to the 32 bit i386/i486, and there were great magazines like Byte that had loads of in depth technical content on these platforms as they were introduced every month. There's a lot more information available at our fingertips today via the internet, but the curated magazines of the 1980s and early 1990s exposed one to many things that you wouldn't necessarily think of looking for.
Ah.... I'm getting old and raveling in the nostalgia of youth!
So it was either inspired or designed for PDP-11, making it cross-platform was afterthought.
It wasn’t designed from the first principles, and it’s unable to express different, newer, and/or more advanced low-level features.
> there is no way to ensure code written today will work exactly as expected 20 years down the road
It’s possible with a language where you specify computations in a declarative way, instead of using increments/decrements, pointer arithmetics, branching and loops.
—-
1. https://en.m.wikipedia.org/wiki/C_(programming_language)#New...