C is not a Low Level Language
18 points by skadamat 7 days ago | 10 comments
  • Narishma 7 days ago |
    (2018)
  • pull_my_finger 6 days ago |
    I have taken a few stabs at programming with C, and every time I get the language concepts down fine, and hit a brick wall dealing with linking/loading/compilation. I've found a book that deals with linking and loading in detail, finally, but I always wondered why this crucial step is almost always completely handwaved, or if it's mentioned, you'll be given commands to run with little to no explanation. There is a whole ecosystem of toolchains, many old and very particular in their incantations, that are REQUIRED to do anything with C (and friends), it's crazy to me that it never gets the attention it needs or really has improved at all. Is this something they cover in CS universities in depth or do they just tell you to let the IDE take care of it?
    • cuteboy19 6 days ago |
      IDEs? mine just used online compilers
    • trealira 6 days ago |
      At my university, when they taught us C++ (and later C), they told everyone to log into this central server running Linux. They taught everyone to use Emacs and write Makefiles, and then we were supposed to be able to do all our assignments like that. They did briefly explain object files and linking while explaining how to write Makefiles.

      However, in practice, most people figured out an IDE setup on their own time and would run the code on their own computer.

    • sim7c00 5 days ago |
      theres good books on linking and loading but figuring out how the toolchains really work (gcc, ld etc.) can definitely be a pain. the tools.do provide all you need for it but you need to find out how to extract that from them. also a l9t is dependent on your knowledge about platform, abi, filetypes etc. etc. so reading specifications for those also helps to decypher meaning of possible commands, link files etc.

      why its mostly skipped is i think because most platforms that provide the tools will try to also provide sane defaults for this stuff, so you rarely need it. you dont need to invoke LD if GCC can do it for you etc.

    • nineteen999 5 days ago |
      Linking? Open the Makefile and examine the LDFLAGS variable. There you will usually see the "-L" flag specifying the path to the libraries to be linked, and the "-l" flag specifying each library to link. You can specify multiples of both of these. The compiler/linker are usually configured by default to include certain paths automatically (eg. /lib, /usr/lib) so you usually don't need to specify those paths with -L.

      It can get a bit more complicated if you want to use DSO's (eg. dynamically loading DLL's at runtime rather than linking at link time), but not a whole lot more.

      Compilation? You probably want to understand GNU make pattern rules: https://www.gnu.org/software/make/manual/html_node/Pattern-R...

      Loading? That's generally the OS kernel's responsibility, as long as you're executing a properly formatted executable that has been compiled and linked correctly you shouldn't have do do anything magic.

      There's really not a whole lot to it, but you're correct that the information is not always provided in one place, since not every build environment for every platform uses GNU make. However it really is pervasive in the embedded space and in many OSS C/C++ projects, so it's well worth learning. Most other C/C++ build systems aim to improve on it, but I usually can't be bothered with the additional build dependencies.

    • bcrl 3 days ago |
      Linking is a skill that's worth picking up if you ever intend to do system level things. I ended up learning the basics in a few different projects: back in the days of 8 bit systems using DOS and real mode 8086 applications, segment overlays were a common way of dealing with the lack of RAM for holding binaries and data. As a result, you had to learn how to tell the linker which pieces of code would overlay others and what was shared. One of the projects written in QuickBASIC that I was exposed to had to do this because of the amount of business logic that existed in that space.

      As a curious systems minded person, at one point I wrote my own port of the Amiga exec.library and a few device drivers that ran in x86 real mode in Microsoft C with a bit of inline assembly. It was necessary to do some linker magic to get things in the right place to be loaded from a custom boot sector. The whole thing did all the task switching and message passing that was done on the Amiga, and was a great learning experience. I spent many, many hours disassembling the Amiga's ROM and other libraries to follow the code paths and reverse engineer exactly how the system worked. There were lots of cool details in the internals that dynamically discovered drivers and libraries at run time inside the ROM. The Amiga was a great learning platform with preemptive multitasking, Autoconfig and so many other nifty features.

      Learning GNU binutils came about in part from working on the Linux kernel, as well as building a system from scratch during the a.out to ELF transition in the mid 1990s. The info pages are very thorough. That experience was invaluable, and combined with everything else was a great way of gaining the skill set needed to work on embedded systems in my later career.

      IDEs are a great tool, but nothing beats the experience of putting the bits in the right spot yourself. The university compilers and OS courses will cover a lot of concepts, but not necessarily all of the open source tools available. If that isn't something you've gone through, I'd recommend looking at Linux From Scratch as a good starting place if that piques your interest. Knowing how to use command line tools like awk and sed, editors like vim and emacs, bash, tcsh, ssh, make.... They all take time to learn how to use, but they're powerful tools that let you do things that would take forever using a GUI (I find controlling a computer with the keyboard is faster than taking my hand off the keyboard to move the mouse).

      There's no better way to solidify learning about programming than to try to bulid something from nothing, like writing a compiler for a language with a hand written recursive descent parser or implementing a routing protocol based on an IETF RFC. Every project contributes to a body of knowledge about what approaches to writing software work well and what becomes a huge time sink. You have to have experience to intuit what data structures fit when solving a particular problem.

      For me many of these experiments were done in C because that's what existed in the Linux open source ecosystem. With the power of computers today compared to 30 years ago, languages like python are Good Enough for many problems, but python is never going to expose you to what really goes on under the hood. It was somewhat easier growing up in the 1980s as we were forced to use a whole bunch of different platforms from the 8 bit 6502 using in the Apple ][, Vic 20 and Commodore 64 to the 16/32 bit 68000 in the Amiga and then on to the 32 bit i386/i486, and there were great magazines like Byte that had loads of in depth technical content on these platforms as they were introduced every month. There's a lot more information available at our fingertips today via the internet, but the curated magazines of the 1980s and early 1990s exposed one to many things that you wouldn't necessarily think of looking for.

      Ah.... I'm getting old and raveling in the nostalgia of youth!

  • nivertech 6 days ago |
    C is a high-level assembly for PDP-11
    • bcrl 3 days ago |
      This is a comment I've seen a number of times, but I really don't agree with. All C is constrained by the compiler that you are using, and there is no way to ensure code written today will work exactly as expected 20 years down the road. The only thing constant in the tech industry is that change is omnipresent, and using computers means adapting to that constant change.
      • nivertech 3 days ago |
        C pointer arithmetics is a copy-cat of the PDP-11 Assembler’s addressing modes[1]

        So it was either inspired or designed for PDP-11, making it cross-platform was afterthought.

        It wasn’t designed from the first principles, and it’s unable to express different, newer, and/or more advanced low-level features.

        > there is no way to ensure code written today will work exactly as expected 20 years down the road

        It’s possible with a language where you specify computations in a declarative way, instead of using increments/decrements, pointer arithmetics, branching and loops.

        —-

        1. https://en.m.wikipedia.org/wiki/C_(programming_language)#New...