• WillAdams 6 days ago |
    The book itself is currently being discussed at:

    https://news.ycombinator.com/item?id=42157558

    Is there a reason why the link goes to the discussion at the bottom of that page rather than the beginning?

    Could this be folded into the other discussion? (I don't see that the link has been posted there yet)

  • bombela 6 days ago |
    The link takes around 10s to render. That's excessive for a text article.
    • defanor 6 days ago |
      And if you have JS disabled by default, it redirects to a page on a different domain name, so you cannot easily allow it in noscrpt just for that website, even if you want to. I gave up on that though; judging by the title, the article is going to be about modelling all the things as functions, as commonly and similarly done with other objects (e.g., sets, categories), which I wanted to confirm, and maybe to nitpick on this perspective and/or the title then (i.e., it is not quite correct to declare everything a function just because you can model or represent things that way).
    • rufius 6 days ago |
      I mean it’s Notion. That’s par for the course.

      What if your text editing and presentation experience was slow and laggy? That’s Notion.

      • yazzku 6 days ago |
        Notion. Delivering value right at your fingertips.
        • hinkley 6 days ago |
          Is that a clever way of saying it’s about as fast as braille?
      • ishtanbul 6 days ago |
        Whats the best corporate wiki platform?
        • rufius 6 days ago |
          Probably a hard question to answer. IME, cultural norms around documentation vary pretty wildly.

          Some orgs I've worked for were very "wiki" driven - there's a big expectation of using Confluence or Notion to navigate documentation. This applies both big (5000+) and small (50+) organizations for me.

          Other organizations I've worked in were very document centric - so you organize things in folders, link between documents (GDoc @SomeDocument or MSFT's equivalent). Those organizations tend to pass around links to documents or "index" documents. Similarly, this applies for both big and small organizations in my experience.

          Of the two, I tend to prefer the latter. Without dedicated editors, the wiki version seems to decay rapidly, especially once the org grows above some size.

          Knowledge management is hard...

    • yazzku 6 days ago |
      46 domains blocked by UBlock Origin, 3 by my own NoScript filter. Seems about right for a "modern" website.

      Edit: also, the pop-up menu on the right side that completely breaks your scrollbar. Putting that UI/UX degree to use.

      • lelandfe 6 days ago |
        Page weight is 7.2MB, 25.6 uncompressed. 110MB heap size. Such extravagant wastefulness.
        • jancsika 6 days ago |
          I wonder if that's large enough to contain an old linux running an old version of firefox and feed that the page content.
        • zahlman 6 days ago |
          And this is for what, a ~100KB header image (most of which is bounding-boxed away) and 24KB of actual text (Markdown source would be only slightly larger)?
        • meiraleal 6 days ago |
          They (tech companies) layoff people to save money but they should be hiring seniors to save them some cloud bills.
    • anonzzzies 6 days ago |
      Notion. Why do people use that stuff? Especially for tech text articles.
      • dustingetz 6 days ago |
        wysiwyg document authoring experience, afaik there are still no alternative publishing platforms with both the flexibility and point click content authoring UX of Notion. Change my view, I’m in the market!
        • hinkley 6 days ago |
          I’m also on the market and this conversation took Notion out of the running.
    • bicx 6 days ago |
      Well, it’s a published Notion site, and Notion is a powerful doc creation platform. It’s not really intended to be a performant publishing tool.
      • llamaimperative 6 days ago |
        Or a performant anything else, AFAICT
      • criddell 6 days ago |
        It’s a performant publishing tool (depending, of course, on your expectations) but it’s not a high performance publishing tool.
        • TeMPOraL 6 days ago |
          It's a performant publishing tool and perhaps even high performance publishing tool - in terms of user effort. What it's not is performant displaying the thing it published.
          • criddell 6 days ago |
            That’s fair. Viewers who don’t know what is serving the page will be disappointed. If you know it’s Notion, then it works about as expected which satisfies the definition of performant.
        • hinkley 6 days ago |
          “Just because you are bad guy doesn’t mean you are bad guy.”
      • hinkley 6 days ago |
        Yeah the guy at my last place that was proud of serving < 2 req/s/core liked to use the world “powerful” too. It’s like it was his favorite word. And he’s on the short list of people I refuse to work with again. What a putz.
        • bicx 6 days ago |
          Well Notion usually exceeds at least 3 req/s/core, so nothing to worry about there
          • hinkley 6 days ago |
            <snerk>

            Well then that’s a relief.

        • AtlasBarfed 6 days ago |
          Powerful, lightweight, configurable, performance.

          These are some of the biggest weasel words of IT. Every one of them has an implicit nature of a comparison word and yet the comparison or any sort of hard metrics are always completely absent in their use.

          • hinkley 6 days ago |
            Yarp.

            Infinite configurability means infinite validation time.

      • zahlman 6 days ago |
        >a powerful doc creation platform

        Which, based on what I see in the rendered archive.is version, is being used to do nothing outside of the normal use of a standard Markdown-based SSG like Nikola or Jekyll.

        Not that doing more would be a good idea anyway.

    • deadbabe 6 days ago |
      That’s why no one reads articles, just headlines.
    • andai 6 days ago |
      https://archive.ph/kcZcY

      Archive seems to "bake" JS sites to plain HTML.

    • ristos 6 days ago |
      The arrow and page up/down keys don't work in any predictable pattern for me, it's really weird. Like I thought it only scrolled up and down with the arrow keys if I press it 4 times, but then page up/down keys don't work no matter how many times I press it, then I focus on the page and it works, but then the arrow keys take 6 times to press before moving, and then I tried the same pattern again, and the arrow keys now take 11 presses before they start moving. Usually a lot of modern apps predictably break the back/forward history buttons and tab focus, but I've never seen anything quite like this. I guess it must be still delivering value though even if the product isn't polished.
      • hinkley 6 days ago |
        I can’t use the scroll to the top gesture in iOS either.

        I guess that just goes to show that the author’s mind was, in fact, blown.

    • uptownfunk 6 days ago |
      Wow it’s really bad.
    • debo_ 6 days ago |
      Maybe it's made entirely of functions.
    • dang 6 days ago |
      "Please don't complain about tangential annoyances—e.g. article or website formats, name collisions, or back-button breakage. They're too common to be interesting."

      https://news.ycombinator.com/newsguidelines.html

  • richrichie 6 days ago |
    Has anyone read the new SICP with Javascript as language of choice?
    • wrycoder 6 days ago |
      Yes. But, I prefer the regularity of the Lisp syntax.
    • 0xpgm 6 days ago |
      Isn't scheme with close to zero syntax so easy to learn?

      Why did someone think it was a good idea to switch to JavaScript?

      I think the person who'll get value out of SICP will not have any problem picking up scheme syntax on the fly.

      • liontwist 6 days ago |
        I agree. Being self contained helps make it timeless. In contrast are books with a CD in the back with an outdated Java compiler you will never be able to setup. And then you have to migrate the snippets yourself.

        If you study any other related field like math or physics you become accustomed to learning a formal system for the context of a particular problem.

        CS students tend to have this weird careerist view where every page just directly help them get a job.

        • SoftTalker 6 days ago |
          Most undergrad CS students want a practical/engineering curriculum. They are not really there for theory, but for a long time that's how CS departments operated, unless maybe you were at an engineering school.

          Schools are so desperate to keep up enrollment numbers today that many have capitulated and are giving students what they want instead of what the faculty thinks they need.

          • tharne 6 days ago |
            > Most undergrad CS students want a practical/engineering curriculum.

            If all someone wants is the practical benefits of programming and has no interest in the underlying theory, they shouldn't waste their their time and money on a CS degree. All the practical information is available for free or at very low cost.

            • SoftTalker 6 days ago |
              But, a lot of employers demand a degree.
              • tharne 6 days ago |
                Maybe so, but we shouldn't be doubling down on expensive and time consuming degrees in the name of ill-conceived credentialism. That hurts everyone except the universities profiting off of it.
                • xelamonster 6 days ago |
                  How does that mean anything to the people who need to be employed to continue living? We're not the ones with the ability to change this.
            • lupire 6 days ago |
              The same applies to CS, so you're missing something else -- skilled tutors and the campus experience.
              • tharne 6 days ago |
                At least in the U.S., many students are paying upwards of a $100k for a four-year degree. That better be one hell of a "campus experience" and some next-level "skilled tutors".

                Call me a hopeless optimist, but I think there's a better way out there.

                • galaxyLogic 6 days ago |
                  How about an AI-tutor? Actual professors don't have time to adapt their teaching to every indfividual studen's knowledge background. But AI might.

                  Universities should start their own AI-tutor development programs, in co-operation with others because, only way AI-tutors can become better is by practice practive practice.

                  So I'n not sure if this is a new viewpoint or not, but it is not only students that need training, it is also teachers who need to be trained more in teaching. AI is all about "training", understanding is about training. Training is the new paradigm for me.

          • liontwist 6 days ago |
            There is a big difference between being practically minded and the allergy to learning anything which doesn’t translate to resume keywords. SICP will teach you more about JavaScript, python, etc than most anything.
          • JTyQZSnP3cQGa8B 6 days ago |
            > They are not really there for theory

            Is that why they are so bad at adapting to foreign languages and frameworks? Maybe they should go back to the basics.

          • wonnage 6 days ago |
            > Most undergrad CS students want a practical/engineering curriculum.

            Somewhat understandable considering that student loans put you into indentured servitude unless you have rich parents. Although I still think they're shortsighted. A good CS graduate should understand that programming languages are just syntactic sugar over the underlying concepts and have little trouble translating/picking up the basics of new languages.

        • lupire 6 days ago |
          You are comparing mathematicians to programmers.

          A more fair comparison is engineering or applied math major, not pure math at MIT.

          • liontwist 6 days ago |
            I dont think so. SICP isn’t abstract algebra, it’s just unlikely to be the exact syntax you will use at your job.

            Engineers rarely do laplace transforms by hand either.

            The book is written for 1st year stem undergrads at MIT. So maybe 2nd or 3rd year at state school.

      • SoftTalker 6 days ago |
        Because knowing scheme isn't going to get you a job at most places. Employers overwhelmingly want JavaScript or Python these days. Trailing that would probably be Java, C++ and C#, and regular old C.

        When I did my undergrad CS degree, the fact that scheme was so heavily used was a common complaint they received from students. It just wasn't a marketable skill.

        • Jtsummers 6 days ago |
          Four year CS degrees usually require something around 20 (maybe even more) CS courses. Are you saying that all of those courses at your school were taught in Scheme? You never had a chance (in the classes, ignoring hobby or internships) to use other languages? That'd be a pretty unique school.

          But even if that were true and you did take 20+ classes in Scheme, you're still a college educated computer scientist. You can't pick up JavaScript or Python in time for a job interview for an entry level job? They're easy languages to learn. If you survived four years of exclusively being taught with Scheme, they'd be a breeze to pick up.

          • SoftTalker 6 days ago |
            No not all scheme. That's an example. The intro course and programming languages course was scheme. There were a number of other languages used. I guess I should have been more nuanced in that a number of students wanted to be taught the currently popular progrmmming languages so they could use them on a resume. They complained about using scheme (or whatever "teaching" language a professor might require) and did not yet appreciate that the concepts/theory they were learning applied to any programming language they might need to use.

            They wanted a trade school/practical education in something immediately marketable, not a theoretical education.

            The reason I remember this is that in my "exit interview" as a senior I mentioned that I appreciated the exposure to these languages and theory and my advisor remarked "we don't hear that very often, the usual feedback is that we don't teach the languages employers want"

      • wonnage 6 days ago |
        JS is easier to read IMO. And of the widely-used interpreted languages I can think of, it's actually got the least confusing implementation of first-class anonymous functions. Python lambdas are limited to one expression, Ruby has that confusing block vs. proc vs. lambda problem, etc.

        I do feel like the value of using Scheme is teaching students early on that syntax doesn't really matter. Those that are actually interested in CS theory will find this enlightening, those that are simply in it because investment banking is so 2007 will churn out.

      • zahlman 6 days ago |
        It's also useful to be able to understand how the idioms map into the syntax of programming languages that one is actually going to use going forward. The point of SICP isn't what language you use, but how you use it, and how you think about the process of using it. Lisp itself exists because someone had the idea of taking the theoretical abstraction and actually realizing it, in notation similar to what the theorists were already using. But that similarity isn't actually relevant to core concepts like "functions as first-class objects", or referential transparency, or the substitution model of computation, or the complexity introduced by mutable state, etc. (Or, dare I say it: to the mind-expanding effects of contemplating the Y combinator.) These ideas can make you a better programmer in any programming language.

        Nor is there any good reason to filter people out preemptively. If seeing `foo(x)` instead of `(foo x)` makes the student more receptive to a proper understanding of recursion, that's just fine.

      • richrichie 6 days ago |
        Then it does not matter what language SICP chooses to illustrate timeless concepts? Even if some JS stuff changes down the line people should be able adapt what’s on the book on the fly?
    • easeout 6 days ago |
      I haven't, but you can compare editions with this SICP Comparison Edition:

      https://sicp.sourceacademy.org/

  • marvinborner 6 days ago |
    They give a nice introduction to encoding state as pure functions. In fact, there are many more purely functional encodings for all kinds of data like trees, integers, sum/product types, images, monads, ...

    The encodings can be a bit confusing, but really elegant and tiny at the same time. Take for example a functional implementation of the Maybe monad in javascript:

      Nothing = nothing => just => nothing
      Just = v => nothing => just => just(v)
      
      pure = Just
      bind = mx => f => mx(mx)(f)
      
      evalMaybe = maybe => maybe("Nothing")(v => "Just " + v)
      console.log(evalMaybe(bind(Nothing)(n => pure(n + 1)))) // Nothing
      console.log(evalMaybe(bind(Just(42))(n => pure(n + 1)))) // Just 43
    • hinkley 6 days ago |
      [flagged]
      • 6510 6 days ago |
        If there is a wrong way to do something someone will do it.
        • hinkley 6 days ago |
          There’s an old saying attributed to the Inuit: everyone enjoys the smell of their own farts.
      • marvinborner 6 days ago |
        I think it's all right if you're used to the notation. The first two lines are tagged unions and will be recognisable as such if you're familiar with encodings like Scott/Church pairs/lists/numbers. Once you understand the structure, the definition of `bind` becomes obvious, as its two arguments represent the cases "is nothing" and "is just", where in the first case Nothing is returned, and in the second case the function is applied to the value inside the Just.

        I think that writing such code, if only for educational purposes, can be really helpful in actually understanding how the state "flows" during the monadic bind/return. Typical monad instantiations of Maybe do not give such deep insight (at least to me).

        > Just because you can do a thing doesn’t mean you should.

        Of course you should, where would be the fun in that?

        • salawat 6 days ago |
          >I think it's all right if you're used to the notation.

          Higher mathematics in a nutshell.

          >Of course you should, where would be the fun in that?

          Also higher mathematics in a nutshell.

          Narrator asks: Who should we put in charge of <<thing that will effect people in a tangible way>>?

          Not the mathematicians! echo the crowd in unanmity.

          Narrator asks: Who will we delegate the task of <<abuse of notation>> to?

          The crowd grumbles, arguing amongst themselves whether such a question even warrants an answer. A mathematician stands up, proclaiming "We'll take it!", following up with, "Once you understand the notation involved in my previous statement, you will understand why this outcome is inevitable."

          The crowd, seeing the wisdom of not even embarking on that tribulation, assents to the delegation, given the task of undoing the abuse of notation for the legibility of the layperson is also delegated to the aspiring mathematician.

          Scene opens on current day...

        • tmtvl 6 days ago |
          > The first two lines are tagged unions

          Are they? But in the Nothing you have 2 identical members (`nothing' without arguments), won't that throw an exception?

          To borrow Rust syntax (pun intended):

            enum Nothing {
              nothing,
              just
              nothing
            };
          
          That's just weird.
          • marvinborner 6 days ago |
            When encoding tagged unions as lambdas, the tags are arguments. In this case `Nothing` has two available tags (`nothing` and `just`) and uses the tag `nothing`. `Just` does the same with the tag `just`, only that the tag gets an additional argument (as does its constructor `Just`), such that the value can be extracted afterwards - just like in an enum:

              enum Maybe<T> {
                Nothing,
                Just(T),
              }
        • IshKebab 6 days ago |
          It's alright once you get used to it usually means it isn't alright in my experience. There are exceptions of course.
          • fiddlerwoaroof 6 days ago |
            This is a big reason why legacy production code bases are such a nightmare to work with: developers refuse to learn anything beyond the minimum necessary to pile on yet another band-aid fix and the code base turns into a disorganized ball of mud
            • ethbr1 6 days ago |
              Alternative: successive developers each piled on their own, different coding preferences, leading to frankencode that requires keeping every paradigm at once in working memory
      • williamcotton 6 days ago |
        It’s definitely easier to read in an ML language, that’s for sure!
    • SkiFire13 6 days ago |
      You can see this as replacing an inductive type with its recursor's function type. It's pretty cool in type theory, but not so good for actually programming stuff.
      • gleenn 6 days ago |
        Honest question: why is that bad for actual programming stuff? Is it because the type theory is interesting but doesn't really help? Performance?
        • jrvieira 6 days ago |
          i am guessing that most people think that the cognitive load cost is usually not worth the benefits.

          i agree that the cognitive load in a language like js which is not prepared to accommodate this paradigm is not worth it

          even when deciding to use Haskell we need to weigh the pros and cons wrt the project's goals

        • SkiFire13 5 days ago |
          In this particular case IMO it's bad because it essentially removes nominal typing for arguably no benefit.

          Even in Lean, a dependently typed language where recursors can be made explicit, people prefer using pattern matching instead of them. There is even sugar for transforming some recursors-like functions into pattern matching like syntax. FYI in Lean recursors are marked as non-computable due to performance concerns, so you can use them to write proofs but not programs.

          Seen from yet another point of view, this is transforming inductive types in a function corresponding to a visitor. And yet functional programming folks spent years trying to convince people to replace visitors with proper inductive/algebraic data types and pattern matching, so this idea is a step backwards even for them.

    • solomonb 6 days ago |
      You can derive these implementations from the recursion principle for your type:

        data Maybe a = Nothing | Just a
      
        foldMaybe :: (Unit -> r) -> (a -> r) -> Maybe a -> r
      
      The two higher order functions passed into `foldMaybe` are your `Nothing` and `Just` (modulo I added the Unit param to the Nothing case to be a little more precise).
    • akira2501 6 days ago |
      It may be elegant mathematically, but then conveyed through a language that is strictly in the ASCII character set without any alignment or internal justification, they're really just painful to look at.
    • tempodox 5 days ago |
      In short, the untyped lambda calculus is Turing-complete.
  • wslh 6 days ago |
    > Everything Is Just Functions...

    I'd iterate on that and say: everything is just languages and dialogues, with functions being one component of them. Over time, we’ve evolved from machine languages to higher-level ones, but most popular languages today still focus on the "how" rather than the "what".

    Programming paradigms, even those like functional and logic programming, requires the "how". My rant is this: the next major iteration(s) in programming languages should shift focus to the "what". By abstracting away the "how", we can reach a higher-order approach that emphasizes intent and outcomes over implementation details.

    I don't want to constrain this idea to Z3, LLMs, or low/no-code platforms, but rather to emphasize the spirit of the "what". It’s about enabling a mindset and tools that prioritize defining the goal, not the mechanics.

    I know this contradicts our work as software engineers where we thrive on the "how", but maybe that’s the point. By letting go of some of the control and complexity, we might unlock entirely new ways to build systems and solve problems.

    If I should be plain realistic, I'd say that in the middle, we need to evolve by mixing both worlds while keeping our eyes on a new horizon.

    • SoftTalker 6 days ago |
      > programming languages should shift focus to the "what"

      SQL is an example of a language that is at least somewhat like that.

          SELECT foo WHERE bar = baz
      
      Doesn't really say "how" to do that, it only defines what you want.
      • wslh 6 days ago |
        Incorrect: you need to know the "how" to create more complex and optimal queries. Your example is like saying, in Python, you just need to write print("Hello World!") to print something.
        • moomin 6 days ago |
          That’s every programming language abstraction. All of them break when you get a fair amount of complexity or performance requirements.
          • wslh 6 days ago |
            Imagine this concrete example: you are the best developer in the world in some specific area(s), except for UX/UI. If you wanted to create a relatively simple yet secure site with user authentication, even if described declaratively as “create a secure site with user authentication,” it would still take a significant amount of time to learn technologies like React and put everything in place. There are zillions of development teams doing the same work around the world.
        • lkuty 6 days ago |
          I wouldn't say that since SQL was an improvement over previous ways to query data which were more concrete, like writing C code to get what you need. As such we are on a level of abstraction higher. Thus SQL specifies the "what", not the "how", with respect to those previous methods. However in complex queries, since we are constrained by the relational model (PK/FK), we may have a feeling of having to specify too much details.
        • SoftTalker 6 days ago |
          That's why I said "somewhat."

          You aren't telling the database how to get those results from the files on the disk. You are telling it what values you want, matching what conditions, and (in the case of joins) what related data you want. If you want an aggregation grouped by some criteria you say what values you want summed (or averaged, etc.) and what the grouping criteria are, but not how to do it.

          Not a perfect example and it breaks entirely if you get into stuff like looping over a cursor but it is why SQL is usually called a declarative language.

    • fsndz 6 days ago |
      isn't that what declarative programming frameworks do already ?
      • lupire 6 days ago |
        Yeah, but a new generation is coming of age, whose teachers only learned these ideas through books, not experience. They are rediscovering computer science one blog post or tweet at a time, because books and classes are obsolete.
      • wslh 6 days ago |
        They don't do it "already" but are one of the approaches taken. If you build state of the art web UI/UX you know that it is not just dragging and dropping objects on the screen while it is perfectly possible to build a tool like this.
    • anon-3988 6 days ago |
      Then this programming paradigm cannot interact with one another. If I have subroutine that does X and another that does Y. How do I compose them? Composability is basically saying, "You can then do Z using X and Y" which is already explaining how its done. So you end up with a language where no subroutine can reference another, because that would constitute how.
      • wslh 5 days ago |
        No, it is complementary. For example, the quicksort algorithm/function you are indirectly using is available in just at another level of abstraction. Clearly we are far from writing algorithms using a "what" approach. It can even be impossible but I wrote about this concept because we have a codebase of algorithms and higher level frameworks that enable us to focus on the "what". I can never imagine a working paradigm like this decades ago.
  • tasty_freeze 6 days ago |
    Neat article. But it was very difficult to navigate for me because 99% I use the keyboard up/down arrows to scroll the page as I'm reading. This page swallows those keystrokes, apparently. Page up/down work, but sometimes. I never use page up/down while reading because I'll be in the middle of a sentence of a paragraph at the bottom, hit page down, and now I need to scan my eyes back to the top of the page. First, it introduces a hiccup in the middle of a sentence, and secondly, because of the hiccup I often want to go back a line or two to reestablish context, but it is now offscreen. Grr.
    • smusamashah 6 days ago |
      For me it was "Your browser is not compatible with Notion." on Android with Hack's (hacker news client) built in browser which is I guess just a stripped down Web view
      • c64d81744074dfa 6 days ago |
        For me it was "JavaScript must be enabled in order to use Notion" (I'm a NoScript user). But it had already redirected me to another domain to show this page. How am I supposed to enable JS for the actual domain of the page? I have ways of course, but it seems like notion is deliberately flipping the bird to people like me...
        • MathMonkeyMan 6 days ago |
          For me it was perpetual loading spinner.
          • mannycalavera42 6 days ago |
            in the meantime surely plenty of telemetry being ingested. The pleasure of TheModernWeb
  • liontwist 6 days ago |
    The cons/car/cdr implementation as lambda was magical the first time I saw it. But it just shows that the language runtime must implement key/value dictionaries and you are able to borrow that implementation to make other data structures.
    • hinkley 6 days ago |
      I find the destructuring logic in elixir much more interesting, and the watered down version in ES6 much more practical.

      In elixir you can pop off as many as you like.

      • liontwist 6 days ago |
        Can you share any resources about it?
        • hinkley 6 days ago |
          I’m a bit of an elixir noob, but Enum functions like slice let you cut up lists in various ways, and you can pattern match values in maps in function definitions:

          http://www.skuunk.com/2020/01/elixir-destructuring-function....

          Which can let you unroll function preambles, or apply different rules if for instance an admin user runs a function versus a regular user.

          • liontwist 6 days ago |
            I think this is a little different. Pattern matching gives you car and cdr, but not cons.
            • hinkley 6 days ago |
              That exists for maps and lists.
    • EuAndreh 3 days ago |
      Not key value dictionaries, just pointers are needed.

      A closure with no behaviour is just a pointer to the enclosed variable. A closure with 2 pointers is a pair, which you can get the car and cdr.

      The runtime needs to make the pointee available outside its definition, so escape analysis, garbage collection, etc. But no dictionary is needed.

  • pjmlp 6 days ago |
    Actually it is more like Algorithms + Data Structures = Programs.
    • agumonkey 6 days ago |
      for some reason I resonate more with the fp/math/linguistic side of this coin. you don't even think about programs in the end

      that said, since I've been reading about kanren and prolog I'm about to say "everything is a relation" :)

  • behnamoh 6 days ago |
    Shameless plug: https://aplaceofmind.notion.site/It-s-Lambdas-All-the-Way-Do...

    I got to the same conclusion a while ago, except that I found that it's lambdas all the way down.

    • zahlman 6 days ago |
      "Lambdas" and functions are not different things, in a functional-programming perspective (i.e. where you're operating with referential transparency and immutable objects anyway). The lambda syntax is just function-definition syntax that doesn't include an implicit name binding.
      • behnamoh 5 days ago |
        exactly, I found that you don't even need labels for functions to be able to do programming.
  • bob1029 6 days ago |
    • qrush 6 days ago |
      this guy has Chris Fleming energy.
    • vmilner 6 days ago |
      • jazzyjackson 6 days ago |
        Cool, available on archive.org too

        Did you read the book in isolation or was it a part of a class / MOOC ?

        https://archive.org/details/all-the-mathematics-you-missed

        • vmilner 4 days ago |
          I just found it by chance somehow when trying to review gaps in my maths education - it’s excellent for reviewing complex analysis (say) if you did it years ago - or less obviously - doing a quick pass over a topic before you learn it for the first time. Then I found the excellent math sorcerer channel and his view is much the same as mine.
    • rapnie 6 days ago |
      Fellow is doing an aggressive function. I wouldn't dare put contrary functions against his output.
  • hyperbovine 6 days ago |
    Reading this brings back fond memories of taking CS61a with prof Brian Harvey at UC Berkeley some 25 years ago. Same book, same level of mind=blown, and very similar instruction style. we spent a semester instead of a week and if memory serves tuition was about the same, but they threw in some English and history courses as well :-)
    • ksd482 6 days ago |
      Same. For me it was 15 years ago, but was with Prof. Brian Harvey in Pimentel hall with the rotating stage.

      Nice memories.

      I fell in love with scheme eventually as it was such a simple syntax. Getting used to parentheses did take some time though.

    • romanhn 6 days ago |
      Same memories, and even the same timeline :) I still recall being blown away by the concept of "code is data", the magic of which I haven't encountered in professional development, alas.
    • kurinikku 6 days ago |
      OP here. Thank you for the kind words! For those who enjoyed this, I would also point out Eli Bendersky's excellent SICP series https://eli.thegreenplace.net/tag/sicp
  • revskill 6 days ago |
    I think it's a cool book for students.

    But for real world programming, the tedious ones is related to validation, parsing and other business logic.

    So i prefer a book to help teach CS by using real world codebase to solve real world everyday problem as a software engineer instead.

    You can have your cake and eat it.

    • lupire 6 days ago |
      That's like teaching physics via car repair. You'll learn a few ideas, but not much of the science.

      It's practical and productive and profitable, which is great, but not really the original goal.

      • revskill 6 days ago |
        It's not a surprise that most of students failed and hate abstract algebra right ? I mean to learn the concept, you will need to know more about the concept itself in a real world context.
        • HKH2 6 days ago |
          Highly intelligent people can learn without any real world context (and I'm not one of them). Obviously there are problems with unapplied learning (or learning for learning's sake), but it is certainly possible.
    • cess11 6 days ago |
      SICP shows a real world code base. It's real world programs that builds up to implementing real world programming languages.

      Why would you validate if you can parse? If you have a decent chunk of experience in implementing business logic then you know that your quality of life will be destroyed by switches and other inscrutable wormhole techniques up until the point where you learn to use and build around rule engines. SICP shows you how you can tailor your own rule engine, so you won't have to get the gorilla and the jungle when you reach for one in an enterprisey library.

  • ysofunny 6 days ago |
    alternative take: everything is just sets

    both can be a foundation for mathematics, and hence, a foundation for everything

    what's interesting is how each choice affects what logic even means?

    • MathMonkeyMan 6 days ago |
      I learned functions in terms of sets. Domain and codomain are sets. Function is a set of ordered pairs between them.

      How could we go the other way? A set can be "defined" by the predicate that tests membership, but then how do we model the predicates? Some formalism like the lambda calculus?

      • reuben364 6 days ago |
        Not sure of details to make it a mathematical foundation but:

        A category can be defined in terms of its morphisms without mentioning objects and a topos has predicates as morphisms into the subobject classifier.

      • ysofunny 4 days ago |
        i think predicates are functions returning booleans

        lambda calculus would provide a computational way to determine the truth value of the predicate, any computable predicate that is.

  • blackeyeblitzar 6 days ago |
    A computer fundamentally isn’t functions though. That’s not how a processor works. If functions are a useful abstraction, why haven’t functional languages taken off?
    • doublepg23 6 days ago |
      It seems functional language experts are too busy rewriting SICP instead of actually useful programs.
      • blackeyeblitzar 6 days ago |
        I just haven’t seen anything concrete as to why SICP’s materials are useful in either the real world or academia. Sometimes these discussions talk about how it is useful for computer science and for theory but even that seems like a claim without evidence. Is this just people reminiscing about their first introduction to programming or a favorite professor?
    • antonvs 6 days ago |
      > A computer fundamentally isn’t functions though. That’s not how a processor works. If functions are a useful abstraction, why haven’t functional languages taken off?

      If computers and their processors are a useful abstraction, why don't we write everything directly in machine language - or microcode for that matter?

      This is more about computing than about computers. As Dijkstra put it, "Computer science is no more about computers than astronomy is about telescopes."

      Computing involves languages, including many languages that are not machine languages. Every language that's higher level than machine code requires translation to actually execute on the particular machines that we've developed as a result of our history and legacy decisions.

      The lambda calculus is a prototypical language that provides very simple yet general meanings for the very concept of variables - or name-based abstraction in general - and the closely related concept of functions. It's a powerful set of concepts that is the basis for many very powerful languages.

      It also provides a mathematically tractable way to represent languages that don't follow those principles closely. Compilers perform optimizations like static single assignment (SSA), which are fundamentally equivalent to a subset of the functional concept of continuation passing style (CPS). In other words, mainstream languages need to be transformed through functional style in order to make them tractable enough to compile.

      The mapping from a lambda calculus style program to a CPU-style register machine is quite straightforward. The connection is covered in depth in Chapter 5 of SICP, "Computing with Register Machines." Later work on this found even better ways to handle this, like Appel's "Compiling with Continuations" - which led to the SSA/CPS equivalence mentioned above.

      There's a lot to learn here. It's hard to recognize that if you know nothing about it, though.

    • dahart 6 days ago |
      Just to pick some nits with those claims… CPUs do have hardware support for functions in the form of a stack and CALL/RET instructions. Functions are a useful abstraction since more or less all software uses them. Functions and functional languages are two related but different things, and the usefulness of functions as an abstraction doesn’t depend on whether functional languages have taken off. And last, I’d say functional languages have gained ground over time, as well as semi-functional languages like, say, Python and JavaScript. Even C++ is gaining more functional language features over time.
  • fifilura 6 days ago |
    So an integer is represented by how deep in the stack you are?

    How do you represent an irregular float?

    • marcosdumay 6 days ago |
      Probably by using IEEE 754.

      What will make any function that uses floating point numbers mindblowing complex. But there's probably an easier way by creating some transformation from (Integer -> a) to (F64 -> a) so that only the transformation gets complex.

      Anyway, there are many reasons people don't write actual programs this way.

  • zetranrt 6 days ago |
    David Beazley is using Scheme! That is a nice shift towards a civilized language. I hope he scraps the scheme-in-python section, but perhaps that is intended as an exit drug for Python addicts.
    • surfingdino 6 days ago |
      I'd rather see FP addicts exit Python ecosystem. I had dubious pleasure of working with Python codebases written by FP enthusiasts. It was hell. Simple, well-understood problem domains were made way more complex by introduction of GraphQL and FP resulting in about a dozen levels of nesting for something that could have been a simple FastAPI app.
  • js2 6 days ago |
    There's a typo in the code in "the substitution model" section:

      ("+", ("fib", ("-", "n", 2)), ("fib", ("-", "n", 1))),
    
    The two calls to `fib` are surely meant to be `fibonacci` since the latter is defined, but not the former. Indeed, the code is correct in the github repo:

    https://github.com/savarin/pyscheme/blob/0f47292c8e5112425b5...

    • kurinikku 6 days ago |
      OP here. Thank you!
  • Animats 6 days ago |
    "Everything is just" approaches usually result in hammering things that don't fit into fitting. That often ends badly. Computing has been through, at least:

    - Everything is just a function (SICP)

    - Everything is just an object (Smalltalk, and to some extent Java)

    - Everything is just a closure (the original Common LISP object system)

    - Everything is just a file of bytes (UNIX)

    - Everything is just a database (IBM System/38, Tandem)

    • Maxatar 6 days ago |
      None of the things you mention ended badly though. I think all of those approaches you list are incredibly useful and important concepts and I am very happy that I not only know them, but that because of how universal they are I can leverage my knowledge of one approach to learn or apply another approach.
      • thethimble 6 days ago |
        Another angle on this is that there’s many formal axiomatic ways to define computing.

        Everything is just a Turing machine. Everything is just a function. Everything is the Conway’s game of life.

        The fact that all of these forms are equally expressive is quite a surprise when you first discover this. Importantly, it doesn’t mean that any one set of axioms is “more correct” than the other. They’re equally expressive.

        • brudgers 6 days ago |
          Everything is just a Turing machine.

          That one ends in a tarpit where everything is possible but nothing of interest is easy.

          • Animats 6 days ago |
            That's the generic problem with "Everything is a ...". Trying to force things into a paradigm that doesn't fit well complicates things.
            • brudgers 6 days ago |
              The generic problem is every generation thinks they invented sex.

              https://www.cs.yale.edu/homes/perlis-alan/quotes.html

            • grugagag 6 days ago |
              It is a simplification that makes easier to grasp a paradigm. Sure, it could be taken to extremes and pretend nothing else exists outside this ‘everything is a … “ bubble. Luckily we can learn from others’ mistakes and not fall into traps too often.
          • zahlman 6 days ago |
            >where everything is possible but nothing of interest is easy.

            Real development IMX is not much different. People just have low standards for "interesting" nowadays, and also have vastly increased access to previous solutions for increasingly difficult problems. But while modern programming languages might be more pleasant to use in many ways, they have relatively little to do with the combined overall progress developers have made. Increased access to "compute" (as they say nowadays), effort put into planning and design, and the simple passage of time are all far more important factors in explaining where we are now IMO.

      • hobs 6 days ago |
        I would go further and say that each one of these were so useful that they presented entirely new sets of problems to attempt to solve, because of how many other problems they directly addressed.

        It's like being mad that hammer was so successful we invented screw to improve on it's greatest hits.

    • kurinikku 6 days ago |
      OP here. I would add "All you need is NAND".
      • philipov 6 days ago |
        "All you need is Transistor" - can't believe how badly that ended!
        • esalman 6 days ago |
          It's because all you need is Ohm's law.
          • rusk 6 days ago |
            James Clerk Maxwell has entered the conversation
          • osigurdson 6 days ago |
            Ohm's law doesn't really work for transistors though.
    • jdougan 6 days ago |
      - Everything is just a filesystem (Plan9/Inferno)

      - Everything is just a buffer (K&R C and many of its descendants)

      - Everything is just a logical assertion (Prolog)

      I look at the list and I see a bunch of successes, though some of them are niche.

      • galaxyLogic 6 days ago |
        Everything (expressed with language) is just a model of something else.

        By making the model follow some simple rules which we think the real thing follows as well we can reason about what happens when some inputs to the real thing being modeled change, by runnign our model (-simulation).

        Thus you could add to your list: "Everything is just a simulation".

        Except the real thing of course :-)

    • osigurdson 6 days ago |
      How about everything is just state and transformations.
    • hehehheh 6 days ago |
      Everything is just a turing machine after all* **

      * modulo infinity

      ** except a small number of languages that are not

    • dadadad100 6 days ago |
      Everything is a rule (Pega)
    • jancsika 6 days ago |
      > - Everything is just a closure (the original Common LISP object system)

      What doesn't fit into this particular "everything?"

  • vonnik 6 days ago |
    Imagine if that statement applied to every non-digital thing as well.
  • asah 6 days ago |
    Not a fan of everything-is-a-function because it's oversimplistic and often unhelpful. Some of the issues:

    - functions that don't fit in cache, RAM, disk, etc.

    - functions that have explosive big-O, including N way JOINs, search/matching, etc.

    - functions with side effects, including non-idempotent. Nobody thinks about side channel attacks on functions.

    - non-deterministic functions, including ones that depend on date, time, duration, etc.

    - functions don't fail midway, let alone gracefully.

    - functions don't consume resources that affect other (cough) functions that happen to be sharing a pool of resources

    - function arguments can be arbitrarily large or complex - IRL, there are limits and then you need pointers and then you need remote references to the web, disk, etc.

    (tell me when to stop - I can keep going!)

    • nephanth 6 days ago |
      Oversimplifying can be great at times. In this case, the lambda-calculus model (which is the base for this type of "everything is just a function" approach) is a great model of computation because it is so simple, while being easy to handle /reason about (compared to eg. Turing machines), which is why it is at the base of most computer logic/proof systems
    • zelphirkalt 6 days ago |
      Half of what you call functions in that comment, are not actually functions and in the FP world many would not call them functions. Rather they are procedures. Functions are procedures, but not all procedures are functions.
      • SilasX 6 days ago |
        Which is also a problem with thinking this is a helpful abstraction: apparently, not everything you need to do can be captured by functions (in that sense)!
      • magicalhippo 6 days ago |
        I'm just used to the words from Pascal. What's the definition of a procedure in the FP world?
        • zelphirkalt 6 days ago |
          Afaik, the definition is just "a sequence of steps/instructions". That is of course much broader than a (simple) function, which among other things for one input gives you one output, and for the same input always the same output. A procedure can do that too, but it can also give you different outputs for the same input, due to side effects.
          • magicalhippo 6 days ago |
            So from that perspective functions are pure and procedures are impure?

            One of the thinga I miss from my C++ days was the ability to mark functions as const, which made them fairly pure.

            Being able to clearly mark which is which, but also to combine them easily, was very productive.

            • zelphirkalt 5 days ago |
              Yes, that is my understanding of the terminology, and how I try to use the terms (not always succeeding).
    • anon-3988 6 days ago |
      I think its the only hope for theoretical purposes.
    • lmm 6 days ago |
      Many of those things can be modelled as functions, they just need to actually be written that way (e.g. if you have a function that requires some resource, maybe it should require that resource! If it depends on the date/time, maybe it should depend on the date/time! If it returns a nondeterministic value, maybe it should return a nondeterministic value!). I think the functional programming approach shines partly because it forces you to take these things seriously: if you want to use e.g. implicitly shared resources, you need to model that, and "functions" that rely on implicitly shared resources are going to be explicitly distinct from actual function functions.
  • gugagore 6 days ago |
    I recently came across the notion that you need inductive data types (and can't just use Church encodings) if you want to do theorem proving, like proving that `0 != 1`.

    I threw up some content up here: https://intellec7.notion.site/Drinking-SICP-hatorade-and-why... , along with an unrelated criticism of SICP.

    I'd like to better understand what the limitations are of "everything is just a function".

    • schoen 6 days ago |
      I think you could prove 0 ≠ 1 if you had some other concrete fact about inequality to make use of. You could reason from the theorem "f = g -> f x = g x" to create your inequality fact on the right side and then take the contrapositive.

      It seems correct to me that you can't directly prove inequality between Church numerals without starting with some other fact about inequality. Whereas with inductive data types, a proof system can directly "observe" the equality or inequality of two concrete instances of the same inductive type, by recursively removing the outermost constructor application from each instance.

      • schoen 6 days ago |
        Looking at the linked references, I'm not sure what else is being assumed to be available or not available when considering proofs using the Church numerals. I guess that will matter a lot, and I don't know enough to make more general statements about what is or isn't sufficient here.
    • gugagore 6 days ago |
    • solomonb 6 days ago |
      Well for theorem proving you need Sigma and Pi types plus some notion of equality. Can you achieve those with Scott or Church encoding?
  • lifeisstillgood 6 days ago |
    David Beazley is a bit of a legend in the python world and honestly this course seems a surprising idea but it took about two seconds thought before it seemed perfect match and Inhave signed up for the next one.

    The relevant part is that this is basically how “software engineers continual education” is going to look like

    • AtlasBarfed 6 days ago |
      "legend in the python world"

      That's a fun statement.

    • wrycoder 6 days ago |
      If he didn’t cover Chapter five on compiling, he didn’t cover SICP’s best part.
  • shdh 6 days ago |
    From my perspective all software is essentially applying transformation functions to some data.

    There are externalities like networking and storage, but still data transformation in a way.

  • fngjdflmdflg 6 days ago |
    Surprised "mind-blowing" is not in the HN clickbait filter.
    • dang 6 days ago |
      It is now.
  • elcritch 6 days ago |
    And functions are just numbers combined with if/else's and a pretty name at the end of the day.

    If this number, jump to numberA otherwise jump to this numberB. Also if numberC store numberD at numberE. ;)

  • zahlman 6 days ago |
    I've watched the actual SICP lectures before (the 1986 recordings on MIT OCW). They're often praised for the information density, but it actually still wastes a lot of time listening to students' Q&A, the lecturers drawing the class' attention to various attempts at "multimedia" presentation in the classroom, simply not having the entire lesson plan worked out in advance (i.e., not being able to preempt the Q&A) etc. For that matter, the sheer amount of time spent on writing things on a chalkboard really adds up.

    And of course the order of the material could be debated and rearranged countless ways. One of my future planned projects is to do my own video series presenting the material according to my own sensibilities.

    It's nice to hear that the course apparently still stays true to its roots while using more current languages like Python. Python is designed as a pragmatic, multi-paradigm language and I think people often don't give it enough credit for its expressive power using FP idioms (if not with complete purity).

    • rjagy 6 days ago |
      The course is using Python to implement a Scheme, then uses Scheme to implement a Scheme. Python could and should be removed from the course.

      Python has very poor support for functional programming. Lists are not cons based, lambdas are crippled, pattern matching is horrible and not even expression based, namespaces are weird.

      Python is not even a current language, it is stuck in the 1990s and happens to have a decent C-API that unfortunately fueled its growth at the expense of better languages.

      • hackboyfly 6 days ago |
        Interesting, I think this is the first time I have seen anyone bash Python this hard.

        Why would a decent C-API fuel its growth? Also can you give me some examples of better languages?

        Am no senior developer but I find python very elegant and easy to get started with.

        • linguae 6 days ago |
          I’m not the parent poster, but I’ve seen two major spurts of Python’s popularity: (1) the mid-2000s when Python became a popular scripting language, displacing Perl, and (2) beginning in the first half of the 2010s when an entire ecosystem of Python APIs backed by code written in C, C++, and even Fortran made up the infrastructure for machine learning code (e.g., NumPy, SciPy, scikit-learn, Pandas, etc.). If Python didn’t have a good way of interfacing with code written in languages like C, then it might not have been as popular among machine learning researchers and practitioners, who needed the performance of C/C++/Fortran for numerical computing but wanted to work with higher levels of abstraction than what is provided in those languages.

          What drew me to Python back in 2006 as a CS student who knew C and Java was its feeling like executable pseudocode compared to languages that required more “boilerplate.” Python’s more expressive syntax, combined with its extensive “batteries included” standard library, meant I could get more done in less time. Thus, for a time in my career Python was my go-to language for short- and medium-sized programs. To this day I often write pseudocode in a Python-like syntax.

          Since then I have discovered functional programming languages. I’m more likely to grab something like Common Lisp or Haskell these days; I find Lisps to be more expressive and more flexible than Python, and I also find static typing to be very helpful in larger programs. But I think Python is still a good choice for small- and medium-sized programs.

          • hackboyfly 6 days ago |
            Ah thanks, that makes sense!
          • taeric 6 days ago |
            I'm convinced python's main asset for its growth was how ubiquitous it was. It was basically pre installed everywhere. With the batteries included idea, you were mostly good with basics, too.

            This changed with heavy use, of course. Such that now packaging is a main reason to hate python. Comically so.

            • pjmlp 6 days ago |
              In the 2000's you had to install it in most UNIXes still.
              • taeric 6 days ago |
                That may be. I recall perl being bigger back then. I mainly consider python to have grown around the 2010s. And it was basically already on most machines.

                It is only in the later versions where they have pushed compatibility boundaries that this has gotten obnoxious.

          • pjmlp 6 days ago |
            Forgetting about Zope.
        • fiddlerwoaroof 6 days ago |
          I try to avoid python in production code bases as much as possible: dependency management issues alone are a good reason to do so for anything that will last longer than a year or two.
        • bitwize 6 days ago |
          It was relatively easy to lash Python as a higher-level orchestration layer to popular number crunching libraries, yielding NumPy and similar, which made Python popular for machine learning applications.

          If you're used to Scheme, Common Lisp, or Haskell, Python's arbitrary decisions about e.g. lambda or how scopes work may be grating. But Python is the BASIC of the modern day, and people laughed at BASIC in the 80s too... except businesses ran on BASIC code and fortunes had been made from it.

      • tikhonj 6 days ago |
        Not very Schemey, but at least modern Python has basically full-on algebraic data types thanks to type hints, immutable dataclasses and structural pattern matching.

        It's still not great for functional programming, but far, far better than it used to be.

        • nextos 6 days ago |
          IMHO the main problem is the fact that lambda expressions have been deliberately crippled. Ruby is often described as a good-enough Lisp despite it is not homoiconic. That's because, like all modern Lisps, it makes pervasive use of blocks, procs and lambdas. Python could have been a very similar language, but Guido held a vocal anti-FP stance. Perhaps this can be addressed now, as other interesting features like the ones you outlined have been added to the language, but it'd have a very deep impact on nearly every API.
          • User23 6 days ago |
            Back in the day I used to jokingly describe Perl5 as a "lisp-5ish." Lisp-1s like Scheme have a single namespace for all functions and values. Lisp-2s like Common Lisp allow a symbol to have separate bindings for functions and values. Perl symbols on the other hand have at least: scalar, array, hash, and code (it's been a long time since I did XS). I'm fairly confident I'm forgetting at least one or two on top of that.

            If you avoided the many footguns the language offered, you could actually write pretty clean functional code. Of course you had to be extremely diligent with your testing because the interpreter did not provide much help even with warnings and strict enabled.

          • pjmlp 6 days ago |
            Most of those Ruby capabilities are already present in Smalltalk, but unfortunately the only thing many know from Smalltalk is it gave birth to OOP and GUIs.
      • nyrikki 6 days ago |
        While I am a huge lisp fan, oh...the irony of saying that python is struck in the 1990's when CONS, CAR and CDR are artifacts from the IBM 704 and Fortran :)

        While I do find it annoying that python used 'list' to mean 'dynamic array', it is a lot better than a ton of church encoding in the other common teaching language, Java.

        Linked lists may not be native in python but it is trivial to implement them.

      • melodyogonna 6 days ago |
        Calling Python old in the same comment talking about a lisp dialect is funny.

        Anyway, Python is intentionally not functional because Guido dislikes functional programming

    • CodeArtisan 6 days ago |
      There is also the course from ArsDigita University. The site is offline now but the courses are available on archive.org.

      https://en.m.wikipedia.org/wiki/ArsDigita#ArsDigita_Foundati...

      https://archive.org/details/arsdigita_01_sicp/

      They were selling USB keys with the entire curriculum, if someone could upload an iso, that would be amazing. https://web.archive.org/web/20190222145553/aduni.org/drives/

    • zelphirkalt 6 days ago |
      FP in Python is rather weak. Even JS does a better job there. Some of the code exercises will need completely different solutions than in Scheme, due to not having TCO. What do instructors do, when their 1 to 1 translated code fails? Tell the students, that due to the choice of language it does not work that way, and that they simply need to believe it? Or do they treat it all as externalize the stack problems and solve it that way?

      It seems rather silly to force SICP into Python.

      • agent281 6 days ago |
        Pyret would be a good alternative. It's designed for education by racketeers.

        https://pyret.org/

      • FreakLegion 6 days ago |
        CPython doesn't do TCO, but you can implement it yourself in Python.

        This version is one of my all-time favorite StackOverflow answers: https://stackoverflow.com/questions/13591970/does-python-opt...

        • zelphirkalt 6 days ago |
          Looks like trampolines inside Y combinator. Interesting approach. Still of course with its own costs. I think some of the Scheme in JS implementation(s?) use trampolines as well, or at least considered doing that.
    • jyounker 6 days ago |
      The issue with Python and most other non-LISP family programming languages is that they don't allow you to easily treat programs as data. Things that are simple to express with Scheme become complicated exercises when using mot another languages. Instead of focusing on the underlaying concept students end up having to focus on the implementation language's details in a way that they don't with Scheme.
  • pinerd3 6 days ago |
    Just fyi (perhaps @dang) this jumps to the Postscript of the blog post due to the anchor/hash in the URL. I was a bit confused initially.
    • layer8 6 days ago |
      The title is also wrong. I was wondering if the submitter maybe did mean to link to the postscript, but it doesn’t fit the title any better.
  • rockwotj 6 days ago |
    Everything is just assembly!
  • emmanueloga_ 6 days ago |
    Warning: Diving into SCIP/Lisp/Scheme can transform how you think about programming... food for thought is always welcomed! But applying those ideas wholesale to OOP codebases often backfires or gets pushback from teammates. Languages handle paradigms differently, and going against the grain usually leads to worse code.

    Example: After Lisp, you might replace every for loop with forEach or chain everything through map/reduce. But unless you’re working in a language that fully embraces functional programming, this approach can hurt both readability and performance.

    At the end of the day, it’s grounding to remember there’s mutable memory and a CPU processing the code. These days, I find data-oriented design and “mechanical sympathy” (aligning code with hardware realities) more practical day-to-day than more abstract concepts like Church numerals.

    • hmmokidk 6 days ago |
      The thing is, I find immutable is safer. Less side effects is more predictable. OO is used often when things can just be functions. I love gamedev, and OO makes a lot of sense there because all of the computations and unsafe code is kind of okay. But for webdev FP makes so much more sense. For SAAS something like elixir enables me to write more reliable / less buggy / better tested code.
      • anhner 6 days ago |
        How do you deal with lack of types? (I know elixir is adding types but from my understanding it's nothing like e.g. typescript)

        I'm thinking about learning elixir but lack of types is kind of a turn off for me.

        • cess11 6 days ago |
          In Elixir specifically you've got structs that are somewhat similar to types, %MyStuff{}, and if you need to you can use Ecto to get some guarantees. You also tend to focus more on the shape of data than the name, e.g. with pattern matching in places like function declarations and case expressions, which emulates quite a bit of what you typically use a type system to accomplish.

          Here's a summary of the type system they're exploring for Elixir: https://hexdocs.pm/elixir/main/gradual-set-theoretic-types.h...

        • TacticalCoder 6 days ago |
          > How do you deal with lack of types?

          Not GP but I'm using Clojure for both the front-end (ClojureScript) and the "back-end" (server running Clojure), sharing Clojure code between the two.

          Clojure is not typed but I use Clojure specs. It's not a type system but it's really nice. You can spec everything, including specc'ing functions: for example you can do stuff like: "while in dev, verify that this function returns indeed a collection that is actually sorted everytime it is called". I'm not saying: "no types + clojure specs" beats types but it exists and it helps to solve some of the things types are useful for.

          https://clojure.org/guides/spec

          • anhner 6 days ago |
            > _for example you can do stuff like: "while in dev, verify that this function returns indeed a collection that is actually sorted everytime it is called"._

            This sounds interesting. Do I understand correctly, this checks the "spec" at runtime? What happens if a spec fails?

        • shawa_a_a 6 days ago |
          Not the other commenter, but my team has been using Elixir in production (soft real-time distributed systems) for several years to great success. The approachable syntax has been great for folks new to the language coming on board and sort of, not realising they’re “doing FP”.

          Generally I’d say Elixir’s lack of “hard” static typing is more than made up for what you get from the BEAM VM, OTP, its concurrency model, supervisors etc.

          That said if you’re interested in leveraging the platform whilst also programming with types I’d recommend checking out Gleam (https://gleam.run), which I believe uses an HM type system.

      • delusional 6 days ago |
        If I could get one wish, it would be to ban "makes sense" from any engineering discussion ever. Facts do not "make sense" the world is not a "sense making" machine. We make sense of the world and of the facts. Something feeling intuitive, which is what people often mean when they claim something "makes sense", just means it slots into your existing experience.
        • fud101 5 days ago |
          "Young man, in mathematics you do not understand things. You just get used to them.” - John von Neumann
      • oersted 6 days ago |
        Even in GameDev there is a strong trend towards data-oriented programming and stateless logic, led by the ECS (Entity-Component-System) paradigm.

        ECS is all about composition rather than inheritance, and decoupling logic from data. It is not strictly immutable for performance reasons, but it has a similar character as the immutable functional state-management frameworks in WebDev (Redux, Elm and co).

        It's not just about maintainability, it actually can be awkward to fit certain patterns into ECS, but it has significant advantages in terms of performance (particularly being CPU cache-friendly) and being able to massively parallelize safely and without having to think too much about it. It can also be a helpful abstraction for distributed computing and networking in multiplayer.

      • tempodox 5 days ago |
        Fewer side effects.
    • llm_trw 6 days ago |
      >At the end of the day, it’s grounding to remember there’s mutable memory and a CPU processing the code.

      And yet goto is even less popular than Lisp despite being the only way that CPUs implement control flow.

      What's even more bizarre is that despite caches being the only way to get any performance out of modern CPUs we still don't have a language that treats the memory hierarchy as a first class citizen. The closest is Linux flavored c with a sea of underscores.

      • taeric 6 days ago |
        Lisp has goto, if you want it. :)

        I confess it has made transcribing some algorithms a bit easier.

      • renox 5 days ago |
        I don't find it bizarre: even assembly language don't have much control over the memory hierarchy, there's prefetch load and nothing else...
        • llm_trw 5 days ago |
          So much the worse for assembly language.

          Lest we forget that modern x86 machine code is an assembly language and the machine code, or microinstructions in newspeak, is completely hidden from view.

    • pjmlp 6 days ago |
      Like Common Lisp Object System?

      Besides, stuff like forEach, map/reduce was already present in Smalltalk collections, and was copied into Object Pascal, C++, even if without first class grammar for it.

      Exactly because the underlying memory is mutual, there are mechanisms in ML languages to mutation, when really needed.

  • User23 6 days ago |
    I think SICP is wonderful.

    As I've learned more and studied more math though, I've come to the conclusion that the relation really is the more fundamental primitive. Every function can be represented as a kind of restricted relation, but the converse is not true, at least not without adding considerable extra gadgetry.

    While of course relational databases and SQL are the best known examples of relational programming, and are highly successful, I still believe it's a largely untapped space.

    However, my interest currently is less in the space of programming language design and more in teaching very young children the basics of math. And for whatever reason it's considerably easier to teach a predicate like "is big" as a 1-ary relation and "is bigger than" as a 2-ary relation than trying to capture the same concepts as functions.

  • upghost 6 days ago |
    Dave is a stellar instructor with a bottomless well of knowledge. One of the few people I go to when I have a Python question I can't figure out. I've taken every course he has available: compilers, lisp, raft, and advanced python salary*. I can say definitively that my salary has doubled twice thanks to his courses, so well worth the effort from my opinion.

    That being said... SICP, Compilers, and RAFT all left me with the gnawing feeling that there was more juice to be squeezed from computer science than I was able to understand.

    Why were my parsers and compilers not bidirectional? Why am I writing a utilitarian OO parser for an elegant FP language? Why is there no runtime polymorphic isomorphism for my client/server RAFT nodes?[1]

    Drowning in my own sorrows, I began binge drinking various languages, drifting from one syntax to the next. CL, Scheme, Java, C#, Haskell, Forth, TypeScript, Clojure... ahh, long love affair with Clojure. But in the end, even Clojure, for all of its elegance, could not solve the original problem I was facing and at the end of the day in many ways was just "a better Python".

    I was nearly ready to call it quits and write my own language when all the sudden I discovered... Prolog.

    Not just any Prolog-- Scryer Prolog. Built on rust. Uncompromising purity. And Markus Triska's Power of Prolog videos on YouTube.[2]

    My God. They should have sent a poet.

    Bidirectional parsing? No-- N-dimensional parsing vis definite clause grammars, ships with standard library. First class integer constraint solvers. And it is the first language I have ever worked with that takes the notion of "code is data" seriously.

    Scryer is still a young language, but it's going big places. Now is a great time to get involved while the language and community is still small.

    I will end this love letter by saying that I owe my career and much of my passion to Dave, and because of him Python is still how I earn my bread and afford expensive toys for my Yorkie.

    You rock, Dave!

    [*]: this was a typo, should be Advanced Python Mastery, but in my ways advanced python salary is actually more accurate, in my case anyway.

    [1]: If issues like this don't bother you, and you haven't been coding for at least 10-15 years, maybe come back later. Learning Prolog and Scryer in particular is really hard at first, even harder than Haskell/Clojure.

    [2]: https://youtube.com/@thepowerofprolog

    • gottacodethem 4 days ago |
      What kind of things are you using Prolog for? General purpose computing? How does it fare vs Clojure in regards to libraries?
      • upghost 4 days ago |
        Great questions. What initially attracted me to it was Definite Clause Grammars[1]; easier to use than regular expressions and turing complete. This solves my bidirectional parser problem and my elegance problem.

        At first I did not expect that I would find Prolog to be useful for general purpose computing –– it turns out Prolog has some surprising properties that I wish were present in other languages such as Clojure.

        Arguments passed to Prolog "predicates" (unit of work similar to functions) as arguments are NOT evaluated UNLESS you use "call" (similar to eval) on them directly. That may seem crazy at first but the implication is that it completely erases the line between code and data. Yes, lisp has macros, but there is a heavy cultural and technical line between a function and a macro. In Prolog, there is no distinction and no stigma around seamless metaprogramming.

        Regarding the library story, it depends on how you look at it. Very soon Scryer Prolog will finish being embeddable in other languages so in a sense it had access to every library from every language. But from a vanilla perspective, it has incredibly powerful first class constraint solvers, which I've never seen in any other language that is also a general purpose language.

        [1]: https://youtu.be/CvLsVfq6cks