https://plato.stanford.edu/entries/church/
Though I heard that it also neglects some major parts of his work. I assume it was too philosophical for mathematicians and too technical for philosophers.
[I have known the above quote forever, but I can't find an original source]
edit: might be from Guy Steele: "And some people prefer not to commingle the functional, lambda-calculus part of a language with the parts that do side effects. It seems they believe in the separation of Church and state"
Those are the ones that make me wish people knew just enough Computer Science to get them :)
Which is a reference to the "two hard problems" jokes, the most used is "There are two hard problems in Computer Science: Cache invalidation, naming things, and off-by-one errors"
But there is also "Two hard problems in distributed systems: Exactly-once delivery, guaranteed order of messages, and exactly-once delivery".
It's just as funny today as when I first heard it in 2002/2003... to which my professor at the time would add:
"But really, naming things is colouring. And cache invalidation is colouring. So really there's only one problem in computer science, colouring and counting... which doesn't sound so hard after all".
You may get it.
1: >Niklaus Wirth's quote
https://news.ycombinator.com/item?id=42054371
2: >the joke about there being 10 kinds of people.
those who understand binary and those who don't.
https://people.csail.mit.edu/gregs/ll1-discuss-archive-html/...
ha, good one.
remind me of the joke about niklaus wirth's name.
[ Quotes about Niklaus Wirth
Whereas Europeans generally pronounce his name the right way ('Nick-louse Veert'), Americans invariably mangle it into 'Nickel's Worth.' This is to say that Europeans call him by name, but Americans call him by value.
Introduction by Adriaan van Wijngaarden at the IFIP Congress (1965). ]
from:> The name lambda comes from the mathematician Alonzo Church's notation for functions (Church 1941). Lisp usually prefers expressive names over terse Greek letters, but lambda is an exception. A better name would be make-function. Lambda derives from the notation in Russell and Whitehead's Principia Mathematica, which used a caret over bound variables: x̂(x + x). Church wanted a one-dimensional string, so he moved the caret in front: ^x(x + x). The caret looked funny with nothing below it, so Church switched to the closest thing, an uppercase lambda, Λx(x + x) . The Λ was easily confused with other symbols, so eventually the lowercase lambda was substituted: λx(x + x). John McCarthy was a student of Church's at Princeton, so when McCarthy invented Lisp in 1958, he adopted the lambda notation. There were no Greek letters on the keypunches of that era, so McCarthy used (lambda (x) (+ x x)), and it has survived to this day.
So, yes, on the topic of this post - Church pops up in loads of Lisp retrospectives. Maybe he's "forgotten" by people with very little engagement in the history of computing.
[1] https://raw.githubusercontent.com/emintham/Papers/master/Chu...
But this https://www.classes.cs.uchicago.edu/archive/2007/spring/3200... mentions "THE CALCULI OF LAMBDA-CONVERSION" linked here https://compcalc.github.io/public/church/church_calculi_1941...
I have a memory of a paper by McCarthy himself where he tells that the first implemented Lisp had a notation close to FORTRAN. S-expressions were only intended for the theory.
>In computer programming, M-expressions (or meta-expressions) were an early proposed syntax for the Lisp programming language, inspired by contemporary languages such as Fortran and ALGOL. The notation was never implemented into the language and, as such, it was never finalized https://en.wikipedia.org/wiki/M-expression
It was never implemented.
It was M-expressions which users were to interact with, and S-expressions were what the machine would use.
See the full email from Steve Russell on the topic on this page https://www.iwriteiam.nl/HaCAR_CDR.html, and here's the relevant quote:
> "After several months and giving a few classes in LISP, we realized that "first" and "rest" were better names, and we (John McCarthy, I and some of the rest of the AI Project) tried to get people to use them instead.
Alas, it was too late! We couldn't make it stick at all. So we have CAR and CDR."
Personally I don't mind them, they're nicely introduced in "A Gentle Introduction to Symbolic Computation" and they stuck easily enough for me.
IPL also used the same list structure; it used the terms SYMB and LINK.
The original Lisp CAR and CDR operations used the machine-level instructions with those names on the IBM 704.
Cons cells were stored in single words of the 36-bit memory and 36/38-bit registers, and the CAR and CDR operations accessed the "address" and "decrement" 15-bit parts of a word, respectively.
The Address and Decrement terms and the fields of the 36-bit word they denote do come from the architecture.
There are instructions like LXA and LXD -- load index from address, load index from decrement.
The linked page with details was interesting, thanks!
> And then a few weeks later I had a user population of about a dozen, most of them friends, and I didn't want to screw up my embedded base. The rest, sadly, is history.[1]
In the short video from Scott dicussing it (https://inv.nadeko.net/watch?v=juXwu0Nqc3I) he says clearly that Church never discussed the lambda calculus while he was at Princeton, and that he thought Church was bitterly disappointed that his system was proven to be in some way inconsistent.
I wonder if Church named it after the Russell and Whitehead notation, and later wanted to distance himself from the whole thing so dismissed the notion. I had a quick look for the "unpublished letter to Harald Dickson" mentioned in the wikipedia there and can't find anything. Hmm.
> Dana Scott, who was a PhD student of Church, addressed this question. He said that, in Church's words, the reasoning was "eeny, meeny, miny, moe" — in other words, an arbitrary choice for no reason. He specifically debunked Barendregt's version in a recent talk at the University of Birmingham.
As a French native, I like to rely on the expression "personne lambda", which is a way to say a layman, that is an anonymous person, which matches pretty well anonymous functions. More generally in French as an adjective lambda means "usual/common", and you might know the lambda letter is at the middle of the Greek alphabet, so it does make sense to represent a mean thing, like common sense.
[1] https://math.stackexchange.com/questions/64468/why-is-lambda...
I followed the two links from the comment on SE making the claim that Church's choice of lambda was completely arbitrary (pun intended). The first one doesn't seem to work, and the second one is a 2m youtube clip of Dana Scott talking about the subject.
I watched the video, the audio is a bit hard to make out in parts, and I'm left thinking the SE commenter interpreting Dana Scott, who you quote fully there, is overstating the case somewhat. Perhaps the claim should be moved from "likely true" to "somewhat uncertain", but not in any way "debunked" as the commenter says. Debunking means providing conclusive evidence against a claim, that any reasonable person would be obliged to agree with. Here we have an interesting and relevant anecdote, but it's still anecdotal, unfortunately.
Scott says a couple of things which are clearly audible that are relevant:
1. He asked John McCarthy personally why Church chose lambda, and McCarthy said he'd no idea,
2. Church never talked about the lambda calculus when he (Scott) was at Princeton, and
3. John Addison, Church's son-in-law, told Scott that he (Addison) wrote Church a postcard in which he asked him where he got lambda from, and that Church sent the whole postcard back to him with "eeny, meeny, miny, moe" written in the margins.
So I'm very happy you shared some more information on the subject, but I feel a conscientious biographer, for example, would be forced to include Scott's and Barendregt's theories and say that without confirmation from Church himself the matter is hard to decide. If anyone has a relevant quote from Church, I'd love to see it, but I presume it mustn't exist if Scott is so convinced there's no link.
I'm tempted to also point out more generally that all symbols are esoteric if you go back far enough, so I don't know if your particular quest could ever have been satisfied. In any case, I learned French beore Lisp, so I did have the experience of going, "oh, like in French? Why is that, I wonder?".
>I'm tempted to also point out more generally that all symbols are esoteric if you go back far enough
It certainly all depends on what we put as definition of esoteric and what is far enough. :) Here what I was meaning was in contrast of a choice like "anonymously".
Since you speak French, you might be interested to watch "L’odyssée de l’écriture - Les origines ": https://www.youtube.com/watch?v=8P2nAj50LdY
I see the point of the documentary, it would have absolutely floored me a couple of months ago. However, I have very recently spent a month learning Serbian, which included learning (Serbian) Cyrillic, which led to me finally asking some obvious questions - where does Cyrillic come from, where does Greek come from, where does the Latin alphabet come from, etc. So I've binged similar information of the sort that documentary describes, I think.
I still might watch it later though and see if there's more juice in there, thank you very much!
Side remark: "Please don't take the bait" is a good analogue to "please don't feed the trolls".
We've taken the bait out of the title now, but when a thread has already filled up with comments reacting to it, rather than anything interesting, that's bad for HN threads.
Here's a link to "On Computable Numbers" for easy reference for anyone who wants to read it/read it again. It's a cracker https://www.cs.virginia.edu/~robins/Turing_Paper_1936.pdf
While specialists in his field know his work and his name (but not even everyone in software does), the public do not.
While your parents and friends see the dramatised exploits of Turing in films like The Imitation Game, or his face on the currency, the same is not said for Church.
Every field has it's public heroes, usually related to the stature of their work - Fleming and Jenner, Marconi, Ford, Bell. Turing.
Anyone will at least recognise these names, but not so for Church.
I think the reality is that only a very small number of scientists, mathematicians, and similar are household names.
Benedic Cumbersome was a good actor, but it's important to remember Michelangelo actually didn't look like Kirk Douglas.
[1] https://www.theguardian.com/tv-and-radio/article/2024/may/29...
[1] Massive in the case of Euler obviously.
Related:
Alonzo Church, 92, Theorist of the Limits of Mathematics(1995) - https://news.ycombinator.com/item?id=12240815 - Aug 2016 (1 comment)
Gian-Carlo Rota on Alonzo Church (2008) - https://news.ycombinator.com/item?id=9073466 - Feb 2015 (2 comments)
[edit] ... and if you really must use "AI" generated imagery then at least caption it as such.
it is a sacred idea by the fact that executing it does not really help understand the fact that it is equivalent to any and all computation
I think it's wrong to see it as a competition, because their collaboration did so much together. Computer Science has "two dads" and that feels appropriate for several reasons, including how Turing died. It is somewhat important to Computer Science that Church and Turing met in the middle, one with the deep raw mathematical theory background and the other with the practical (but at the time unacknowledged) background.
Also, it would be somewhat remiss to ignore that Turing did care about implementation and wanted to get back to practical implementation but wasn't allowed. There's a deep tragedy there and lot of questions about what would have been different if things hadn't been classified in the same way by the UK government. Though also in that case perhaps it would have cost us the Church collaboration that helped firm up the theory so well in our timeline.
On a separate note, I don't think lambda calculus is really maths. It's more like clever notation. The merits of any notation are subjective. It's interesting how Church never cared about how that particular idea of his inspired the design of certain programming languages - which are ultimately notations, and therefore subjective in their merits.