- Python: slicing, full reflection capabilities, and its use as an interface to almost anything [1], not to mention its role as an embedded interpreter (beyond the GIL debate).
- Z3: one of the closest I know for defining the ‘what’ rather than the ‘how.’ I got lost with Prolog when I tried to add more complex logic with ‘hows’, though I was a complete newbie with Prolog. Z3, however, really boosted my beginner capabilities.
- OMeta: allows me to write fast parsers without spending time resolving ambiguities, unlike ANTLR and other popular parsers using classical techniques.
- Smalltalk/Squeak: everything can be re-crafted in real-time, a blend of an OS and a programming language. The community vibe is unique, and I even have friends who implemented an operating system using Squeak. The best example? TCP/IP implemented in Smalltalk/Squeak! [3]. David Weil and I also created an early proof of concept [4] that used Linux behind the scenes, aiming to ‘compete’ with the QNX floppy disk.
- AutoLISP: a programming language embedded in AutoCAD, which I discovered in high school in the late ’80s—only to just find out that its first stable release is considered to be around 1995 [5].
- REXX on the Commodore Amiga: not only could applications be extended with this language, but they could also interact with each other. A decade later, when I used REXX on an IBM Mainframe, I still preferred the Amiga’s approach.
- Racket: I can draw in the REPL. For example with Racket Turtle.
- C++: object orientation "for C". I was not mesmerized by templates.
- Clipper: first database usage and relatively simple to create business apps.
- Objective-C: the first implementation of GCD [1], providing a C/C++ style language with fast performance, a clean event-loop solution, and reflective capabilities. However, I never liked the memory handling.
- Machine Code/Assembler on the Apple IIe: I could jump directly into the assembler monitor from BASIC.
- APL: a powerful yet complex way to program using concise expressions. Not ideal for casual use—best suited if you really understand the underlying concepts.
By the way, it’s time for Apple to embrace programming languages on iOS and iPadOS!
[1] http://www.garret.ru/dybase.html
[2] https://wiki.squeak.org/squeak/1762
[3] http://swain.webframe.org/squeak/floppy/
[4] http://toastytech.com/guis/qnxdemo.html
Aardappel ABC Beatrice Charity Esterel FL Fractran GPM Hope Lean MCPL NESL Oz ProTem Рапира rpython SLC Squiggol UNITY USELESS
Gnu/Linux: Set computer software free so people could discover and learn. 30 years of success and counting. Mind blown.
C: lean, rock-solid durability for 50 years and counting. The connected world runs on C. Mind blown.
AWK, sed, grep, Perl: lean, powerful, rock-solid durability in text and data processing for over 30 years and counting. Mind blown.
SQL and relational data: Querying big data for 50 years and counting, now powering the world in the 21st century. Mind blown.
Old IS Gold!
Thank you! I wasn’t aware of Instaparse or its use of PEGs [1] which gives you the same sense about parsing ambiguities.
> REXX - I thought this was ingenious specifically for file/text processing
Formally the REXX in Amiga was called ARexx and included extensions [2]. REXX [3] itself is not specifically for file/text processing but enables you to connect different environments [4].
[1] https://en.wikipedia.org/wiki/Parsing_expression_grammar
[2] https://en.wikipedia.org/wiki/ARexx
[3] https://en.wikipedia.org/wiki/Rexx
[4] https://www.ibm.com/docs/en/zos/3.1.0?topic=environments-hos...
- MacBASIC: Mac GUI programming w/o Pascal or C https://www.folklore.org/MacBasic.html (which is something I'll never forgive Bill Gates for)
- HyperCARD (also on that list, and agree with almost all points made): It was magic to get into Developer mode and to create stacks --- it's unfortunate that Asymetrix Toolbook didn't get further, nor that the Heizer stack for converting HyperCard stacks to Toolbook wasn't more widely available, and a shame that Runtime Revolution which became Livecode reneged on their opensource effort --- hopefully someone will make good on that: https://openxtalk.org/
Unfortunately, I never got anywhere w/ Interfacebuilder.app or Objective-C....
- Lua: real variables in TeX! (and latex), and one gets METAPOST as well --- even recursion becomes simple: https://tex.stackexchange.com/questions/723897/breaking-out-...
- OpenSCAD: Make 3D things w/o having to use a full-fledged CAD program
- BlockSCAD: Make 3D things w/o typing: https://www.blockscad3d.com/editor/ (though to be fair, https://github.com/derkork/openscad-graph-editor also allows that)
- PythonSCAD: variables and file I/O for OpenSCAD (though to be fair, RapCAD had the latter, it was just hard to use it w/o traditional variables) https://pythonscad.org/
Still working through a re-write of my OpenSCAD library in Python: https://github.com/WillAdams/gcodepreview and am hopeful that a tool like to https://nodezator.com/ will make that graphically accessible (w/o going into OpenSCAD mode to use OSGE).
Then Rust took it a few levels beyond Go.
It’s a really good list, I suspect the languages you add is going to spend on your experience. So I wouldn’t feel too sad if your favorite language is on that list. The author lists Java as a language with an amazing standard library, but something tells me a lot of people will have C#, Go or similar as their Java, and that’s fine!
In other words, Rust doesn't have a type that means "either IO Error or Network Error", so it can't just compose those two behind the scenes into something that you can later decompose. You have to create some tag, and tell the compiler how to apply the tag, because each tag is different.
After a couple of decades in the industry I really prefer explicit error handling because while powerful implicit error handling is also easy to get wrong. Which means I have to go through ridiculous chains of exceptions to find some error someone introduced years ago, instead of being able of heading directly to it and immediately understanding what is going on. So I guess it’s a little contradictory that I prefer the Rust approach, but I think the compromise of added complexity is small enough to make up for the added safety and “manoeuvrability”.
The flip-side is of course that Go’s simplicity is part of the reason it’s seeing a lot of adoption while most other “new” languages aren’t. (Depending on where you live).
I know the biggest complaint with checked exceptions is that people tend to just use catch all exceptions, but that’s just sloppy work. If they’re a junior, that’s when you teach them. Otherwise, people who do sloppy work are just sloppy everywhere anyway.
Simplicity works because it’s made for the real world. As I said I personally think Rust did it better, but if you asked me to come help a company fix a codebase I’d rather do it for a Go one than Rust.
Been on teams where every individual was free to use their best judgement, we didn’t have a lot of documented processes, and… nothing ever went wrong. Everyone knew sloppy work would come back and bite, so they just didn’t ever do it. Deadlines were rarely a problem because everyone knew that you had to estimate in some extra time when presenting to stakeholders. And the team knew when to push back.
On the other hand, I’ve been on teams where you felt compelled to define every process with excruciating detail and yet experienced people somehow screwed up regularly. We didn’t even have hard deadlines so there was no excuse. The difference between implicit and explicit error handling would have not mattered.
At the end of the day, some of these teams got more done with far fewer failures.
“if err != nil {return nil, err}” is the opposite of this philosophy. If you find yourself passing err up the call stack most of the times and most calls may return err, it’s still exception-driven code but without ergonomics.
It’s not simplicity, it’s head butt deep in the sand.
The general idea is that you should pay the cost of handling an error right there where you receive one, even if you're just going to return it. This reduces the incremental cost of actually doing something about it.
If you're given an easy way out, a simpler way of just not handling errors, you either won't even consider handling them, or you'll actively avoid any and all error handling, for fear of polluting your happy path.
I can't say I agree with the approach all the time, but I know I'm much more likely to consider the error flow doing it the Go way, even if I'm comstant annoyed by it.
Doesn't go offer the simplest way of all to "just not handle errors"? Just ignore them. It is an option. You can ignore them on purpose, you can ignore them by mistake, but you can always simply ignore them.
But ignoring by mistake gets caught by linters, in practice.
And then doing it on purpose is almost as noisy as bubbling it, and sure to raise an eyebrow in code review.
My experience is with exceptions in Java/C#, Go errors, and C++ absl::StatusOr. In theory, I'd favor checked exceptions. In practice, I find that I'm always running away from polluting the happy path and coming up with contrived flow when I want to handle/decorate/whatever some errors but not others, and that giving types to checked exceptions becomes a class hierarchy thing that I've also grown to dislike. Both Go and C++/Abseil are indifferent if noisy to me (and C++ annoys me for plenty other reasons). Maybe elsewhere I'd find options better. Maybe.
Zig's way is better.
Error returning is explicit, error returning is declared in the function signature. Error returning plays nice with defer (there is errdefer) and there is limited sugar (try keyword) that makes life easier.
I personally like it better than exceptions (even if it's much noisier for the 95% common case as another poster put it), both of which I've used enough to appreciate the pros/cons of. But that's about it.
I'll probably never use Zig enough to find out.
void oops(error_t error) { exit(1)};
int sum = addints(a, b, .error = oops);
As others have identified, not in the Go language. In other languages which support disjoint unions with "right-biased" operations, such as Haskell's Either[0], Scala's version[1], amongst others, having to explicitly check for error conditions is not required when an Either (or equivalent) is used.
0 - https://hackage.haskell.org/package/base-4.20.0.1/docs/Data-...
1 - https://www.scala-lang.org/api/3.x/scala/util/Either.html
Processes IRL don’t look like write() three times. They look foo(); bar(); baz(); quux();, which weren’t designed to cooperate under a single context. If only there was an implicit err-ref argument which could be filled by some “throw” keyword and the rest would auto-skip to the “catch”.
Afer a few programs I realized I never really thought about error handling - Go forces you to decide where you want to handle your error, where to inform about it etc.
It is still annoying (especially the boilerplate) but it had a net positive value for me in terms of learning.
Shades of "boring technology"[0] which might better be described as "choose technology that is a good fit for the problem, proven, reliable and proportionate. Spend your innovation tokens sparingly and wisely".
- running on a VM (now boringly normal)
- using a pure-functional language with immutable terms (the pure-functional language fad has since then come and gone and this is now archaic and passé if anything)
But languages only get stereotyped once. At any rate, it's pretty boring and old.
Say your code divides by 0; the runtime system can't handle 1/0 but you the programmer may have anticipated this and know what to do. This is just someplace you write code right there to handle the case (catch an exception, pattern match the 0 case beforehand, whatever).
Your error, on the other hand, means something you expected to hold didn't; recovery inline from unknown unknowns is a fool's errand. Instead you give up and die, and the problem is now the exception you know how to handle "my child process died".
In brief, a checked exception is part of a method signature, "function wash throws NoSoapException", and the compiler enforces that whoever writes code calling that signature must make some kind of overt decision about what they want to do when that exception comes up. That may mean recovery, wrapping it inside another exception more-suitable to their own module's level of abstraction, deliberately ignore/log it, or just throw a non-checked exception.
So in a sense checked exceptions suit those "error" cases where you do expect the programmer consuming your function to at least decide whether they can handle it, and if they don't you still get all those wonderful features like stack traces and caused-by chaining. In contrast, regular (unchecked) exceptions have the opposite expectation, that the caller probably won't be able to handle them anyway and must opt-in to capture them.
That’s OK. In the same way that when you start to read books, you might like Blyton, Dahl or diving into a Hardy Boys, by the time you’ve been reading fiction for a few decades your tastes might become a bit more philosophical.
The trick is to introduce these things to people when they’re ready for them, and to not treat those who haven’t experienced them as in some way inferior.
I’m not going to shove Proust down someone’s neck or make them feel dumb for not having read any of his work, in the same way I am going to think carefully about whether somebody would benefit from seeing some Prolog.
What does interest me a lot about this list is that it’s not just a “well, look, I found Clojure and think it’s better than Python, YMMV”, or “if you don’t like or understand Haskell maybe you are just too dumb to grok eigenvectors” brag piece.
There is a thought about what you might get from that language that makes it worth exploring. As a result I might revisit OCaml after a brief and shallow flirtation some years ago, and go and take a look at Coq which I doubt I can use professionally but sounds fascinating to explore.
also, what happened to opalang that he mentioned? iirc i read about it back in the day.
Then had to do things in OS/2, and then it was the REXX turn. Shell scripting didn't had to be as basic as I knew from DOS. Some years later moved to bash, and from complex script to long but very powerful oneliners, so it was another shock.
Still was working with OS/2 when learned Perl, and regexes, and its hashes, and all with some interesting semantic approach on that. And for many years it was my "complex" shell scripting language. Python was not as mindblowing or at least had the same kind of impact on me.
Mainstream CPUs expose one or more explicit stack pointers, and in assembly you use that all the time, right?
Instead of "an ilusion", I'd say it's "a convension".
Prolog works bottom-up.
Emacs LISP works inside-out.
if a < b → c := true
□ a ≥ b → c := false
fi
To an APL dfn with "guards": c ← {
a < b : true
a ≥ b : false
}
As in GCL, if none of the guards hold true, the dfn (braces) will terminate without return value, and thus the code will abort with an error.(seems like a degenerate, empty, dfn also behaves differently?)
Imagine going from pre World Wide Web - and for many companies pre-email - age of local area networks (for file sharing only) and 38.K modems directly to group and user oriented, email integrated, distributed, replicated, graphics and multimedia integrated, no-code, document oriented, user programmable, fully GUI, secure, distributed database applications.
Lotus Notes blew my mind completely. A truly amazing application at the time.
Common Lisp, which at the moment is the only language I can see myself using unless I decide to make one myself, never really blew my mind, although the little things it does make it hard for me to imagine switching away from it.
> Just that the concepts either didn’t resonate with me or, more often than not, that I had already discovered these same concepts with other languages, just by the luck of the order in which I learnt languages.
X |> one |> two |> three
versus the more conventional
three(two(one(X)))
These days, base R already includes a native pipe operator (and it is literally `|>`, rather than magrittr's `%>%`).
value = table:function1():function2():function3()
let x1 = one(x); let x2 = two(x1); let x3 = three(x2);
The other advantage of being explicit is you can set breakpoints and see the intermediate values when debugging.
Granted, the same could be said about `three(two(one(X)))`, so it's not specifically a pipe operator problem. It's just that I see it a lot more ever since pipe operators (or their cousins "streams", "extension methods", etc) have been popularized.
My guess is it's because `three(two(one(X)))` across multiple lines would need indentation on each nested function, and end up looking obviously obtuse with too many levels of nesting. Pipes make it look cleaner, which is great at some level, but also more tempting to get caught up in "expression golf" leading to extremely concise but incomprehensible code.
Just some additional commentary too - I think this post quite misrepresents it with some of the comparisons.
Prolog at its core is SLD Resolution [1] (a form of search) over Horn Clauses [2] (first order logic). Queries posted to the runtime are attempts to find a set of values which will satisfy (cause to be true) the query – whilst SQL is founded on relational algebra which more closely aligned with set theory.
Whilst there's probably some isomorphism between satisfying/refuting a logical predicate, and performing various set operations, I'd say it's a bit of a confusion of ideas to say that SQL is based on 'a subset of Prolog'. The author might be thinking about Datalog [3], which is indeed a syntactic subset of Prolog.
[1]: https://en.wikipedia.org/wiki/SLD_resolution [2]: https://en.wikipedia.org/wiki/Horn_clause [3]: https://en.wikipedia.org/wiki/Datalog
1) Formal Syntax and Semantics of Programming Languages: A Laboratory-Based Approach by Ken Slonneger uses Prolog to design/implement languages - http://homepage.divms.uiowa.edu/~slonnegr/ and https://homepage.cs.uiowa.edu/~slonnegr/plf/Book/
2) Defining and Implementing Domain-Specific Languages with Prolog (PhD thesis of Falco Nogatz) pdf here - https://opus.bibliothek.uni-wuerzburg.de/opus4-wuerzburg/fil...
3) Use Prolog to improve LLM's reasoning HN thread - https://news.ycombinator.com/item?id=41831735
4) User "bytebach" gives an example of using Prolog as an intermediate DSL in the prompt to an LLM so as to transform English declarative -> Imperative code - https://news.ycombinator.com/item?id=41549823
5) Prolog in the LLM Era (a series by Eugene Asahara) - https://eugeneasahara.com/category/prolog-in-the-llm-era/
[1]: https://quantumprolog.sgml.net
[2]: https://quantumprolog.sgml.net/container-planning-demo/part1...
You mentioned you're looking for something new that doesn't have to be related to data analytics, well constraint programming (and similar) is basically the mirror problem. Instead of data you have the rules of the "game" and the solver's job can be to: find a solution, find the optimal solution, find all solutions, or prove there is no solution.
Things like scheduling problems, resource allocation problems, etc. A real-world example would be finding the most efficient cell placement when developing a microchip, this is basically an advanced rectangle packing puzzle.
Much like prolog you define the rules (constraints) and the solver takes over from there. Part of the fun is figuring out the most effective way to model a real-world problem in this manner.
The closest thing to Prolog in this domain would be ASP, with Clingo/Clasp being the best solver available. But you also have regular constraint programming (look into MiniZinc or Google's OR-Tools), mixed-integer programming which is mainly for hardcore optimization problems (commercial solvers sell for tens of thousands of dollars), satisfiability modulo theories (often used for software verification and mathematical proofs), etc.
The mind-blowing bit is that this sort of problem is NP-complete but these solvers can find solutions to utterly massive problems (millions of variables and constraints) in milliseconds sometimes.
* C++: The feeling that I can make a-n-y-t-h-i-n-g. (not true but still mostly true) * Ruby: elegant, OO, who-needs-compile-time-if-you-have-unit-tests * Haskell: hard to learn but so worth it, by doing so I became a follow-the-types person * Elm: not hard at all, a well picked subset of Haskell for browser apps only * Kotlin: a well typed Ruby, with the ecosystem of the JVM at your disposal * Rust: when every bit of runtime performance counts (but now without sacrificing safety like with C/C++)
BASIC, but specifically on a TRS-80. I can just type something in and run it. I don't have to wait a week for the teacher to take the card deck over to wherever the mainframe is.
Pascal. That's it, all of it, in those ~4 pages of railroad diagrams expressing the BNF.
C. Like Pascal, but terser and with the training wheels removed. It just fits the way my mind works.
Java. The language itself is kind of "meh", but it has a library for everything.
Perl. Regular expressions and default variables.
And I'd also like to know how different Coq and Lean are. I'm not a mathematician. I'm just a software developer. Is there a good reason for me to pick one over the other?
Lean is designed to be a general purpose programming language that is also useful for mathematical proofs. From that perspective it’s kind of trying to be “better haskell”. (Although most of the focus is definitely on the math side and in practice it probably isn’t actually a better haskell yet for ecosystem reasons.)
If you try either, you’ll likely be playing with a programming paradigm very different than anything you’ve used before: tactic oriented programming. It’s very cool and perfect for math, although I don’t think I’d want to use it for everyday tasks.
You won’t go wrong with either, but my recommendation is to try lean by googling “the natural number game”. It’s a cool online game that teaches you lean and the basic of proofs.
By the way, don’t be scared of dependent types. They are very simple. Dependent types just mean that the type of the second element of a tuple can depend on the value of the first, and the return type of a function can depend on the value passed into it. Dependent types are commonly framed as “types that have values in them” or something, which is a misleading simplification.
C++: pointers, OOP and so much more - good and bad - all in one package
Fortran90: vector programming for humans
Python: general programming for humans
Biggest disappointment: Scratch. Why isn't visual programming more mind blowing?
Anyway, always looking out for these well-regarded "mind-blowing" languages but somehow they never show up in really mind-blowing projects? Would be interesting in this respect to have a list of mind blowing projects that were only possible due to a mind-blowing language.
One issue that i have experienced, which may be related, is that it's hard to have a little bit of controlled type dynamism. That is, once you have a type that is a union with many possibilities, you need to be vigilant not to do operations on it that makes type inference give up and return `Any`. This still bites me. There are some solutions to that in pacakges (e.g. the relatively new Moshi.jl), but I do miss it with language-level support.
And there is this XL language that has very interesting approach to extending the language, but sadly the compiler is not in a runnable state, I could only assess it by the docs.
In recent years, Go blew my mind with it's simplicity and how strong of a case it makes for writing boring code.
It's been decades since writing C/ASM and my guess is what little I remember isn't applicable anymore, but I plan on diving in again at some point if only to better understand Go.
I'm not a huge fan of videos as content delivery mechanisms for simple facts, but this is a talk - an argument made with the intent of convincing you about something that you may find counter-intuitive. What's the point of summarising that, if it loses the granularity of the argument, its persuasive power? "Static types bad, Clojure good?"
Nuance is not the same thing as a talk designed to build an argument bit by bit and be persuasive. You could summarise an hour-long closing argument for the defence in a jury trial as 'My client is not guilty', but doing so is rather missing the point.
> Just say that no, you can't effectively summarize this video, instead of doing whatever it is that you are doing here.
x == y, but with the added implication that SBF's take on longform is not actually something to aspire to.
“I don’t want to say no book is ever worth reading, but I actually do believe something pretty close to that. … If you wrote a book, you f'ed up, and it should have been a six-paragraph blog post.” -Sam Bankman-Fried
> "Static types bad, Clojure good?"
sure, this kind of summary is useless, but then is simply too short
I’ve actually struggled to get people on my team to understand what is so great about this language or the ideas behind it. Pointing people at 1 hour long YouTube videos that almost need to understand the source language to see examples of what he’s talking about haven’t been working.
I’ll think hard on how to summarize this without needing the context I have and come back to this comment. It won’t be what he’d say but I’ll share my view
I saw a big uproar in certain strongly typed FP communities around it, but I think it's more like different problem domains having different experiences. Many software operates in a closed world where they control everything, while other software has to communicate with other software that may change with a different schedule, owned by a different team, etc.
I wrote a bit more about it here: https://news.ycombinator.com/context?id=42020509
While I see the merit in his arguments, I believe his approach to data is "Clojure-like" or Lisp-like, in that it discourages explicit enumeration of all states and their possible configurations for the tradeoff of being able to sculpt these records. But, as a Haskell dev, I do not want to sculpt the records in the way he describes. I want to explicitly enumerate all the possibilities. This is, in my mind, a difference in cognitive preference. There is no right or wrong way and I think what he proposes is elegant. It is ultimately a matter of up front reasoning about the space compared with reaching a synthesis of the problem domain over time through wrestling with the problem itself. It is a statement about the efficacy of saying what can be known up front and to what extent, how much accuracy, and what utility.
I am about to launch a SaaS that pairs languages to these cognitive patterns as explicit understanding of such information can help bring together people who think similarly about information (team-building) while also opening up the possibility of expanding ones insights with alternative ideas and approaches (as his talk has done for me). The intent is to help hiring teams find devs that match their culture's programming and problem solving styles (whether that be reinforcing or doubling down on ways of thinking and acting).
Summary
Rich Hickey discusses the complexities of optionality in programming, particularly in Clojure’s spec system, emphasizing the need for clear schemas and handling of partial information.
Highlights
* Community Engagement: Acknowledges the presence of both newcomers and regulars at the event.
* Fashion Sense: Introduces a humorous take on the programming roadmap focused on fashion.
* Language Design: Explores the challenges of language design, especially regarding optionality in functions.
* Null References: Cites Tony Hoare’s “billion-dollar mistake” with null references as a cautionary example.
* Spec Improvements: Discusses plans to enhance Clojure’s spec system, focusing on schema clarity and usability.
* Aggregate Management: Emphasizes the importance of properly managing partial information in data structures.
* Future Development: Outlines future directions for Clojure’s spec, prioritizing flexibility and extensibility.
Key Insights
* Community Connection: Engaging with both veteran and new attendees fosters a collaborative environment, enhancing knowledge sharing and community growth.
* Humorous Approach: Infusing humor into technical discussions, like fashion choices, can make complex topics more relatable and engaging.
* Optionality Complexity: The management of optional parameters in programming languages is intricate, requiring careful design to avoid breaking changes.
* Null Reference Risks: Highlighting the historical pitfalls of null references serves as a reminder for developers to consider safer alternatives in language design.
* Schema Clarity: Clear definitions of schemas in programming can significantly improve code maintainability and reduce errors related to optional attributes.
* Information Aggregation: Understanding how to manage and communicate partial information in data structures is crucial for creating robust applications.
* Spec Evolution: Continuous improvement of the spec system in Clojure will enhance its usability, allowing developers to better define and manage their data structures.
There was a very similar complaint [1] ("OOP is not that bad") favorably comparing the flexibility of OOP classes in Dart vs typeclasses in Haskell. But as pointed out by /u/mutantmell [2], this is again about lack of 'structural typing' at the level of modules with exported names playing the role of fields. First class modules in functional languages like ML (backpack proposal in Haskell or units in Racket) allow extensible interfaces and are even more expressive than classes in most usual OOP languages. First-class modules are equivalent to first-class classes [3] ie. a class expression is a value, so a mixin is a regular function which takes a class and returns a class. Both are present in Racket.
[1] https://news.ycombinator.com/item?id=41901577
[2] https://www.reddit.com/r/haskell/comments/1fzy3fa/oop_is_not...
[3] https://docs.racket-lang.org/guide/unit_versus_module.html
Could you please write where in the video is Rich talking about it?
I don't want to be typing at the high level the same way I'm typing at the byte level.
Prepare to have your mind blown: https://ratfactor.com/forth/the_programming_language_that_wr...
Postscript is also RPN based..
And btw, the HP 11c "calculator" is also RPN based :) . Used it to solve some matrix things in university..
Turbo Pascal. Woah this is fast. And Mum is impressed when code sells for money.
Visual Basic. I'm living in the future, Ma. Dentist has a side business sell flowers and needs a program that can handle root canal and roses? I got you.
Perl. Suddenly system administration is fun again.
Java. My mind is blown, in a bad way. This is what I have to write to keep a roof over my head? Ugh.
Go. Aaahhh. Feels like reading W. Richard Stevens again. Coming home to unix, but with the bad parts acknowledged instead of denied.
Later: Lisp again, because of closures and CLOS let you program how method dispatch should work and CLCS let you resume from just before an error. Haskell, because you can lazily consume an infinite structure, and every type contains _|_ because you can't be sure that every function will always return a value. Java, back when the language was poor but the promise was "don't send a query, send code that the backend will run securely" (this was abandoned).
This is back now with WASM!
Was that the case for you? I'm especially curious about the exams if there were any, because it's probably hard to keep the whole context in mind as a student and to evaluate students for the teacher.
- tables (hashmap)
- regex
- pass by reference (caused a lot of confusion and frustrating bugs)
- metatables
- goto and labels
- multiple return values
After Lua I did want to try a language that's fast, something compiled to small native binary. My choice was Nim. And Nim got my mind blown to pieces: - static types
- function declarations I can understand how to use without reading their code
- functional paradigm
- batteries included standard library
- compile-time functions
- generics
- templates
- macros
- exploring and manipulating AST
- const, let, var (immutability)
- pointers and references
- compiler optimizations
- move semantics
- memory management
- I can compile code to both javascript and C?!
- C, C++ and JS interop (ffi)
- I can read and understand source code of standard library procedures?!
- huh, my computer is fast, like crazy fast!!
- and probably more, but that's what I can recall now..
local x = 0
function f(byref arg)
arg = 1
end
f(x) -- x == 1
Except that there’s no “byref” in Lua and no way to access the original argument’s location (commonly “lvalue”).Passing a table to a function and not receiving a deep/shallow copy is still “by value”, not “by reference”, because “by” here is about variables (lvalues), not values.
Edit: removed more confusing parts
> Tables, functions, threads, and (full) userdata values are objects: variables do not actually contain these values, only references to them. Assignment, parameter passing, and function returns always manipulate references to such values; these operations do not imply any kind of copy.
I also seen this described as "reference semantics" and "value/copy semantics", maybe that would be a better term?
When we use exactly “pass[ing] [an argument] by reference” it usually means passing an argument by a reference to its location (opposed to contents). I think we just avoid using that exact form if we want different meaning, there’s no proper form. Cause applying it to values rather than arguments rarely makes sense. Copy semantics are non-trivial in general. Shallow copying may not be sufficient to create a proper copy and deep copying may copy half of the vm state, think f(_G, debug.getregistry()).
The Lua manual probably had the same issue with wording it.
Again, C++ with copy constructors is an exception here. You usually receive a copy (shallow or deep depends on a constructor) if no reference, pointer or move semantics given. That was the edit-removed part above.
In old versions the JVM does jsr and ret. In new versions it just duplicates the code. I don't understand what's mind blowing about it?
TS + PixiJS is a reasonable replacement for some of it now, but I still sometimes miss having compile-time warnings.
https://github.com/Jack000/PartKAM
a vector drawing tool and CAM (Computer Aided Manufacturing) program.
Are there large models that fully learn all programming language design principles and all user codes?
Truly achieving natural language programming.
Why?
I also remember finding the interrupt/syscall system surprisingly stupid and simple. Reading the kernel's source did broaded my horizons!
1. Basic on ZX Spectrum around 5-6 years old. Draw circle and a box and a triangle and... Look ma, a rocket!
2. Pascal around 10-12. I can write a loop! and a function. But anything longer than 100 lines doesn't work for some reason. C is also cool but what are pointers?
3. PHP around 14-15. The Internet is easy!
4. Perl around the same time. I can add 5 to "six" and get "5six". Crazy stuff.
5. C around 16-17. Real stuff for real programmers. Still didn't understand pointers. Why can't it be more like Pascal?!
6. C++ around 18-19. I can do ANYTHING. Especially UIs in Qt. UI is easy! Now, let's make a Wolf3d clone...
7. Common Lisp + Emacs Lisp around 20. No compilation?! REPLs? What are atoms? Cons cells? Live redefinition of everything? A macro? A program image?
8. Python around 22. Ok, so languages can be readable.
9, 10, 11, ... StandardML for beautiful types and compiler writing, vanilla C and x86 assembly/architecture the deep way, back to Lisps for quick personal development...
Looking back, it's Lisps and vanilla C that sticked.
As strange as it may sound, I do know quite a lot about Smalltalk implementation details without ever using the language. Self (a Smalltalk dialect) papers were big in the jit compiler writer community in late 90s and early 2000s. A couple of classic general programming books used Smalltalk in examples.
You can access Objective-C in Linux via GnuStep.
GraFORTH (build the program up rather than down, mindbogglingly fast 3D graphics on the II)
TransFORTH (same thing, with floats)
Pascal and C (build up or down, convenient syntax)
APL (because it's extremely concise, and taught me the value of understandable code - focus on why, the program already says what and how)
Actor (like Smalltalk, but with a less surprising syntax)
Smalltalk (because the syntax IS important)
Visual Basic (because an integrated interface designer is immensely helpful)
Python (because batteries included)
Haskell (everything is clear once you rotate the program 90 degrees on the time axis)
Rust (because the compiler saves me from myself)
Practical Phase:
$Basic - look, I can print things to the screen $Assembly - I have no clue what's going on $Matlab - Ok, this mostly makes sense and I can chart graphs and make functions $Python - ok I can do all kinds of automation now. It would be great to create an executable to give someone, but this apparently requires traversing the inner circles of hell. $SQL - declarative languages are weird, but it's easy to get up to speed. I'm doing analysis now over multi-TB datasets. $GAMS - a declarative language for building linear programming and other kinds of mathematical optimization models (algebraic modeling). This language is weird. Ok. Very weird. I'll go back to the Python API. $Unix/Bash/Awk - ok, I'm doing everything through the command line and it is beautiful, but after a short while I have to switch to Python. $Powershell - kind of like the Linux command line, but way more powerful and a lot slower. Can write more complex code, but often the cmdlets are too slow for practical use. Have to move back to Python a lot.
Exploration:
$Lisp - this is supposedly the best language. Read several books on language. Everything seems less readable and more complicated than Python. Aesthetics are fine. A lot of power here. Limited libraries. $Haskell - this seems like it is trying too hard. Too much type theory. $APL - this is really cool. Typing with symbols. Scalars, vectors, and matrices...yeah! $Prolog - very cool, but ultimately not that useful. We have SQL for databases and other ways to filter data. Prob a good niche tool. $Forth - this is amazing, but I'm not a low level hardware coder, so it has limited value. Hardly any libraries for what I want. $Smalltalk - the environment and GUI is part of the application...what? $Rebol - way too far ahead of its time. Like lisp, but more easy to use and dozens of apps in a few lines of code. Amazing alien technology. $Java - OMG...why does everything require so much code? People actually use this ever day? Aghhh. $C - where is my dictionary/hash type? What is a pointer? A full page of code to do something i can do in 3 loc of Python. At least it's fast. $Perl5 - cool. It's like Python, but more weirdness and less numerical libraries and a shrinking community. $Perl6(raku) - They fixed the Perl5 warts and made a super cool and powerful language. Optimization seems like it will take forever though to get it performant. $OCaml - pretty and elegant. They use this at Jane Street. Why is it so hard to figure out how to loop through a text file (this was 10 years ago)? $8th - a forth for the desktop. Commercial, but cheap. Very cool ecosystem. Small and eccentric community. I wish this would be open sourced at some point. $Mathematica - by far the most powerful programming tool/environment I've used for practical engineering and science work. Like lisp, but no environment/library problems. Commercial and pricey. I can create a directed graph and put it on a map of my home town, train a neural network, do differential equations, and create a video out of my graphs all in a single notebook with built-in functions. Nice! $Swift - can I not get this on Windows? $F# - oh it's OCaml on .Net, but the documentation is a little hard to follow. Everyone assumes I know C# and .Net. $Clojure - Rich Hickey is a genius. A lisp IT won't fire me for installing. Oh wait...I don't know the JVM. Workflow kind of soft requires emacs.
Not entirely https://eyg.run/
I happen to be learning functional programming and really struggling with a different way of thinking, at the same time I saw the movie Arrival, with the alien language as main plot point.
It struck me that what programming language we pick, fall in love with, dig into, does seem to change our thinking and ways of approaching problems. It can actually change our brain.
Squares = [X*X || X <- [1,2,3,4]].
(This can be read as, Squares equals a list of X*Xs such that each X is a member of [1,2,3,4])Going from that to then realizing that you can use list comprehensions to implement quicksort in basically 2 lines of code:
qsort([]) ->
[];
qsort([H | T]) ->
qsort([ X || X <- T, X < H ]) ++ [H] ++ qsort([ X || X <- T, X >= H ]).
These examples are written in Erlang though list comprehensions are found in many other languages, Python and Haskell to name a couple.Strand is a logic programming language for parallel computing, derived from Parlog[2] which itself is a dialect of Prolog[3]. While the language superficially resembles Prolog, depth-first search and backtracking are not provided. Instead execution of a goal spawns a number of processes that execute concurrently.
http://www.call-with-current-continuation.org/strand/strand....
Having done all the functional, lisp-like, logic, stack-based, etc., "non-mainstream" languages, the one that most blew my mind so far is Verilog. Everything executing all the time, all at once, is just such a completely different way of thinking about things. Even though the syntax is C-like, the way you have to think about it is completely different from C. Like it looks imperative, but it's actually 100% declarative: you're describing a circuit.
Everything else maps pretty close to minor extensions over OOP after you've worked with it for a while. Except logic languages, which map like minor extensions over SQL. Verilog to me was in a class of its own.
And then Awk, and associative arrays. You can literally create any data structure you need using associative arrays. It won't be fast; but if I could have only one container type, this would be it.
And then TCL, which is where I really learned to appreciate meta-programming. (I wrote Tcl's "Snit" object system in pure Tcl; it got quite a lot of use once upon a time.)
I make it a point to check out new languages on a regular basis, just to keep myself fresh (most recently, Rust and Haskell).
Here are three DSLs that left lasting and positive impressions:
1) Mathematical programming languages, first GNU MathProg, then later python pulp and pyomo. These launched a subcareer in applying LP and MIP to practical, large-scale problems. The whole field is fascinating, fun, and often highly profitable because the economic gains from optimization can be huge. Think of UPS.
2) Probabilistic programming languages, first BUGS (winbugs, openbugs, JAGS), later Stan, pymc, and various specialized tools. These helped me become a practicing Bayesian who doesn't much like p-value hypothesis testing.
3) The dplyr | ggplot universe. These are really two independent eDSLs but they're from the same folks and work great together. Writing awkward pandas|matplotlib code is just soul-wrecking tedium after using dplyr|ggplot.
No mention of Reflection? Try-catch/Exceptions?