Rochus 4 hours ago

> Usability is similarly ill-defined and hard to measure.

Human factors are very well studied and standardized, and there is a well-established discipline called "Human Factors Engineering", which also provides established test and evaluation methods. Human Factors research is considered solid and well-established because it has been built on rigorous experimental psychology and engineering principles developed over more than a century, with systematic methodology and empirical validation. Even if much of it is unknown or ignored by computer science or programming language design, there are many disciplines where Human Factors Engineering is critical (see e.g. ANSI/AAMI HE75).

Usability is therefore neither ill-defined nor hard to measure. Several ISO 9241 series standards address textual and command-based interaction directly relevant to programming language assessment. ETSI EG 202 116 provides extensive human factors guidelines for command language style. ITU-T Recommendation Z.361 provides guidelines for Human-Computer Interfaces in telecommunications management, incorporating the ISO 9241-10 dialogue principles. ISO/IEC TR 24772 addresses programming language vulnerabilities specifically from a human factors perspective.

E.g. Ada did have substantial human factors considerations documented in its design rationale, directly contradicting the notion that such principles don't apply to professional developers or programming languages. It's rather that computer science seems to continue ignoring established fields ("Not invented here"). Human factors in software development have been overlooked by researchers in the software engineering and development research areas, despite their obvious relevance. So what is lacking is primarily interest or willingness, not the fundamentals and means.

  • saghm 2 hours ago

    Honest question: given the relatively niche status of Ada compared to other systems programming languages, and similar diminished levels of popularity for other languages mentioned in this thread (Pascal, Perl), does this really prove that these principles are particularly effective for programming languages? I understand that this is a relatively small sample size, but I feel like that could be just as well used to argue that other factors might have been more important to the successes of languages designed like Ada than the more rigorous approach to human usability. It doesn't feel particularly obvious to me that we know these principles actually work well for programming languages but choose to ignore them rather than them not fitting the domain particularly well at all, or maybe something in between where we don't really know whether they apply well yet or not, and more attempts at using them might not actually be as successful as one might hope.

    • Rochus 2 hours ago

      I didn't claim that usability features were "important to the successes of languages designed like Ada". But Ada at least explicitly considered human factors in their design (even if mostly based on expert judgment and established principles, not practical studies), which also seems very appropriate given the criticality of most applications written in Ada. But as my ergonomics professor at ETH Zurich, Helmut Krüger (successor to the renowned Étienne Grandjean), used to say: people get used to even the most ergonomically terrible systems. The "level of suffering" experienced by most people is probably simply not great enough to systematically take such aspects into account. But there are still industries where it is important to reduce the human tendency to make mistakes by taking appropriate measures. Ada was created for such an industry from the very beginning.

      • saghm an hour ago

        > I didn't claim that usability features were "important to the successes of languages designed like Ada".

        This kind of seems like it's focusing too much on my exact word choice and less the actual intent of my question behind it. The question I have is why following established principles should matter; I don't think it should be particularly surprising that someone might assume that making a language more usable for humans would be related to the number of humans who end up deciding to use it, and if that's not the case, I wanted to understand why my intuition is wrong.

        > there are still industries where it is important to reduce the human tendency to make mistakes by taking appropriate measures. Ada was created for such an industry from the very beginning

        This is a good point that I hadn't considered; it definitely makes sense to me that some domains might be less tolerant to human errors than others, and those domains would better reflect how well-designed a language is for humans.

        > The "level of suffering" experienced by most people is probably simply not great enough to systematically take such aspects into account. But there are still industries where it is important to reduce the human tendency to make mistakes by taking appropriate measures.

        Reading this part a couple of times, I think this might be where the nuance lies. My colloquial understanding of what it means for something to be ergonomic (and even by the idea of what"level of suffering" would mean) isn't quite the same as the measurement of how likely it is for something to induce human error. This might just be a case where the common use of the term isn't the same as how it's used inside the field of study, but I would have expected that the ergonomics of a language and measurement of the "level of suffering" would be with respect to the programmer, not the one experiencing the use of the software that's developed as a result. That isn't to say I disagree with the idea that the end-user experience should ultimately be more important, but I think that might account for the disconnect between what you're describing here and what I would have expected from a discussion around "programming language ergonomics" (which also might explain the difference between Ada and the other languages mentioned in this thread).

        • Rochus 9 minutes ago

          > The question I have is why following established principles should matter

          Apparently I still don't understand your question, sorry. For what I understand, following established principles is part of the engineering profession; it has proven to be the right thing to do over decades, and it is part of engineering education.

          > I would have expected that the ergonomics of a language and measurement of the "level of suffering" would be with respect to the programmer, not the one experiencing the use of the software that's developed as a result.

          Usually not the "level of suffering" is measured in human factors engineering, but the time needed and degree of fulfillment of typical tasks a typical representative of a test group is suppost to perform. You do that with different designs and can then conclude which on meets the performance requirements. Human factors typically enter a specification as performance requirements (what functions shall the system implement and how well). Given a programming language, you could measure how long a typical programmer requires to complete a task and how many errors the implementation has in the first version.

  • mfru 3 hours ago

    Apart from ADA, what are languages / stacks with most aspects of "Human Factors Engineering" considered?

    • Rochus 3 hours ago

      Niklaus Wirth designed Pascal (1970) with explicit pedagogical and human factors goals documented in his seminal paper "On the Design of Programming Languages". Wirth explicitly stated his belief that "insights gained from educational considerations could benefit programming language design in general, and that the simplicity and clarity he was striving for should be a guiding principle for all language design, serving equally pedagogical and development purposes". But - as with Ada - I'm not aware of any notable human factors engineering studies to validate design decisions.

      Also Alan Kay and the Xerox PARC team designed Smalltalk (as Papert did before with Logo) with profound human-centered considerations, and they even "tested" their early concepts with children.

      Also some other languages explicitly state human-centered design goals (e.g. Python, Eiffel), but as with Pascal or Ada the approach was more based on expert judgment, formal analysis, and established principles, not practical studies.

      • bluGill an hour ago

        Human factors studies on programming languages are really hard to do right. It is easy to study someone seeing a language for the first time. However programming well requires a high level of expertise and so the real question isn't how easy it is for a beginner, it is how easy it is for someone who has been doing it for years. Or maybe how much different is someone after a week vs month vs year (that is at what point do the experience gains plateau)- in all this you have to be careful to ask different tasks - since good programmers will abstract when they need to do something twice.

        • Rochus 36 minutes ago

          I think it's not more difficult than identifying representative user groups (at least five members per group) and defining typical tasks for a usability test, focussing on the specific features in question. It's just a lot of work and requires experience and training.

    • fuzztester 3 hours ago

      Perl is one obvious one, as anyone will realize, if one has read some of Larry Wall's articles about his (and later the team's) motivations for the (syntax and semantics) design decisions they made about the language. For example, in his annual State of The Onion talks, he often discusses these points.

      They may not have used standards such as the gp comment mentions, but they definitely considered human factors a lot.

      E.g. TIMTOWTDI - There Is More Than One Way To Do It.

      But that's not the only area in which they applied it.

      • Rochus 3 hours ago

        Perl presents a fascinating counterexample: Larry Wall, trained as a linguist, explicitly cared about human factors, but his linguistic philosophy produced a language that empirical tests show performs poorly on readability and learnability measures (see e.g. http://dx.doi.org/10.1145/2534973 or https://doi.org/10.1145/2089155.2089159).

        • bluGill an hour ago

          I haven't used Perl much, but my impression it is much easier to learn if you already know awk and sed - his original target was people who knew awk and sed well but were running into limitations using the two as separate tools. However the language quickly spread to people who don't know either of the previous tools and then it is difficult to learn.

        • pjmlp 2 hours ago

          Just like human dialects, across the same main language, :)

    • pjmlp 2 hours ago

      I think Eiffel would be one of them.

hinkley 7 hours ago

Once you understand a thing, you know what it’s capable of.

A lot of my early expertise in performance analysis was heavily informed by my SIGPLAN membership. Many of the improvements showing up in compilers and interpreters would show up in some form there, and of course those developers were either involved in the papers or had access to the same research. So when some new version came out with a big explanation of what it did, I already had a reasonably good notion of how it worked.

It was a dark day when they got rid of the paper proceedings.

  • bdcravens 3 hours ago

    > Once you understand a thing, you know what it’s capable of.

    Did you.... just quote Blade? :-)

doyougnu 6 hours ago

I still like Olin Shiver's take on this: https://www.ccs.neu.edu/home/shivers/papers/why-teach-pl.pdf

  • zweifuss 5 hours ago

    There is something to the existence of fads and fundamentals. When I started, it was Object-Oriented-Programming (with multiple-inheritance and operator overloading, of course), Round-Trip Engineering (RTE), XML, and UML.

    IMHO, not the ideas were bad, but the execution of them was. Ideas were too difficult/unfinished/not battle-tested at the time. A desire for premature optimisation without a full understanding of the problem space. The problem is that most programmers are beginners, and many teachers are intermediate programmers at best, and managers don't understand what programmers actually do. Skill issues abound. "Drive a nail with a screwdriver" indeed.

    Nowadays, Round-Trip Engineering might be ready for a new try.

  • mekoka 3 hours ago

    > Java and its OO relatives capture a communications-oriented model of computation where the funda- mental computational elements are stateful agents that compute by sending one another messages;

    I wish even only half the OOP world actually understood it as the above.

keyle 8 hours ago

I tries to answer the question "Why do we design new programming languages?" but it forgets the simplest of answers:

Because we can. Because a compiler is nothing more than a fancy text translator.

  • WalterBright 7 hours ago

    Because what we know about programming has progressed, and new languages appear to take advantage of that.

    • pjmlp 7 hours ago

      Many new languages are still recycling ideas from the 1970's research labs.

      Outside affine types, all the praise for Rust's type system traces back to Standard ML from 1976.

      The heretic of doing systems programming in GC enabled programming languages (GC in the CS sense, including RC), goes back to research at Xerox PARC, DEC and ETHZ, late 1970's, early 1980's.

      Other things that we know, like dependent types, effects, formal proofs, capabilities, linear and affine type systems, are equally a few decades old from 1980's, early 1990's.

      Unfortunely while we have progressed, it seems easier to sell stuff like ChatGPT than better ways to approach software development.

      • 1718627440 4 hours ago

        It seams like most things boil down to ideas from the 70's. The internet, distributed computing, AI, etc...

        • GuB-42 an hour ago

          Yes, but ideas are (mostly) worthless. I mean, they are necessary, but that's the easy part, building the technical foundation that make it possible is the hard part.

          The internet needs wires and routers, distributed computing need a good network (i.e. the internet), current-day AI needs GPUs and GPUs need silicon chips that defy the laws of physics. Really, looking at the EUV lithography process makes all of computer science feel insignifiant by comparison, everything about it is absurd.

          The real progress is that now, we can implement the ideas from the 70s, the good ones at least. I don't want to diminish the work of the founders of computer science, there is real genius here, but out of the billions of people on this planet, individual geniuses are not in short supply, but the real progress come from the millions of people that worked on the industrial complex, supply chains and trade that lead to modern GPUs, among everything that define modern computing.

          • 1718627440 an hour ago

            As for this measure, Software is worthless as well, which is basically fully specified ideas.

        • pjmlp 3 hours ago

          Indeed, everything old is new again.

        • Rochus 3 hours ago

          > most things boil down to ideas from the 70's

          Rather from the sixties. E.g. OOP including dynamic dispatch, late binding, GC etc. appeared 1967 with Simula.

          • pjmlp 2 hours ago

            GC was a bit earlier. :)

            • Rochus 2 hours ago

              Earlier than the sixties? All elements of OOP were known before 1967, but their combination, which we still use today under the title OOP, appeared in Simula 67 for the first time. I think the first appearance of a GC in literature was in 1960.

              • AnimalMuppet 2 hours ago

                Did Lisp have GC from the beginning? If so, that would be 1958.

                • Rochus 2 hours ago

                  I think, the first mark‑and‑sweep collector was published in McCarthy's 1960 Communications of the ACM paper "Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I". It's resonable to assume, that they already had it when Steve Russell implemented the first Lisp evaluator, but we don't know exactly when it was added.

                  • kragen an hour ago

                    No, that paper doesn't describe the garbage collector. I do think it is true that before it was published Slug Russell had implemented the GC, but I think it's correct that we don't have listings from that early.

                    • Rochus 7 minutes ago

                      Isn't the "Free-Storage List" in section 4c, starting on page 26, a mark & sweep collector? They didn't use the term then, but I think it describes one. I'm not aware of any earlier publication.

                    • andsoitis 35 minutes ago

                      The concept of automatic memory management dates back to the 1950s.

                      However, John McCarthy, the creator of Lisp, introduced the first widely recognized garbage collection mechanism around 1959/1960.

                      • kragen 19 minutes ago

                        If so, he didn't write about it in that paper, and I don't think he introduced it at all; I think Slug Russell did.

      • eptcyka 5 hours ago

        Aside from bringing in the groundbreaking feature, Rust doesn't bring in any groundbreaking changes. It could be argued that bringing in lesser known features to more people is a good thing in it's own right.

    • keyle 7 hours ago

      Hey Walter, while we have you. Have you ever had the itch to make a newer D that takes all the fancy constructs while retaining the good parts?

  • TomasBM 5 hours ago

    I'm not a 'native' programmer, but even I can agree with that.

    As it became easier to abstract away from pure assembly, people did just that. Some ideas stuck, some converged, and the ones with the most tenacious, adaptable and attractive communities remained.

    Before I learned to code, no programming language was even remotely readable to me. But the more I learned, the more I could shed the notion that this was purely my fault, and accept that sometimes things are a certain way because someone found it interesting or useful. Applies to natural languages and mathematics, too.

  • ModernMech 38 minutes ago

    Because we can, because it's fun, and because some of us are compulsively obsessed.

  • pjmlp 7 hours ago

    Really fancy one, when it gets to what translation actually means in implementation effort.

    • keyle 7 hours ago

      Hehe, it was tongue-in-cheek. It takes a buttload of efforts to make a good one.

    • darig 7 hours ago

      [dead]

  • agumonkey 6 hours ago

    linguistic constructs will direct your brain when solving problems and may help you discover solutions spaces you wouldn't with another set of idioms

    not to promote FP but imperative stateful vs closures/function oriented is quite a strong example of that

    a different paradigm can really be a massive intellectual tool

ramon156 6 hours ago

When studying programming languages you really have to focus on the why's.

When I was learning Rust I started out just associating patterns with lib types. Need to dynamically hold items? Vec. Need a global mutex? install lazy_static.

This is fine if you're beginning, but at some point you need to read about why people choose this. 9/10 times there's a more elegant option you didn't know about because you just did what everyone else does. This separates programmers from coders.

The only reason I learned this was because my current company has programmers, not coders. I learned a ton from them

lock1 7 hours ago

For me, I find reviewing and analyzing programming languages to be a fun activity. Writing esolangs is also fun, you can write it without needing to care about things like backward compatibility or practicality.

This reminds me of recreational math & gamedev, you simply do whatever you feel is fun and design it exactly as you'd like it to be.

rednafi 5 hours ago

The appeal of experimenting with languages varies widely between academics and industrial practitioners. When you’re in the business of creating languages, exploration often takes on the character of an art project.

However, as a trench-line coder, I enjoy dabbling in languages to learn different techniques for achieving a similar set of goals without sacrificing pragmatism. In that sense, I rarely have the luxury to explore purely for exploration’s sake. So I wouldn’t describe abstraction, performance, or usability as “aesthetics,” nor would I spend time on a frivolous language that I know won’t gain much traction outside academia.

I like reading the perspectives of academics just to see how wildly different they are from those of the people I work with in the industry. This is probably a good thing.

le-mark 3 hours ago

One of PGs essays hit on this when he talked about “blub”. Building a language for a specific solution/domain was/is expensive, but using a language to build abstractions or an internal domain specific language has been much more tractable. As such some languages are more or less suited for this activity; Lisp being the king then Ruby then etc.

So to me the study languages was interesting from this DSL perspective.

DarkNova6 7 hours ago

There absolutely lies value in studying programming language, but maybe not when reinventing ideas of the past.

> I encourage everyone to create the most absurd, implausible, and impractical languages. Chasing the measurable is often useful, expressing the expressible is insightful, but never forget the true goal of language design: to explore and create what isn’t.

Sorry, but this sounds more like an artsclass to me. Don't get me wrong, there was a point in time where exploration of the unknown was the only way to move forward. But these days we would need greater insights into higher-level language semantics and inherent tradeoffs to guide language-design and language evolution.

There is plenty to choose from and one can learn already so much just by reading up on the Java-EG mailing lists. Brian Goetz has a true academic mindset and I frequently feel inspired when I read his reasoning which is both highly structured and accessible.

Otherwise we would just be left with another compiler class. Compiler basics really aren't that difficult.

  • cmontella 17 minutes ago

    > this sounds more like an artsclass to me.

    Indeed, it is, and that's the point! Being interfaces to computers for humans, programming languages sit at the intersection of computer science and humanities. Lots of people like to treat programming languages like they're math class, but that's only half the picture. The other half is usability, ergonomics, learnability, and especially community. Not to mention the form of the language is all about aesthetics. How many times has someone on Hacker News called a language "beautiful" or "ugly" referring to the way it looks? When people praise Python they talk about how easy it is to read and how pleasant it is to look at compared to C++. Or look at what people say about Elm error messages versus C++ template errors. Actually a lot of what's wrong with C++ could have been averted if the designers had paid more attention in art class.

    > But these days we would need greater insights into higher-level language semantics and inherent tradeoffs to guide language-design and language evolution.

    Here's a talk that argues there's much more fertile languages ground for ideas outside of the "programming languages are math" area, which has been thoroughly strip-mined for decades:

    https://medium.com/bits-and-behavior/my-splash-2016-keynote-...

    This author takes the perspective that programming languages are much greater than the sum of the syntax + semantics + toolchain + libraries, and treating them as such is limiting their potential.

shevy-java 6 hours ago

Old languages died or are barely in use anymore.

I think it is more interesting to see which languages are still used today and how popular these are. Because this is also tied to the human user/developer.

For instance, I used BASIC when I was young - not as a professional but as a hobbyist. I liked it too. I wouldn't use BASIC today because it would be entirely useless and inefficient.

  • binaryturtle 6 hours ago

    But is that a problem of the language itself, or is it just a problem of available toolchains? E.g. if the gcc compiler collection would come with BASIC support and you just could type something like "gbasic -O3 foobar.bas -o foobar" to get a properly optimised executable out of your BASIC source code file then some may would still use BASIC today, I guess?

    I started with BASIC too. Also enjoyed BlitzBasic2 for a long time on the Amiga. That's where I learned programming… back then when programming was still fun.

    • MoltenMan 6 hours ago

      Realistically? No, not at all. The reason there are no toolchains for BASIC is because nobody uses BASIC (because it's not functional in our modern world), not the other way round.

      • vbezhenar 5 hours ago

        Why do you think BASIC is not functional? Our modern world does not differ from the 1980 world at all. Variables are variables, subroutines are subroutines.

        It fall out of fashion, along with Pascal, Perl, Ruby, but that's just fashion.

        • zozbot234 35 minutes ago

          > Why do you think BASIC is not functional?

          Because BASIC simply doesn't have first-class functions, and they would be quite hard to represent in a BASIC-like syntax while keeping the language idiomatic. Even the unreasonably clunky C pattern of having a pointer to a function taking void* as its first argument (to account for closure captures) gets you a whole lot closer to functional programming than even the fanciest BASICs.

          • ModernMech 31 minutes ago

            Here, "functional" is being used to mean "ablity to function", not "relating to the functional programming paradigm".

        • f1shy 4 hours ago

          I have even seen pretty darn impressive things done with VisualBasic back in the day. And that were not hobby things. I've seen it used in very important mission critical telecommunication equipment. The compiler was custom, not the one from Microsoft. After all, the language had pretty much anything other languages had.

          How can a language be "inefficient"? You can say it lacked on expressiveness. Maybe was too verbose? But I would not place BASIC into the "verbose" category.

        • biofox 4 hours ago

          I hate that languages have become fads. The concepts have not changed, but there is a constant churn of languages.

          I don't have to relearn natural language every 5-10 years, but for some reason I'm expected to when it comes to programming.

  • pjmlp 2 hours ago

    Depends pretty much on what BASIC compiler one is talking about.

  • jonathanstrange 6 hours ago

    I was never more productive than with REALBasic, it was a true RAD tool, and I'd happily use Xojo nowadays if it wasn't so expensive. That's a "modern" structured BASIC so I guess not what you had in mind.

    One thing I've learned over the years is that the language is almost irrelevant, the tooling and 3rd-party library support are much more important.

mellosouls 7 hours ago

(2022) This is particularly important here as the essay makes no mention of LLMs or coding agents (which were still in their infancy in development environments; this article is post original copilot/codex but pre ChatGPT).

  • aDyslecticCrow 3 hours ago

    > This is particularly important here

    No actually. Why is that important? I dont quite see why that is relevant. Could you elaborate?

  • cubefox 3 hours ago

    Yeah. Now the trend goes in an entirely different direction: telling an LLM in natural language what to achieve using those programming languages. How quickly times change.

    Edit: I assume this comment gets downvoted because people don't like where we are heading, not because they really think LLM programming capabilities won't continue to improve at a staggering pace.

    • aDyslecticCrow an hour ago

      > LLM programming capabilities won't continue to improve at a staggering pace.

      The error rate of models make language design, tooling, testing methodology and human review more important than ever before. This demands language evolution. You could get faar with lax testing and language tooling with enough caution and skill. but when LLMs enter the picture, that no longer flies.

      we need tooling, static analysis, testing paradigms, language design that restrict how dangerous the LLM is allowed to act.

      natural language is faar to fuzzy to replace programming (system specification is already famously impossible thing to do right). If you think it truly will replace code, i highly suspect you work om webbdesign, where testing and reliability was always a secondary concern.

      And even then, I think were already on the convergence platoe of LLM code. The companies are raising prices as diminishing improvement and balooning compute costs.

awesome_dude 8 hours ago

I think that one of the things that they neglect to mention, on why we invent new languages, and it's probably the most important thing - people want new ways to express concepts.

It's super important because those concepts get measured, and absorbed into existing languages (as best they can), but that wouldn't have happened without the new languages

New concepts like Rust's "ownership model", Smalltalk's "Object Orientation", Lisp's "Functional programming", Haskell's "Lazy evaluation", Java's "Green threads"

  • pjmlp 7 hours ago

    While those languages made the concepts mainstream, they weren't the ones coming up with them.

    Rust's "ownership model", is a simplification of Cyclone, AT&T's research on a better C, based on mix of affine and linear type systems.

    https://en.wikipedia.org/wiki/Cyclone_(programming_language)

    Haskell's "Lazy evaluation" was present in Miranda, before all related researchers came up with Haskell as common playground.

    https://en.wikipedia.org/wiki/Miranda_(programming_language)

    "History of Haskell"

    https://www.microsoft.com/en-us/research/wp-content/uploads/...

    Java's "Green threads" go back to systems like Concurrent Pascal.

    https://dl.acm.org/doi/10.1145/775332.775335

    • Rochus 3 hours ago

      Cyclone was indeed a very interesting language. It's always surprising what ultimately prevails.

      • pjmlp 3 hours ago

        I guess it is always a matter of being there at the right time, or having the luck to spot the right audience.

        • AnimalMuppet 12 minutes ago

          At the right time (or just before the right time). Spotting the right audience (or stumbling onto what a significant audience needs).

          And, I think, being better for what that audience is trying to do than existing tools. (But maybe that was implied in your statement.) This also implies adequate tooling and libraries.

          And publicity, to reach that audience (though viral is better than corporate).

    • exasperaited 2 hours ago

      And while Smalltalk may be the first quite "pure" OO language, it by no means invented the concept (you could argue it has a purer implementation of messaging but even some of that was in Simula 67, five years earlier).

  • DarkNova6 7 hours ago

    I don't think we have a lack of concepts, but a lack of concept connectivity.

    To avoid feature bloat of unconnected pieces of ad-hoc syntax Java does the right thing and focuses on expressing easily-composable language building blocks.

    As a last adopter language, Java has the luxory of cherry-picking good features of other languages but I think their line of thinking should be the reference moving forward.

  • mrheosuper 7 hours ago

    I'm pretty sure they mentioned

    >we create programming languages to experience new ideas; ideas that would have remained inaccessible had we stayed with the old languages.

  • exasperaited 2 hours ago

    > New concepts like Rust's "ownership model", Smalltalk's "Object Orientation", Lisp's "Functional programming", Haskell's "Lazy evaluation", Java's "Green threads"

    I sincerely want to ask if this was an ironic comment given the topic? Because obviously none of these core concepts were really new to the languages you ascribe them to.

cubefox 3 hours ago

> Why do we design new programming languages?

The main answer is that we have only a limited ability to modernize existing programming languages. For example, most languages are not null safe, because most languages are old and we can't make them null safe without breaking backward compatibility with most existing code. And we can't break backward compatibility for practical reasons. So Java will never be null safe, PHP will never be strongly or statically typed, etc.

So for fundamental language features, replacing older languages is the only way to achieve progress. Unfortunately that's a very slow process. Python, the currently most popular language, is already over 30 years old.

  • AnimalMuppet 23 minutes ago

    It's not just 30 years old. It's 30 years of people building libraries of useful code. If you "modernize" it so that much of that 30 years of work is thrown away, that's really expensive - maybe more expensive than continuing to build on a less-than-perfect foundation.

    But that turns into the trap of short-term thinking - eventually you reach the point where you would have been better off throwing it away and starting over. You don't reach that in the year you throw it away, though, nor in the year after.

    • zozbot234 17 minutes ago

      Python is one language that famously broke backwards compatibility with the 2.x/3.x split.

constantcrying 7 hours ago

>Common answers to this question will include words like abstraction, performance, convenience, usability etc. The problem with these answers is that apart from the measurable, they are all subjective, aesthetic choices.

That just is not true at all. These are all legitimate engineering tradeoffs, which any serious project has to balance. Calling this "aesthetics" is completely dishonest. These aren't arbitrary categories, these are meaningful distinctions engineers use when evaluating tools to write software. I think the students better understand what programming languages are than the teacher.

If you accept that a programming language is a tool and not just an academic game of terms, then all these questions have clear answers.

  • Rochus 4 hours ago

    > These are all legitimate engineering tradeoffs

    Agree, and we actually have both the standards and established methods to conduct representative tradeoff studies. But this knowledge is mostly ignored by CS and programming language designs. Even for Ada, there was little empirical evidence for specific design decision. A systematic survey discovered only 22 randomized controlled trials of textual programming language features conducted between the early 1950s through 2012, across six decades. This staggering scarcity explains why language designers rely heavily on intuition rather than evidence (see e.g. https://www.cs.cmu.edu/~NatProg/programminglanguageusability...).

ryandv 5 hours ago

Programming languages are obsolete in the LLM era. What current generation AI has revealed is that English is actually the ultimate representation of computer programs and systems, which is both sufficiently terse and precise to economically describe the operation of arbitrarily complex programs.

There is no reason to study programming languages in 2025, other than as a historical curiosity - the same way one may study languages equally as pitiable as e.g. COBOL, Lisp, or MIPS assembly.

  • cmontella 5 minutes ago

    I think it's the exact opposite -- LLMs have revealed the precise utility of programming languages. For decades the "English as programming language" has been the holy grail of language designers. From COBOL to SQL to AppleScript, it was the hope that one day we'll be able to program a computer just as easily as we can instruct a person.

    Well LLMs finally offer that, and what they are proving is what programmers have known for decades -- natural language is a terrible way to specify a program to a computer. So what is happening in the LLM world is they are reinventing programming languages and software engineering. They're just calling it "prompt engineering" and "context engineering".

    What this tell us is that natural languages are not only not sufficient for the task of programming, to make them sufficient you need to bring back all the properties you lost by ditching the programming language. Things like reliability, reproducibility, determinism, unambiguity are thrown away when you use an LLM, and context engineering / prompt engineering are ways of trying to get that back. They won't work well. What you really want is a programming language.

  • nobleach 3 hours ago

    Good luck debugging and fixing the multitude of errors introduced by your LLM. ESPECIALLY if you're outputting some low-level assembler or machine code. At least when we have our AI outputting Java/Typescript/Python/etc, it's easy (for a programmer) to see where it's run afoul.

    • ryandv 3 hours ago

      I have every confidence that even current generation state-of-the-art artificial intelligence will have far superior facilities at debugging and managing the complexity of even its own bare metal programming output than humans.

      Why? Consider Gemini's recent performance at the International Collegiate Programming Contest [0], in which it solved a problem that no other human team was able to solve.

      Wetware intelligence is itself obsolete, at least as concerns the domain of computing.

      [0] https://deepmind.google/discover/blog/gemini-achieves-gold-l...

  • christophilus 5 hours ago

    Sarcasm, right?

    • ryandv 5 hours ago

      The best way to get a strong answer is to post boldly incorrect statements. Then I get other people to write my actually-desired arguments for me.

      I call it, "wetware LLM prompt engineering".

      • 1718627440 4 hours ago

        And the wetware often refuses to do that and simply downvotes you. How dare they.

        • ryandv 4 hours ago

          Like real LLMs it's a bit of a slot machine, but my comment history will assure you that the jackpots are far more attainable.