If you think things suck now, just make it better! The world is your playground. Nobody makes you use YAML and Docker and VS Code or whatever your beef is. Eclipse is still around! There's still a data center around your corner! Walk over and hang a server in the rack, put your hardly-typechecked Java 1.4 code on there and off you go!
Nobody, except your future employment prospects.
There's good reasons and bad reasons for a lot of technical options; "can I hire people to do it?" is a very good reason, but it does directly lead to CV-driven-development, where we all chase whatever tech stack the people writing the job adverts have decided is good.
The same people who capitalise "MAC" in "MAC & PC", the same people who conflate Java with JavaScript, the same people who want 10 years experience in things only released 3 years ago.
I think what people who write essays like this wish for is the smaller scale of things. Industrial steel production is objectively better than old-timey blacksmithing, but hammering out sets of cutlery certainly feels a lot more personal. Steelmaking wasn't better before all these fancy mechanisms and danger and pollutants and logistics, but when it was craft, it was more satisfying to work on.
As an aside: if we say k8s, we should also say j8t.
The idea that most employers make terrible decisions now, and amazing decisions back in the day, is plainly false. The author vividly recollects working at a decent Java shop. Even there I strongly doubt everything was amazing as they describe, but it sounds decent indeed. But plenty businesses at the time used C++, for no good reason other than inertia, usually in Windows-only Visual C++ 6-specific dialects. Their "build server" ran overnight, you'd check in your code in the late afternoon and get your compile errors back in the morning. The "source control" worked with company-wide file locks, and you'd get phoned your ass back to the office if you forgot to check in a file before leaving. Meanwhile, half the web was written in epic spaghetti plates of Perl. PHP was a joy to deploy, as it is now, but it was also a pain to debug.
If you care deeply about this stuff, find an employer who cares too. They existed back then and they exist now.
This. Let's keep things in perspective when people complain about long Rust compile cycles. And even that's a whole lot better than filling in paper-based FORTRAN or COBOL coding forms to be punched into cards in the computing room and getting back line-printed program output (or a compiler error) the next week.
Until you have to compile the program without prior build cache or start a build in a CI pipeline.
You are describing it: Things in programing were bad, then suddenly all in the upside UNTIL it start to coming down.
Is not a refute of the problem. Is to pick a moment were both were bad, and like all the discussions about tech, MASSIVELY ignore that MASSIVE internet with MASSIVE money with MASSIVE backing is worse than before.
Is like people complaining that pistols on the wild west kill as do nuclear weapons, ignoring the massive difference in size and blast damage
We love the idea that we don't have any agency in this field and we're constantly being pushed by the mean baddies at the top.
People would compile their local unit locally, of course (a "unit" would be a bunch of files grouped together in some hopefully-logical way). But they wouldn't be 100% sure it compiled correctly when integrated with the larger codebase until the nightly build ran. So like if you didn't change the .h files you were pretty sure to be in the clear, but if you did, you had to be careful and worse-case-scenario do a 1-day-per-step edit-compile-test loop for a week or so. I'm not entirely sure how they managed to keep these compile failures from hurting other teams, but they didn't always (I think they had some sort of a layered build server setup, not too dissimilar from how GH Actions can do nightlies of a "what if this PR were merged with main now").
Visual Studio 6 itself was pretty OK actually. Like the UI was very limited (but therefore also fast enough), but compiling smallish projects went fine. In fact it was known to be a pretty fast compiler, I didn't mean to suggest that VC++6 implies overnight builds. They just coincided. In fact better-structured big-ish C++ projects (pimpl pattern anyone?) could probably recompile pretty quickly on the computers of the day.
Linux kernel compiles in the 1990s were measured in hours, and that codebase was tiny compared to many. So, yep, builds were slow, slow enough to have an entire xkcd comic written about them.
Have you ever considered that you don’t understand why those decisions were made and that’s why you think they were made for the wrong reasons?
Javascript?
That would solve the trademark problem...
It’s that one extra spoken syllable that pushes it into k8s I guess. ¯\_(ツ)_/¯
Sounds like “coso” (“thingy”, mildly despective).
Clearly it’s “Kates”. Like “sk8er boy”.
Something like slurring the word “internationalization” combined with 18. “Ill-ateen-yon”
(In general I hate these numeric abbreviations, they’re terribly opaque to anyone who’s not clued in)
Most people are confused by it. When I start to explain it, I sound like a jerk who hates foreigners.
The programmers on the street point and smile at it. If they work in `l14n`, they come up and give me a hug.
But it's always Dennis.
This completely ignores the fact we're doing this for our boss, not for ourselves
That's fair. When I see bad code today, and try to explain to myself what's bad about it, I realize that people totally did the same things 25 years ago.
Nowadays there is just so much more code, and we stand on taller piles of "architecture" trying to scale higher heaps of expectations. The thing is, the effect of the bad stuff seems to compound more readily than the effect of the good stuff. And meeting the demand for more code involves broadening the base of people doing the coding.
> If you think things suck now, just make it better! The world is your playground.
I agree with this. A lot of the modern expectation is artificial, emphasizing form over function. Even where it isn't, a lot of the modern technique is unnecessary cargo-culting. You can do a lot locally if you believe in your machine (https://thundergolfer.com/computers-are-fast ; https://www.youtube.com/watch?v=ucWdfZoxsYo).
For PAPER I'm targeting Python 3.6+ (where they added `pathlib.Path` and f-strings, and upgraded the SSL version) with the intent to support it indefinitely (which involves forking certain dependencies).
I started my first prpgramming job in 2003. Some things where better, others were worse. But the only way we can improve the "craft" of software development is by having introspection. It's kindnofnwhat happen during the "software crisis" of the 80s, and why "waterfal" was rrplaced by "iterative" software development.
Now, with such an impactful change like AI aided coding, it is a great opportunity to see if we can evolve anything in the building process.
From a technological view, we're a point where your development stack doesn't matter all that much.
Not VS Code, but maybe YAML and Docker if your company is trying to align what tools it uses. C# places might still force you to use Visual Studio proper. Everyone says use the right tool for the job, but for bog standard CRUD web development, we do have a shitload of tools that work and there's multiple ways to get to a fast, working product.
I still chuckle that my laptop is 3 times as fast as the cloud thing that serves our CRUD app and we pay many more times for it, but also knowing full well I do not ever want to be RDP'ing into a production box again and pouring through IIS or Windows logs.
What I definitely do see is a degradation in making choices about when to adopt a more complicated technology because of the incentives of the hiring market.
People have loudly beaten the drum to keep your skills up to date and so people choose stuff that's popular because it benefits them personally, even when the product doesn't need it. This in turn leads companies to only select people who know that stack, and the more that companies do that, the more people make technical choices to get them the best job that they can handle.
We absolutely, very much 100% see that happening now with LLM AI if you ever needed a bigger piece of proof. Pretty much everything that is happening now has just been a louder example of every bad practice since the run up to the dotcom bust.
Because of that, I'd frankly never suggest running on-prem or building a local-only app unless there was a much bigger reason (legal, security, whatever) especially if the other products in the company have chosen the cloud.
Why? Because convincing the next job that that would have been the right choice is too hard.
Edit: and to someone else's point, I made the choice to be in the Microsoft/Azure/Windows hell hole but digging myself out and moving to something else is practically working a second full-time job and holding 2 ecosystems in my head at once
On the one hand: yes, this dev has clearly chosen a career/language specialization that puts him knee-deep in the absolute worst tooling imaginable.. I cannot fathom a workflow this fucking miserable and if this was my day to day, I would be far, far more depressed than I already am.
AND, the fact that so very very much of our industry does run, perhaps not all of, but a significant amount of a workflow not awfully different from this is IMO, an indictment of our trade. To invoke the immortal sentiment of the hockey coach from Letterkenny, this shit is FUCKING embarrassing.
So much major software that ships in a web browser because writing for Windows, Mac and Linux is just too hard you guys, it's simply too much for a sweet little bean like Microsoft ($3.62 trillion) to manage as they burn billions on AI garbage, is FUCKING embarrassing.
Half the apps on my phone are written this way which is why they can barely manage 30hz on their animations, die entirely when S3 goes down, and when they are working, make my phone hot. To run an app that lets me control my thermostat from my desk. That's FUCKING embarrassing.
And my desktop is only saved by virtue of being magnitudes more powerful than my original one back in the 90's, yet it only seems a scant more capable. In the early 00's I was sitting on Empire Earth and chatting with people over TeamSpeak. My computer can still do this, and with the added benefit of Discord can stream my game so my friends can all watch each other, and that's cool, apart from I lose about 10 fps just by virtue of having Discord open, and when I'm traveling? Oh god forget it, Discord flounders to death on hotel wifi despite it being perfectly cromulent DSL speeds. Not BLAZING, surely, but TeamSpeak handled VOIP over an actual DSL connection, with DSL latency, in the 00's. That's FUCKING embarrassing.
All our software now updates automatically by default, and it's notable when that's a GOOD thing. Usually what it actually means is the layout of known features changes for seemingly arbitrary reasons. Other times more dark patterns are injected. And Discord, not to pick on them, but they're the absolute fucking worst for this. I swear they push an "update" every time one of their devs sneezes, I usually have to install 18 such updates on each launch, and I run it very regularly! And for all that churn, I couldn't tell you one goddamn thing they actually added recently. FUCKING embarrassing.
And people will say "oh they could be better," "we know we can do it better," "these aren't the best companies or apps" okay but they are BIG ones. If the mean average car in America got awful fuel economy, needed constant service, was ill-designed for it's purpose and cost insane amounts of money...
Oh, that happened too. I think I just made my metaphor more embarrassing for another industry.
Sometimes, you can avoid contact with this mediocrity, but often you don't and you're forced to play in this swamp.
Sometimes the programmer doesn't get to choose. I do find the business drive toward mediocrity quite maddening.
I do agree that comparing the past with the present if fraught with complicated nuances, and people do tend to see the past with rose tinted glasses. But, I read Talwar's blog post more as a personal reflection on their experiences they are facing and not some kind of scientific treatise on what went wrong.
if this were titled “Java/JavaScript peaked” or “my reflections on XYZ” and written like that, I wouldn’t have given it a second thought. but claiming programming peaked 15 years ago leads me to not feel bad about my summarization
We have records from many periods in history of old men crowing about how society is collapsing because of the weak new generations. Thing is, maybe they were always right, and the new generations just had to grow up and take responsibility? And then again, maybe sometimes they were little too right and society did in fact collapse, but locally.
“Hard times create strong men. Strong men create good times. Good times create weak men. And, weak men create hard times.”
― G. Michael Hopf, Those Who Remain
I bring this example up every time, but I'm a baseball fan. And seemingly every generation of fan has said there's more people striking out than there used to be. Is it because there part of getting old as a baseball fan? No! It's really happening. Strikeouts have quite literally been going up from one decade to the next for basically a century.
So sometimes it's because the thing is really happening. Environmental ecosystem collapse? Real. People having shorter attention spans every next generation? Real! Politics getting worse? Well, maybe the 1860s were worse, but the downward trajectory over the last 50 years seems pretty real. Inefficiency of increasingly automagic programming paradigms? Real!
Sometimes things are true even if old people are saying them.
copying only the conclusion for a tl;dr: "The only way that the aphorism explains history is by reinforcing confirmation bias - by seeming to confirm what we already believe about the state of the world and the causes behind it. Only those worried about a perceived crisis in masculinity are likely to care about the notion of "weak men" and what trouble they might cause. Only those who wish to see themselves or specific others as "strong men" are likely to believe that the mere existence of such men will bring about a better world. This has nothing to do with history and everything with stereotypes, prejudice and bias. It started as a baseless morality tale, and that is what it still is."
https://old.reddit.com/r/AskHistorians/comments/hd78tv/does_...
It’s essentially a truism warning people that problems you ignore don’t fix themselves, and has nothing to do with gender or gender stereotypes, that’s a linguistic misunderstanding. In this context, “men” is gender neutral and means “people.” In old english, the word “men” is explicitly gender neutral and there was a different word, “wēr” for male people, which is still used in some contexts, e.g. “werewolf” means wolf man.
Abundance allows comfort, comfort enables complacency, and complacency can weaken the social fabric by encouraging short-term gratification over long-term maintenance.
People worry about masculinity because masculinity requires structured, pro-social outlets to not be toxic. A aimless or misdirected male population is an incredibly corrosive and/or dangerous thing. It can rot out a society from within, or make a society susceptible to subversion from without.
Societies use rhetoric about strength because if a society does not maintain systems that cultivate competence, responsibility, purpose, and pro-social ambition (especially in its most impulsive members), it becomes brittle.
Abundance, by contrast, allows seed saving, food storage for winter, spare resources to use on washing and hygiene and medicine and recovering from illness, rule of law and enforcement, time away from subsistence farming and scavenging for food to enable things like developing metalworking skills, inventing, practicing archery, spending time on other society-building rituals like building churches and going to church.
> "A aimless or misdirected male population is an incredibly corrosive and/or dangerous thing"
If they are "incredibly dangerous" does that not make them "strong"? These are supposed to be the "weak men" created by "good times", aren't they? Are they strong men created by weak times who are themselves creating weak times by rotting society? Or are they strong because they are men, independent of the times? Does this fit into the saying at all?
Famine is not isomorphic to “hard times”, and particularly not what the aphorism is referring to: self-created hard times, wherein a society’s ability to self-sustain and compete externally is needlessly curtailed.
> If they are "incredibly dangerous" does that not make them "strong"?
I said corrosive and/or dangerous, and weakness can be both corrosive and dangerous.
What you linked to was not a debunking. It was a political viewpoint. Reasonable arguments exist for a different one.
Nobody claimed it was
> "particularly not what the aphorism is referring to"
The aphorism does not say what it is referring to, you are making this up so it says what you want it to say (which is bias). This wouldn't be a problem if you used that to make a point and argue your point, but it is a problem when you just go "I imagine that it means something else, so you're wrong". Self-created hard times such as ... what? If laziness in farming doesn't create famine in winter... what hard times are more relevant than that for a society in 0 AD? "Needlessly curtailed" by who or what effect?
> "I said corrosive and/or dangerous, and weakness can be both corrosive and dangerous."
Can it. Is there any way to measure this weakness? Is it actually a thing?
In which case it makes no falsifiable claims. If “hard times,” “weak men,” and “strong men” have no stable meaning, the cycle can’t explain anything and can be retrofitted to any narrative. There would be nothing to argue for or against.
That isn’t the case. “Hard times” in this context means the cumulative internal consequences of institutional decay, complacency, and short-termism. Not natural disasters. That’s why your famine example is irrelevant.
> Is there any way to measure this weakness? Is it actually a thing?
Sociological concepts are evaluated by their broad effects, not by a single scalar value. Declining institutional competence, eroded norms, reduced accountability, and loss of collective purpose are both observable and historically recurrent.
I’m not historian but for example I could challenge the idea that a rhetoric about strength and keeping a masculine ideal for the young male population was non existent in European feodality where only nobility had the privilege of fighting, and 90% of the population were farmers. Or that 2000 years ago Jesus already challenged the idea that men needed to be strong in the traditional sense, and that real courage was loving and forgiving among others. I could go on with fashion and clothes but maybe just look at a West European king painting to reevaluate what masculinity is supposed to look like traditionally.
My understanding is that your rhetoric appears only recently (and is therefore not traditional) coinciding with nationalism rise and the need for bodies to throw in the total war (another modern invention) meat grinder.
You can disagree, and I’m open to hearing your counter arguments, because I’m not dismissing you as biased.
One self-described historian. On a Reddit post. Let’s not pretend this is the unified or authoritative voice of the discipline.
> I could go on with fashion and clothes but maybe just look at a West European king painting to reevaluate what masculinity is supposed to look like traditionally.
You’re conflating aesthetic masculinity with functional masculinity, and that’s a category error. The aphorism isn’t about how men dressed in the 17th century or how they signaled status — it’s about what kind of men can sustain a civilization.
In this context, “strong men” refers to individuals who demonstrate the discipline, competence, long-term responsibility, and willingness to bear risk that are required to build, maintain, and defend the institutions that keep a society stable — especially when conditions are difficult. It’s a sociological concept, not an aesthetic one, and it has nothing to do with your personal distaste (or favor) for particular cultural aesthetic expressions of masculinity.
Or I can reframe it one more way: If good times create weak men, then all the rich people currently running things corruptly and soaking up whatever 90% of the wealth, are weak, and all the discipline and virtue in society are among the rest of us. Cultivate competence, responsibility, purpose and pro-social ambition in the super-rich and you might have something there.
Like, when React was new I had total Delphi deja vu. And then they went about reinventing MVC (not the Rails MVC, real MVC) and calling it "unidirectional data flow" instead of just MVC, and feeling all smart about themselves and doing proud conference talks, and I was like "this is just MVC but with worse naming".
But React also made it so that every component is designed to be reusable. Like, in Delphi you had a "Form" on which you dropped "Controls" and then you could also create your own controls if you were really advanced. But most people didn't feel like they were advanced enough, so code reuse was a mess. React made it so that every control (cough component) is reusable, because using components is the same as making components. That's a good idea! Purely functional UI, that's also a good idea! Then they threw OO out instead of fixing it, that was a terrible idea, but bottom line it's still great! Plus, Delphi didn't have to deal with the horrible mess that is HTML and CSS so it had it easy.
But yeah lots of people my age saw the same, saw how it was just Delphi all over again but with different mistakes, and focused on the mistakes. It really is purely an attitude thing.
I'm having a lot of fun with signals and SolidJS and observables now and it baffles me that something so elegant and fast took this long to be discovered (or more like, to get ergonomic and mainstream enough).
To my ears this is a hilariously naive statement. It sounds to me to be roughly the equivalent or a 7-year old saying "Adults have boring jobs where they sit at a desk all day. I hate it. I promise when I'm old I'm gonna be an Astronaut or play Major League Baseball."
It's not that they don't mean it, it's that one should make promises about a situation they can't yet understand. While some of those kids probably did end up being astronauts or baseball players 99%+ who made that promise didn't. It turns out that being an adult gives them perspective that helps them realize the reasons they want a desk job even if they don't like it, or for many they actually enjoy their desk job (ex they like to program).
So the same if a million young people all thought similarly, and then magically changed their view when they got there dont promises your going to be the one who will break the streak.
You might turn out to be an astronaut, but many people listening, based on good previous evidence will rightly assume you won't.
you’re wrong right here — I understand. you’re just speaking nonsense and bullshit; you sound naive to me
Do you expect to learn? Get wiser?
If you do, you will eventually develop wisdom that younger people don’t have yet - or may never get. Younger people find new ways to do many things better, but regress in other ways. Lacking your (and your generation’s common) experiences.
Which is why the only old people who can’t see any real regression are … well I have yet to meet that kind of old person, other than those unfortunate to have dementia.
Also, every new better (or perceived better) way to do things has to reinvent many obvious things all over again. Things many won’t realize were already solved by previous practices. Which takes time.
So meanwhile, regressions.
And there is no assurance that new ways will really be better, after all regressions are addressed. Because it is impossible to see all the implications of complex changes.
Anyone who isn’t aware that the amount of today’s glue code, rewriting of common algorithms for slightly different contexts, the mush mash of different tools, platforms, and dependencies, and all their individual quirks, was a non-optimal outcome…
But the current pain points will drive a new way. And so it goes.
Progress is not smooth or monotonic.
It is a compliment to discount that you won’t also notice. Not a critique.
you’ve also just said a ton of stuff I don’t disagree with, but I’m not sure what discussion you’re trying to have here
I do regret the time spent reading this article and participating in this comment section; that was naive of me!
The word “naive” was a strong one to throw at you, but I think in the context it reflected irony and humor, not disrespect.
Anyone who learns anything looks back on a naive version of themselves. I remember thinking a lot like you, too.
(I don’t think things are collapsing, but do see significant unnecessary regressions in the state of programming: Madness, insanity, everywhere! :)
I'm so glad that for the most part in my early internet days (early 2000s), I was pseudonymous. I tended to have very strong opinions about stuff I had barely just learned and didn't have experience to get nuances. My political opinions have completely flipped and I look back on my young firebrand days and unfortunately see lots of young people repeat the same vapid shit that I believed because I was ignorant but convinced it all followed from simplistic crap ideas I was raised with.
I don’t do this; among other differences with y’all in this thread, it’s why I’m not worried about the caution presented here
"...it’s one of my least favorite types of people; and that’s precisely my point, old men have been saying society is collapsing since ancient times, yet here we are, with things better than ever"
Which is a pretty strong opinion. Also, pretty much all societies that ever existed have collapsed. When that happens, life generally sucks and lots of people die. I'm not just talking about ancient Rome or Greece, or Easter Island, or the fall of dozens of different empires, or more recently South Sudan or Haiti.
Other people in thread called you naive. I won't insult you like that but just given the statements here, there's a whole lot of familiar-sounding overconfidence that reminds of things I'd have said in my 20s.
If you don't think they were at least sometimes right, to what do you instead attribute the various cases of socio-economic collapse documented throughout history?
When people say they see history repeating itself, it's worth hearing them out.
(broken clock right twice a day)
I’m all for hearing well-reasoned arguments; presenting pithy quotes as fact is absurd; claiming programming peaked 15 years ago is absurd
But every generation doesn't have old men screaming "society is collapsing" at the same rate. There's always a baseline of people with a "get off my grass" mentality, but if you factor that out, occurrence of people actually pointing out that the world is on a dangerous path isn't uniform from one era to the next. Very few people, if any, were seriously making such an argument 30 years ago.
People who are genuinely making reasoned arguments, and not just complaining about things being outside their comfort zone, should absolutely be taken seriously.
> claiming programming peaked 15 years ago is absurd
Well, what are you measuring? It certainly peaked in some dimensions 15 years ago. Whether you personally see those dimensions as important is of course a subjective question.
Actually it's the opposite.
Strong men create hard times by trying to show to each other how strong they are. Hard times create weak men because during hard times strong men kill each other, thus mostly weak men remain. Weak men create good times because instead of trying to show their strength they just build stuff so that the world is easier for them. In good times people breed and the population returns to the mean with just enough strong men to start the cycle again.
WWII was the last time strong men created hard times. We are overdue for another round and it shows.
Putting things on a screen was (is) stupidly simple.
That's the whole thing. It's not the types or npm or whatever. It's that you could start with C/Python/Java and spend your first 6 months as a coder printing and asking values on a dark terminal (which is something a newbie might not even have interacted with before) or you could go the html/css/javascript route and have animations moving in your screen you can show to your friends on day one.
Everything flows from that human experience. UIs are Electron because creating UIs for a browser is an extremely more universal, easy and creative-friendly experience than creating UIs for native apps, particuarly if JS is your native language for the previously stated reason.
The industry failed to adapt to the fact that the terminal wasn't any longer the reality of computer users, and the web kinda filled that niche by default.
Also, React was an answer to the advent of mobile phones, tablets, smart tvs, and basically all the explosion of not-a-desktop computer form factors. You could no longer assume your html was a proper format for everything. So you need an common API to be consumed by a mobile app, a tv app, a tablet app... react was the way to make the web another of the N apps that use the api, and not get special treatment. The idea made sense in context, back then.
This is why HTML is still a great language for building UIs and that's why Visual Basic had a huge success in the early 90s: drag UI components on a panel, write the callbacks for clicks, save and distribute the exe. Almost anybody could do it.
React and its siblings are much more complicated than that.
We had Windows 2000, a server operating system that worked well as a desktop. Both Visual Basic 6 (aka VB6) and Borland Delphi allowed drag and drop GUI application development. Microsoft Office Professional supported most of the features of VB6 while controlling Spreadsheets, Databases, and other documents.
Anyone skilled in a domain other than computing could spend time and put together a reasonably decent application to help with their jobs, and it just worked. You could then call in a professional to clean up edges and make it faster/more reliable if it needed to be scaled.
The documentation was available for pretty much everything, in print, and on screen, with working examples for almost every single function.
It was before the whole .NET distraction, and forcing web pages into everything.
It definitely wasn't perfect... we didn't have widespread version control. No Mercurial or GIT. Mostly, it was numbered PKzip files stored on floppy disks.
We still don't have reliable secure operating systems, I think we've missed that window. Genode was my hope, but it remains a collection of ingredients instead of a daily driver.
In Apple BASIC:
- HPLOT x, y
- HPLOT x1, y1 TO x2, y2
In QuickBASIC on MS-DOS:
- PSET (x, y), color
In contrast I don't find HTML to be "stupidly simple" at all.
I'm still waiting for some GUI-based terminal to appear
I want to call git in Scratch!
You literally laid out your UI in a WYSIWYG, drag and dropped connections to your code, and had something working on day one.
It was even easier than the web, because what you were laying out were full components with code and UI and behavior. The web still hasn’t caught up with that fully.
When I see comments like these, I better understand why old timers shake their fist at the youngsters reinventing the wheel badly because they don’t understand what came before.
If you are looking for somebody to shake your fist at, try the market protection agencies all over the world.
I don’t disagree, that just wasn’t available as a starting to program option when my generation started learning around 2005. If it still existed, it was way too niche for me to know about as a beginner.
And I don’t include myself in the generation that started programming through JS, I went the console route a bit earlier. But I have seen friends enter programming later on and it’s clear why that is the main choice.
At least in my corner of the world they were practically non existing. The first time I saw macOS at all was in 2012, and it was on my pc after managing to turn it into a hackintosh with help from some random Eastern European guy from a message board.
Programming was more exciting when you had amazing things to imagine having - like a 256 colour screen or BASIC that ran as fast as machine code (ACORN ARCHIMEDES) or the incredible thought of having a computer powerful enough to play a video and even having enough storage to hold a whole video!
2400bps! Checking email once a day
Everything came true. My screen has 16million colours and I don't notice. I have 12 cores and 64GB of memory and fibre optic internet. The extremely exciting future arrived. That mind-expanding byte article I read about Object Oriented programming came true enough for us to see that it wasn't the answer to everything.
There are 2 problems now - nothing one does is unique. You can't make a difference because someone else already has done whatever you can think of doing ....and of course AI which is fun to use but threatens to make us even more useless.
I just cannot really get my sense of excitement back - which may just be because I'm old and burned out.
In an ironic twist of life, this is almost what I'm back doing right now. I turned off notifications and pull messages years ago because of all the messages I'm getting for a dozen different systems. I check mail at most a few times per day and that's it. I wouldn't be able to work if I'd have to actively keep an eye on them. I can get away with it because everybody is using Slack for work or WhatsApp for personal life, so there is no urgency to check mail. I'm on Slack too, so I see if I have messages there but WhatsApp is silenced and I allow no notification of any sort on the lock screen of my phone.
It's always surprising to me when I see people being nostalgic for the old days. Yes, things seemed simpler, but it was because there was less you could do.
I'll always fondly remember my attempt to get on GeoLink with a 300 baud modem, and then my parents realizing that the long distance calls made it far, far too expensive to use, and returning it. Sure, I was disappointed at the time, but it wasn't too much later that 56k modems existed and we had a local unlimited internet provider. And now it's a fun story.
But I was actually just as frustrated at the time as I am now, but for different reasons. Change exists, and that's good.
I agree that it feels harder to make your mark today. But I don't think it's actually harder. There's plenty of fun things that people would love to see people do. Just yesterday, I found out about the Strudel musical programming language and watched an amazing video of someone making Trance with it. And those kind of discoveries happen constantly now, where they were pretty seldom back 30 years ago.
We're at the point that anyone can make a game, app, webpage, etc if they put enough effort into it. The tools are so easy and the tutorials are so plentiful and free that it's really just about effort, instead of being blocked from it.
I've been saying "we live in the future" about once a month for years now. It's amazing what we have today.
And because computing was less mature and a younger field overall. If computers remained stagnant for 500 years, fixed as they were in 1980, I bet that programming would become increasingly more complex, just to enable doing do more with less
haha :-) It was of course 2400 baud and we were using FidoNET which was very very exciting at that time in Zimbabwe. We'd spend 10 minutes trying to get a dial tone sometimes but it was magic when you connected and saw something was coming in. International telephone calls were so expensive that we talked to my brothers overseas once or twice a month at best. With email we could chat every day if we wanted.
The limitation then was information - no internet, no manuals no documentation. I wrote a text editor and did my best to make it fast with clever schemes but it always flickered. Many years later a pal at university in South Africa casually mentioned that graphics memory was slow so it was actually best to write to memory and then REP MOVSB that to the graphics memory. I cursed out loud at this and asked him how he knew that?! Well, he lived in a more modern country and could buy the right books. Nowadays you really can be a linux kernel programmer in the Congo if you want to.
But what makes me happy to hear is that - on the other side of the planet - random kids were also plugging in a modem, to get connected with each other and press at the edge of the future.
2MHz 8-bit -> 3.5GHz 64-bit multi-core CPU
1KB -> 32GB RAM (a factor of 32 million times more memory !!)
audio casette storage -> 4TB HDD
16x48 char display -> GTX 980 Ti 2560x1600 gfx (+ 6 TFLOPs)
offline -> 9600 baud BBS -> 1Gbps fiber + internet
From hand assembling on paper (or just entering memorized hex opcodes directly into memory) to vibe coding "build me an app to do xxx", or talking to Gemini on my iPhone (would have looked like an alien artifact in 1978) asking it pretty much anything.
What will the next 50 years bring? Will it be as amazing as NASCOM-1 -> iPhone + Gemini? I used to think so, and certainly in 50 years I'd expect full-blown human-level AGI to be here, but will it feel that much different ?
Get off my lawn!
And yes, the whole "when I was young" saga starting in ... 2010 ... made me pause too.
Take sound, for example. Going from "no sound" to "sound" was huge. Going from just beeps to IBM PC sound was a real step. CD-quality was a real step. Going from CD-quality to 64-bit samples at a 1 MHz sample rate is a yawn. Nobody cares. The improvement on CD quality isn't enough to be interesting.
I have high enough bandwidth. Enough screen resolution. Enough RAM, enough CPU speed, good enough languages freely available, enough data.
The problem is, everything that was an easy win with all that has already been done. All that's left is things that aren't all that exciting to do.
(I don't think this is permanent. Something will come along eventually - a new paradigm, a new language, a new kind of data, something - that will open a bunch of doors, and then everything will be interesting again.)
I started my first real, full-time job in 1992. Back here we write C, or maybe C++ if we're feeling cutting-edge. The Sparc 10 can get a bit slow when we're all on it, but I have a shelf full of O'Reilly X Windows books to look through if I can't figure something out. My mate in London sent me a QIC tape with something called "gcc" on it: sounds exciting, but before I can install it I have to find a spare day to update SunOS first.
This 2010 programming setup seems pretty amazing tbh... can't wait to get me some of that. Nice languages and tooling, no more having to edit makefiles by hand in emacs or laboriously debug in gdb. Bet they don't even use sourcesafe anymore.
I reckon by 2025 they'll have god-like stuff: fast, reliable hardware with more memory and storage than you can eat; powerful development and collaboration tools; lots of ways to find answers without having to ask that guy over in the other building. And a lot of it will be basically free! I wonder how they'll feel about all that awesome dev power, and whether they'll still use X terminals.
Programming has evolved several times since the early 90s (when I got in this business) and I had the impression it had already evolved several times by the 90s (especially talking with old mainframe or COBOL programmers).
It's evolving again now, and that process is painful. Nobody knows what the future holds.
Actually, where I was sitting on a decent PC with broadband Internet at the time, everything was much, much faster. I remember seeing a video on here where someone actually booted up a computer from the 2000's and showed how snappy everything was, including Visual Studio, but when I search YouTube for it, it ignores most of my keywords and returns a bunch of "how to speed up your computer" spam. And I can't find it in my bookmarks. Oh well.
Was this what you were referring to?: https://jmmv.dev/2023/06/fast-machines-slow-machines.html
I do. It's not without its problems but currently it's the least bad solution.
Was it this one? Casey Muratori ranting about Visual Studio Debugger slowness, and he shows a video of Visual Studio opening and debugging faster on a single core Pentium 4 from 20 years earlier - https://youtu.be/GC-0tCy4P1U?t=2160
Or this one? Roger Clark developing Notepad in C++ on Windows 2000 and commenting how fast Visual Studio opens: https://youtu.be/Z2b-a4r7hro?t=491
Given the refinements to the hardware, the modern scale of manufacturing and accessible market, and the sheer amount of engineering manpower a tech company can bring to bear nowadays, you'd think standards would have risen into the stratosphere, but instead the tech consumer is cowed into accepting slow, buggy, abusive, invasive trash.
Many websites are so shitty, they don't even manage to display static text, without one downloading tons of their JS BS.
I do remember applications like Microsoft word's UI constantly freezing though.
You can fight this machine.
You can view shorts on desktop. In a browser on desktop, when you see one on your subscriptions tab (or the front page if you still use that), you can right-click and open in new tab. You can replace "shorts/" with "watch?v=" in the URL and load the same video in the regular UI. You don't have to scroll down.
We have super powerful editors, powerful languages, a gazillion of libraries ready for download.
I use to write video games and it took months (yeah, months of hobby-time) to put a sprite on a screen.
And for Java, yeah, things have improved so much too: the language of today is better, the tooling is better, the whole security is more complex but better, the JVM keeps rocking...
And now we've got Claude.
I'm really happy to be now.
I'm not sure how that happened - in DOS you could copy things to the framebuffer. There were libraries like Allegro which came with a million features including sound/UI/sprite rendering/animation/effects etc. out of the box.
But anyways copying a sprite to a screen is not hard even if you don't use a single piece of foreign code - you can read in a BMP file and just copy it row-by-row to the screen.
At that time you didn't even have the tools to cut sprites from image in the first place. That's why it took a lot of time (and I was 12 years old and my dad was pretty far away from an engineer, that didn't help).
or are you 60 years of age now, starting at 40 and have been coding long enough to see editors progress?
It's easy enough to avoid the NPM circus as well. Just don't put JavaScript on your resume and don't get anywhere near frontend development.
TypeScript recently invited people to try their new compiler which is 7-10x faster. [1]
VS Code is popular, but personally I write TypeScript in Vim, which is definitely faster and less resource-intensive than Eclipse. But thanks to VS Code's invention of the Language Server Protocol and its subsequent adoption in many editors, I can get 'compile the code as you type' functionality. [2]
And then there's runtime speed. Java used to have legendarily slow startup times, and they're still an issue today. JavaScript's modern runtimes are optimized to start fast. On the other hand, Java still holds the crown for throughput, so it's a mixed bag. But if you want even better throughput without sacrificing too much usability, today we have Rust, as well as a vastly improved C++. (Though both of those languages have terrible compile times.)
[1] https://devblogs.microsoft.com/typescript/progress-on-typesc...
[2] To be fair, a memory usage comparison should compare Eclipse not to Vim itself, but to the whole [Vim + coc.nvim + TypeScript compiler] stack, of which most of the memory usage comes from the TypeScript compiler. I'm not sure how that compares to Eclipse. But I'm more concerned about typing latency, where I _do_ only care about Vim itself because all the LSP stuff is asynchronous.
Amen. Github (just to cite one example) has become much less usable since they switched to React. And it's clearly not a site that needs React, since it worked fine for 17 years without it.
Just landscape changed significantly comparing to 2010.
Before you didn't have current smartphones and tablets. On desktop you could just support windows only (today probably have to macOS as well). On browser you didn't have to support Safari. You didn't have to worry much about screen DPI, aspect ratio, landscape vs portrait, small screen sizes. Your layout didn't have to be adaptive. There was not much demand for SPA. Back then there was much less people worldwide with access to internet.
It’s the Future - https://blog.paulbiggar.com/its-the-future/
And now I see that's from June 2015 -- it's over 10 years old now! Wow
I'm not sure we're really in a better place in the cloud now ... The article says
I’m going back to Heroku
and in 2025, I think people still want that
I need a map of key ranges to values with intelligent range merging, it is right there on crates.io to import, it has been there since 2016, Maven Central didn't get one until 2018. In the olden days either it was in Apache Commons or it didn't exist. Halcyon days those.
Fix this and your life will be much better, generate testable preview apps on every pr, this feedback loop is a velocity killer. Management for some reason never wants to prioritize speeding it up even though it slows down every single project so advise to just do it and not ask permission.
Building was introduced as a temporary measure, to handle cross-browser awkwardness (grunt and stuff like that). People overused it. We totally don't need it anymore. TypeScript is awesome but a major blocker to this return to a more nimbler ecosystem.
People in the 2000s discovered that mixing code with HTML tags was bad and big complexity demon mansion. By the end of the 2000s, this was fixed in the tools of that time. I consider JSX a best-practices regression. It feels like ASP.NET, but the kids don't notice because they have never seen ASP.NET.
For a while, we also saw npm as temporary. A better thing, more web-friendly, would appear. That never happened.
JavaScript could have been great.
My mind struggles with this reality everyday... the "cloud" has to be the most successful rebrand of all time. In 2005: "Be very careful what data you share on the internet". In 2025: "Yeah, I just put all my shit in the cloud".
If you pine for the days of Java and Maven, you can still do that. It’s all still there (Eclipse and NetBeans, too!)
If you don’t like using Node and NPM, that’s totally valid, don’t use them. You can spin up a new mobile app, desktop app, and even a SaaS-style web app without touching NPM. (Even on fancy modern latest-version web frameworks like Hanami or Phoenix)
If you don’t want everyone to use JS and NPM and React without thinking, be the pushback on a project at work, to not start there.
Everything is better and keeps getting better if you make good decision instead of following the lowest common denominator tech.
If programming peaked, it certainly wasn't in 2010.
Hard disagree. I'm not going to argue that Java debugging was the best, however:
1. You could remote debug your code as it ran on the server.
2. You could debug code which wouldn't even compile, as long as your execution path stayed within the clean code.
3. You could then fix a section of the broken code and continue, and the debugger would backtrack and execute the code you just patched in during your debugging session.†
This is what I remember as someone who spent decades (since Java 1.0) working as a contract consultant, mainly on server side Java.
Of course this will not convince anyone who is determined to remain skeptical, but I think those are compelling capabilities.
† Now I code in Rust a lot, and I really enjoy it, but the long compile times and the inability to run broken code are two things which I really miss from those Java days. And often the modern 2025 debugger for it is unable to inspect some for the variables for some reason, a bug which I never encountered with Java.
For the 1: not really applicable in my case. For 2: I didn't know this. For 3: yes, but it worked only for a subset of issues, and honestly much more usable with Clojure and Scala.
I primarily worked with Hadoop and ETLs, you probably won't be able to convince me to be fair.
https://lib.rs/crates/subsecond
(Note, it was created by Dioxus, but it's usable in any Rust project)
In Java? How?
The world is not worse, you just ended up stuck in the wrong tech stack with the wrong company.
I do agree that the JS ecosystem is terrible tho, and i was a full time js dev before (now i only do some frontend from time to time) we already have a dedicated frontend team.
I am building an AI coding tool that doesn't eat RAM, there are super lightweight alternatives to Electron.
I've often thought about how certain properties of humans impacts the tech we make and accept.
For instance, to a human, something happening in a couple of seconds is quick, and in several seconds is fairly quick. Hence, build steps etc tend to creep up to those sorts of numbers.
That obviously has severe limitations but ideal if you don’t like the front end framework scene and want to put all your logic in backend. And there I find it a bit easier to navigate
RIP guys in corporate that have things imposed on them
I don't need my software to eat the world, I'm perfectly content with it just solving someone's problems.
I find intellij a great IDE, modern frameworks are fun to use, ai helps me doing things i don't want to do or things i just need (like generate a good README.md for the other people).
Containers are great and mine build fast.
My Startup has a ha setup with self healing tx to k8s and everything is in code so i don't need to worry to backup some random config files.
Hardware has never been that cheap. NVMs, RAM and Compute. A modern laptop today has a brilliant display, quite, can run everything, long batterytime.
Traffic? No brainer.
Websphere was a monster with shitty features. I remember when finally all the JEE Servers had a startup time of just a few seconds instead of minutes. RAM got finally cheap enough that you were able to run eclipse, webserver etc. locally.
Java was verbose, a lot more verbose than today.
JQuery was everywere.
At times it seems he's learning, but then he leans back into GPT and it feels like a lost cause. Yo bro it's crazy, i just deleted 2k lines of code, as if it was normal, with me raising eyebrows asking how many modules were outright deleted and never needed to be there in the first place.
Problem is, GPT and company will happily throw a lot of puke at the project and trust me bro that's essential we plan every single possible feature first and foremost instead of just testing the basics and build on top of them, because we have literally two requirements, so it's taking two months to do something i could do in two weeks, not because i'm more experienced, but because we should be doing what i'm telling to do, and not discuss whatever bullcrap the artificial manager is suggesting, which i would love to have the power to outright ban from our network.
Me and the junior have more or less the same age, just very different life journeys that led to programming
"Javascript" programming peaked.
This is exactly what 10 years of experience did for you. Why complain?
> And here’s the funny thing: it never broke, because the person who built it did it well, because it wasn’t taxed within an inch of its life, and because we were keeping an eye on it.
Sure, if you say so buddy...
> We used Eclipse, which was a bit like VS Code.
This made me laugh. You can't possibly be a serious person if you think Eclipse was better in any way shape or form than VS Code.
I don't have a great memory, but one thing I can absolutely still remember with 100% clarity is how bloated and memory hungry Eclipse was. It was almost unusable.
I started on C++ on Windows, using MFC, and also using Visual Basic 5.0 when it came out. VB made my eyes bleed, but a lot of people are nostalgic about it. Visual C++ did not have many fans, but it had a lot of users. I got the first taste of the Microsoft treadmill during those days, where they would get you to use a new thing, and then in a few months they hardly used it anymore, and they were promoting the next new thing. As soon as you got comfortable with MFC, they were pushing ATL, and then .NET, etc. Really, the people who were happy with what they had and never upgraded were better off.
I narrowly missed the big-manual days of programming, where it was unlikely you had online resources to help you through it. The Turbo Pascal folks and the early Mac folks remember that well. Instead, we had the big online help file (CHM) and search engines like Altavista. Code examples were few and far between. We often spent a lot of time just figuring out how to make the right incantations to get things to do what we wanted. There was a happy path, like with MFC if you make the same exact application over and over again. And then there is the difficult path, where you want it to be a little innovative.
I came across Squeak Smalltalk and used that a lot for my own personal exploration, so I always felt like there was something missing from the world that actually came to be. Still, working alone on Squeak is only as fun as long as you don't get bored and don't need something that Squeak was too slow to handle.
Like the author, I was into pair programming (eXtreme programming, actually). I never understood its detractors. Working was pretty fun.
I never liked the MS ecosystem so I enthusiastically accepted Java. IBM offered a lot of support for Linux and there were good applications waiting to be written. It had its growing pains but quickly settled in to being very productive. In this period, Intellisense and similar technologies were becoming commonplace and the Refactoring Browser had been developed (for Smalltalk but then basically for Java). IntelliJ IDEA was released and was, honestly, revolutionary. The previous IDEs were just not serious until they caught up with the developer support in IDEA.
I figured out that being a professional programmer is not for me, because I don't enjoy working on projects, and I went back to school, eventually becoming a veterinarian. So my professional career kind of ends there.
I enjoy programming, but as a casual programmer it is hard to work on something and come back to it every few months. Things do seem to rot. What compiled before doesn't now. Library use changes radically. If you started a React project before hooks, you know what I mean. Sure, you can still do it without hooks, but nobody does so you're on your own.
What AI does is it makes exploration and problem solving, as well as understanding what I did a few months ago so much easier. I don't have anyone to pair-program with. But the AI makes it easier to be the programmer who's not in the driver seat. I like that, and I think it could lead to good things in the field. The big risk is that LLMs are not very good at future things. If things are out of their training window, they make lots of annoying mistakes. For example, Debian Bookworm and Debian Trixie are somewhat different, and Claude doesn't know what it's doing yet with Trixie. Claude thinks the most recent version of Python is 3.11 or something. With LLMs you have to be comfortable working on yesterday's code. But for most of us, that's OK.
You can still code the old way just like you can still put up your old website. No one is forcing you to use AI or VS Code or even JavaScript.
Of course you.might not getting those choices at work, but that is entirely different since your paid to do a job that benefits your employer, not paid to do something you enjoy.
Have fun.