245 points by alexwennerberg 3 days ago|149 comments
jlipps 3 days ago
I love the references to Jacques Ellul's ideas, which I think are interesting to reflect on in an AI age. It helps make clear that what is fundamentally at stake in much technological "progress" is an (often only tacitly acknowledged) sublimation of "efficiency" to the place of highest value.

What's fascinating is that this value elevation seems to have gone largely unchallenged, despite being in essence an arbitrary value choice. Other choices are possible. And, I hope, still are possible, despite what the bigcorps declare must be the case in order to maximize shareholder returns.

tikhonj 3 days ago
Efficiency isn't even the best way to optimize for (expected) shareholder returns for an organization! Efficiency fundamentally trades-off against adaptability and resilience.
jlipps 3 days ago
Yes, good point! Further underscoring the fetishistic nature of efficiency as highest value ;-)
sothatsit 3 days ago
Efficiency is easy to measure. And whatever is measured becomes the goal.

It is harder to measure craft, care, or wonder. My best proxy is emails from real people, but those are sporadic, unpredictable, and a lot harder to judge than analytics screens that update every minute.

willj 3 days ago
100%. This is what I posted about on Hacker News ([1] where it got no traction) and Reddit [2] (where it led to a discussion but then got deleted by a mod).

[1] https://news.ycombinator.com/item?id=46705588

[2] https://www.reddit.com/r/ExperiencedDevs/comments/1qj03gq/wh...

will__ness 3 days ago
> But there are serious limits. [Your coding agent] will lie to you, they don't really understand things, and they often generate bad code.

I think that really high quality code can be created via coding agents. Not in one prompt, but instead an orchestration of planning, implementing, validating, and reviewing.

Its still engineering work. The code still matters. Its just a different tool to write the code.

I'd compare the difference between manually coding and operating a coding agent to the difference between a handsaw and a chainsaw - the end result is the same but the method is very different.

acedTrex 3 days ago
> the end result is the same but the method is very different.

I dont think anyone really cares at all about LLM code that is the exact same end result as the hand written version.

It's just in reality the LLM version is almost never the same as the hand written version, it's orders of magnitude worse.

mishkovski 2 days ago
So far, I haven't seen any comparison of AI (using the best available models) and hand written code that illustrates what you are saying, especially the "it's orders of magnitude worse" part.
will__ness 2 days ago
> it's orders of magnitude worse

This is not my experience *at all*. Maybe models from like 18+ months ago would produce really bad code, but in general most coding agents are amazing at finding existing code and replicating the current patterns. My job as the operator then is to direct the coding agent to improve whatever it doesn't do well.

Cthulhu_ 3 days ago
In the limited use cases I've used it, it's alright / good enough. But it has lots of examples (of my own) to work off of.
elzbardico 3 days ago
But a lot of people don't think like this, and we must come to the unavoidable conclusion that the LLM code is better than what they are used to, be their own code, or from their colleagues.
nubg 3 days ago
Speak for yourself.
acedTrex 3 days ago
I mean yes, i am speaking for myself. I am drowning in mountains of LLM slop patches lol. I WISH people were using LLMs as "just another tool to generate code, akin to a vim vs emacs discussion."
SchemaLoad 3 days ago
I'm so sick of being dumped 1000 line diffs from coworkers who have generated whole internal libraries that handle very complicated operations that are difficult to verify. And you just know they spend almost no time properly testing and verifying since it was zero effort to generate it all in the first place.
simplify 3 days ago
LLMs are an amplifier. The great get greater, and the lazier get lazier.
Madmallard 3 days ago
Considering the seeming increasing frequency of high severity bugs happening at FAANG companies in the last year I think perhaps The great getting greater is not actually the case.
phito 3 days ago
That's assuming FAANG engineers are actually great.
Madmallard 3 days ago
They're far more likely to be above average I would say.
salawat 2 days ago
Above average in tolerance for immoral business models, certainly.
dns_snek 3 days ago
I happen to think that's largely a self-delusion which nobody is immune to, no matter how smart you are (or think you are).

I've heard this from a few smart people whom I know really well. They strongly believe this, they also believe that most people are deluding themselves, but not them - they're in the actually-great group, and when I pointed out the sloppiness of their LLM-assisted work they wouldn't have any of it.

I'm specifically talking about experienced programmers who now let LLMs write majority of their code.

ekidd 3 days ago
All on my own, I hand-craft pretty good code, and I do it pretty fast. But one person is finite, and the amount of software to write is large.

If you add a second, skilled programmer, just having two people communicating imperfectly drops quality to 90% of the base.

If I add an LLM instead, it drops to maybe 80% of my base quality. But it's still not bad. I'm reading the diffs. There are tests and fancy property tests and even more documentation explaining constraints that Claude would otherwise miss.

So the question is if I can get 2x the features at 80% of the quality, how does that 80% compare to what the engineering problem requires?

okamiueru 3 days ago
I was somewhat surprised to find that the differentiator isn't being smart or not, but the ability to accurately assess when they know something.

From my own observations, the types of people I previously observed to be sloppy in their thought processes and otherwise work, correlates almost perfectly with those that seem most eager to praise LLMs.

It's almost as if the ability to identify bullshit, makes you critical of the ultimate bullshit generator.

will__ness 2 days ago
This is very true. My biggest frustration is people who use LLMs to generate code, and then don't use LLMs to refine that code. That is how you end up with slop.I would estimate that as a SDE I spend about 30% of my time reviewing and refining my own code, and I would encourage anyone operating a coding agent to still spend 30% figuring out how to improve the code before shipping.
nielsbot 3 days ago
> Not in one prompt, but instead an orchestration of planning, implementing, validating, and reviewing

Lots of times I could just write it myself and be done with it

fartfeatures 3 days ago
Sure and lots of times I can walk places. That doesn't mean bikes, cars, trains and planes aren't incredibly useful. They let me achieve things I can't in other ways for example transporting cargo without a team of people to help me. Just like AI coding.
trollbridge 3 days ago
Yet replacing walking with cars is often cited as one of the reasons for many of society's ills.
theshrike79 2 days ago
There is a middle road.

America went full car to a point where just going to the shops from the suburbs is a car drive. Crossing the ROAD needs a car in way too many places.

There are cities where you can find a shop for essentials within walking distance, bigger shops need a short to medium drive, but can be still walked to if you really want to.

jappgar 3 days ago
Yet no one seriously declares motor vehicles as useless.
worksonmine 3 days ago
Would you still use your car if you ended up in the wrong destination half the time?
XenophileJKO 3 days ago
Yes, because I can drive to the other end of the state in an afternoon. Then if I get lost, I can just course correct.
makapuf 3 days ago
Generating lots of pollution, cost, jams, noise and accidents globally. Not all cities need to be made for cars, right tool for the job etc.
worksonmine 3 days ago
Have fun getting stuck in a loop when it insists your destination exists in a place it doesn't.
stevenhuang 3 days ago
Would you use your car if you ended up in the right destination 100% - epsilon of the time? Yes, you would.

Or do you suppose this is the best AI will ever get?

nielsbot 3 days ago
Maybe your analogy holds if driving and walking took the same amount of time.

Plus "planning, implementing, validating, and reviewing" would be a bit like walking anyway in your analogy.

solomatov 3 days ago
>I think that really high quality code can be created via coding agents. Not in one prompt, but instead an orchestration of planning, implementing, validating, and reviewing.

Do you have any advice to share (or resources)? Have you experienced it yourself?

will__ness 2 days ago
storystarling 3 days ago
The practical limit is the latency and inference cost. A full planning and validation loop burns a lot of tokens, and waiting for that cycle breaks flow compared to just writing the code.
jappgar 3 days ago
Only if your flow is writing the actual code.

If you flow state involves elaborating complimentary specifications in parallel, it's marvelous

mbesto 3 days ago
> high quality code

What does high quality code look like?

> The code still matters.

How so?

will__ness 2 days ago
Great questions. For me, high quality code is code that: 1) works (is functional, no bugs) 2) is secure (no security vulnerabilities) 3) is extendable (I can quickly and easily build new features with limited refactors)

I argue the code still matters because of these 3 reasons. If the code doesn't work, your product won't work. If its not secure, there's obvious consequences. If you can't build new features quickly, you will end up wasting money/time.

delbronski 3 days ago
Most software engineers were not “crafting” before AI. They were writing sloppy code for the sake of profit, getting a pay check, and going home. Which is why AI also outputs the same crappy code.

Rumor has it there were a few elite crafters among the lot. Software wizards who pondered about systems and architecture as they had a $10 espresso macchiato.

skybrian 3 days ago
Enterprise software tends to particularly bad because it's being sold to managers who won't use it themselves. Consumer software tends to be more user-friendly (or it won't sell), but popular software isn't always what you want.

When writing software for yourself, there is a bias towards implementing just the features you want and never mind the rest. Sometimes the result can be pretty sloppy, but it works.

However, code health is a choice. You just need to know what to ask for. A coding agent can be used as a power washer to tidy up a project. This won't result in great art, but like raking leaves or cleaning your steps or plowing a driveway, it can be satisfying.

Just as you wouldn't use a power washer to clean a painting, maybe there's some code that's too delicate to use a coding agent on? But for a project that has good tests and isn't that delicate, which I believe includes most web apps, nobody's going to want to pay for you to do it by hand anymore. It would be like paying someone to clear the snow in a parking lot with a shovel rather than hiring someone with a plow.

SchemaLoad 3 days ago
Enterprise software is also particularly bad because many of the customers get to demand that things work the way they want. Leading to a million weird functions, toggles, configurability because some manager in charge of making a big purchase demanded that first it must do X, many of these features left with not even a single user after the original requester leaves. While consumer software the individual consumers just get what they are given, and a single product manager/team decide what's best.
pixl97 3 days ago
>many of the customers get to demand that things work the way they want

This here so much. When some group paying you millions is saying they want a feature or they will look at competitors all kinds of crap ends up in the software.

cess11 3 days ago
Right, and then you also have public sector software, and transnational software generally, where the provider actually needs to cater to a plethora of rules and regulations.
ozim 3 days ago
* Enterprise software tends to particularly bad because it's being sold to managers who won't use it themselves.*.

Don’t forget that managers have different goals than file and rank employees.

For SaaS I work for we get requirements like required fields for a process that manager needs to have correct data and for insights into business process.

After we deliver software we get support tickets from employees that are using system nagging that “it takes too much time to fill in all this data” and that we should “fix our shitty system”.

They don’t care and they don’t have full knowledge why stuff is required - which is fault of managers that are not training their people and explaining “why”.

Oh and of course they have to copy paste shit over and over because their company won’t have budget for us integrating with their CRM and we won’t invest in something that benefits only single customer who might not renew the license next year - but also they don’t want to make a commitment like 5 years contract where we could do some investment. Of course there are some that invest in connecting the CRM but it mostly is an exception rather than the rule.

enra 3 days ago
Consumer software can be good, but it's often also optimized for max engagement, not for the actual value or functionality.

Enterprise software can be because there isn't an incentive mismatch, good solution is more valuable for the customers, it will sell better and they're willing to pay for it.

But like you say, lot of enterprise software is bad because it's optimized for the payer, not the user, and it's often shoehorned to weird workflows of the particular enterprise.

slotrans 3 days ago
The AI code takeover will not free engineers up to do craftsmanship. It will annihilate the last vestiges of craftsmanship forever.
0xbadcafebee 3 days ago
New technology does not eliminate old technology or craftsmanship. It just shifts who uses it and what for.

- Power tools didn't annihilate the craftsmanship of hand-tool woodworking. Fine woodworkers are still around and making money using hand tools, as well as hobbyists. But contractors universally switched to power tools because they help them make more money with less labor/cost/time.

- A friend of mine still knits on a loom because she likes using a loom. Some people knit by hand because they like that better. Neither of them stopped just because of large automated looms.

- Blacksmiths still exist and make amazing metal crafts. That doesn't mean there isn't a huge market for machine cast or forged metal parts.

In the future there'll just be the "IDE people" and the "Agent Prompt people", both plugging away at whatever they do.

hackyhacky 3 days ago
You give examples where crafts based on pre-industrial technology still exist. You're right, but you're proving the GP's point.

200 years ago, being a blacksmith was a viable career path. Now it's not. The use of hand tools, hand knitting, and hand forging is limited to niche, exotic, or hobbyist areas. The same could be said of making clothes by hand or developing film photographs. Coding will be relegated to the same purgatory: not completely forgotten, but considered an obsolete eccentricity. Effectively all software will be made by AI. Students will not study coding, the knowledge of our generation will be lost.

0xbadcafebee 3 days ago
I know people who make their living doing those niche things. So what if they're niche? Enterprise Software Architect is niche. Aerospace Engineer is niche. Hell, finding somebody under the age of 40 who can write Assembly is niche.

Everything gets worse overtime. Even before AI, I was constantly complaining about how technology is enshittifying. I'm sure my parents complained about things getting worse, and their parents. Yet here we are, the peak achievement of living beings on this planet, making do. I think we will be OK without typing in by hand a thing that didn't even exist 70 years ago.

hackyhacky 2 days ago
> So what if they're niche? Enterprise Software Architect is niche.

It's a question of supply and demand in the labor market. Right now, we are paid well and afforded respect because demand for our service is higher than the supply. When anyone can use AI to do our job, the supply will exceed the demand.

There are blacksmiths still working today. Their work is niche. And although blacksmithing today requires no less skill than it did 200 years ago, there is significantly less demand, and very few can make a living at it.

jappgar 3 days ago
I doubt hobbyists would describe their hobby as purgatory.

I doubt the laborer would describe their toil as "craft".

hackyhacky 2 days ago
> I doubt hobbyists would describe their hobby as purgatory.

Programmers have become accustomed to a lot of cultural and financial respect for their work. That's about to disappear. How do you think radio actors felt when they were displaced by movies? Or silent film actors when they were displaced by talkies?

> I doubt the laborer would describe their toil as "craft".

Intellectual labor is labor. I'm a laborer in programming and I definitely consider it a craft. I think a lot of people here at HN do.

hxugufjfjf 2 days ago
And they were and are of course right to feel those feelings, but it doesn't change the fact that the world is changing. Rarely do large changes benefit everyone in the world.
hackyhacky 2 days ago
> And they were and are of course right to feel those feelings, but it doesn't change the fact that the world is changing. Rarely do large changes benefit everyone in the world.

I'm not sure who you are arguing against. No one here said that the world isn't changing. But it seems to me that the people who are disadvantaged by AI, which is potentially everyone who doesn't own a data center, should take efforts to ensure their continued survival, instead of merely becoming serfs to the ruling oligarchs.

tripledry 3 days ago
I don't think that's a good comparison though. We shouldn't compare AI/Software to handcrafting one item, you should compare to handcrafting the machine that crafts the items.

If I knit a hat, I can sell it once, but if I make a game, I can run or sell it repeatedly.

However, I still agree with the outcome - if AI becomes even better and is economically viable - number of people handcrafting software will reduce drastically.

Ronsenshi 3 days ago
> Effectively all software will be made by AI. Students will not study coding, the knowledge of our generation will be lost.

Given the echo chamber of HN when it comes to AI that certainly seems inevitable. The question is - who would work on novel things or further AI model improvements if it so happens that knowledge of writing software by hand disappears?

hackyhacky 3 days ago
I can answer your question in two ways:

1. AI will work on AI. 2. People will work on AI, but as a rare niche, not a mainstream area of software development.

pjmlp 3 days ago
A selected few, just like some mechatronic engineers get to develop new factory robots, and a few lucky ones stay around to do the manual tasks they still can perform or press the big red button when something goes wrong.
trollbridge 3 days ago
The examples given are using tools to do well-defined, repeatable processes. So far, despite many attempts by upper management to make software the same way, it hasn't happened, and AI doesn't appear to be any different.

I don't see a huge difference between people writing in a high-level language and people writing complex prompts.

pjmlp 3 days ago
As someone coding since 1986, I certainly see it on the time to get something done.

AI agents isn't coding in Common Lisp home made macro DSL, is me doing in one hour doing something that could have taken a couple of days, even if I have to fix some slop along the way.

Thus I can already see the trend that started with MACH architecture and SaaS products, to go even further decreasing the team sizes required for project delivery.

Projects I used to be part of a 10 people team, are started to be sized into 5 or less.

Cthulhu_ 3 days ago
That's a very doomer statement, with the false premise that craftsmanship is already on its last vestiges. It's not that bad.
ontouchstart 3 days ago
I am in the middle of reading a fascinating book about the early days of computing: Turing’s Cathedral by George Dyson. It really opened my eyes to American engineering craft post WWW II.

We seem to take everything for granted now and forget what real engineering is like.

This review is 13 years old by itself:

https://www.theguardian.com/books/2012/mar/25/turings-cathed...

FarmerPotato 3 days ago
+1 for the mention of Forth. I use it often. LLM answers are possible now, but they are like translated C. It’s very bad style.

The standard: Forth words should be a few lines of code with a straightforward stack effect. Top level of a program might be 5 words.

LLM will generate some subroutines and one big word of 20-50 lines of nested IF..THEN..ELSE and DO..WHILE just as if it writing C.

basilgohar 3 days ago
I feel like one of the things that's not said enough, and which I think is conflating the effectiveness of AI in the eyes of actual software engineers, is that, for the most part, most code produced IS lousy. The craft of programming has been watered down so much in favor of results, and so much code is disposable or write-and-use once, that quality just became less relevant.

I remember when I first started out programming 20 years ago, there was time to craft good quality code. Then there were more and more pushes to get more code out faster, and no one really cared about the quality. Bugs became part of the cost of doing business. I think GenAI for code fits well in the more recent paradigm, and comparing it with hand-crafted code of yore is a bit disingenuous, as appealing as it may be, because most code hasn't been that good for a long time.

I am sad to admit it, but AI is just fitting in where poor coding practices have already existed, and flourishing in that local maxima.

pjmlp 3 days ago
Indeed this is offshoring taken to the next level.

Business doesn't care about the craft, they care that the use case is solved, even if the code is crap under the hood.

ChrisMarshallNY 3 days ago
I will say that offshoring has developed a bad name; maybe not unreasonably. There’s some terrible outsourcing shops.

But there’s also a few that are really good. I know people that used Romanian and Polish shops that did great work. They weren’t super-cheap, but still cheaper than American contract developers.

I assume that the bad offshoring orgs are pretty nervous about AI. I also assume that the better ones are learning to incorporate AI as a force multiplier.

Interesting times, ahead…

pjmlp 3 days ago
They certainly are, https://www.theregister.com/2026/01/19/hcl_infosys_tcs_wipro...

We usually call that nearshore over here in Europe, exactly to differentiate the expectations of what gets delivered.

sodapopcan 3 days ago
As someone who resists agents, I agree with all of this 100%, especially that it's not said enough, because this is exactly what I've railed against most of my career. There is a HUGE "LOL all code sucks" sentiment across the majority of our industry and I friggin' hate it.
ako 3 days ago
I think we should embrace AI to craft better software. You have a lot of control over the code generated by AI, so all your designs, patterns, best practices can be used in the generated code. This will make us better software craftsmen.

A nice example is guitar building: there's a whole bunch of luthiers that stick to traditional methods to build guitars, or even just limit themselves to japanese woodworking tools.

But that is not the only way to build great guitars. It can be done by excellent luthiers, building high quality quitars with state of the art tools. For example Ulrich Teuffel who uses all sorts of high tech like CAD systems and CDC machines to craft beautiful guitars: https://www.youtube.com/watch?v=GLZOxwmcFVo and https://www.youtube.com/watch?v=GLZOxwmcFVo

Unfortunately, craftsmanship does not come cheap, so most customers will turn to industrially created products. Same for software.

Cthulhu_ 3 days ago
But your comparison is a bit off; you mention CNC machines and the like to build guitars, but those are tools that are still exactly programmed by humans. LLMs on the other hand are probabilistic - you prompt "write me a set of gcode instructions for a CNC to build a guitar body" and wait / hope.

Sure, LLMs as a tool probably have a place in software development, but the danger lies in high volume, low oversight.

But there's people using it large scale to build large applications, time will tell how they work out in the end. Software engineering is programming over time, and the "over time" for LLM based software engineering hasn't been long enough yet.

ako 3 days ago
You have a lot of control over what the LLM creates. The way you phrase your requirements, give it guidance over architecture, testing, ux, libraries to use. You can build your own set of skills to outline how you want the LLM to automate your software process. There's a lot of craftmanship in making the LLM do exactly what you think it needs to do. You are not a victim at the mercy of your LLM.

You are a lead architecture, a product manager, a lead UXer, a lead architect. You don't have 100% control over what your LLM devs are doing, but more than you think. Just like normal managers don't micromanage every action of their team.

NilMostChill 3 days ago
> You have a lot of control over what the LLM creates.

No, you don't, you have "influence" or "suggestion".

You can absolutely narrow down the probability ranges of what is produced , but there is no guarantee that it will stick to your guidelines.

So far, at least, it's just not how they work.

> You don't have 100% control over what your LLM devs are doing, but more than you think. Just like normal managers don't micromanage every action of their team.

This overlooks the role of actual reasoning/interpretation that is found when dealing with actual people.

While it might seem like directing an LLM is similar in practice to managing a team of people, the underlying mechanisms are not the same.

If you analyse based on comparisons between those two approaches, without understanding the fundamental differences in what's happening beneath the surface, then any conclusions drawn will be flawed.

---

I'm not against LLM's, i'm against using them poorly and presenting them as something they are not.

ako 3 days ago
I think i have enough control, probably more than when working with developers. Here's something i recently had claude code build: https://github.com/ako/backing-tracks

If you check the commit log, you'll see small increments. The architecture document is what i have it generate to validate the created architecture: https://github.com/ako/backing-tracks/blob/main/docs/ARCHITE...

Other than that most changes start with the ai generating a proposal document that i will review and improve, and then have it built. I think this was the starting proposal: https://github.com/ako/backing-tracks/blob/main/docs/DSL_PRO...

This started as a conversation in Claude Desktop, which it then summarized into this proposal. This i copied into claude code, to have it implemented.

NilMostChill 2 days ago
> I think i have enough control.

This is probably just a disagreement about the term "control", so we can agree to disagree on that one i suppose.

The rest of the reply doesn't really relate to any of the points i mentioned.

That it's possible to successfully use the tool to achieve your goals wasn't in dispute.

I'll try to narrow it down:

---

> You are not a victim at the mercy of your LLM.

Yes, you absolutely are, it's how they work.

As i said, you can suggest guidelines and directions but it's not guaranteed they'll be adhered to.

To be clear , this also applies to people as well.

---

Directing an LLM (or LLM based orchestration system) is not the same as directing a team of people.

The "interface" is similar in that you provide instructions and guidelines and receive an attempt at the wanted outcome.

However, the underlying mechanisms of how they work are so different that the analogy you were trying to use doesn't make sense.

---

Again, LLM's can be useful tools, but presenting them as something they aren't only serves to muddy the waters of understanding how best to use them.

---

As an aside, IMO, the sketchy salesmen approach to over-promising on features and obscuring the the limitations will do great harm to the adoption of LLM's in the medium to long term.

The misrepresentation of terminology is also contributing to this.

The term AI is intentionally being used to attribute a level of reasoning and problem solving capability beyond what actually exists in these systems.

ako 2 days ago
Looks like we just have different expectations: i don't want to micromanage my coding agents any more than i micromanage the developers i work with as a product manager. If the output does what it is supposed to do, and the software is maintainable and extendable by following certain best practices, i'm happy. And i expect that goes for most business people.

And in practice i have more control with a coding agent than with developers as i can iterate over ideas quickly: "build this idea", "no change this", "remove this and replace it with this". Within an hour you can quickly iterate an idea into something that works well. With developers this would have taken days if not more. And they would've complained i need to better prepare my requirements.

NilMostChill 2 days ago
TL;DR;

If it's working for you, great, but presenting it like it's a general direct replacement for development teams is disingenuous.

---

> Looks like we just have different expectations: i don't want to micromanage my coding agents any more than i micromanage the developers i work with as a product manager. If the output does what it is supposed to do, and the software is maintainable and extendable by following certain best practices, i'm happy. And i expect that goes for most business people.

None of what i said implied any expectations of the process of using the tools, but if you've found something that works for you that's good.

On the subject of maintainability and extension, that is usually bound to the level of complexity of the project and the increase in requirements is not generally linear.

I agree, many business people would love what you've described, very few are getting it.

> And in practice i have more control with a coding agent than with developers as i can iterate over ideas quickly: "build this idea", "no change this", "remove this and replace it with this". Within an hour you can quickly iterate an idea into something that works well. With developers this would have taken days if not more. And they would've complained i need to better prepare my requirements.

Up to a point, yes.

If your application of this methodology works well enough before you hit the limitations of the tooling, that's great.

There is , however, a threshold of complexity where this starts to break down, this threshold can be mitigated somewhat with experience and a better understanding on how to utilise the tooling, but it still exists (currently).

Once you reach this threshold the approaches you are talking about start to work less effectively and even actively hinder progress.

There are techniques and approaches to software development that can further push this threshold out, but then you're getting into the territory of having to know enough to be able to instruct the LLM to use these approaches.

miningape 3 days ago
> You have a lot of control over what the LLM creates. The way you phrase your requirements, give it guidance over architecture, testing, ux, libraries to use. You can build your own set of skills to outline how you want the LLM to automate your software process

Except for the other 50% of the time where it goes off the rails and does what you explicitly asked it not to do.

leekrasnow 3 days ago
I am a craftsman of fine puzzles made from wood and CNC machined metal. I use LLM in lots of ways to help on individual parts of bigger puzzle design projects, like for example to create custom puzzle solver software which can search through large sets of possible notching patterns on wooden sticks in order to find ones that meet some criteria or are optimized in whatever manner I find aesthetically pleasing.

I’ve been writing various single-purpose software tools of these sorts for decades. I would not want to go back to hand-writing them now that I can have agents (cursor, claude code, etc) lay down the algorithmic architecture that I vibe at them, now that I know how to “speak that language” and reliably get the software outcomes that I seek.

I find this similar to how I would not want to spend all day turning the crank handles on a manual milling machine when I can have a CNC mill do it, now that I know how to use various CAM systems well and have the proper equipment.

Given that my overall craft is not limited to just writing code or turning crank handles, I readily embrace any improvements of my workshop “technology stack” so that I can produce higher quality artwork.

schubart 2 days ago
fauigerzigerk 3 days ago
I agree. The article's logic is incoherent. It conflates the choice of tools with the decision what product to make and what level of quality to aim for.

If AI can be used to make bad (or good enough) software more cheaply, I have no problem with that. I'm sure we will get a huge amount of bad software. Fine.

But what matters is whether we get more great software as well. I think AI makes that more likely rather than less likely.

Less time will be spent on churning out basic features, integrations and bug fixes. Putting more effort into higher quality or niche features will become economically viable.

3vidence 3 days ago
I wonder if that's only really true for "pre-LLM" engineers though. If all you know is prompting maybe there's not a higher quality with more focused that can really be achieved.

It might just all meld into a mediocre soup of features.

To be clear not against AI assisted coding, think it can work pretty great but thinking about the implications for future engineers.

fauigerzigerk 2 days ago
>If all you know is prompting maybe there's not a higher quality with more focused that can really be achieved.

That's true of any particular individual but not for a company that can decide to hire someone who can do more than prompting.

>It might just all meld into a mediocre soup of features

I don't think the relative economics have changed. Mediocre makes sense for a lot of software categories because not everyone competes on software quality.

But in other areas software quality makes a difference it will continue to make a difference. It's not a question of tools.

sacha1bu 3 days ago
This really nails the core issue: AI thrives in environments where software is treated as “good enough” optimization rather than craft. It’s not replacing great engineers so much as exposing how much of modern software has already become rote, metric-driven work. The Arts & Crafts parallel feels especially apt as mass-produced code gets cheaper, human judgment and taste become the real scarce resources.
reedlaw 2 days ago
Great article, but doesn't address the fundamental issue: defining quality. Other than some objective metrics like code coverage, there is little agreement about what constitutes good code. The closest thing to a consensus might be the rules encoded in linters/formatters. Each Rubocop or eslint rule had to go through code review and public scrutiny to be included and maintained. Most often the rules are customized per project/team. Of course this runs into the same problem the article mentions: narrowness of vision. It seems the only way to achieve a high-minded ideal is the BDFL model of software development.
iberator 3 days ago
AI takes jobs faster than creating new ones.

That's the problem.

mohsen1 3 days ago
I don't get why the author assumes AI-assisted coding can never produce elegant software. I am learning new things from AI almost in every interaction.
alexwennerberg 2 days ago
AI can produce elegant software but not on its own. It requires a human with taste to direct and guide it.
wormik 3 days ago
Stop arguing about whether AI ruins craft. Fix the incentive structures so that increased productivity buys humans time, not disposability. Then see what kind of craft emerGe
uejfiweun 3 days ago
Unfortunately it seems the incentive structures are ultimately set by global competition and the security dilemma. So short of total world disarmament and UBI, fundamentally the incentive will always be to work harder than your opponent to preserve your own security.
swordsith 2 days ago
"But attempts to generalize their capabilities have largely failed and produced code that is novel and impressive only in its monstrosity[link gastown]" was very funny thanks
Herring 3 days ago
This argument is basically just the 1800s Luddite vs Industrialist argument recast for a new age. Group A thinks quality is about human agency, and that machines are being used to bypass the apprenticeship system and produce inferior goods. Group B thinks efficiency is the highest priority, and craft is just vanity. Of course as we know we went a third way, and human roles just shifted.

I think one promising shift direction is humans do NOT like to talk to bots, especially not for anything important. It's biological. We evolved to learn from and interact with other humans, preferably the same group over a long time, so we really get to understand/mirror/like/support each other.

convolvatron 3 days ago
I dont think it's the same at all. when weaving was displaced, yes some people were pissed about losing their livelihood, but the quality of the cloth didn't diminish.

when CNC came for machining, no one really bitched, because the computers were just removing the time consuming effort of moving screws by hand.

when computers write code, or screenplays, the quality right now is objectively much worse. that might change, but claims that we're at the point where computers can meaningfully displace that work are pretty weak.

sure that might change.

gbear605 3 days ago
Cloth absolutely has gotten worse over the last two hundred years since industrialization. It's also orders of magnitude cheaper, making it worth it, and certainly new types of cloth are available that weren't before, but we're not better off in every possible way.
WarmWash 3 days ago
>but we're not better off in every possible way

I'd argue that we are, because cloth of higher quality than anything that has ever existed before is available today, it's just really expensive. But high quality cloth was also expensive back then.

I think you are making the error of comparing cheap clothes of today with expensive clothes of the past, rather than cheap clothes with cheap clothes and expensive with expensive. People of the past might have had higher quality clothes on average, but its because those clothes were expensive despite being the cheapest available. Trust me, if Shein was around in 1780, everyone would be wearing that garbage.

mh2266 3 days ago
Is there any type of clothing that existed in the 1800s that you could actually not buy or have custom made today?

On the other hand, you could not buy a Gore-Tex Pro shell or an ultralight down jacket for any price in 1800.

gbear605 3 days ago
Dhaka muslin most famously isn't producible today due to a lack of knowledge. More broadly a lot of weaving techniques have been lost since they don’t make sense to do with machinery.

I don’t disagree that there are a lot of gains, including on net, just that it hasn’t been a pareto improvement with no losses at all.

trollbridge 3 days ago
We're definitely worse off when fabric now is mostly synthetic fabrics that flood the environment with microplastics, and last a much shorter amount of time. Of course, that's good for the fashion industry since it means they can sell more often.
odo1242 3 days ago
The result being worse generally doesn't stop humans from being displaced. Clothes made today are notably worse than the handmade ones.
bfung 3 days ago
Is clothes today really worse?

We have clothes and materials like gortex now that blocks rain and snow no handmade jacket could ever hope to perform at the same level to be lightweight AND dry.

anhner 3 days ago
> We have clothes and materials like gortex now that blocks rain and snow no handmade jacket could ever hope to perform at the same level to be lightweight AND dry.

At the cost of massive environmental, animal and human health.

fwip 3 days ago
The available quality of cloth did, in fact, diminish.
Terr_ 3 days ago
Hold up, why it changed matters to parent-poster's argument. Consider the difference between:

1. "The technology's capability was inferior to what humans were creating, therefore the quality of the output dropped."

2. "The costs of employing humans created a floor to the price/quality you could offer and still make a profit. Without the human labor, a lower-quality product became possible to offer."

The first is a question of engineering, the second is a question of economic choice and market-fit.

fwip 3 days ago
Some of both.

The fabric and clothes were worse, and cheaper. This put many traditional workers out of business, making actually good clothes scarcer, and eventually, more expensive than they previously were.

Terr_ 3 days ago
I think the poster's "LLMs are not like textile machines" point hinges on whether a step down in quality is required due to engineering issues or not, at least for an equivalent product. (E.g. bulk cloth, rather than fine embroidery.)
fwip 3 days ago
I'm talking about equivalent products. The cloth made by machine during the Industrial Revolution was meaningfully worse in quality than the hand-made stuff.
ekianjo 3 days ago
Not really. Polymers in clothes are everywhere and they have very désirable properties compared to pure cotton. Untreated cotton had many problems.
trollbridge 3 days ago
Materials other than cotton (like wool and leather) existed.
ekianjo 3 days ago
yup, but polymers are much, much cheaper to produce. And some have properties that no natural fabric can offer.
anhner 3 days ago
> properties that no natural fabric can offer

like polluting every inch of the Earth with microplastics!

lkey 3 days ago
sighs pulling out this quote again:

"Luddites were not opposed to the use of machines per se (many were skilled operators in the textile industry); they attacked manufacturers who were trying to >>circumvent standard labor practices<< of the time."

Luckily, the brave government's troops, show trials and making '"machine breaking" (i.e. industrial sabotage) a capital crime"' solved the crisis of these awful, entitled workers' demands once and for all and across all time.

I'm sure that any uppity workers in our present age can also be taught the appropriate lessons.

rapind 3 days ago
I wonder if the workers of the time were as responsible for the propaganda as we are... It seems like the ultimate heist when capital can get labour to propagate their own messaging.
Herring 3 days ago
Ordinarily yes I’d love to overthrow the bourgeoisie (check my history, I live in flagged threads), but this time I think this thread is really just about the evolution of the profession.
lkey 3 days ago
To be clear, I'm an accidental member of the haute bourgeoisie, and under normal circumstances, I cannot be harmed directly by this, or any, new evolution of labor relations.

I was mostly annoyed because Luddites were an early labor movement and their demands were, by modern standards, normal, but they are continually invoked like they believed and demanded things they did not.

I do agree the profession, if we dare diminish coding to exactly one thing, is changing, but I believe the direction of that evolution, unchecked, will exacerbate the ongoing attenuation of the power of labor in the US, even as workers individually become more 'productive'.

Quoting you from elsewhere: "Reminder that the most reliable way to prevent the rise of the far right is to implement robust safety nets and low inequality, to reduce status anxiety and grievance."

We agree here, and likely elsewhere, so keep fighting the good fight on this orange hell site (and also outside if you're able).

Ronsenshi 3 days ago
This evolution comes at a cost - if one senior suddenly can do their work and plus work of 5 juniors - why would company keep these juniors? It won't - the moment C-suite realizes they don't need extra people, they will be gone. But at some point senior engineers will retire or find new better paying jobs and said company would need to find a replacement. In the past this replacement could come from one of the juniors that worked in that company for a while and mentored by the senior. Not so much when there's no more juniors thanks to AI.

Not so much evolution here, I'm afraid. Just a plain redistribution of wealth upwards thanks to new tools that made large chunk of workforce obsolete. How this will affect industry in 15 years - nobody seem to think about that.

billy99k 3 days ago
The same could be said about the mass adoption of open source.

Why hire an experienced coder to create project X, when you can just use an open source project and hire cheaper and less experienced coders to make updates? I've been part of many of these conversations with business leaders and management over the years.

Developers have been giving away their work and devaluing their profession for decades and has basically turned it into digital factory work.

It's why I stopped writing software professionally almost a decade ago.

AI is using all of this open source to train and will eventually put you out of a job.

hackyhacky 3 days ago
The Luddites indeed lost their jobs to machines, but they could find other jobs, and their children adapted to the changed world.

Dario Amodei, CEO of Anthropic, thinks that this disruption be different from that one. From his article The Adolescence of Technology, currently on HN's front page:

> AI will be capable of a very wide range of human cognitive abilities—perhaps all of them. This is very different from previous technologies like mechanized farming, transportation, or even computers. This will make it harder for people to switch easily from jobs that are displaced to similar jobs that they would be a good fit for. For example, the general intellectual abilities required for entry-level jobs in, say, finance, consulting, and law are fairly similar, even if the specific knowledge is quite different. A technology that disrupted only one of these three would allow employees to switch to the two other close substitutes (or for undergraduates to switch majors). But disrupting all three at once (along with many other similar jobs) may be harder for people to adapt to. Furthermore, it’s not just that most existing jobs will be disrupted. That part has happened before—recall that farming was a huge percentage of employment. But farmers could switch to the relatively similar work of operating factory machines, even though that work hadn’t been common before. By contrast, AI is increasingly matching the general cognitive profile of humans, which means it will also be good at the new jobs that would ordinarily be created in response to the old ones being automated. Another way to say it is that AI isn’t a substitute for specific human jobs but rather a general labor substitute for humans.

pydry 3 days ago
It has nothing to do with luddites.

Software quality about speed of delivery and lack of bugs.

If you're fine with software which gets a little bit harder to work on every time you make a change and which might blow up in unexpected ways, AI is totally fine.

Ive yet to meet many AI champions who are explicit about their desire to make that trade off though. Even the ones who downplay software quality arent super happy about the bugs.

bfung 3 days ago
> If you're fine with software which gets a little bit harder to work on every time you make a change and which might blow up in unexpected ways,

human written code is similarly fine. Save for very few human individuals.

trevwilson 3 days ago
> If you're fine with software which gets a little bit harder to work on every time you make a change and which might blow up in unexpected ways, AI is totally fine.

While the speed and scale at which these happen is definitely important (and I agree that AI code can pose a problem on that front), this applies to every human-written piece of software I've ever worked on too.

adithyassekhar 3 days ago
> I think one promising shift direction is humans do NOT like to talk to bots, especially not for anything important. It's biological.

Let me tell you why I like shopping from amazon instead of going to a super market...

But also the older I get I keep wanting to visit the store in person. It's not to see the other human, I just want to hold the thing I want to buy and need it immediately instead of waiting. I feel like there isn't enough time anymore.

SchemaLoad 3 days ago
You talk to bots on Amazon? If say Hacker News was entirely just bots, why would you bother commenting, why would you bother reading the comments?
adithyassekhar 3 days ago
This is exactly why I come here. I just woke up and didn't feel like dealing with people today and made that comment. Now you got me thinking, I do like to be social. Just not the kind where the other person is actively trying to sell me something often by being dishonest.

I remember a sales person in a store actively trying to shame me for purchasing from a brand he didn't prefer. I gave him a lot of chances to get off of me, respectfully and firmly. Some really are like blood sucking leeches, they don't come off. He was probably paid to do that.

pjmlp 3 days ago
It helps getting acceptance of talking to bots, when using voice instead of typing book sized prompts into tiny chat windows.
bitwize 3 days ago
Maybe kids will end up preferring to talk to bots, much like the generations after my own actually preferred digital compression artifacts in their music.
Herring 3 days ago
Can it get me a job if I get laid off (networking)? Can I crash on its couch for a while? It might displace tv/netflix, which yes is a huge market, but I don't think much more than that.
ekianjo 3 days ago
> It's biological.

Nonsense. We never evolved to send text messages and yet here we are with social networks, chat systems and emails used everywhere for everything.

kumarvvr 3 days ago
On a side track, I wish to express my fears regarding AI

Unfortunately for the general populace, most technological improvements in information technology, for the past 5 decades, has lead to loss of political control and lessened their leverage for political change.

With AI, this change is going to be accelerated a 100 times.

With current AI slop, and more importantly, almost indistinguishable from reality, AI based content, the populace is going slowly learning to reject what they see and what they hear from mass media.

AI has muddied the pool so much, that every fish, us, cannot see the whole pool. What this will lead to, is for political figures and bad actors to, much more easily almost with no effort at all, create isolation among people.

No event will create a mass uprising, because no event can be believed by a common mass. It will be easy to generate an alternative reality using the same AI.

Now, the political class and the billionaire class, are free to act with impunity, because the last check on their power, the power of mass media to form public opinion, to inspire the masses to demand change or accountability, has eroded to the point of no return. (They have already captured the institutions of public power)

I fear for the future of humanity.

Edit : There are already troubling signs from the billionaire class regarding this. There is a narrative to "ensure guardrails" for AI, sort of giving the populace the idea that once that is done, AI is acceptable. This is like saying, "better have a sleeve on the knife, so that no one can cut with it, but use it as a prop in a movie"

They are creating this narrative that AI is inevitable.

They are fear mongering that AI is going to take jobs, which it will, but it also goads the capable ones to get on to the bandwagon and advance AI further.

chickensong 3 days ago
Mass media has been going to shit for a long time anyway. People have always chosen to believe tons of insane things.

AI will cause plenty of problems of course, because it puts a powerful tool in the hands of those who shouldn't use it, but also, many people shouldn't drive cars or use the internet.

AI is inevitable, just as calculators and computers were. I suspect that no humans will write code by hand in a not so distant future. The machines will be far too effective.

Personally, I love the acceleration we're seeing since the coming of computers. The internet is a big black mirror, scary, beautiful, and ugly, just like us. AI feels similar to me.

Will some things get worse because of AI? Probably. But maybe it'll also help to save us from ourselves. If nothing else, it will probably force some investment into long overdue security, identity, and trust issues.

northfield27 2 days ago
Agreed. AI-generated code is "mid" by nature. You won’t feel amazed seeing AI-generated code because its "reasoning" competes with that of a potato.

I have recently started exploring AI-coding -- note that I said AI coding and not vibe-coding because that is for the brain-dead.

By AI coding, I mean I know the inputs, outputs, structures the code should have and the necessary context to write the code. Then, articulating the requirements in English as best as I can and feeding it to agents.

Needless to say, the code is pathetic; it chooses to implement meaningless abstractions even after explicitly providing the design and plan to follow.

I don’t understand how we, as a collective species, agreed to believe the criminally wrong lies of tech CEOs that, instead of implementing a “reliable system by hand,” we choose to convey our ideas and instructions in an “ambiguous,” “inconsistent,” and “context-dependent” language (English), which is then passed through a probabilistic system to generate the reliable system.

swordsith 2 days ago
I cant say my experience is at parity with this, what model were you using?
northfield27 2 days ago
For AI-coding, I use Copilot with whatever default model under free plan it gives.
baroudi 3 days ago
> But there are serious limits. [Your coding agent] will lie to you, they don't really understand things, and they often generate bad code.

As for lies and bad code, it didn't appear with AI. Humans lied and produced bad code before AI.

How does the author empirically know AI does not understand? And if it does not understand right now, is a machine fundamentally unable to understand? Is understanding an exclusive human ability? Is it because machines lack a soul? It sounds quite dualistic (Descartes'view that mind and body and fundamentally different).

Don't get me know, I think right now, AI is less a good at understanding humans than other humans (or even dogs) in many contexts because it has no access to non verbal signals. But in the context of building software, it is good enough and I don't see why a machine should not be able to understand humans.

jfyi 3 days ago
I have had an interesting experience just recently.

I hired back on at a company I used to work at and found they had contracted work to another former employee who was handed the code from a rest api I had written and a web app that used it. The task was to write an android app that interacted with the api.

He ran it through an agentic coding assistant and got out api scaffolding and basic UI.

Looking it over, I couldn't shake the feeling I was looking at my code, just ported to kotlin. I was seeing my idiosyncrasies everywhere. It was kind of surreal.

I was familiar with the dev's work who did it and it was nothing like his prior work, but it's been years since I have seen him, so who knows.

I couldn't help but admit it was a good foundation to start building on.

I told the pm they were likely overpaying significantly for an agentic coding assistant and only getting access to it for a few hours a month. This same organization recoiled in terror when I pointed out the cost of a claude code subscription once.

sigi64 3 days ago
AI wrote better code than most of my colleagues.

Especially with my rules:

- Prefer simple, boring solution

- Before adding complexity to work around a constraint, ask if the constraint needs to exist.

- Remember: The best code is often the code you don't write.

jasonm23 2 days ago
Many advances have their slop version, look at any trend which began as broadly positive and became infested with a tacky parade of bandwagon pretenders.

AI is just the next example, and the internet is particularly troublesome at filtering up the most tacky.

Always remember the broadly positive aspects remain, they're simply obscured by dusty clouds from the bandwagon.

Use tools to expand your capability.

pplonski86 3 days ago
What if AI starts to have sense of craft? we just miss the verify and critique models, that will tell other models what looks good
jumploops 3 days ago
> People have said that software engineering at large tech companies resembles "plumbing"

> AI code [..] may also free up a space for engineers seeking to restore a genuine sense of craft and creative expression

This resonates with me, as someone who joined the industry circa 2013, and discovered that most of the big tech jobs were essentially glorified plumbers.

In the 2000s, the web felt more fun, more unique, more unhinged. Websites were simple, and Flash was rampant, but it felt like the ratio of creators to consumers was higher than now.

With Claude Code/Codex, I've built a bunch of things that usually would die at a domain name purchase or init commit. Now I actually have the bandwidth to ship them!

This ease of dev also means we'll see an explosion in slopware, which we're already starting to see with App Store submissions up 60% over the last year[0].

My hope is that, with the increase of slop, we'll also see an increase in craft. Even if the proportion drops, the scale should make up for it.

We sit in prefab homes, cherishing the cathedrals of yesteryear, often forgetting that we've built skyscrapers the ancient architects could never dream of.

More software is good. Computers finally work the way we always expected them to!

[0]https://www.a16z.news/p/charts-of-the-week-the-almighty-cons...

danpalmer 3 days ago
> joined the industry circa 2013, and discovered that most of the big tech jobs were essentially glorified plumbers

Most tech jobs are glorified plumbers. I've worked in big tech and in small startups, and most of the code everywhere is unglamorous, boring, just needs to be written.

Satisfaction with the job also depends on what you want out of it. I know people who love building big data pipelines, and people who love building fancy UIs. Those two groups would find the other's job incredibly tedious.

mattgreenrocks 3 days ago
The right job for a person depends on whether they can rise above the specific flavor of pain that the job dishes out. BigTech jobs strike me as having an inextricable political element to them: so you enjoy jockeying for titles and navigating constant reorgs?

The pay is nice but I find myself…remarkably unenvious as I get older.

danpalmer 3 days ago
Big companies are political and re-orgs lead to layoffs. Startups are a constant battle for funding and go out of business. Small companies mean a lot of exposure to bad management and budget issues. Charities are highly regulated and audited environments. Government jobs have no perks and entrenched middle management.

Every type of work has its idiosyncrasies, which people will either get on with or not. Mentioning one without the others is a bit disingenuous, or its whatever the opposite of the grass-is-greener bias is.

strken 3 days ago
Plumbing has certification and industry best practices, and its leaks generally affect a few blocks at most rather than spraying across the entire internet.
trollbridge 3 days ago
Er... we sit in prefab homes? Trailers are generally considered to be the worst possible quality of home construction and actually lose value instead of the normal appreciation real estate has.
kxbnb 3 days ago
The framing of craft vs. slop misses something important: most production software quality problems aren't about aesthetics or elegance, they're about correctness under real-world conditions.

I've been using AI coding tools heavily for the past year. They're genuinely useful for the "plumbing" - glue code, boilerplate, test scaffolding. But where they consistently fail is reasoning about system-level concerns: authorization boundaries, failure modes, state consistency across services.

The article mentions AI works best on "well-defined prompts for already often-solved problems." This is accurate. The challenge is that in production, the hard problems are rarely well-defined - they emerge from the interaction between your code and reality: rate limits you didn't anticipate, edge cases in user behavior, security assumptions that don't hold.

Craft isn't about writing beautiful code. It's about having developed judgment for which corners you can't cut - something that comes from having been burned by the consequences.

rescbr 3 days ago
> Craft isn't about writing beautiful code. It's about having developed judgment for which corners you can't cut - something that comes from having been burned by the consequences.

That's why I'm of the opinion that for senior developers/architects, these coding agents are awesome tools.

For a junior developer? Unless they are of the curious type and develop the systems-level understanding on their own... I'd say there's a big chance the machine is going to replace their job.

j33dd 3 days ago
Most people using LLMs dont have this craft...... which begs the question. Should they be using LLMs in the first place? Nope. But given that its rammed down their throat by folks internally and externally, they will.
rednafi 3 days ago
It’s easy to forget that any artifact - painting, music, text, or software - that appeals to a large number of people is, by definition, an average on the spectrum of quality.

Popular music tends to be generic. Popular content is mostly brainrot these days. Popular software is often a bloated mess because most users’ lives don’t revolve around software. They use software to get something done and move on.

I never understood the appeal of “craft” in software. Early computer pioneers were extremely limited by the tech of their time, so the software they hacked together felt artsy and crafty. Modern software feels industrial because it is industrial - it’s built in software factories.

Industrial software engineers don’t get paid to do art. There are research groups that do moonshot experiments, and you can be part of that if it’s your thing. But lamenting the lack of craft in industrial software is kind of pointless. Imagine if we’d stopped at crafty, handmade auto engines and never mass-produced them at scale. We don’t lament “crafty engines” anymore. If you want that, go buy a supercar.

Point is: AI is just another tool in the toolbox. It’s like Bash, except calling it that won’t pull in billions of dollars in investment. So “visionaries” call it ghost in the machine, singularity, overlord, and whatnot. It produces mediocre work and saves time writing proletariat software that powers the world. Crafty code doesn’t pay the bills.

But I’m not saying we shouldn’t seek out fun in computing. We absolutely should. It’s just that criticizing AI for not being able to produce art is an old thing. The goalpost keeps shifting, and these tools keep crushing it.

I don’t use AI to produce craft, because I don’t really do craft in software - I have other hobbies for that. But I absolutely, proudly use it to generate mediocre code that touches millions of people’s lives in some way.