AI threatens to harm a lot about programming, but not the existence/necessity of programmers.
Particularly, AI may starve the development of open source libraries. Which, ironically, will probably increase the need for employed programmers as companies accrue giant piles of shoddy in-house code that needs maintaining.
I can’t wait for my future coworkers who will be coding with AI without actually understanding the fundamentals of the language they’re coding in. It’s gonna get scary.
I guarantee you have coworkers right now coding without understanding the fundamentals of the language they’re coding in. Reusing code you don’t understand doesn’t change if you stole it from Stack Overflow, or you stole it from Chat-GPT9.
The code on SO is rarely specific to what the use case is IMHO. Any code I’ve gotten from there has had to be reworked to fit into what I’m doing. Plus I can’t post some stuff on SO because of legal reasons but can on an internal ChatGPT portal.
Trust me, it’s gonna get a lot worse.
Matter of fact, I look forward to the security breaches of developers posting company code into ChatGPT for help lol. We already had that issue with idiots posting company code into the public GitHub.
The amount of code I’ve seen copy-pasted from StackOverflow to do things like “group an array by key XYZ”, “dispatch requests in parallel with limit”, etc. when the dev should’ve known there were libs to help with these common tasks makes me think those devs will just use Copilot instead of SO, and do it way more often.
I think that undersells most of the compelling open source libraries though. The one line or one function open source libraries could be starved, I guess. But entire frameworks are open source. We’re not at the point yet where AI can develop software on that scale.
I agree wholeheartedly, and I think I failed to drive my point all the way home because I was typing on my phone.
I’m not worried that libs like left-pad will disappear. My comment that many devs will copy-paste stuff for “group by key” instead of bringing in e.g. lodash was meant to illustrate that devs often fail to find FOSS implementations even when the problem has an unambiguously correct solution with no transitive dependencies.
Frameworks are, of course, the higher-value part of FOSS. But they also require some buy-in, so it’s hard to knock devs for not using them when they could’ve, because sometimes there are completely valid reasons for going without.
But here’s the connection: Frameworks are made of many individual features, but they have some unifying abstractions that are shared across these features. If you treat every problem the way you treat “group by key”, and just copy-paste the SO answer for “How do I cache the result of a GET?” over and over again, you may end up with a decent approximation of those individual features, but you’ll lack any unifying abstraction.
Doing that manually, you’ll quickly find it to be so painful that you can’t help but find a framework to help you (assuming it’s not too late to stop painting yourself into a corner). With AI helping you do this? You could probably get much, much farther in your hideous hoard of ad-hoc solutions without feeling the pain that makes you seek out a framework.
I think there will be (and there already have been) significant downsizing over the next few years as businesses leverage AI to mean the same work can be done by less people paid less.
But the job cannot go away completely yet. It needs supervision by someone that can see the bullshit it often spits out and correct it.
But, if I’m honest, software development seems to be targeted when I think design writers should be equally scared. Well, that is if businesses work out that AI isn’t just chatgpt. A GPT or other LLM could be trained on a company’s specific designs and documentation, and then yes designers and technical writers could be scaled right back too.
Developers are the target because that’s what they see chatgpt doing.
In real terms a lot of the back office jobs and skilled writing and development jobs are on the line here.
The work can’t be done by someone paid less. The work can be done by highly skilled, experienced developers with fewer junior resources. The real death comes 60 years later when there are no more developers because there is no viable path to becoming a senior.
Technical writers you may be correct about because translating text is one is the primary use cases for AI.
Well that’ll be junior developers, until they get hauled over the coals for producing highly repetitive code rather than refactoring out common themes.
Who do they think will be using the AI?
AI threatens to harm a lot about programming, but not the existence/necessity of programmers.
Particularly, AI may starve the development of open source libraries. Which, ironically, will probably increase the need for employed programmers as companies accrue giant piles of shoddy in-house code that needs maintaining.
I can’t wait for my future coworkers who will be coding with AI without actually understanding the fundamentals of the language they’re coding in. It’s gonna get scary.
I guarantee you have coworkers right now coding without understanding the fundamentals of the language they’re coding in. Reusing code you don’t understand doesn’t change if you stole it from Stack Overflow, or you stole it from Chat-GPT9.
The code on SO is rarely specific to what the use case is IMHO. Any code I’ve gotten from there has had to be reworked to fit into what I’m doing. Plus I can’t post some stuff on SO because of legal reasons but can on an internal ChatGPT portal.
Trust me, it’s gonna get a lot worse.
Matter of fact, I look forward to the security breaches of developers posting company code into ChatGPT for help lol. We already had that issue with idiots posting company code into the public GitHub.
Imagine programming a computer without understanding the machine code that tells the CPU what to do
You don’t have to wait, they’re doing it now.
I feel attacked.
j/k. I’m happy in the education sector. The code I write won’t be seen by anybody but me.
Why do you think AI will starve open source?
The amount of code I’ve seen copy-pasted from StackOverflow to do things like “group an array by key XYZ”, “dispatch requests in parallel with limit”, etc. when the dev should’ve known there were libs to help with these common tasks makes me think those devs will just use Copilot instead of SO, and do it way more often.
Bad devs will continue being bad devs, shocker
I think that undersells most of the compelling open source libraries though. The one line or one function open source libraries could be starved, I guess. But entire frameworks are open source. We’re not at the point yet where AI can develop software on that scale.
I agree wholeheartedly, and I think I failed to drive my point all the way home because I was typing on my phone.
I’m not worried that libs like
left-pad
will disappear. My comment that many devs will copy-paste stuff for “group by key” instead of bringing in e.g.lodash
was meant to illustrate that devs often fail to find FOSS implementations even when the problem has an unambiguously correct solution with no transitive dependencies.Frameworks are, of course, the higher-value part of FOSS. But they also require some buy-in, so it’s hard to knock devs for not using them when they could’ve, because sometimes there are completely valid reasons for going without.
But here’s the connection: Frameworks are made of many individual features, but they have some unifying abstractions that are shared across these features. If you treat every problem the way you treat “group by key”, and just copy-paste the SO answer for “How do I cache the result of a GET?” over and over again, you may end up with a decent approximation of those individual features, but you’ll lack any unifying abstraction.
Doing that manually, you’ll quickly find it to be so painful that you can’t help but find a framework to help you (assuming it’s not too late to stop painting yourself into a corner). With AI helping you do this? You could probably get much, much farther in your hideous hoard of ad-hoc solutions without feeling the pain that makes you seek out a framework.
I think there will be (and there already have been) significant downsizing over the next few years as businesses leverage AI to mean the same work can be done by less people paid less.
But the job cannot go away completely yet. It needs supervision by someone that can see the bullshit it often spits out and correct it.
But, if I’m honest, software development seems to be targeted when I think design writers should be equally scared. Well, that is if businesses work out that AI isn’t just chatgpt. A GPT or other LLM could be trained on a company’s specific designs and documentation, and then yes designers and technical writers could be scaled right back too.
Developers are the target because that’s what they see chatgpt doing.
In real terms a lot of the back office jobs and skilled writing and development jobs are on the line here.
The work can’t be done by someone paid less. The work can be done by highly skilled, experienced developers with fewer junior resources. The real death comes 60 years later when there are no more developers because there is no viable path to becoming a senior.
Technical writers you may be correct about because translating text is one is the primary use cases for AI.
Here’s the thing. Pay for work isn’t based on skill alone. It’s scarcity of a given demographic (skill makes up just part of that).
If the number of people overall is cut for software development worldwide, then scarcity at all levels will reduce and I reckon that will reduce pay.
I think our pay will start to diminish.
My pessimistic take is that everyone in society will get recast as the “human feedback” component of whichever flavor of ML takes over their domain.
8 hours a day of doing your domain’s equivalent of captchas.
That’s a worst case. I think at the moment at least gpt type ai isn’t good enough yet to not be used as a tool.
But yeah with some improvements we’ll end up being quality control for automated systems.
Well that’ll be junior developers, until they get hauled over the coals for producing highly repetitive code rather than refactoring out common themes.
Ah, but the AI won’t know to haul them over the coals. Utopa achieved! /s