Yeah, there are people who can “in general” imagine how this will happen, but programming is exactly 99% not about “in general” but about specific “dumb” conflicts in the objective reality.
People think that what they generally imagine as the task is the most important part, and since they don’t actually do programming or anything requiring to deal with those small details, they just plainly ignore them, because those conversations and opinions exist in subjective bendable reality.
But objective reality doesn’t bend. Their general ideas without every little bloody detail simply won’t work.
Not really, it’s doable with chatgpt right now for programs that have a relatively small scope. If you set very clear requirements and decompose the problem well it can generate fairly high quality solutions.
This is incorrect. And I’m in the industry. In this specific field. Nobody in my industry, in my field, at my level, seriously considers this effective enough to replace their day to day coding beyond generating some boiler plate ELT/ETL type scripts that it is semi-effective at. It still contains multiple errors 9 times out of 10.
I cannot be more clear. The people who are claiming that this is possible are not tenured or effective coders, much less X10 devs in any capacity.
People who think it generates quality enough code to be effective are hobbyists, people who dabble with coding, who understand some rudimentary coding patterns/practices, but are not career devs, or not serious career devs.
If you don’t know what you’re doing, LLMs can get you close, some of the time. But there’s no way it generates anything close to quality enough code for me to use without the effort of rewriting, simplifying, and verifying.
Why would I want to voluntarily spend my day trying to decypher someone else’s code? I don’t need chatGPT to solve a coding problem. I can do it, and I will. My code will always be more readable to me than someone else’s. This is true by orders of magnitude for AI-code gen today.
So I don’t consider anyone that considers LLM code gen to be a viable path forward, as being a serious person in the engineering field.
It’s just a tool like any other. An experienced developer knows that you can’t apply every tool to every situation. Just like you should know the difference between threads and coroutines and know when to apply them. Or know which design pattern is relevant to a given situation. It’s a tool, and a useful one if you know how to use it.
This is like applying a tambourine made of optical discs as a storage solution. A bit better cause punctured discs are no good.
A full description of what a program does is the program itself, have you heard that? (except for UB, libraries, … , but an LLM is no better than a human in that too)
That guy has never seen AI code before. It regularly gets even simple stuff wrong. Was he especially good is when it gives made up crap. Or it tells you a method or function you can use but doesn’t tell you where it got that. And then you’re like “oh wow I didn’t realize that was available” and then you try it and realize that’s not part of the standard library and you ask it “where did you get that” and it’s like “oh yeah sorry about that I don’t know”.
My absolute favorite is when I asked copilot to code a UI button and it just pasted “// the UI element should do (…) but instead it is doing (…)” a dozen times.
Like, clearly someone on stackoverflow asked for help, got used for training data, and confused copilot
The job of CEO seems the far easier to replace with AI. A fairly basic algorithm with weighted goals and parameters (chosen by the board) + LLM + character avatar would probably perform better than most CEOs. Leave out the LLM if you want it to spout nonsense like this Amazon Cloud CEO.
The sentiment on AI in the span of 10 years went from “it’s inevitable it will replace your job” to “nope not gonna happen”. The difference back then the jobs it was going to replace were not tech jobs. Just saying.
From the very beginning people were absolutely making connections between ai and tech jobs like programming.
The fuck are you talking about? Are you seriously trying to imply that now that it’s threatening tech jobs (it’s not) suddenly the narrative around how useful it will be changed (it didn’t)
When is that exactly do you have in mind? I’m talking about automation which roughly around 2010 the discourse was primarily centered around blue collar jobs. The discussion was about these careers becoming obsolete if AI ever advanced to the point where it involved little to no humans to perform the tasks.
Back then AI with regards to white collar jobs was no where near the primary focus of discourse much less programming.
Tech nerds back then were all gung ho about it making entire careers obsolete in the near future. Truck drivers were supposed to be a dead career by now. They absolutely do not hold the same enthusiasm right now when it’s being said about their own careers.
It’s worth noting that the new CEO is one of few people at Amazon to have worked their way up from PM and sales to CEO.
With that in mind, while it’s a hilariously stupid comment to make, he’s in the business of selling AWS and its role in AI. Take it with the same level of credibility as that crypto scammer you know telling you that Bitcoin is the future of banking.
I’m not entirely sold on the technology, especially since immutable ledgers have been around long before the blockchain, but also due to potential attack vectors and the natural push towards centralisation for many applications - but I’m just one man and if people find uses for it then good for them.
Its not like jobs will disappear in a single day. Incremental improvements will render lower level tasks obsolete, it already has to a degree.
Someone will still need to translate the business objectives into logical structure, via code, language, or whatever medium. Whether you call that a “coder” or not, is kind of irrelevant. The nerdy introverts will need to translate sales-douche into computer one way or another. Sales-douches are not going to be building enterprise apps from their techbro-hypespeak.
That is what happens when you mix a fucking CEO with tech “How many workers can I fire to make more money and boast about my achievements in the annual conference of mega yacht owners” where as the correct question should obviously have always been (unless you are a psychopath) “how can I use this tech to boost productivity of my workers so they can produce the same amount of work in less amount of time and have more personal time for themselves”
Also these idiots always forget the “problem solving” part of most programming tasks which is still beyond the capability of LLMs. Sure have LLMs do the mundane stuff so that programmers can spend time on stuff that is more rewarding? No instead lets try to fire everyone.
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !technology@lemmy.world
This is a most excellent place for technology news and articles.
deleted by creator
Yeah hows that goin’?
Extremely misleading title. He didn’t say programmers would be a thing of the past, he said they’ll be doing higher level design and not writing code.
Even so, he’s wrong. This is the kind of stupid thing someone without any first hand experience programming would say.
Yeah, there are people who can “in general” imagine how this will happen, but programming is exactly 99% not about “in general” but about specific “dumb” conflicts in the objective reality.
People think that what they generally imagine as the task is the most important part, and since they don’t actually do programming or anything requiring to deal with those small details, they just plainly ignore them, because those conversations and opinions exist in subjective bendable reality.
But objective reality doesn’t bend. Their general ideas without every little bloody detail simply won’t work.
Not really, it’s doable with chatgpt right now for programs that have a relatively small scope. If you set very clear requirements and decompose the problem well it can generate fairly high quality solutions.
This is incorrect. And I’m in the industry. In this specific field. Nobody in my industry, in my field, at my level, seriously considers this effective enough to replace their day to day coding beyond generating some boiler plate ELT/ETL type scripts that it is semi-effective at. It still contains multiple errors 9 times out of 10.
I cannot be more clear. The people who are claiming that this is possible are not tenured or effective coders, much less X10 devs in any capacity.
People who think it generates quality enough code to be effective are hobbyists, people who dabble with coding, who understand some rudimentary coding patterns/practices, but are not career devs, or not serious career devs.
If you don’t know what you’re doing, LLMs can get you close, some of the time. But there’s no way it generates anything close to quality enough code for me to use without the effort of rewriting, simplifying, and verifying.
Why would I want to voluntarily spend my day trying to decypher someone else’s code? I don’t need chatGPT to solve a coding problem. I can do it, and I will. My code will always be more readable to me than someone else’s. This is true by orders of magnitude for AI-code gen today.
So I don’t consider anyone that considers LLM code gen to be a viable path forward, as being a serious person in the engineering field.
It’s just a tool like any other. An experienced developer knows that you can’t apply every tool to every situation. Just like you should know the difference between threads and coroutines and know when to apply them. Or know which design pattern is relevant to a given situation. It’s a tool, and a useful one if you know how to use it.
This is like applying a tambourine made of optical discs as a storage solution. A bit better cause punctured discs are no good.
A full description of what a program does is the program itself, have you heard that? (except for UB, libraries, … , but an LLM is no better than a human in that too)
right now not a chance. it’s okay ish at simple scripts. it’s alright as an assistant to get a buggy draft for anything even vaguely complex.
ai doing any actual programming is a long ways off.
That guy has never seen AI code before. It regularly gets even simple stuff wrong. Was he especially good is when it gives made up crap. Or it tells you a method or function you can use but doesn’t tell you where it got that. And then you’re like “oh wow I didn’t realize that was available” and then you try it and realize that’s not part of the standard library and you ask it “where did you get that” and it’s like “oh yeah sorry about that I don’t know”.
My absolute favorite is when I asked copilot to code a UI button and it just pasted “// the UI element should do (…) but instead it is doing (…)” a dozen times.
Like, clearly someone on stackoverflow asked for help, got used for training data, and confused copilot
The job of CEO seems the far easier to replace with AI. A fairly basic algorithm with weighted goals and parameters (chosen by the board) + LLM + character avatar would probably perform better than most CEOs. Leave out the LLM if you want it to spout nonsense like this Amazon Cloud CEO.
Cheaper too I bet.
Also lol for the AI coder 😁 good luck with that 😂
It’s a good thing. After all, I don’t care when Amazon goes down.
Until an AI can get clear, reasonable requirements out of a client/stakeholder our jobs are safe.
So never right?
If the assumption is that a PM holds all the keys…
The sentiment on AI in the span of 10 years went from “it’s inevitable it will replace your job” to “nope not gonna happen”. The difference back then the jobs it was going to replace were not tech jobs. Just saying.
From the very beginning people were absolutely making connections between ai and tech jobs like programming.
The fuck are you talking about? Are you seriously trying to imply that now that it’s threatening tech jobs (it’s not) suddenly the narrative around how useful it will be changed (it didn’t)
When is that exactly do you have in mind? I’m talking about automation which roughly around 2010 the discourse was primarily centered around blue collar jobs. The discussion was about these careers becoming obsolete if AI ever advanced to the point where it involved little to no humans to perform the tasks.
Back then AI with regards to white collar jobs was no where near the primary focus of discourse much less programming.
Tech nerds back then were all gung ho about it making entire careers obsolete in the near future. Truck drivers were supposed to be a dead career by now. They absolutely do not hold the same enthusiasm right now when it’s being said about their own careers.
You’re way off the mark. Save your outrage.
I wish.
They could churn out garbage and scams for the idiots on Facebook, sure.
It’s worth noting that the new CEO is one of few people at Amazon to have worked their way up from PM and sales to CEO.
With that in mind, while it’s a hilariously stupid comment to make, he’s in the business of selling AWS and its role in AI. Take it with the same level of credibility as that crypto scammer you know telling you that Bitcoin is the future of banking.
As a wage slave with no bitcoin or crypto, the technology has been hijacked by these types and could otherwise have been useful.
I’m not entirely sold on the technology, especially since immutable ledgers have been around long before the blockchain, but also due to potential attack vectors and the natural push towards centralisation for many applications - but I’m just one man and if people find uses for it then good for them.
I guess additional bonus for crypto would be not burning the planet, and actuallt have a real value of something, not the imagined one.
PM and sales, eh?
So you’re saying his lack of respect for programmers isn’t new, but has spanned his whole career?
Wasn’t it the rabbit 1 scammer who said programmers would be gone in 5 years, like 3 years ago?
Spoken like someone who manages programmers instead of working as one.
Its not like jobs will disappear in a single day. Incremental improvements will render lower level tasks obsolete, it already has to a degree.
Someone will still need to translate the business objectives into logical structure, via code, language, or whatever medium. Whether you call that a “coder” or not, is kind of irrelevant. The nerdy introverts will need to translate sales-douche into computer one way or another. Sales-douches are not going to be building enterprise apps from their techbro-hypespeak.
That is what happens when you mix a fucking CEO with tech “How many workers can I fire to make more money and boast about my achievements in the annual conference of mega yacht owners” where as the correct question should obviously have always been (unless you are a psychopath) “how can I use this tech to boost productivity of my workers so they can produce the same amount of work in less amount of time and have more personal time for themselves”
Also these idiots always forget the “problem solving” part of most programming tasks which is still beyond the capability of LLMs. Sure have LLMs do the mundane stuff so that programmers can spend time on stuff that is more rewarding? No instead lets try to fire everyone.