• Che Banana@lemmy.ml
      link
      fedilink
      arrow-up
      31
      ·
      9 months ago

      Yes, but your bosses don’t know/understand that, why pay you when they can have 3 interns & AI for freeeeeeeeeeee???

        • Ephera@lemmy.ml
          link
          fedilink
          arrow-up
          29
          ·
          9 months ago

          Our team lead recently sent out two fresh juniors to tackle a task, with no senior informed. And of course, they were supposed to build it in Python, even though they had no experience with it, because Python is just so easy. Apparently, those juniors had managed to build something that was working …on one machine, at some point.

          On the day when our team lead wanted to show it to the customer, the two juniors were out of house (luckily for them) and no one knew where a distribution of that working state was. The code in the repo wouldn’t compile and seemed to be missing some commits.

          So, a senior got pulled in to try to salvage it, but the juniors hadn’t set up proper dependency management, unit tests, logging, distribution bundling, nor documentation. And the code was spaghetti, too. Honestly, could have just started over fresh.

          Our team lead was fuming, but they’ve been made to understand that this was not the fault of the juniors. So, yeah, I do think on that day, they found some new appreciation for seniors.

          Heck, even I found new appreciation for what we do. All of that stuff is just the baseline from where we start a project and you easily forget that it’s there, until it’s not.

    • space@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      29
      arrow-down
      1
      ·
      edit-2
      9 months ago

      Writing the actual code is the easy part. Thinking about what to write and how to organize it so it doesn’t become spaghetti is the hard part and what being a good developer is all about.

      • AggressivelyPassive@feddit.de
        link
        fedilink
        arrow-up
        8
        ·
        9 months ago

        Question is: how many developers are actually good? Or better, how many produce good results? I wouldn’t call myself a great programmer, just okayish, but I certainly pushed code I knew was absolute garbage, simply because of external pressure (deadlines, legacy crap, maybe just a bad day,…).

      • explodicle@local106.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        9 months ago

        I’m more of a mechanical engineer than a coder, and for me it’s been super helpful writing the code. The rest of our repo is clear enough that even I can understand what it actually does by just reading it. What I’m unfamiliar with are the syntax, and which nifty things our libraries can do.

        So if you kinda understand programs but barely know the language, then it’s awesome. The actual good programmers at my company prefer a minimal working example to fix over a written feature request. Then they replace my crap with something more elegant.

      • fuzzzerd@programming.dev
        link
        fedilink
        arrow-up
        5
        ·
        9 months ago

        That sounds awful. Imaging going back and forth requesting changes until it gets it right. It’d be like chatting with openai only it’s trying to merge that crap into your repo.

    • jwt@programming.dev
      link
      fedilink
      arrow-up
      4
      ·
      9 months ago

      It would probably mean the amount of coding work that companies want done would multiply 10 fold as well. I’m sure the content of the work developers do will change somewhat over time (analogous to what happened during the industrial revolution), but I doubt they’re all out of a job in the near future.

      • AggressivelyPassive@feddit.de
        link
        fedilink
        arrow-up
        2
        arrow-down
        2
        ·
        9 months ago

        Where I’m really not sure is, what percentage of the software written today actually needs human work?

        I mean, think about all the basic form rendering, inputs masks, CRUD apps. There’s definitely a ton of work in them and they’re widely used, but I’m pretty sure that a relatively basic AI-assisted framework could recreate most of these apps with hardly any actual coding. Sure, it won’t be super efficient or elegant, but let’s be honest: nobody cares about that, if they’re good enough.

        Just look at Wix, Wordpress, Squarespace etc. Website builders basically imploded the “low effort” web design market. Who would pay hundreds for a website made by a human, if you can just click together something reasonably good looking in 2h?

        • MajorHavoc@programming.dev
          link
          fedilink
          arrow-up
          4
          arrow-down
          2
          ·
          edit-2
          9 months ago

          There’s definitely a ton of work in them and they’re widely used, but I’m pretty sure that a relatively basic AI-assisted framework could recreate most of these apps with hardly any actual coding

          Any shop that’s not incompetent switched to using frameworks for that stuff 10-20 years ago, so there’s hopefully very little work left there for the AI.

          Even at a company where it’s a massive amount, that company “benefitting” from AI, really just managed to defer their “use a framework” savings 20 years late.

          • AggressivelyPassive@feddit.de
            link
            fedilink
            arrow-up
            3
            ·
            9 months ago

            Frameworks still require work. And tons of that. Even just defining all the form fields, add basic validations, write all the crud stuff, tests, etc.

  • MajorHavoc@programming.dev
    link
    fedilink
    arrow-up
    161
    arrow-down
    2
    ·
    edit-2
    9 months ago

    We do this every 15 years. For anyone less than 15 years into their career, welcome to the party.

    Let’s see if I can save you some energy:

    • Yes, it made my job massively easier.
    • No, it didn’t replace me.
    • Yes, it allowed a bunch of new people to also do the job I do. Welcome newbies!
    • No, my salary didn’t go down, relative to inflation.

    It turns out that the last mile to a successful product delivery is still really fucking hard, and this magic bullet tool also didn’t solve that.

    Now… Am I talking about…?

    • AI?
    • Web frameworks?
    • English like programming language syntax?
    • A compiler with built-in type checking?
    • All of the above.

    Edit: Formatting for readability.

    • Donkter@lemmy.world
      link
      fedilink
      arrow-up
      28
      arrow-down
      1
      ·
      9 months ago

      I mean honestly for things like tech, the jobs are going away due to these innovations, just piecemeal. Each of these innovations have shaved hours off of projects. Now someone’s salary might be the same and they might still have to go into the office 40hrs a week (or be just as productive working from home, go figure) but the actual work they’re doing is that much easier than it used to be, they might only have to work 4 hours a day now to accomplish what might have taken 2 days in the past.

      Sure, certain companies put more demand on employees than others, and as you mentioned there are still human components to the system that remain untouched by technology, but if the tech world was honest with itself tech employees do far less work now than they did 10-20 years ago, disregarding the general expansion of the tech industry. I’m just talking about individual jobs.

      Of course I don’t think those employees should be making less. I think if we innovate so much that a person’s job disappears we should be able to recognize that that person still deserves to be clothed and fed as if they still had that job.

      • masterspace@lemmy.ca
        link
        fedilink
        English
        arrow-up
        27
        ·
        9 months ago

        Yes, except for the fact that the flip side of those is that software, almost by definition, is automating away jobs in other industries.

        So when it gets easier / cheaper to write software, other industries will spend an increasing amount on it to replace their workers. That’s one of the reasons the software industry has continued to grow, even though it’s gotten easier to write.

        • Donkter@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          9 months ago

          Sure, but also almost by definition, using tech to replace workers in other industries will reduce the total amount of workers needed for that job as you made the tech presumably to make the job easier or faster. My post was talking about the tech industry just because that was the topic, but as you mention, tech definitely replaces jobs in all sectors.

          • MajorHavoc@programming.dev
            link
            fedilink
            arrow-up
            8
            ·
            edit-2
            9 months ago

            almost by definition, using tech to replace workers in other industries will reduce the total amount of workers needed for that job

            The data on this is actually uncertain. Installing ATM machines to replace bank tellers should have been a slam dunk, but didn’t really cut into bank teller total employment.

            https://www.aei.org/economics/what-atms-bank-tellers-rise-robots-and-jobs/

            Don’t get me wrong, the ATM was the first step in a long chain of improvements that still ought to soon make bank tellers obsolete, and the dept of labor predicts 15% lower demand next year.

            But even this relatively one-for-one swap of machines for people has taken half a century, so far.

            • Donkter@lemmy.world
              link
              fedilink
              arrow-up
              5
              ·
              9 months ago

              That goes back to the point I was making earlier. For some reason a bank teller is hired for the same wage for the same hours, but I can almost guarantee you that because of the ATM they spend significantly less of their work day “working” because the ATM was designed to do a significant portion of their job. There certainly is an excuse to keep them around all day, there are some unavoidable tasks that only a human can do and they come up at random times throughout the day, but the ATM has replaced many of the working hours the bank tellers used to have even if the job didn’t go away.

      • MajorHavoc@programming.dev
        link
        fedilink
        arrow-up
        12
        ·
        9 months ago

        tech employees do far less work now than they did 10-20 years ago

        Agreed!

        Of course, if we had truly understood the situation 10-20 years ago, we could have admitted that they were primarily being paid to know how to get the thing* to work, and not actually for the hours they spent typing in new code. Hence the rise of “Infrastructure Engineer” and “DevOps Specialist” as titles.

        *I omiitted the technical term, for brevity. But to be clear, by ‘thing’, I mean what professionas typically call the “damned fucking piece of shit webserver, and this fucking bullshit framework”.

    • Rodeo@lemmy.ca
      link
      fedilink
      arrow-up
      25
      arrow-down
      7
      ·
      9 months ago

      No, my salary didn’t go down, relative to inflation.

      I’m calling bullshit on that one.

      Everybody’s salary except executives has gone down relative to inflation going all the way back the the 80s.

      • shastaxc@lemm.ee
        link
        fedilink
        arrow-up
        17
        ·
        9 months ago

        Not mine. Every year if I don’t get a “cost of living” increase that meets or exceeds inflation, I go complain about it to my boss who then negotiates with HR on my behalf and I get a bigger raise. I’m not gonna let inflation kill my salary, and my boss is not gonna risk me leaving for another company. I do wish they would just give it to me up front and stop making me ask each year. We all know what the outcome is gonna be.

          • shastaxc@lemm.ee
            link
            fedilink
            arrow-up
            3
            ·
            9 months ago

            I’m not saying that the average wage in the country has not fallen against inflation. Data indicates that it has. But what I’m saying is that In the tech industry, if you provide good value to your company and the managers have half a brain, you should be able to negotiate annual raises to AT LEAST match inflation. If your company won’t, consider moving to a new company.

            I know this is a privilege that most workers do not have, but this thread is about jobs in tech, where this is a more common case. It’s also one of the reasons why the aren’t more unions.

            • frezik@midwest.social
              link
              fedilink
              arrow-up
              5
              ·
              9 months ago

              I’m not saying that the average wage in the country has not fallen against inflation. Data indicates that it has.

              It actually hasn’t; the data has shifted since this talking point was created. There’s still other issues at work, though; the argument needs to be reframed around productivity.

              See: https://midwest.social/comment/6656948

            • rab@lemmy.ca
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              9 months ago

              This is comment makes me want to move to US. In Canada what you said is so unrealistic

              • CanadaPlus@futurology.today
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                9 months ago

                In every country but the US, really. Someday, big tech companies will realise that a person in any other Western country can code just as well for half the price, but for now they won’t even consider it cause 'Murica.

                • frezik@midwest.social
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  9 months ago

                  They do. They’re looking mostly to Eastern Europe. India is the classic place to look, but the quality of the tech education there is mixed (at best). I’ve worked with a lot of competent people from Romania.

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        7
        ·
        9 months ago

        This got passed around as a common fact in the wake of the 2008 financial crisis. Wages from the early 70s through 2010 or so were flat (not negative, but flat) due to inflation. Things have shifted since then.

        https://fred.stlouisfed.org/series/LES1252881600Q

        Note that the graph shows median wage; it isn’t as affected by a few high earners as average wage would be. The 2010s were a period of relatively low inflation and wages had a chance to catch up a bit.

        What is true is that productivity has leaped massively since the 70s, but median wages have only crept up somewhat. The argument needs to shift to be around how the working class was screwed out of their share of productivity improvements. That’s not likely to change until we have more unions and overall something closer to Socialism.

      • MajorHavoc@programming.dev
        link
        fedilink
        arrow-up
        2
        ·
        9 months ago

        Two mitigating factors for me:

        1. For many years my skillet expanded faster than inflation ate away at my pay. I’ve been in a high demand specialty (Cybersecurity) for awhile.
        2. I’m now a manager, which does come with extra pay. Perhaps more importantly, it puts me in a position to throw my weight around to get my team and myself better raises.
    • Quadhammer@lemmy.world
      link
      fedilink
      arrow-up
      16
      arrow-down
      1
      ·
      9 months ago

      When AI is good enough to replace all of IT we all better hold onto our butts because we’re all going to fucking die

  • Badabinski@kbin.social
    link
    fedilink
    arrow-up
    113
    ·
    9 months ago

    lol, I’d love to see the fucking ruin of the world we’d live in if current LLMs replaced senior developers. Maybe it’ll happen some day, but in the meantime it’s job security! I get to fix all of the bugfuck crazy issues generated by my juniors using Copilot and ChatGPT.

    • SakuraCosmos@programming.dev
      link
      fedilink
      arrow-up
      31
      ·
      edit-2
      9 months ago

      One of my uni lecturers does the whole “You are out of a job” thing. He’s a smart guy but he’s barley written a line of code in his life. This comes up frequently and everytime I ask him “Get CHATGPT to write fizz buzz in X86 ASM.” Without fail it will crash when trying to build everytime. This technology is very advanced but I find people get it to the the simplest tasks and then expect it to solve the most complex ones.

      • evranch@lemmy.ca
        link
        fedilink
        arrow-up
        19
        ·
        9 months ago

        I tried using AI tools to do some cleanup and refactoring of some legacy embedded C code and was curious if it could do any optimization or knew any clever algorithms.

        It’s pretty good at figuring out the function of the code and adding comments, it did some decent refactoring of some sections to make them more readable.

        It has no clue about how to work in a resource constrained environment or about the main concepts that separate embedded from everything else. Namely that it has to be able to run “forever”, operate in realtime on a constant flow of sensor data, and that nobody else is taking care of your memory management.

        It even explained to me that we could do input filtering by using big arrays to do simple averaging on a device with only 1kB RAM, or use a long long for a never-reset accumulator without worrying about what will happen because “it will be years before it overflows”.

        AI buddy, some of these units have run for decades without a power cycle. If lazy coders start dumping AI output into embedded systems the whole world is going to get a lot more glitchy.

        • wewbull@feddit.uk
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          1
          ·
          9 months ago

          This is how AI is a threat to humanity. Not because it will choose to act against us, but because people will trust what it says without question and base huge decisions on faulty information.

          • evranch@lemmy.ca
            link
            fedilink
            arrow-up
            8
            ·
            9 months ago

            A million tiny decisions can be just as damaging. In my limited experience with several different local and cloud models you have to review basically all output as it can confidently introduce small errors. Often code will compile and run, but it has small errors that can cause output to drift, or the aforementioned long-run overflow type errors.

            Those are the errors that junior or lazy coders will never notice and walk away from, causing hard to diagnose failure down the road. And the code “looks fine” so reviewers would need to really go over it with a fine toothed comb, which only happens in critical industries.

            I will only use AI to write comments and documentation blocks and to get jumping off points for algorithms I don’t keep in my head. (“Write a function to sort this array”) It’s better than stack exchange for that IMO.

          • someacnt_@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            9 months ago

            Maybe the real “AI nuking the world” scenario was that it ie caused by the faulty information the AI hallucinated into existence

    • Ignotum@lemmy.world
      link
      fedilink
      arrow-up
      27
      ·
      9 months ago

      I was helping someone with their programming homework, every time copilot suggested anything he just blindly added it, and every time i had to ask him “and why do you need those lines? What do they do?”, and he could never answer…

      Sometimes those lines made sense, other times they were completely irrelevant to the problem, but he just add the suggestions on reflex without even reading them

      • MajorHavoc@programming.dev
        link
        fedilink
        arrow-up
        8
        ·
        9 months ago

        And when “web frameworks means we don’t need web developers anymore” and when “COBOL is basically plain English, so anyone can code, so we don’t need specialists anymore”.

      • wewbull@feddit.uk
        link
        fedilink
        English
        arrow-up
        5
        ·
        9 months ago

        Millions did. It’s just that after a while the advantages stopped being convincing and the trend reversed. If the same thing happens here, expect to go jobless for a while until you’re needed again.

    • Clent@lemmy.world
      link
      fedilink
      arrow-up
      19
      arrow-down
      1
      ·
      9 months ago

      And those juniors don’t realize they’ve set themselves up to be forever-juniors since they aren’t learning how to do the basics themselves.

    • MagicShel@programming.dev
      link
      fedilink
      arrow-up
      16
      ·
      9 months ago

      I had to pull aside a developer to inform him that he “would be” violating our national security by pasting code online to an AI and that there were potentially repercussions far beyond his job.

      He’s a lot slower now, but the code is better.

  • gencha@lemm.ee
    link
    fedilink
    arrow-up
    94
    ·
    9 months ago
    1. People vastly overestimate the abilities of AI.
    2. Developers vastly overestimate their own abilities.
    3. There are people on any level of seniority that would be perfectly replaced by a noise generator.
    • JasonDJ@lemmy.zip
      link
      fedilink
      arrow-up
      14
      arrow-down
      1
      ·
      9 months ago

      There are people on any level of seniority that would be perfectly replaced by a noise generator.

      Im fairly certain that this is what happened I my CISO.

    • taanegl@lemmy.world
      link
      fedilink
      arrow-up
      13
      arrow-down
      1
      ·
      9 months ago

      The company had avoided certain destruction, after having fired the previous CEO and putting a new one in it’s place. The new CEO had managed to bring a newfound calm to the company and it’s ranks, and brought an air of meditative discipline to board room meetings.

      Some said it was crazy, but making the LectoFan EVO the new CEO was the best decision the company board had ever made.

    • rab@lemmy.ca
      link
      fedilink
      arrow-up
      5
      arrow-down
      2
      ·
      9 months ago

      Overestimate now, but I think that AI is going to be insane within like 5 years, given current investment trends

      • Ethan@programming.dev
        link
        fedilink
        English
        arrow-up
        4
        ·
        9 months ago

        I find it very hard to believe that AI will ever get to the point of being able to solve novel problems without a fundamental change to the nature of “AI”. LLMs are powerful, but ultimately they (and every other kind of “AI”) are advanced pattern matching systems. Pattern matching is not capable of solving problems that haven’t been solved before.

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        3
        ·
        9 months ago

        If they stick around, maybe. If it falls into a rut, then investors are going to pull out and we’ll be in another AI winter.

      • gencha@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        9 months ago

        Replace “AI” with “metaverse” or “Bitcoin”. Same bullshit

  • crossmr@kbin.social
    link
    fedilink
    arrow-up
    80
    ·
    9 months ago

    Being a programmer is a lot like being a tradesperson. A tradesperson has a lot of flexibility in what they can do. They can work for a company, work freelance, or start their own business.

    Programming gives you the same flexibility, the most important bit being that you can do it for yourself.

    AI is going to struggle with larger complex tasks for a long time coming. While you can go to it and say ‘write me a script to convert a png to a jpg’ you can’t go to it and say ‘Write me a suite of tools to support business X’ or ‘make me a fun and creative game’ A good programmer isn’t going to be out of work for a long time.

    • xmunk@sh.itjust.works
      link
      fedilink
      arrow-up
      49
      ·
      9 months ago

      Most of the work software developers do is comprehending the problem, formulating a solution that addresses the problem, and doing it in a maintainable, performant, and security conscious manner.

      I think AI can write a killer isEven() method, I think it’s shit at everything I listed… it’s extremely shit at being security conscious, any dev can tell you that it’s easier to write code and confirm it’s following best security practices then it is to review someone’s code and confirm it’s following best security practices… I think AI actively makes it harder to have confidence in security.

      The first real part of my job I think AI will help with is performance tuning. We’re not there yet but I think we’re not unimaginably far from being able to give an AI a working but slow function and have a computer spin up a million randomized test inputs and outputs… then start scrambling the algorithm in a plethora of ways and testing the performance while confirming that the test cases pass.

      Then again, you’ll need to confirm the algorithm is still secure - but I think the realm of performance is the first place we’d see a tool that I’d demand a license for.

      • ForgotAboutDre@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        9 months ago

        All the AI does is match the request to solutions it was trained in.

        It just stackoverflow in your ide. It has a little more flexibility in answering and isn’t as corrupted by SEO result when googling the equivalent answer. Its not informed and thinking.

        The optimisation problems you are talking about is the process that is used to make AI models in the first place. I think you want an AI to configure optimisation routines for you rather than build the test cases and variables yourself. Or you want some system that implement all the individual components better, but an AI that can optimise the entire thing isn’t coming about soon. It would need to trained on very similar software. In which case you should just use that better software.

    • someacnt_@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      9 months ago

      Basically any pro-AI argument seems to go “it will achieve AGI”. So funny that lots of people buy that, forgetting how hard a general intelligence is.

  • tatterdemalion@programming.dev
    link
    fedilink
    arrow-up
    85
    arrow-down
    6
    ·
    edit-2
    9 months ago

    It literally cannot come up with novel solutions because it’s goal is to regurgitate the most likely response to a question based on training data from the internet. Considering that the internet is often trash and getting trashier, I think LLMs will only get worse over time.

    • space@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      52
      ·
      9 months ago

      AI has poisoned the well it was fed from. The only solution to get a good AI moving forward is to train it using curated data. That is going to be a lot of work.

      On the other hand, this might be a business opportunity. Selling curated data to companies that want to make AIs.

      • tatterdemalion@programming.dev
        link
        fedilink
        arrow-up
        11
        ·
        9 months ago

        I could see large companies paying to train the LLM on their own IP even just to maintain some level of consistency, but it obviously wouldn’t be as valuable as hiring the talent that sets the bar and generates patent-worthy inventions.

        • MagicShel@programming.dev
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          9 months ago

          You can fine tune a model with specific stuff today. OpenAI offers that right on their website and big companies are already taking advantage. It doesn’t take a whole new LLM, and the cost is a pittance in comparison.

    • cybersandwich@lemmy.world
      link
      fedilink
      arrow-up
      49
      ·
      9 months ago

      I said this a while ago but you know how we have “pre-atomic” steel? We are going to have pre-LLM data sets.

      • Obi@sopuli.xyz
        link
        fedilink
        arrow-up
        18
        ·
        9 months ago

        Low-background steel, also known as pre-war steel, is any steel produced prior to the detonation of the first nuclear bombs in the 1940s and 1950s. Typically sourced from ships (either as part of regular scrapping or shipwrecks) and other steel artifacts of this era, it is often used for modern particle detectors because more modern steel is contaminated with traces of nuclear fallout.[1][2]

        Very interesting, today I learned.

      • DudeDudenson@lemmings.world
        link
        fedilink
        arrow-up
        16
        ·
        9 months ago

        The reason why chat gpt 3.5 is still great for anything previous to it’s cutoff date. It’s not constantly being updated with new garbage

    • ArrogantAnalyst@feddit.de
      link
      fedilink
      arrow-up
      28
      ·
      9 months ago

      Also the more the internet is swept with AI generated content, the more future datasets will be trained on old AI output rather than on new human input.

    • test113@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      25
      ·
      edit-2
      9 months ago

      Hi, I don’t want to say too much, but after being invited to some closed AI talks by one of the biggest chip machine manufacturers (if you know the name, you know they don’t mess around), I can tell you AI is, in certain regards, a very powerful tool that will shape some, if not all, industries by proxy. They described it as the “internet” in the way that it will take influence on everybody’s life sooner or later, and you can either keep your finger on the pulse or get left behind. But they distinguished between the “AI” that’s floating around in the public sector vs. actual purpose-trained AI that’s not meant for public usage. Sidenote: They are also convinced the average user of a LLM is using it the “wrong” way. LLMs are only a starting point.

      Also, it’s concerning; I’m pretty sure the big boys have already taken over the AI market, so I do not trust that it will be to the benefit of all of us and not only for a select group (of shareholders) that will reap the benefits.

      • mob@sopuli.xyz
        link
        fedilink
        arrow-up
        33
        arrow-down
        1
        ·
        9 months ago

        Yeah you definitely went to a marketing thing and got marketed to

        • DudeDudenson@lemmings.world
          link
          fedilink
          arrow-up
          11
          arrow-down
          1
          ·
          9 months ago

          Like when they claim your smart thermostat is now “AI powered” despite the fact it’s the same exact product it was 2 years ago

        • test113@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          9 months ago

          Again, none of the people at this talk have anything to do with selling a product or pushing an agenda or whatever you think. There is no press, there is no marketing, there is no product - it was basically a meetup of private equity firms that discussed the implementation and impact of purpose-trained AI in diverse fields, which affects the business structure of the big single-family office behemoths, like an industry summit for the private equity sector regarding the future of AI and how some plan to implement it (mainly big non-public SFOs).

          Sometimes people just meet to discuss strategy; no one at these talks is interested in selling you anything or buying anything - they are essentially top management and/or members of large single-family offices and other private equity firms. They are not interested in selling or marketing something to the public; they are not public companies.

          It’s weird how you guys react; not everything is a conspiracy or a marketing thing. It’s pretty normal in private equity to have these closed talks about global phenomena and how to deal with it.

          These talks are more to keep the industry informed. I get that you do not like it when essentially the big SFOs have a meeting where they discuss their future plans on a certain topic, but it’s pretty normal that the elite will arrange themselves to coordinate some investments. It’s essentially just the offices of the big billionaire families coming together to put heads together to discuss a topic that might influence their business structure. But, in no way is it a marketing strategy; it would, on the contrary, be negatively viewed in the public eye that big finance is already coordinating to implement AI into their strategy.

          But feelings don’t change facts. My point is if the actual non public big players are looking at AI in a serious matter, then so should you.

          • mob@sopuli.xyz
            link
            fedilink
            arrow-up
            5
            ·
            edit-2
            9 months ago

            Its not a conspiracy… You are obviously not involved in the actual ML/AI, but another sector. You aren’t speaking in any technical explaination.

            A lot of us are involved in the technical aspect and understand what is being said by management.

            • test113@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              9 months ago

              I never argued that I was in IT/Tech; I deal with investments and PE. I have nothing to do with IT or tech. My point is we, in the PE/FO sector, are going to invest in AI businesses in 24/25, not only in the “B2C market” but mainly in the B2B market and for internal applications. Whether you believe it or not, it’s gonna happen anyway.

      • wewbull@feddit.uk
        link
        fedilink
        English
        arrow-up
        23
        ·
        9 months ago

        So Nvidia (or Intel or AMD) told you that you need to AI to stay competitive. Not only that, but you needed a bespoke solution. Not the toy version out on the net every can get access to.

        Strangely enough, they have some wonderful products coming to market which would be just what you need to build a large training network capable of injesting all your company data. They’d be happy to help you on this project.

        All they had to do to get you to drop your guard was invite you by name to a “closed talk”.

        • test113@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          2
          ·
          9 months ago

          Haha, lol, whats happening why do you hate me, just sharing an experience, an opinion?

          • it’s not NVIDIA or AMD or any chip manufacturer, or someone who has a product to sell to you. Most of them are not even publicly traded but are organized in family office structures. They don’t care about the B2C market at all; they are essentially private equity firms. You guys interpret anything to fit your screwed-up vision of this world. They don’t even have a product to sell to you or me; it was a closed talk with top industry leaders and their managers where they discussed their view of AI and how they will implement purpose-trained AI into manufacturing, etc. It has nothing to do with selling to the public.

          I have already said too much - just let me tell you if you think LLMs are the pinnacle of AI, you are very mistaken, and depending on your position in the market, you need to take AI into account. You can only dismiss AI if you have a position/job with no real responsibility.

          So weird how you guys think everything is to sell you something or a conspiracy - this was a closed talk to discuss how the leaders in certain industries will adapt to the coming changes. They give zero cares about the B2C market, aka you as an individual.

          Again, none of the people at this talk have anything to do with selling a product or pushing an agenda or whatever you think. There is no press, there is no marketing - it was basically a meetup of private equity firms that discussed the implementation and impact of purpose-trained AI in diverse fields, which affects the business structure of the big single-family office behemoths.

      • Buttons@programming.dev
        link
        fedilink
        English
        arrow-up
        3
        ·
        9 months ago

        As long as AI isn’t outlawed or “regulated” in some stupid way, open-source AI models will stay competitive. People are interested in AIs and working on them is exciting and doesn’t require a lot of code or other bullshit, this is the type of thing that the open-source community will work on.

  • kibiz0r@midwest.social
    link
    fedilink
    English
    arrow-up
    53
    arrow-down
    1
    ·
    9 months ago

    Who do they think will be using the AI?

    AI threatens to harm a lot about programming, but not the existence/necessity of programmers.

    Particularly, AI may starve the development of open source libraries. Which, ironically, will probably increase the need for employed programmers as companies accrue giant piles of shoddy in-house code that needs maintaining.

    • whoisearth@lemmy.ca
      link
      fedilink
      arrow-up
      36
      ·
      9 months ago

      I can’t wait for my future coworkers who will be coding with AI without actually understanding the fundamentals of the language they’re coding in. It’s gonna get scary.

      • Patches@sh.itjust.works
        link
        fedilink
        arrow-up
        24
        ·
        9 months ago

        I guarantee you have coworkers right now coding without understanding the fundamentals of the language they’re coding in. Reusing code you don’t understand doesn’t change if you stole it from Stack Overflow, or you stole it from Chat-GPT9.

        • whoisearth@lemmy.ca
          link
          fedilink
          arrow-up
          11
          ·
          9 months ago

          The code on SO is rarely specific to what the use case is IMHO. Any code I’ve gotten from there has had to be reworked to fit into what I’m doing. Plus I can’t post some stuff on SO because of legal reasons but can on an internal ChatGPT portal.

          Trust me, it’s gonna get a lot worse.

          Matter of fact, I look forward to the security breaches of developers posting company code into ChatGPT for help lol. We already had that issue with idiots posting company code into the public GitHub.

      • Aa!@lemmy.world
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        9 months ago

        Imagine programming a computer without understanding the machine code that tells the CPU what to do

      • blindsight@beehaw.org
        link
        fedilink
        arrow-up
        3
        ·
        9 months ago

        I feel attacked.

        j/k. I’m happy in the education sector. The code I write won’t be seen by anybody but me.

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        17
        ·
        9 months ago

        The amount of code I’ve seen copy-pasted from StackOverflow to do things like “group an array by key XYZ”, “dispatch requests in parallel with limit”, etc. when the dev should’ve known there were libs to help with these common tasks makes me think those devs will just use Copilot instead of SO, and do it way more often.

        • VoterFrog@kbin.social
          link
          fedilink
          arrow-up
          4
          ·
          9 months ago

          I think that undersells most of the compelling open source libraries though. The one line or one function open source libraries could be starved, I guess. But entire frameworks are open source. We’re not at the point yet where AI can develop software on that scale.

          • kibiz0r@midwest.social
            link
            fedilink
            English
            arrow-up
            2
            ·
            9 months ago

            I agree wholeheartedly, and I think I failed to drive my point all the way home because I was typing on my phone.

            I’m not worried that libs like left-pad will disappear. My comment that many devs will copy-paste stuff for “group by key” instead of bringing in e.g. lodash was meant to illustrate that devs often fail to find FOSS implementations even when the problem has an unambiguously correct solution with no transitive dependencies.

            Frameworks are, of course, the higher-value part of FOSS. But they also require some buy-in, so it’s hard to knock devs for not using them when they could’ve, because sometimes there are completely valid reasons for going without.

            But here’s the connection: Frameworks are made of many individual features, but they have some unifying abstractions that are shared across these features. If you treat every problem the way you treat “group by key”, and just copy-paste the SO answer for “How do I cache the result of a GET?” over and over again, you may end up with a decent approximation of those individual features, but you’ll lack any unifying abstraction.

            Doing that manually, you’ll quickly find it to be so painful that you can’t help but find a framework to help you (assuming it’s not too late to stop painting yourself into a corner). With AI helping you do this? You could probably get much, much farther in your hideous hoard of ad-hoc solutions without feeling the pain that makes you seek out a framework.

    • r00ty@kbin.life
      link
      fedilink
      arrow-up
      7
      ·
      9 months ago

      I think there will be (and there already have been) significant downsizing over the next few years as businesses leverage AI to mean the same work can be done by less people paid less.

      But the job cannot go away completely yet. It needs supervision by someone that can see the bullshit it often spits out and correct it.

      But, if I’m honest, software development seems to be targeted when I think design writers should be equally scared. Well, that is if businesses work out that AI isn’t just chatgpt. A GPT or other LLM could be trained on a company’s specific designs and documentation, and then yes designers and technical writers could be scaled right back too.

      Developers are the target because that’s what they see chatgpt doing.

      In real terms a lot of the back office jobs and skilled writing and development jobs are on the line here.

      • MagicShel@programming.dev
        link
        fedilink
        arrow-up
        16
        ·
        edit-2
        9 months ago

        The work can’t be done by someone paid less. The work can be done by highly skilled, experienced developers with fewer junior resources. The real death comes 60 years later when there are no more developers because there is no viable path to becoming a senior.

        Technical writers you may be correct about because translating text is one is the primary use cases for AI.

        • r00ty@kbin.life
          link
          fedilink
          arrow-up
          2
          ·
          9 months ago

          Here’s the thing. Pay for work isn’t based on skill alone. It’s scarcity of a given demographic (skill makes up just part of that).

          If the number of people overall is cut for software development worldwide, then scarcity at all levels will reduce and I reckon that will reduce pay.

          I think our pay will start to diminish.

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        9 months ago

        My pessimistic take is that everyone in society will get recast as the “human feedback” component of whichever flavor of ML takes over their domain.

        8 hours a day of doing your domain’s equivalent of captchas.

        • r00ty@kbin.life
          link
          fedilink
          arrow-up
          1
          ·
          9 months ago

          That’s a worst case. I think at the moment at least gpt type ai isn’t good enough yet to not be used as a tool.

          But yeah with some improvements we’ll end up being quality control for automated systems.

    • wewbull@feddit.uk
      link
      fedilink
      English
      arrow-up
      7
      ·
      9 months ago

      Who do they think will be using the AI?

      Well that’ll be junior developers, until they get hauled over the coals for producing highly repetitive code rather than refactoring out common themes.

  • AwkwardLookMonkeyPuppet@lemmy.world
    link
    fedilink
    arrow-up
    35
    arrow-down
    1
    ·
    9 months ago

    I have no other skills that would pay anywhere close to what this career pays. I’d need to go back to school and become a surgeon or something. I don’t think they let people become surgeons at 50 years old, and I don’t have the energy for an internship and residency. I’m just hanging on and hoping that it doesn’t all vanish in the next few years. I’m also spending time learning how to leverage AI, since I think that’ll put me a step ahead. Good luck to all of us, we’re going to need it!

  • NigelFrobisher@aussie.zone
    link
    fedilink
    arrow-up
    33
    arrow-down
    2
    ·
    9 months ago

    Any gains from LLM now would barely offset the complexity bloat introduced in enterprise applications in the last decade alone. And that’s not even taking into account the sins of the past that are only hidden behind the topsoil lair of cargo cult architecture.

    • sheogorath@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      9 months ago

      After the report that codes made by the assistance of copilot are actually shittier than code written manually I’m feeling safe until the next breakthrough in AI development. Meanwhile I’m saving up gold for the eventuality.

  • Jimmyeatsausage@lemmy.world
    link
    fedilink
    arrow-up
    27
    ·
    9 months ago

    I’ll start worrying about artificial intelligence when customers can generate requirements specific enough for actual intelligence to decipher.

    Kinda hard to build a prompt when they don’t even really know what they want until they’ve seen what they asked for.

  • FluffyPotato@lemm.ee
    link
    fedilink
    arrow-up
    22
    ·
    9 months ago

    Yea, I tried to use AI for my work, it seems to have zero clue about the software I asked about but it pretends it does. I think I’m safe.

  • Semi-Hemi-Demigod@kbin.social
    link
    fedilink
    arrow-up
    21
    ·
    9 months ago

    I feel pretty secure in my job, because in the future I’ll talk to the customers so the AI doesn’t have to instead of the engineers.

    • renzev@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      9 months ago

      Typical conversation between a non-programmer and a programmer about AI:

      Won’t AI put you out of your job?

      It probably won’t

      Well, can’t AI write code much faster and more efficiently than humans?

      How would it know what code to write?

      I guess you would need to provide it with a description of the app that you want it to make?

      So you’re telling me that in the future, there will be machines that can generate computer code based entirely on a description of the required functionality?

      I guess so?

      Those machines are called “compilers”, and “a description of the required functionality” is called “a program”. You’re describing programming.

  • Sprokes@lemmy.world
    link
    fedilink
    arrow-up
    20
    arrow-down
    1
    ·
    9 months ago

    Didn’t ChatGPT become very bad recently? It used to give really working code but now it gets things wrong and doesn’t follow context. It gives code but when you ask it to improve by give more context, it ignores the previous answer and give wrong code.

    It even sometimes answers by saying it does not have the answer for questions that it answered few months ago.

    • Evotech@lemmy.world
      link
      fedilink
      arrow-up
      14
      arrow-down
      3
      ·
      edit-2
      9 months ago

      Last time I asked a niche api question it showed me how to formulate the question so I could post it on this GitHub issues…

      Edit

      • Clent@lemmy.world
        link
        fedilink
        arrow-up
        21
        arrow-down
        1
        ·
        9 months ago

        How is that a niche api question? That’s a public api that is scraped up.

        It’s also a terrible way to ask the question. It’s how a clueless newb asks questions. Anyone hoping to help needs to at least know: What are you attempting to use the end point for and What results are you receiving vs expecting?

      • RedstoneValley@sh.itjust.works
        link
        fedilink
        arrow-up
        11
        ·
        9 months ago

        That looks like advice on how NOT to ask for technical support on a public forum.

        1. Be generic and vague. Omit as many details as possible, this will only distract from the problem at hand.
        2. remember to include your private API key to share it with the world.
    • anus@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      9 months ago

      The latest update from openai calls this “laziness” and discusses a fix coming