Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. Sam Altman Says If Jobs Gets Wiped Out, Maybe They Weren’t Even “Real Work” to Start With

Sam Altman Says If Jobs Gets Wiped Out, Maybe They Weren’t Even “Real Work” to Start With

Scheduled Pinned Locked Moved Technology
technology
125 Posts 78 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • N [email protected]

    I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.

    The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.

    The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.

    In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.

    Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.

    As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.

    I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.

    These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.

    How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.

    At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.

    curious_canid@lemmy.caC This user is from outside of this forum
    curious_canid@lemmy.caC This user is from outside of this forum
    [email protected]
    wrote last edited by
    #78

    Sam Altman is a huckster, not a technologist. As such, I don't really care what he says about technology. His purpose has always been to transfer as much money as possible from investors into his own pocket before the bubble bursts. Anything else is incidental.

    I am not entirely writing off LLMs, but very little of the discussion about them has been rational. They do some things fairly well and a lot of things quite poorly. It would be nice if we could just focus on the former.

    T 1 Reply Last reply
    20
    • N [email protected]

      I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.

      The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.

      The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.

      In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.

      Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.

      As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.

      I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.

      These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.

      How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.

      At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.

      1984@lemmy.today1 This user is from outside of this forum
      1984@lemmy.today1 This user is from outside of this forum
      [email protected]
      wrote last edited by [email protected]
      #79

      The guys name is too perfect.

      Altman. Alternative man.

      Just not a good alternative.

      After his extreamly creepy interview with Tucker Carlsson about that whistleblower who died, I know he is not right in the head.

      1 Reply Last reply
      3
      • N [email protected]

        I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.

        The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.

        The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.

        In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.

        Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.

        As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.

        I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.

        These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.

        How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.

        At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.

        vandals_handle@lemmy.worldV This user is from outside of this forum
        vandals_handle@lemmy.worldV This user is from outside of this forum
        [email protected]
        wrote last edited by
        #80

        Is this where they get rid of the telephone sanitizers and middle managers?

        1 Reply Last reply
        3
        • dojan@pawb.socialD [email protected]

          At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.

          I'd not put an LLM in charge of developing a framework that is meant to be used in any sort of production environment. If we're talking about them setting up the skeleton of a project, then templates have already been around for decades at this point. You also don't really set up new projects all that often.

          M This user is from outside of this forum
          M This user is from outside of this forum
          [email protected]
          wrote last edited by
          #81

          Most of what LLMs present as solutions have been around for decades, that's how they learned them: from source material they train to.

          So far, AI hasn't surprised me with anything clever or new, mostly I'm just reminding it to follow directions, and often I'm pointing out better design patterns than what it implements on the first go around.

          Above all: you don't trust what an LLM spits out any more than you trust a $50/hr "consultant" from the local high school computer club to give you business critical software... you test it, if you have the ability you review it at the source level, line by line. But there ARE plenty of businesses out there running "at risk" with sketchier software developers than the local computer club, OF COURSE they are going to trust AI generated code further than they should.

          Get the popcorn, there will be some entertaining stories about that over the coming year.

          1 Reply Last reply
          2
          • C [email protected]

            I tried to demo an agentic AI in Jetbrains to a coworker, just as a "hey look at this neat thing that can make changes on its own". As the example I told it to convert a constructor in c# to a primary constructor.

            So it "thought" and made the change, "thought" again and reverted the change, "thought" once again and made the change again, then it "thought" for a 4th time and reverted the changes again. I stopped it there and just shook my head.

            M This user is from outside of this forum
            M This user is from outside of this forum
            [email protected]
            wrote last edited by
            #82

            I had similar experiences a few months back, like 6-8. Since Anthropic Sonnet 4.0 things have changed significantly. 4.5 is even a bit better. Competing models have been similarly improving.

            1 Reply Last reply
            3
            • N [email protected]

              I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.

              The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.

              The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.

              In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.

              Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.

              As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.

              I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.

              These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.

              How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.

              At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.

              B This user is from outside of this forum
              B This user is from outside of this forum
              [email protected]
              wrote last edited by
              #83

              Sam, I say this will all my heart…

              Fuck you very kindly. I’m pretty sure what you do is not “a real job” and should be replaced by AI.

              1 Reply Last reply
              14
              • kescusay@lemmy.worldK [email protected]

                Well, if I'm not, then neither is an LLM.

                But for most projects built with modern tooling, the documentation is fine, and they mostly have simple CLIs for scaffolding a new application.

                G This user is from outside of this forum
                G This user is from outside of this forum
                [email protected]
                wrote last edited by
                #84

                I mean if you use the code base you’re working in as context it’ll probably learn the code base faster than you will, although I’m not saying that’s a good strategy, I’d never personally do that

                kescusay@lemmy.worldK 1 Reply Last reply
                0
                • mp3@lemmy.caM [email protected]

                  CEO isn't an actual job either, it's just the 21st century's titre de noblesse.

                  P This user is from outside of this forum
                  P This user is from outside of this forum
                  [email protected]
                  wrote last edited by [email protected]
                  #85

                  CEOs for a public company is just their little monkey puppet that takes all the blame for the board’s failures

                  1 Reply Last reply
                  0
                  • andrewrgross@slrpnk.netA [email protected]

                    100%.

                    Peter Frase deconstructed this in an article a decade ago (and subsequent book) "Four Futures".

                    It's really not complicated. Saying 'the rich want to make us all obsolete and then kill us off ' sounds paranoid and reactionary, but if you actually study these dynamics critically that's a pretty good distillation of what they'd like to do, and they're not really concealing it.

                    P This user is from outside of this forum
                    P This user is from outside of this forum
                    [email protected]
                    wrote last edited by [email protected]
                    #86

                    They won’t succeed because once poor folks get desperate enough after the bread and circuses run out… things will get a little troublesome for everyone.

                    1 Reply Last reply
                    0
                    • dojan@pawb.socialD [email protected]

                      At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.

                      I'd not put an LLM in charge of developing a framework that is meant to be used in any sort of production environment. If we're talking about them setting up the skeleton of a project, then templates have already been around for decades at this point. You also don't really set up new projects all that often.

                      A This user is from outside of this forum
                      A This user is from outside of this forum
                      [email protected]
                      wrote last edited by
                      #87

                      This is my take with it too. They seem to be good at creating "high fidelity" mock-ups, and creating a basic framework for something, but try to even get them to change a background color or something and they just lie to you.

                      They're basically a good tool for stubbing stuff out for a web application...which, it's insane that we had to jump through all of these hoops and spend unknown billions in order to get that. At this point, I would assume that we have a rapid application development equivalent for web apps...but maybe not.

                      All of the "frameworks" involved in front-end application delivery certainly don't seem to provide any benefit of speeding up development cycles. Front-end development seems worse today than when I used to be a full-time full stack engineer (and I had fucking IE6 to contend with at the time).

                      1 Reply Last reply
                      0
                      • N [email protected]

                        I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.

                        The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.

                        The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.

                        In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.

                        Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.

                        As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.

                        I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.

                        These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.

                        How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.

                        At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.

                        F This user is from outside of this forum
                        F This user is from outside of this forum
                        [email protected]
                        wrote last edited by [email protected]
                        #88

                        We could create jobs by opening a guillotine factory

                        1 Reply Last reply
                        1
                        • P [email protected]

                          It comes down to the "hallucination rate" which is a very fuzzy metric, but it works pretty well - at a hallucination rate of 5% (95% successful responses) AI is just about on par with human workers - but faster for complex tasks, and slower for simple answers.

                          I have no idea what you're doing, but based on my own experience, your error/hallucination rate is like 1/10th of what I'd expect.

                          I've been using an AI assistant for the better part of a year, and I'd laugh at the idea that they're right even 60% of the time without CONSTANTLY reinforcing fucking BASIC directives or telling it to provide sources for every method it suggests. Like, I can't even keep the damned thing reliably in the language framework I'm working on without it falling back to the raw vendor CLI in project conversations. I'm correcting the exact same mistakes week after week because the thing is braindead and doesn't understand that you cannot use reserved keywords for your variable names. It just makes up parameters to core functions based on the question I ask it, regardless of documentation until I call it's bullshit and it gets super conciliatory and then actually double checks it's own work instead of authoritatively lying to me.

                          You're not wrong that AI makes human style mistakes, but a human can learn, or at least generally doesn't have to be taught the same fucking lesson at least once a week for a year (or gets fired well before then). AI is artificial, but there absolutely isn't any intelligence behind it, it's just a stochastic parrot that somehow comes to plausible answers that the algorithm expects that you want to hear.

                          A This user is from outside of this forum
                          A This user is from outside of this forum
                          [email protected]
                          wrote last edited by
                          #89

                          You’re not wrong that AI makes human style mistakes, but a human can learn, or at least generally doesn’t have to be taught the same fucking lesson at least once a week for a year (or gets fired well before then).

                          This is the point nobody seems to get. Especially people that haven't worked with the technology.

                          It just does not have the ability to learn in any meaningful way. A human can learn a new technique and move to master simple new techniques in a couple of hours. AI just keeps falling back on its training data no matter how many times you tell it to stop. It has no other option. It would need to be re-trained with better material in order to consistently do what you want it to do, but nobody is really re-training these things...they're using the "foundational" models and at most "fine-tuning" them...and fine-tuning only provides a quickly punctured facade...it eventually falls back to the bulk of its learning material.

                          1 Reply Last reply
                          1
                          • P [email protected]

                            Fuck, I barely let AI make functions in my code because half the time the fuckin idiot can't even guess the correct method name and parameters when it can pull up the goddamned help page like I can or even Google the basic syntax.

                            M This user is from outside of this forum
                            M This user is from outside of this forum
                            [email protected]
                            wrote last edited by
                            #90

                            A year ago AI answers were only successfully compiling for me about 60% of the time. Now they're up over 80%, and I'm no longer in the loop when they screw up, they get it right on the first try 80% of the time, then 96% of the time by the 2nd try, 99% by the third try, 99.84% of the time by the 4th try, and the beauty is: they retry for themselves until they get something that actually compiles.

                            Now we can talk about successful implementation of larger feature sets....

                            1 Reply Last reply
                            0
                            • N [email protected]

                              I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.

                              The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.

                              The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.

                              In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.

                              Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.

                              As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.

                              I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.

                              These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.

                              How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.

                              At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.

                              dupacycki@lemmy.worldD This user is from outside of this forum
                              dupacycki@lemmy.worldD This user is from outside of this forum
                              [email protected]
                              wrote last edited by [email protected]
                              #91

                              To be fair, a lot of jobs in capitalist societies are indeed pointless. Some of them even actively do nothing but subtract value from society.

                              That said, people still need to make a living and his piece of shit artificial insanity is only making it more difficult. How about stop starving people to death and propose solutions to the problem?

                              C S S 3 Replies Last reply
                              10
                              • N [email protected]

                                I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.

                                The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.

                                The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.

                                In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.

                                Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.

                                As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.

                                I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.

                                These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.

                                How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.

                                At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.

                                S This user is from outside of this forum
                                S This user is from outside of this forum
                                [email protected]
                                wrote last edited by
                                #92

                                Can't AI replace Sam Altman?

                                1 Reply Last reply
                                10
                                • N [email protected]

                                  I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.

                                  The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.

                                  The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.

                                  In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.

                                  Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.

                                  As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.

                                  I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.

                                  These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.

                                  How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.

                                  At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.

                                  S This user is from outside of this forum
                                  S This user is from outside of this forum
                                  [email protected]
                                  wrote last edited by
                                  #93

                                  What do we need the mega rich for anyway? They aren't creative and easily replaced with AI at this point.

                                  lechekaflan@lemmy.worldL 1 Reply Last reply
                                  18
                                  • dupacycki@lemmy.worldD [email protected]

                                    To be fair, a lot of jobs in capitalist societies are indeed pointless. Some of them even actively do nothing but subtract value from society.

                                    That said, people still need to make a living and his piece of shit artificial insanity is only making it more difficult. How about stop starving people to death and propose solutions to the problem?

                                    C This user is from outside of this forum
                                    C This user is from outside of this forum
                                    [email protected]
                                    wrote last edited by
                                    #94

                                    why capitalist societies specifically?

                                    1 Reply Last reply
                                    0
                                    • dupacycki@lemmy.worldD [email protected]

                                      To be fair, a lot of jobs in capitalist societies are indeed pointless. Some of them even actively do nothing but subtract value from society.

                                      That said, people still need to make a living and his piece of shit artificial insanity is only making it more difficult. How about stop starving people to death and propose solutions to the problem?

                                      S This user is from outside of this forum
                                      S This user is from outside of this forum
                                      [email protected]
                                      wrote last edited by
                                      #95

                                      They may seem pointless to those outside of the organization. As long as someone is willing to pay them then someone considers they have value.

                                      No one is "starving to death" but you'd have people just barely scraping by.

                                      J lengawaits@lemmy.worldL 2 Replies Last reply
                                      1
                                      • G [email protected]

                                        I mean if you use the code base you’re working in as context it’ll probably learn the code base faster than you will, although I’m not saying that’s a good strategy, I’d never personally do that

                                        kescusay@lemmy.worldK This user is from outside of this forum
                                        kescusay@lemmy.worldK This user is from outside of this forum
                                        [email protected]
                                        wrote last edited by
                                        #96

                                        The thing is, it really won't. The context window isn't large enough, especially for a decently-sized application, and that seems to be a fundamental limitation. Make the context window too large, and the LLM gets massively offtrack very easily, because there's too much in it to distract it.

                                        And LLMs don't remember anything. The next time you interact with it and put the whole codebase into its context window again, it won't know what it did before, even if the last session was ten minutes ago. That's why they so frequently create bloat.

                                        1 Reply Last reply
                                        3
                                        • S [email protected]

                                          They may seem pointless to those outside of the organization. As long as someone is willing to pay them then someone considers they have value.

                                          No one is "starving to death" but you'd have people just barely scraping by.

                                          J This user is from outside of this forum
                                          J This user is from outside of this forum
                                          [email protected]
                                          wrote last edited by
                                          #97

                                          With many bearaucracies there's plenty of practically valueless work going on.

                                          Because some executive wants to brag about having over a hundred people under them. Because some proceas requires a sort of document be created that hasn't been used in decades but no one has the time to validate what does or does not matter anymore. Because of a lot of little nonsense reasons where the path of least resistance is to keep plugging away. Because if you are 99 percent sure something is a waste of time and you optimize it, there's a 1% chance you'll catch hell for a mistake and almost no chance you get great recognition for the efficiency boost if it pans out.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups