Sam Altman Says If Jobs Gets Wiped Out, Maybe They Weren’t Even “Real Work” to Start With
-
I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.
The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.
The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.
In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.
Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.
As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.
I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.
How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.
At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.
Well, I'm afraid, given what's happening in the world and the fact that the AI bubble will eventually burst, nothing good awaits us in the future except a parody of Blade Runner.
-
says the guy who never did real work in his life
People worked to survive like an engine that needs oil to run. When our civilization collapses, then people will accept reality.
-
...and still they are throwing money at him, as fast as they can.
Mistake of the century.
Or maybe the rich, realizing that collapse was imminent, poured money into the hopes of replacing humans with AI and disposing of or controlling humans with the help of AI.
-
From the article:
“The thing about that farmer,” Altman said, is not only that they wouldn’t believe you, but “they very likely would look at what you do and I do and say, ‘that’s not real work.'”
I think he pretty much agrees with you.
You drive a tractor up and down a field, is that really any more work than the rest of us?
-
I feel like he's really onto something about real work, but he's missing the point of society. The purpose of our economy is to employ everyone, thus minimizing the negative societal effects of supporting unemployed people, and enabling people to improve their lives. If you optimize a society to produce more GDP by firing people, you're subtracting value, not adding it.
I think you are a step further down in the a/b problem tree.
The purpose of society is that everyone can have a safe, stable and good life. In our current setup this requires that most people are employed. But that's not a given.
Think of a hypothetical society where AI/robots do all the work. There would be no need to employ everyone to do work to support unemployed people.
We are slowly getting to that direction, but the problem here is that our capitalist society isn't fit for that setup. In our capitalist setup, removing the need for work means making people unemployed, who then "need to be supported" while the rich who own/employ robots/AI benefit without putting in any work at all.
-
I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.
The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.
The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.
In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.
Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.
As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.
I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.
How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.
At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.
I am starting to dislike Altman spam more than Elmo spam.
Regarding the philosophical points, there is some truth to the arguments, but one thing is absolutely certain (you can have zero knowledge of "AI" services to know that), you can't trust Americans in such matters.
-
I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.
The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.
The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.
In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.
Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.
As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.
I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.
How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.
At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.
if you can't build a complete functional AI, you shouldn't be releasing it to the public to start with.
Pushing AI without looking the negatives, just to make a "better feature", does not work like this.
-
Well, I'm afraid, given what's happening in the world and the fact that the AI bubble will eventually burst, nothing good awaits us in the future except a parody of Blade Runner.
This.
It will be the baby of idiocracy and blade runner.
All the horrible dehumanising parts, without any of the gritty aesthetics, and every character is some kind of sadistic Elmer Fudd.
-
Dumpster fire companies are the ones he's targeting because they're the mostly like to look for quick and cheap ways to fix the symptoms of their problems, and most likely to want to replace their employees with automations.
-
At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
No and no. Have you ever coded anything?
Have you ever built anything with your hands that mattered?
-
I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.
The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.
The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.
In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.
Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.
As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.
I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.
How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.
At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.
wrote last edited by [email protected]Talking psychology, please stop calling it AI. This raises unrealistic expectations. They are Large Language Models.
-
Yeah, I have never spent "days" setting anything up. Anyone who can't do it without spending "days" struggling with it is not reading the documentation.
You guys are getting documentation?
-
Talking psychology, please stop calling it AI. This raises unrealistic expectations. They are Large Language Models.
Raising unrealistic expectations is what companies like OpenAI are all about
-
Have you ever built anything with your hands that mattered?
Yes. How is it relevant to moderne SWE practices?
-
Have you ever built anything with your hands that mattered?
I know this was aimed at someone else. But my response is "Every day." What is your follow-up question?
-
Talking psychology, please stop calling it AI. This raises unrealistic expectations. They are Large Language Models.
In computer science machine learning and LLMs are part of AI. Before that other algorithms were considered part of AI. You may disagree, probably because all the hype around LLMs, but they are AI
-
In computer science machine learning and LLMs are part of AI. Before that other algorithms were considered part of AI. You may disagree, probably because all the hype around LLMs, but they are AI
You missed the psychology part?
-
This.
It will be the baby of idiocracy and blade runner.
All the horrible dehumanising parts, without any of the gritty aesthetics, and every character is some kind of sadistic Elmer Fudd.
It will be the baby of idiocracy and blade runner.
I agree.
-
I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.
The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.
The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.
In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.
Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.
As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.
I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.
How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.
At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.
At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
I'd not put an LLM in charge of developing a framework that is meant to be used in any sort of production environment. If we're talking about them setting up the skeleton of a project, then templates have already been around for decades at this point. You also don't really set up new projects all that often.
-
In computer science machine learning and LLMs are part of AI. Before that other algorithms were considered part of AI. You may disagree, probably because all the hype around LLMs, but they are AI
Granting them AI status, we should recognize that they "gained their abilities" by training on the rando junk that people post on the internet.
I have been working with AI for computer programming, semi-seriously for 3 months, pretty intensively for the last two weeks. I have also been working with humans for computer programming for 35 years. AI's "failings" are people's failings. They don't follow directions reliably, and if you don't manage them they'll go down rabbit holes of little to no value. With management, working with AI is like an accelerated experience with an average person, so the need for management becomes even more intense - where you might let a person work independently for a week then see what needs correcting, you really need to stay on top of AI's "thought process" on more of a 15-30 minute basis. It comes down to the "hallucination rate" which is a very fuzzy metric, but it works pretty well - at a hallucination rate of 5% (95% successful responses) AI is just about on par with human workers - but faster for complex tasks, and slower for simple answers.
Interestingly, for the past two weeks, I have been having some success with applying human management systems to AI: controlled documents, tiered requirements-specification-details documents, etc.