Sam Altman Says If Jobs Gets Wiped Out, Maybe They Weren’t Even “Real Work” to Start With
-
This guy needs to find Luigi.
that's a smart comment to make.
-
I've been thinking a lot about this since chatgpt dropped and I agree with Sam here despite the article trying to rage bait people. We simply shouldn't protect the job market from the point of view of identity or status. We should keep an open mind of jobs and work culture could look like in the future.
Unfortunately this issue is impossible to discuss without conflating it with general economics and wealth imbalance so we'll never have an adult discussion here. We can actually have both - review/kill/create new jobs and work cultures and address wealth imbalance but not in some single silver bullet solution.
this issue is impossible to discuss without conflating it with general economics and wealth imbalance
It's not conflating, the two issues are inextricably linked.
General economics and wealth imbalance can be addressed with or without the chaos of AI disrupting the job market. The problem is: chaos acts to drive wealth imbalance faster, so any change like AI in the jobs market is just shaking things up and letting more people fall through the cracks faster.
-
Thou shalt not make a machine in the likeness of a human mind.
-- The Orange Catholic Bible
Also, that pompous chucklefuck can go fuck himself. There are people who could barely feed themselves at less than a couple dollars per day.
read the next sentence after that.
-
I've worked for big corporations that employ a lot of people. Every job has a metric showing how much money every single task they do creates. Believe me. They would never pay you if your tasks didn't generate more money than they need to pay you to do the task.
This is part of the reason I don't work for big corporations.. yuck
-
I've worked for big corporations that employ a lot of people. Every job has a metric showing how much money every single task they do creates. Believe me. They would never pay you if your tasks didn't generate more money than they need to pay you to do the task.
Every job has a metric showing how much money every single task they do creates.
Management accountants would love to do this. In practise you can only do this for low level, commoditised roles.
-
It's arranging all that with emails, phone calls with a fake voice etc. Maybe some useful idiots that do pimping and stuff in case it's unable to do that

Whoa whoa whoa... no one said let AI be the administrative assistant.
-
I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.
The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.
The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.
In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.
Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.
As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.
I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.
How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.
At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.
It’s funny, years ago, a single developer “killing it” on Steam was almost unheard of. It happened, but it was few and far between.
Now, with the advent of powerful engines like Unreal 5 and the latest iterations of Unity, practically anyone outside the Arctic Circle can pick one up and make a game.
Is tech like that taking jobs away from the game industry? Yes. Very much so. But since those programs aren’t technically “AI,” they get a pass. Never mind that they use LLMs to streamline the process, they’re fine because they make games we enjoy playing.
But that’s missing the point. For every job the deployment of some “schedule 1” or “megabonk” tech replaced, it enabled ten more people to play and benefit from the final product. Those games absolutely used AI in development, work that once would’ve gone to human hands.
Technology always reduces jobs in some markets and creates new ones in others.
It’s the natural way of things.
-
I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.
The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.
The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.
In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.
Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.
As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.
I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.
How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.
At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.
Why do people still listen to this grifter piece of shit? I really don't get it.
-
If your argument attacks my credibility, that's fine, you don't know me. We can find cases where developers use the technology and cases where they refuse.
Do you have anything substantive to add to the discussion about whether AI LLMs are anything more than just a tool that allows workers to further abstract, advancing all of the professions it can touch towards any of: better / faster / cheaper / easier?
I've got something to add: in every practical application AI have increased liabilities and created vastly inferior product, so they're not more than just a tool that allows workers to further abstract because they are less than that. This in addition to the fact that AI companies can't turn a profit, so it's not better, not faster, not cheaper, but but it is certainly easier (to do a shit job).
-
I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.
The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.
The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.
In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.
Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.
As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.
I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.
How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.
At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.
This was a great comment to the article. You have true expression in your words, my friend. It was a joy reading.
-
this issue is impossible to discuss without conflating it with general economics and wealth imbalance
It's not conflating, the two issues are inextricably linked.
General economics and wealth imbalance can be addressed with or without the chaos of AI disrupting the job market. The problem is: chaos acts to drive wealth imbalance faster, so any change like AI in the jobs market is just shaking things up and letting more people fall through the cracks faster.
Everything is "linked" - your point is moot.
-
Everything is "linked" - your point is moot.
The real thing most people are trying to hold onto is stability, because chaos benefits the powerful. AI is just the latest agent of chaos, from their perspectives.
-
The real thing most people are trying to hold onto is stability, because chaos benefits the powerful. AI is just the latest agent of chaos, from their perspectives.
Weird take.