Sam Altman Says If Jobs Gets Wiped Out, Maybe They Weren’t Even “Real Work” to Start With
-
There's a book Bullshit Jobs that explores this phenomenon. Freakonomics also did an episode referring to the book, which I found interesting.
Bullshit Jobs: A Theory is a 2018 book by anthropologist David Graeber that postulates the existence of meaningless jobs and analyzes their societal harm. He contends that over half of societal work is pointless and becomes psychologically destructive when paired with a work ethic that associates work with self-worth
management, CEOs, csuites, and admins being one of them.
-
Sam Altman is a huckster, not a technologist. As such, I don't really care what he says about technology. His purpose has always been to transfer as much money as possible from investors into his own pocket before the bubble bursts. Anything else is incidental.
I am not entirely writing off LLMs, but very little of the discussion about them has been rational. They do some things fairly well and a lot of things quite poorly. It would be nice if we could just focus on the former.
hes probably afraid, its going to burst too fast and is left holding the bag, thats why GATES, musk, MS, google is trying to stem the bleeding.
-
The problem is the capitalist investor class, by and large, determines what work will be done, what kinds of jobs there will be, and who will work those jobs. They are becoming increasingly out of touch with reality as their wealth and power grows and seem to be trying to mold the world into something, somewhere along the lines of what Curtis Yarven advocates for, that most people would consider very dystopian.
This discussion is also ignoring the fact that currently, 95% of AI projects fail, and studies show that LLM use hurts the productivity of programmers. But yeah, there will almost surely be breakthroughs in the future that will produce more useful AI tech; nobody knows what the timeline for that is though.
But isn't the investment still driven by consumption in the end? They invest in what makes money, but in the end things people are willing to spend money on make money.
-
Then that software engineer that was replaced by AI becomes Sam's personal chef to kill him
Honestly (as a software engineer), we should have less of a privileged attitude towards being replaced. In the end, that's what software engineers have been doing for years regarding other jobs.
-
I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.
The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.
The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.
In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.
Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.
As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.
I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.
How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.
At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.
Cool. Let AI be the CEO.
-
I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.
The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.
The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.
In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.
Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.
As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.
I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.
How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.
At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.
I've been thinking a lot about this since chatgpt dropped and I agree with Sam here despite the article trying to rage bait people. We simply shouldn't protect the job market from the point of view of identity or status. We should keep an open mind of jobs and work culture could look like in the future.
Unfortunately this issue is impossible to discuss without conflating it with general economics and wealth imbalance so we'll never have an adult discussion here. We can actually have both - review/kill/create new jobs and work cultures and address wealth imbalance but not in some single silver bullet solution.
-
But isn't the investment still driven by consumption in the end? They invest in what makes money, but in the end things people are willing to spend money on make money.
They invest in things they think they will be able to sell later for a higher price. Expected consumption is sometimes part of their calculations. But, they are increasingly not in touch with reality (see blockchain, metaverse, Tesla, etc). Sometimes they knowingly take a loss to gain power over the masses (Twitter, Washington Post). They are also powerful enough to induce consumption (bribe governments for contracts, laws, bailouts, and regulations that ensure their investments will be fruitful). They are powerful enough to heavily influence which politicians will get elected, choosing who they want to bribe. They are powerful enough to force the businesses they are invested in to buy/sell to each other. The largest, most profitable companies, produce nearly nothing, they use their positions of being near-monopolies to extract rent (i.e. enshittification/technofeudalism).
-
But isn't the investment still driven by consumption in the end? They invest in what makes money, but in the end things people are willing to spend money on make money.
You'd think so, but unfortunately not. Venture captial is completely illogical, designed around boom or bust "moonshot" ideas that are supposed to completely change everything. So this money isn't driven by actual consumption, rather speculation.
I can't really speak to other forms of investment but I suspect it doesn't get a whole lot better. The economy has become far too financialised with a fiat currency that is completely separate from actual intrinsic value. That's why a watch can cost more than a family home, which isn't true consumption - just this weird concept of "wealth" -
I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.
The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.
The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.
In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.
Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.
As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.
I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.
How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.
At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.
"Real job" is often an anti-intellectualist code word for hard manual labor.
-
I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.
The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.
The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.
In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.
Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.
As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.
I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.
How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.
At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.
Machines gave us during the industrial revolution the means to unskilled labour to have something to do. Now machines will take it away. Simple.
-
I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.
The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.
The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.
In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.
Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.
As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.
I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.
How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.
At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.
Executive positions are probably the easiest to replace with AI.
- AI will listen to the employees
- They will try to be helpful by providing context and perspective based on information the employee might not have.
- They will accept being told they are wrong and update their advice.
- They will leave the employee to get the job done, trusting that the employee will get back to them if they need more help.
-
I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.
The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.
The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.
In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.
Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.
As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.
I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.
How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.
At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.
If OpenAI gets wiped out, maybe it wasn’t even a “real company” to start with
-
Executive positions are probably the easiest to replace with AI.
- AI will listen to the employees
- They will try to be helpful by providing context and perspective based on information the employee might not have.
- They will accept being told they are wrong and update their advice.
- They will leave the employee to get the job done, trusting that the employee will get back to them if they need more help.
-
The AI won’t have a twitter account to go on racist rants.
-
The AI won’t end up on the Epstein list.
-
The AI won’t drunkenly send nudes to an intern.
-
I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.
The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.
The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.
In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.
Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.
As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.
I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.
How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.
At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.
This guy needs to find Luigi.
-
Cool. Let AI be the CEO.
Don’t be silly. How’s an AI going to fly politicians to secret islands and make backroom deals?
-
Executive positions are probably the easiest to replace with AI.
- AI will listen to the employees
- They will try to be helpful by providing context and perspective based on information the employee might not have.
- They will accept being told they are wrong and update their advice.
- They will leave the employee to get the job done, trusting that the employee will get back to them if they need more help.
Don't executives spend their day talking to AI and doing whatever they say?
-
I came across this article in another Lemmy community that dislikes AI. I'm reposting instead of cross posting so that we could have a conversation about how "work" might be changing with advancements in technology.
The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work "you and I do today" (including Altman himself), doesn't look like work.
The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.
In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.
Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.
As humanity's core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn't seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.
I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they're made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
These days we have fewer bookkeepers - most companies don't need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.
How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn't have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.
At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.
Jobs like air traffic controllers for example?
-
"Real job" is often an anti-intellectualist code word for hard manual labor.
wrote last edited by [email protected]There's actually a lot of bullshit jobs out there. Things that could be automated(without AI) or when companies feel the need to do a hiring binge to appear like they're growing. A lot of useless busywork. C-suits especially could be replaced or eliminated first though.

-
The solution to the public misusing technical terms isn't to change the technical terms, but to educate the public. All of the following fall under AI:
- pathing algorithms of computer opponents, but probably not the decisions that computer opponents make (i.e. who to attack; that's usually based on manually specified logic)
- the speech to text your phone used before Gemeni or whatever it's called now on Android (Gemeni is also AI, just a different type of AI)
- home camera systems that can detect people vs animals, and sometimes classify those animals by species
- DDOS protection systems and load balancers for websites probably use some type of AI
AI is a broad field, and you probably interact with non-LLM variants every day, whether you notice or not. Here's a Wikipedia article that goes through a lot of it. LLMs/GPT are merely one small subfield in the larger field of AI.
I don't understand how people went from calling the computer player in their game "AI" (or even older, "CPU"), which nobody mistook for actual intelligence, to now people believing AI means something is sentient. Maybe it's because LLMs are more convincing since they do a much better job at languages, idk, but it's the same category of thing under the hood. ChatGPT isn't "thinking," and when it claims to "think," it's basically turning a prompt into a set of things to "think" about (basically generates and answers related prompts), and then uses that set of things in its context to provide an answer. It's not actually "thinking" as people do, it's merely following a set of statistically-motivated steps based on your prompt to generate a relevant answer. It's a lot more complex than that Warcraft 2 bot you played against as a kid, but it's still following steps a human designed, along with some statistical methods to adapt to things the developer didn't encounter.
The problem with AI in a "popular context" is that it has been a forever moving target. Old mechanical adding machines were better at correctly summing columns of numbers than humans, at the time they were considered a limited sort of artificial intelligence. All along the spectrum it continues. 5 years ago, image classifiers that can sit and watch video feeds 24-7, accurately identifying things that happen in the feed with better than human accuracy (accounting for human lack of attention, coffee breaks, distracting phone calls, etc.) those were amazing feats of AI - at the time, and now they're "just image classifiers" much as Alpha-Zero "just plays games."
-
The problem with AI in a "popular context" is that it has been a forever moving target. Old mechanical adding machines were better at correctly summing columns of numbers than humans, at the time they were considered a limited sort of artificial intelligence. All along the spectrum it continues. 5 years ago, image classifiers that can sit and watch video feeds 24-7, accurately identifying things that happen in the feed with better than human accuracy (accounting for human lack of attention, coffee breaks, distracting phone calls, etc.) those were amazing feats of AI - at the time, and now they're "just image classifiers" much as Alpha-Zero "just plays games."
The first was never "AI" in a CS context, and the second has always and will always be "AI" in a CS context. The definition has been pretty consistent since at least Alan Turing, if not earlier.
I don't know how to square that circle. To me it's pretty simple, a solution or approach is AI if it simulates (or creates) intelligence, and an intelligent system is one that uses data (learns) from its environment to achieve its goals. Anything from an A* pathiing algorithm to actual general AI are "AI," yet people assume the most sophisticated end of the spectrum.