Majority of AI Researchers Say Tech Industry Is Pouring Billions Into a Dead End
-
trust me bro, we're almost there, we just need another data center and a few billions, it's coming i promise, we are testing incredible things internally, can't wait to show you!
We are having massive exponential increases in output with all sorts of innovations, every few weeks another big step forward happens
-
This post did not contain any content.
Me and my 5.000 closest friends don't like that the website and their 1.300 partners all need my data.
-
As an experienced software dev I'm convinced my software quality has improved by using AI. More time for thinking and less time for execution means I can make more iterations of the design and don't have to skip as many nice-to-haves or unit tests on account of limited time. It's not like I don't go through every code line multiple times anyway, I don't just blindly accept code. As a bonus I can ask the AI to review the code and produce documentation. By the time I'm done there's little left of what was originally generated.
As an experienced software dev I'm convinced my software quality has improved by using AI.
Then your software quality was extreme shit before. It's still shit, but an improvement. So, yay "AI", I guess?
-
You skipped possibility 3, which is actively happening ing:
Advancements in tech enable us to produce results at a much much cheaper cost
Which us happening with diffusion style LLMs that simultaneously cost less to train, cost less to run, but also produce both faster abd better quality outputs.
That's a big part people forget about AI: it's a feedback loop of improvement as soon as you can start using AI to develop AI
And we are past that mark now, most developers have easy access to AI as a tool to improve their performance, and AI is made by... software developers
So you get this loop where as we make better and better AIs, we get better and better at making AIs with the AIs...
It's incredibly likely the new diffusion AI systems were built with AI assisting in the process, enabling them to make a whole new tech innovation much faster and easier.
We are now in the uptick of the singularity, and have been for about a year now.
Same goes for hardware, it's very likely now that mvidia has AI incorporating into their production process, using it for micro optimizations in its architectures and designs.
And then those same optimized gpus turn around and get used to train and run even better AIs...
In 5-10 years we will look back on 2024 as the start of a very wild ride.
Remember we are just now in the "computers that take up entire warehouses" step of the tech.
Remember that in the 80s, a "computer" cost a fortune, took tonnes of resources, multiple people to run it, took up an entire room, was slow as hell, and could only do basic stuff.
But now 40 years later they fit in our pockets and are (non hyoerbole) billions of times faster.
I think by 2035 we will be looking at AI as something mass produced for consumers to just go in their homes, you go to best buy and compare different AI boxes to pick which one you are gonna get for your home.
We are still at the stage of people in the 80s looking at computers and pondering "why would someone even need to use this, why would someone put one in their house, let alone their pocket"
I remember having this optimism around tech in my late twenties.
-
TIL
-
I mean it's pretty clear they're desperate to cut human workers out of the picture so they don't have to pay employees that need things like emotional support, food, and sleep.
They want a workslave that never demands better conditions, that's it. That's the play. Period.
And the tragedy of the whole situation is that they canβt win because if every worker is replaced by an algorithm or a robot then whoβs going to buy your products? Nobody has money because nobody has a job. And so the economy will shift to producing war machines that fight each other for territory to build more war machine factories until you canβt expand anymore for one reason or another. Then the entire system will collapse like the Roman Empire and we start from scratch.
-
Me and my 5.000 closest friends don't like that the website and their 1.300 partners all need my data.
Why do many sig figs for 5 and 1.3 though?
-
Thing is, same as with GHz, you have to do it as much as you can until the gains get too small. You do that, then you move on to the next optimization. Like ai has and is now optimizing test time compute, token quality, and other areas.
To be fair, GHz did go up. Granted, it's not why modern processors are faster and more efficient.
-
This post did not contain any content.
It's because customers don't want it or care for it, it's only the corporations themselves are obsessed with it
-
This post did not contain any content.
The actual survey result:
Asked whether "scaling up" current AI approaches could lead to achieving artificial general intelligence (AGI), or a general purpose AI that matches or surpasses human cognition, an overwhelming 76 percent of respondents said it was "unlikely" or "very unlikely" to succeed.
So they're not saying the entire industry is a dead end, or even that the newest phase is. They're just saying they don't think this current technology will make AGI when scaled. I think most people agree, including the investors pouring billions into this. They arent betting this will turn to agi, they're betting that they have some application for the current ai. Are some of those applications dead ends, most definitely, are some of them revolutionary, maybe
Thus would be like asking a researcher in the 90s that if they scaled up the bandwidth and computing power of the average internet user would we see a vastly connected media sharing network, they'd probably say no. It took more than a decade of software, cultural and societal development to discover the applications for the internet.
-
Meanwhile a huge chunk of the software industry is now heavily using this "dead end" technology
I work in a pretty massive tech company (think, the type that frequently acquires other smaller ones and absorbs them)
Everyone I know here is using it. A lot.
However my company also has tonnes of dedicated sessions and paid time to instruct it's employees on how to use it well, and to get good value out of it, abd the pitfalls it can have
So yeah turns out if you teach your employees how to use a tool, they start using it.
I'd say LLMs have made me about 3x as efficient or so at my job.
I think the human in the loop currently needs to know what the LLM produced or checked, but they'll get better.
-
Why do many sig figs for 5 and 1.3 though?
Some parts of the world (mostly Europe, I think) use dots instead of commas for displaying thousands. For example, 5.000 is 5,000 and 1.300 is 1,300
-
Wait til you realize that's just what art literally is...
You're confusing ai art with actual art, like rendered from illustration and paintings
-
This post did not contain any content.
The cope on this site is so bad sometimes. AI is already revolutionary.
-
Don't be an ass and realize that ai is a great tool for a lot of people. Why is that so hard to comprehend?
What's hard for you to comprehend about my comment?
-
As an experienced software dev I'm convinced my software quality has improved by using AI. More time for thinking and less time for execution means I can make more iterations of the design and don't have to skip as many nice-to-haves or unit tests on account of limited time. It's not like I don't go through every code line multiple times anyway, I don't just blindly accept code. As a bonus I can ask the AI to review the code and produce documentation. By the time I'm done there's little left of what was originally generated.
As an experienced software dev, I know better than to waste my time writing boilerplate that can be vomited up by an LLM, since somebody else has already written it and I should just use that instead.
-
The cope on this site is so bad sometimes. AI is already revolutionary.
That may be true technologically. But if the economics don't add up it's a bubble.
-
Some parts of the world (mostly Europe, I think) use dots instead of commas for displaying thousands. For example, 5.000 is 5,000 and 1.300 is 1,300
I knew the context, was just being cheesy.
-
This post did not contain any content.
There are some nice things I have done with AI tools, but I do have to wonder if the amount of money poured into it justifies the result.
-
The actual survey result:
Asked whether "scaling up" current AI approaches could lead to achieving artificial general intelligence (AGI), or a general purpose AI that matches or surpasses human cognition, an overwhelming 76 percent of respondents said it was "unlikely" or "very unlikely" to succeed.
So they're not saying the entire industry is a dead end, or even that the newest phase is. They're just saying they don't think this current technology will make AGI when scaled. I think most people agree, including the investors pouring billions into this. They arent betting this will turn to agi, they're betting that they have some application for the current ai. Are some of those applications dead ends, most definitely, are some of them revolutionary, maybe
Thus would be like asking a researcher in the 90s that if they scaled up the bandwidth and computing power of the average internet user would we see a vastly connected media sharing network, they'd probably say no. It took more than a decade of software, cultural and societal development to discover the applications for the internet.
I agree that it's editorialized compared to the very neutral way the survey puts it. That said, I think you also have to take into account how AI has been marketed by the industry.
They have been claiming AGI is right around the corner pretty much since chatGPT first came to market. It's often implied (e.g. you'll be able to replace workers with this) or they are more vague on timeline (e.g. OpenAI saying they believe their research will eventually lead to AGI).
With that context I think it's fair to editorialize to this being a dead-end, because even with billions of dollars being poured into this, they won't be able to deliver AGI on the timeline they are promising.