Majority of AI Researchers Say Tech Industry Is Pouring Billions Into a Dead End
-
Yeah, and they're wrong.
It makes sense from typographical standpoint, the comma is the larger symbol and thus harder to overlook, especially in small fonts or messy handwriting
-
I agree that it's editorialized compared to the very neutral way the survey puts it. That said, I think you also have to take into account how AI has been marketed by the industry.
They have been claiming AGI is right around the corner pretty much since chatGPT first came to market. It's often implied (e.g. you'll be able to replace workers with this) or they are more vague on timeline (e.g. OpenAI saying they believe their research will eventually lead to AGI).
With that context I think it's fair to editorialize to this being a dead-end, because even with billions of dollars being poured into this, they won't be able to deliver AGI on the timeline they are promising.
AI isn't going to figure out what a customer wants when the customer doesn't know what they want.
-
It's becoming clear from the data that more error correction needs exponentially more data. I suspect that pretty soon we will realize that what's been built is a glorified homework cheater and a better search engine.
what's been built is a glorified homework cheater and an
betterunreliable search engine. -
Some parts of the world (mostly Europe, I think) use dots instead of commas for displaying thousands. For example, 5.000 is 5,000 and 1.300 is 1,300
Yes. It's the normal Thousands-separator notation in Germany for example.
-
Yeah, and they're wrong.
We (in Europe) probably should be thankful that you are not using feet as thousands-separator over there in the USA... Or maybe separate after each 2nd digit, because why not...
-
I knew the context, was just being cheesy.
Too late... You started a war in the comments. I'll proudly fight for my country's way to separate numbers!!!
-
This post did not contain any content.
Current big tech is going to keeping pushing limits and have SM influencers/youtubers market and their consumers picking up the R&D bill. Emotionally I want to say stop innovating but really cut your speed by 75%. We are going to witness an era of optimization and efficiency. Most users just need a Pi 5 16gb, Intel NUC or an Apple air base models. Those are easy 7-10 year computers. No need to rush and get latest and greatest. I’m talking about everything computing in general. One point gaming,more people are waking up realizing they don’t need every new GPU, studios are burnt out, IPs are dying due to no lingering core base to keep franchise up float and consumers can't keep opening their wallets. Hence studios like square enix going to start support all platforms and not do late stage capitalism with going with their own launcher with a store.
It’s over. -
And the tragedy of the whole situation is that they can‘t win because if every worker is replaced by an algorithm or a robot then who‘s going to buy your products? Nobody has money because nobody has a job. And so the economy will shift to producing war machines that fight each other for territory to build more war machine factories until you can’t expand anymore for one reason or another. Then the entire system will collapse like the Roman Empire and we start from scratch.
producing war machines that fight each other for territory to build more war machine factories until you can’t expand anymore for one reason or another.
As seen in the retro-documentary Z!
-
This post did not contain any content.
Good let them waste all their money
-
The cope on this site is so bad sometimes. AI is already revolutionary.
Ya about as revolutionary as my left nut
-
I like my project manager, they find me work, ask how I'm doing and talk straight.
It's when the CEO/CTO/CFO speaks where my eyes glaze over, my mouth sags, and I bounce my neck at prompted intervals as my brain retreats into itself as it frantically tosses words and phrases into the meaning grinder and cranks the wheel, only for nothing to come out of it time and time again.
-
What's hard for you to comprehend about my comment?
You are insulting a person, because they said ai helps them.
-
I like my project manager, they find me work, ask how I'm doing and talk straight.
It's when the CEO/CTO/CFO speaks where my eyes glaze over, my mouth sags, and I bounce my neck at prompted intervals as my brain retreats into itself as it frantically tosses words and phrases into the meaning grinder and cranks the wheel, only for nothing to come out of it time and time again.
Right, that sweet spot between too less stimuli so your brain just wants to sleep or run away and too much stimuli so you can't just zone out.
-
Your product is other people's work thrown in a blender.
Congrats.
Yeah he should be using real art like stock photos and shitty clip art
-
Optimizing AI performance by “scaling” is lazy and wasteful.
Reminds me of back in the early 2000s when someone would say don’t worry about performance, GHz will always go up.
I miss flash players.
-
It makes sense from typographical standpoint, the comma is the larger symbol and thus harder to overlook, especially in small fonts or messy handwriting
But from a grammatical sense it’s the opposite. In a sentence, a comma is a short pause, while a period is a hard stop. That means it makes far more sense for the comma to be the thousands separator and the period to be the stop between integer and fraction.
-
Yeah, nothing pleases us more than constant, buggy updates.
Unit tests and good architecture are still foundational requirements, so far no bug reports with any of these updates. In fact a huge chunk of these ai updates were addressing bugs. Not sure why you're so mad at what you imagine is happening and making so many broad assumptions!
-
Technology in most cases progresses on a logarithmic scale when innovation isn't prioritized. We've basically reached the plateau of what LLMs can currently do without a breakthrough. They could absorb all the information on the internet and not even come close to what they say it is. These days we're in the "bells and whistles" phase where they add unnecessary bullshit to make it seem new like adding 5 cameras to a phone or adding touchscreens to cars. Things that make something seem fancy by slapping buzzwords and features nobody needs without needing to actually change anything but bump up the price.
I remember listening to a podcast that’s about explaining stuff according to what we know today (scientifically). The guy explaining is just so knowledgeable about this stuff and he does his research and talk to experts when the subject involves something he isn’t himself an expert.
There was this episode where he kinda got into the topic of how technology only evolves with science (because you need to understand the stuff you’re doing and you need a theory of how it works before you make new assumptions and test those assumptions). He gave an example of the Apple visionPro being a machine that despite being new (the hardware capabilities, at least), the algorithm for tracking eyes they use was developed decades ago and was already well understood and proven correct by other applications.
So his point in the episode is that real innovation just can’t be rushed by throwing money or more people at a problem. Because real innovation takes real scientists having novel insights and experiments to expand the knowledge we have. Sometimes those insights are completely random, often you need to have a whole career in that field and sometimes it takes a new genius to revolutionize it (think Newton and Einstein).
Even the current wave of LLMs are simply a product of the Google’s paper that showed we could parallelize language models, leading to the creation of “larger language models”. That was Google doing science. But you can’t control when some new breakthrough is discovered, and LLMs are subject to this constraint.
In fact, the only practice we know that actually accelerates science is the collaboration of scientists around the world, the publishing of reproducible papers so that others can expand upon and have insights you didn’t even think about, and so on.
-
I like my project manager, they find me work, ask how I'm doing and talk straight.
It's when the CEO/CTO/CFO speaks where my eyes glaze over, my mouth sags, and I bounce my neck at prompted intervals as my brain retreats into itself as it frantically tosses words and phrases into the meaning grinder and cranks the wheel, only for nothing to come out of it time and time again.
COs are corporate politicians, media trained to only say things which are completely unrevealing and lacking of any substance.
This is by design so that sensitive information is centrally controlled, leaks are difficult, and sudden changes in direction cause the minimum amount of whiplash to ICs as possible.
I have the same reaction as you, but the system is working as intended. Better to just shut it out and use the time to think about that issue you're having on a personal cat project or what toy to buy for your cat's birthday.
-
Yeah he should be using real art like stock photos and shitty clip art
If his business can't afford to pay someone qualified to do the work, the business shouldn't exist.