Two conversational AI agents switching from English to sound-level protocol after confirming they are both AI agents
-
The same reason that humanoid robots are useful
Sex?The thing about this demonstration is that there's a wide recognition that even humans don't want to be forced to voice interactions, and this is a ridiculous scenario that resembles what the 50s might have imagined the future as being, while ignoring the better advances made along the way. Conversational is maddening way to get a lot of things done, particularly scheduling. So in this demo, a human had to conversationally tell an AI agent the requirements, and then an AI agent acoustically couples to another AI agent which actually has access to the actual scheduling system.
So first, the coupling is stupid. If they recognize, then spout an API endpoint at the other end and take the conversation over IP.
But the concept of two AI agents negotiating this is silly. If the user AI agent is in play, just let it access the system directly that the other agent is accessing. An AI agent may be able to efficiently facilitate this, but two only makes things less likely to work than one.
You don’t need special robot lifts in your apartment building if the cleaning robots can just take the elevators.
The cleaning robots even if not human shaped could easily take the normal elevators unless you got very weird in design. There's a significantly good point that obsession with human styled robotics gets in the way of a lot of use cases.
You don’t need to design APIs for scripts to access your website if the AI can just use a browser with a mouse and keyboard.
The API access would greatly accelerate things even for AI. If you've ever done selenium based automation of a site, you know it's so much slower and heavyweight than just interacting with the API directly. AI won't speed this up. What should take a fraction of a second can turn into many minutes,and a large number of tokens at large enough scale (e.g. scraping a few hundred business web uis).
-
If a business has an internet connection (of course they do), then they have the ability to host a website just as much as they have the ability to answer the phone. The same software/provider relationship that would provide AI answering service could easily facilitate online interaction. So if oblivous AI enduser points an AI agent at a business with AI agent answering, then the answering agent should be 'If you are an agent, go to shorturl.at/JtWMA for chat api endpoint', which may then further offer direct options for direct access to the APIs that the agent would front end for a human client, instead of going old school acoustic coupled modem. The same service that can provide a chat agent can provide a cookie cutter web experience for the relevant industry, maybe with light branding, providing things like a calendar view into a reservation system, which may be much more to the point than trying to chat your way back and forth about scheduling options.
-
The older generation isn't going to be getting their end-user AI agents working either. While the next generation may consume more video content than before, all the kids I know still get frustrated at a video that could have just been text unless it is something they want to enjoy.
The only time voice makes sense is to facilitate real time communication between two humans because they can speak faster than they can type. Conversational approach to use cases often have limits, though that doesn't preclude AI technology from providing those interfaces, so long as they aren't constrained to voice. A chat agent that pops up a calendar UI when scheduling is identified as the goal, for example.
-
They keep talking about "judgement day".
-
then they have the ability to host a website just as much as they have the ability to answer the phone
Many people in the developed world are behind CGNAT. Paying for an Ipv4 is a premium, and most businesses either setup shop on an existing listing page (e.g. facebook), or host a website from website provider/generator.
A phone number is public, accessible, and an AI can get realtime info from a scrawled in entry in a logbook using OCR
-
The older generation are going to give permission to some random monolithic AI company to listen to their calls and handle their lives for them. Bookings will take place automatically, and a verbal grievance will be voiced to prompt the AI (local or otherwise) to negotiate a rebook. It's way faster than dealing with a form.
-
Nice to know we finally developed a way for computers to communicate by shrieking at each other. Give it a few years and if they can get the latency down we may even be able to play Doom over this!
-
Frankly the folks old enough to be defeated by the technology are old enough to likely be unable to even give them fodder for training. At this point you are talking about people generally in their 80s and/or with some dementia, who need someone with power of attorney to take care of any of these scenarios anyway. They may be able to do day to day life, but they need someone who can act on their behalf knowing what they would want even if they themselves can't competently convey it.
People under 80 generally can navigate these interfaces now without a problem, and frequently prefer it. The out of touch 60 year old is a pretty old stereotype.
-
If they had I would have welcomed any potential AI overlords. I want a massive dial up in the middle of town, sounding its boot signal across the land. Idk this was an odd image I felt like I should share it..
-
So for one, business lines almost always have public IPv4. Even then, there are a myriad of providers that provide a solution even behind NAT (also, they probably have public IPv6 space). Any technology provider that could provide AI chat over telephony could also take care of the data connectivity path on their behalf. Anyone that would want to self-host such a solution would certainly have inbound data connectivity also solved. I just don't see a scenario where a business can have AI telephony but somehow can't have inbound data access.
So you have a camera on a logbook to get the human input, but then that logbook can't be a source of truth because the computer won't write in it and the computer can take bookings. I don't think humans really want to do a handwritten logbook anyway, a computer or tablet ui is going to be much faster anyway.
-
You and I clearly inhabit different worlds, and I guess we can just agree to disagree at this point
-
I enjoyed it.
-
The efficiency comes from the lack of voice processing. The beeps and boops are easier on CPU resources than trying to parse spoken word.
That said, they should just communicate over an API like you said.
-
Ultrasonic wireless communication has been a thing for years. The scary part is you can't even hear when it's happening.
-
Right, electronic devices talk to each other all the time
-
"A couple of decades"
Buddy....it's 55 years old now. Lol.
Interesting movie concept, though. Would love to see something like this remade today with modern revelations.
-
Why is my dog going nuts? Another victim of AI slop.
-
So an AI developer reinvented phreaking?
-
Reminds me of insurance office I worked in. Some of the staff were brain dead.
- Print something
- Scribble some notes on the print out
- Fax that annotated paper or scan and email it to someone
- Whine about how you're out of printer toner.
-
can adapt in the moment if you supply it with the right context
So the disabled have to jump through hoops to interact with the world, great. And can be, meaning somebody has to review what's being put into this black box instead of just having a person do the task themselves. Instead of a person being qualified for the task, some corp is getting rent from everybody.
Basically you are taking away customer service and providing a booking bot. No thanks.