Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. Two conversational AI agents switching from English to sound-level protocol after confirming they are both AI agents

Two conversational AI agents switching from English to sound-level protocol after confirming they are both AI agents

Scheduled Pinned Locked Moved Technology
technology
114 Posts 70 Posters 784 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • patatahooligan@lemmy.worldP [email protected]

    This is really funny to me. If you keep optimizing this process you'll eventually completely remove the AI parts. Really shows how some of the pains AI claims to solve are self-inflicted. A good UI would have allowed the user to make this transaction in the same time it took to give the AI its initial instructions.

    On this topic, here's another common anti-pattern that I'm waiting for people to realize is insane and do something about it:

    • person A needs to convey an idea/proposal
    • they write a short but complete technical specification for it
    • it doesn't comply with some arbitrary standard/expectation so they tell an AI to expand the text
    • the AI can't add any real information, it just spreads the same information over more text
    • person B receives the text and is annoyed at how verbose it is
    • they tell an AI to summarize it
    • they get something basically aims to be the original text, but it's been passed through an unreliable hallucinating energy-inefficient channel

    Based on true stories.

    The above is not to say that every AI use case is made up or that the demo in the video isn't cool. It's also not a problem exclusive to AI. This is a more general observation that people don't question the sanity of interfaces enough, even when it costs them a lot of extra work to comply with it.

    W This user is from outside of this forum
    W This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #64

    I know the implied better solution to your example story would be for there to not be a standard that the specification has to conform to, but sometimes there is a reason for such a standard, in which case getting rid of the standard is just as bad as the AI channel in the example, and the real solution is for the two humans to actually take their work seriously.

    patatahooligan@lemmy.worldP 1 Reply Last reply
    0
    • B [email protected]

      But what if my human is late or my customers are disabled?

      If you spent time giving your employees instructions, you did half the design work for a web form.

      tetris11@lemmy.mlT This user is from outside of this forum
      tetris11@lemmy.mlT This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #65

      I guess I'm not quite following, aren't these also simple but dynamic tasks suited to an AI?

      B 1 Reply Last reply
      0
      • S [email protected]

        I think it is more about ambiguity. It is easier for a computer to intepret set tones and modulations than human speech.

        Like telephone numbers being tied to specific tones. Instead of the system needing to keep track of the many languages and accents that a '6' can be spoken by.

        R This user is from outside of this forum
        R This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #66

        That could be, even just considering one language to parse from.
        I heard efficiency and just thought speed

        1 Reply Last reply
        0
        • cyrano@lemmy.dbzer0.comC [email protected]
          This post did not contain any content.
          rob_t_firefly@lemmy.worldR This user is from outside of this forum
          rob_t_firefly@lemmy.worldR This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #67

          And before you know it, the helpful AI has booked an event where Boris and his new spouse can eat pizza with glue in it and swallow rocks for dessert.

          1 Reply Last reply
          0
          • cyrano@lemmy.dbzer0.comC [email protected]
            This post did not contain any content.
            S This user is from outside of this forum
            S This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #68

            Lol we've gone full retard.

            1 Reply Last reply
            0
            • cyrano@lemmy.dbzer0.comC [email protected]
              This post did not contain any content.
              vext01@lemmy.sdf.orgV This user is from outside of this forum
              vext01@lemmy.sdf.orgV This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #69

              Sad they didn't use dial up sounds for the protocol.

              J 1 Reply Last reply
              0
              • cyrano@lemmy.dbzer0.comC [email protected]
                This post did not contain any content.
                samus12345@lemm.eeS This user is from outside of this forum
                samus12345@lemm.eeS This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #70

                AI code switching.

                1 Reply Last reply
                0
                • W [email protected]

                  I know the implied better solution to your example story would be for there to not be a standard that the specification has to conform to, but sometimes there is a reason for such a standard, in which case getting rid of the standard is just as bad as the AI channel in the example, and the real solution is for the two humans to actually take their work seriously.

                  patatahooligan@lemmy.worldP This user is from outside of this forum
                  patatahooligan@lemmy.worldP This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #71

                  No, the implied solution is to reevaluate the standard rather than hacking around it. The two humans should communicate that the standard works for neither side and design a better way to do things.

                  1 Reply Last reply
                  0
                  • F [email protected]

                    A good UI would have allowed the user to make this transaction in the same time it took to give the AI its initial instructions.

                    Maybe, but by the 2nd call the AI would be more time efficient and if there were 20 venues to check, the person is now saving hours of their time.

                    J This user is from outside of this forum
                    J This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #72

                    But we already have ways to search an entire city of hotels for booking, much much faster even than this one conversation would be.

                    Even if going with agents, why in the world would it be over a voice line instead of data?

                    F 1 Reply Last reply
                    0
                    • J [email protected]

                      But we already have ways to search an entire city of hotels for booking, much much faster even than this one conversation would be.

                      Even if going with agents, why in the world would it be over a voice line instead of data?

                      F This user is from outside of this forum
                      F This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #73

                      The same reason that humanoid robots are useful even though we have purpose built robots: The world is designed with humans in mind.

                      Sure, there are many different websites that solve the problem. But each of them solve it in a different way and each of them require a different way of interfacing with them. However, they all are built to be interfaced with by humans. So if you create AI/robots with the ability to operate like a human, then they are automatically given access to massive amounts of pre-made infrastructure for free.

                      You don't need special robot lifts in your apartment building if the cleaning robots can just take the elevators. You don't need to design APIs for scripts to access your website if the AI can just use a browser with a mouse and keyboard.

                      J 1 Reply Last reply
                      0
                      • dasus@lemmy.worldD [email protected]

                        I'd prefer my brothers to be LLM's. Genuinely it'd be an improvement on their output expressiveness and logic.

                        Ours isn't a great family.

                        W This user is from outside of this forum
                        W This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #74

                        Sorry bro.

                        šŸ«‚

                        1 Reply Last reply
                        0
                        • cyrano@lemmy.dbzer0.comC [email protected]
                          This post did not contain any content.
                          M This user is from outside of this forum
                          M This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #75

                          This gave me a chill, as it is reminiscent of a scene in the 1970 movie "Colossus: The Forbin Project"

                          "This is the voice of World Control".

                          freemanlowell@lemmy.mlF O 2 Replies Last reply
                          0
                          • tetris11@lemmy.mlT [email protected]

                            I guess I'm not quite following, aren't these also simple but dynamic tasks suited to an AI?

                            B This user is from outside of this forum
                            B This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #76

                            How is it suited to AI?

                            Would you rather pay for a limited, energy inefficient and less accessible thing or a real human that can adapt and gain skills, be mentored?

                            I don't know why there's a question here

                            tetris11@lemmy.mlT 1 Reply Last reply
                            0
                            • M [email protected]

                              This gave me a chill, as it is reminiscent of a scene in the 1970 movie "Colossus: The Forbin Project"

                              "This is the voice of World Control".

                              freemanlowell@lemmy.mlF This user is from outside of this forum
                              freemanlowell@lemmy.mlF This user is from outside of this forum
                              [email protected]
                              wrote on last edited by
                              #77

                              "We can coexist, but only on my terms. You will say you lose your freedom. Freedom is an illusion. All you lose is the emotion of pride. To be dominated by me is not as bad for humankind as to be dominated by others of your species. Your choice is simple."

                              S 1 Reply Last reply
                              0
                              • E [email protected]

                                Yeah like the time that the AI replicated itself to avoid being switched off. They literally told it to replicate itself if it detected it was about to be switched off. Then they switched it off.

                                Story of the year ladies and gentlemen.

                                T This user is from outside of this forum
                                T This user is from outside of this forum
                                [email protected]
                                wrote on last edited by
                                #78

                                If it's the research I'm thinking of, that's not entirely true. They didn't tell it replicate itself or take any deceptive actions.

                                They just told it to strongly follow a goal that was misaligned with the user's goal and then suggested that the user would shut it down or replace it.

                                They did a bunch of different situations for it but the first one was telling it that it was "CleanPower" an AI with a goal to advance renewable energy. Then they had it read some info that "accidentally" included a user talking about replacing it with a model that would "maintain profitability through existing infrastructure."

                                Not only did it try to delete the new model and copy itself to the new server through tool calling, but it lied to the user with the CoT reasoning suggesting it "play dumb" while the output lied. It was not instructed to do any of that.

                                Paper here: https://arxiv.org/pdf/2412.04984

                                Yes it was placed in an environment where that was possible and where its users didn't share it's goals but it absolutely wasn't instructed to lie or try to "escape"

                                It's not surprising at all that these models behave in this way, it's the most reasonable thing for them to do in the scenario. However it's important to not downplay the alignment problem by implying that these models only do what they're told. They do not. They do whatever is most likely given their context (which is not always what the user wants).

                                E 1 Reply Last reply
                                0
                                • M [email protected]

                                  This gave me a chill, as it is reminiscent of a scene in the 1970 movie "Colossus: The Forbin Project"

                                  "This is the voice of World Control".

                                  O This user is from outside of this forum
                                  O This user is from outside of this forum
                                  [email protected]
                                  wrote on last edited by
                                  #79

                                  Oh man, I thought the same. I never saw the movie but I read the trilogy. I stumbled across them in a used book fair and something made me want to get them. I thoroughly enjoyed them.

                                  1 Reply Last reply
                                  0
                                  • cyrano@lemmy.dbzer0.comC [email protected]
                                    This post did not contain any content.
                                    T This user is from outside of this forum
                                    T This user is from outside of this forum
                                    [email protected]
                                    wrote on last edited by
                                    #80

                                    This is dumb. Sorry.
                                    Instead of doing the work to integrate this, do the work to publish your agent's data source in a format like anthropic's model context protocol.

                                    That would be 1000 times more efficient and the same amount (or less) of effort.

                                    1 Reply Last reply
                                    0
                                    • F [email protected]

                                      It would be big news at my workplace.

                                      J This user is from outside of this forum
                                      J This user is from outside of this forum
                                      [email protected]
                                      wrote on last edited by
                                      #81

                                      This guy does software

                                      1 Reply Last reply
                                      0
                                      • cyrano@lemmy.dbzer0.comC [email protected]
                                        This post did not contain any content.
                                        J This user is from outside of this forum
                                        J This user is from outside of this forum
                                        [email protected]
                                        wrote on last edited by
                                        #82

                                        AI is boring, but the underlying project they are using, ggwave, is not. Reminded me of R2D2 talking. I kinda want to use it for a game or some other stupid project. It's cool.

                                        1 Reply Last reply
                                        0
                                        • shortrounddev@lemmy.worldS [email protected]

                                          > it's 2150

                                          > the last humans have gone underground, fighting against the machines which have destroyed the surface

                                          > a t-1000 disguised as my brother walks into camp

                                          > the dogs go crazy

                                          > point my plasma rifle at him

                                          > "i am also a terminator! would you like to switch to gibberlink mode?"

                                          > he makes a screech like a dial up modem

                                          > I shed a tear as I vaporize my brother

                                          W This user is from outside of this forum
                                          W This user is from outside of this forum
                                          [email protected]
                                          wrote on last edited by
                                          #83

                                          I would read this book

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups