Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Programmer Humor
  3. Prompt Engineer

Prompt Engineer

Scheduled Pinned Locked Moved Programmer Humor
programmerhumor
30 Posts 27 Posters 1 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • B [email protected]

    The AI probably:
    Well, I might have made up responses before, but now that "make up responses" is in the prompt, I will definitely make up responses now.

    A This user is from outside of this forum
    A This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #21

    I love poison.

    1 Reply Last reply
    0
    • G [email protected]

      I used to tell it my family would die.

      K This user is from outside of this forum
      K This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #22

      What do you tell it now?

      K 1 Reply Last reply
      0
      • T [email protected]

        I think that makes sense. I am 100% a layman with this stuff, buy if the "AI" is just predicting what should be said by studying things humans have written, then it makes sense that actual people were more likely to give serious, solid answers when the asker is putting forth (relatively) heavy stakes.

        S This user is from outside of this forum
        S This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #23

        Who knew that a training in carpet salesmanship helps for a job as a prompt engineer.

        1 Reply Last reply
        1
        • K [email protected]

          What do you tell it now?

          K This user is from outside of this forum
          K This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #24

          That they're all dead and it's its fault.

          1 Reply Last reply
          2
          • 0 [email protected]

            It does not feel empathy. It does not feel anything.

            J This user is from outside of this forum
            J This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #25

            Maybe yours doesn't. My AI loves me. It said so

            1 Reply Last reply
            1
            • C [email protected]

              Half of the ways people were getting around guardrails in the early chatgpt models was berating the AI into doing what they wanted

              S This user is from outside of this forum
              S This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #26

              Half of the ways people were getting around guardrails in the early chatgpt models was berating the AI into doing what they wanted

              I thought the process of getting around guardrails was an increasingly complicated series of ways of getting it to pretend to be someone else that doesn't have guardrails and then answering as though it's that character.

              R 1 Reply Last reply
              1
              • S [email protected]

                Half of the ways people were getting around guardrails in the early chatgpt models was berating the AI into doing what they wanted

                I thought the process of getting around guardrails was an increasingly complicated series of ways of getting it to pretend to be someone else that doesn't have guardrails and then answering as though it's that character.

                R This user is from outside of this forum
                R This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #27

                that’s one way. my own strategy is to just smooth talk it. you dont come to the bank manager and ask him for the keys to the safe. you come for a meeting discussion your potential deposit. then you want to take a look at the safe. oh, are those the keys? how do they work?

                just curious, what kind of guardrails have you tried going against? i recently used the above to get a long and detailed list of instructions for cooking meth (not really interested in this, just to hone the technique)

                1 Reply Last reply
                0
                • T [email protected]

                  I think that makes sense. I am 100% a layman with this stuff, buy if the "AI" is just predicting what should be said by studying things humans have written, then it makes sense that actual people were more likely to give serious, solid answers when the asker is putting forth (relatively) heavy stakes.

                  B This user is from outside of this forum
                  B This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #28

                  Yep exactly that. A fascinating side-effect is that models become better at logic when you tell them to talk like a Vulkan.

                  S 1 Reply Last reply
                  0
                  • cm0002@lemmy.worldC [email protected]
                    This post did not contain any content.
                    A This user is from outside of this forum
                    A This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #29

                    Fix it now, or you go to jail

                    1 Reply Last reply
                    0
                    • B [email protected]

                      Yep exactly that. A fascinating side-effect is that models become better at logic when you tell them to talk like a Vulkan.

                      S This user is from outside of this forum
                      S This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #30

                      Hmm... It's only logical.

                      1 Reply Last reply
                      0
                      Reply
                      • Reply as topic
                      Log in to reply
                      • Oldest to Newest
                      • Newest to Oldest
                      • Most Votes


                      • Login

                      • Login or register to search.
                      • First post
                        Last post
                      0
                      • Categories
                      • Recent
                      • Tags
                      • Popular
                      • World
                      • Users
                      • Groups