Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Lemmy Shitpost
  3. Lemmy be like

Lemmy be like

Scheduled Pinned Locked Moved Lemmy Shitpost
419 Posts 150 Posters 1 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • grrgyle@slrpnk.netG [email protected]

    Legitimately useful applications, like in the medical field, are actually brought up as examples of the "right kind" of use case for this technology.

    Over and over again.

    It's kind annoying, because both the haters of commercial LLM in All The Things and defenders of the same will bring up these exact same use cases as examples of good ai use.

    T This user is from outside of this forum
    T This user is from outside of this forum
    [email protected]
    wrote last edited by
    #45

    May I ask for a link ?
    Never saw that in the communities I consult. Never.
    Or at least not above 5 downvotes.

    grrgyle@slrpnk.netG 1 Reply Last reply
    1
    • Z [email protected]

      Does this count? https://sopuli.xyz/post/1138547

      E This user is from outside of this forum
      E This user is from outside of this forum
      [email protected]
      wrote last edited by
      #46

      ah, mid 2023, the honeymoon times

      1 Reply Last reply
      0
      • T [email protected]

        May I ask for a link ?
        Never saw that in the communities I consult. Never.
        Or at least not above 5 downvotes.

        grrgyle@slrpnk.netG This user is from outside of this forum
        grrgyle@slrpnk.netG This user is from outside of this forum
        [email protected]
        wrote last edited by
        #47

        I'll keep an eye out but I don't have votes visible, so can only really tell sentiment from comments.

        Aside, but I highly recommend hiding vote counts. They're even more pointless here than they were on redit. They're meaningless noise on the frontend.

        1 Reply Last reply
        1
        • D [email protected]

          Good lord stop comparing LLMs to airplanes in your replies. This is why you think "AI bad" is an unserious statement.

          E This user is from outside of this forum
          E This user is from outside of this forum
          [email protected]
          wrote last edited by [email protected]
          #48

          I used that comparison a total of two times (and might use it more), how about refute my argument instead of getting mad at me for using a good comparison twice.

          Airplanes emit SHITLOADS of carbon into the atmosphere, they have directly caused the death of tens of thousands of people. Airplanes are heavily used in war and to spy on people. Airplanes are literally used to spray pesticides and other chemicals into the air etc.
          They can mostly just be used by the rich etc.

          Just like with AI, there are many reasons airplanes are bad, that doesn't mean we should get rid of them.

          I 1 Reply Last reply
          3
          • F This user is from outside of this forum
            F This user is from outside of this forum
            [email protected]
            wrote last edited by
            #49

            Yeah. I hate the naming of it too. It's not AI in the sense how science fiction saw it. History repeats itself in the name of marketing. I'm still very annoyed with these marketers destroying the term "hover board".

            A 1 Reply Last reply
            3
            • rushlana@lemmy.blahaj.zoneR [email protected]

              Give me one real world use that is worth the downside.

              As dev I can already tell you it's not coding or around code. Project get spamed with low quality nonsensical bug repport, ai generated code rarely work and doesn't integrate well ( on top on pushing all the work on the reviewer wich is already the hardest part of coding ) and ai written documentation is ridled with errors and is not legible.

              And even if ai was remotly good at something it still the equivalent of a microwave trying to replace the entire restaurant kitchen.

              E This user is from outside of this forum
              E This user is from outside of this forum
              [email protected]
              wrote last edited by [email protected]
              #50

              I can run a small LLM locally which I can talk to using voice to turn certain lights on and off, set reminders for me, play music etc.

              There are MANY examples of LLM's being useful, it has its drawbacks just like any big technology, but saying it has no uses that aren't worth it, is ridiculous.

              P rushlana@lemmy.blahaj.zoneR R 3 Replies Last reply
              2
              • M [email protected]

                Yes. AI can be used for spam, job cuts, and creepy surveillance, no argument there, but pretending it’s nothing more than a corporate scam machine is just lazy cynicism. This same “automatic BS” is helping discover life-saving drugs, diagnosing cancers earlier than some doctors, giving deaf people real-time conversations through instant transcription, translating entire languages on the fly, mapping wildfire and flood zones so first responders know exactly where to go, accelerating scientific breakthroughs from climate modeling to space exploration, and cutting out the kind of tedious grunt work that wastes millions of human hours a day. The problem isn’t that AI exists, it’s that a lot of powerful people use it selfishly and irresponsibly. Blaming the tech instead of demanding better governance is like blaming the printing press for bad propaganda.

                K This user is from outside of this forum
                K This user is from outside of this forum
                [email protected]
                wrote last edited by
                #51

                This same “automatic BS” is helping discover life-saving drugs, diagnosing cancers earlier than some doctors

                Not the same kind of AI. At all. Generative AI vendors love this motte-and-bailey.

                M 1 Reply Last reply
                9
                • M [email protected]

                  I personally think of AI as a tool, what matters is how you use it. I like to think of it like a hammer. You could use a hammer to build a house, or you could smash someone's skull in with it. But no one's putting the hammer in jail.

                  K This user is from outside of this forum
                  K This user is from outside of this forum
                  [email protected]
                  wrote last edited by [email protected]
                  #52

                  “Guns don’t kill people, people kill people”

                  Edit:

                  Controversial reply, apparently, but this is literally part of the script to a Philosophy Tube video (relevant part is 8:40 - 20:10)

                  We sometimes think that technology is essentially neutral. It can have good or bad effects, and it might be really important who controls it. But a tool, many people like to think, is just a tool. "Guns don't kill people, people do." But some philosophers have argued that technology can have values built into it that we may not realise.

                  ...

                  The philosopher Don Idhe says tech can open or close possibilities. It's not just about its function or who controls it. He says technology can provide a framework for action.

                  ...

                  Martin Heidegger was a student of Husserl's, and he wrote about the ways that we experience the world when we use a piece of technology. His most famous example was a hammer. He said when you use one you don't even think about the hammer. You focus on the nail. The hammer almost disappears in your experience. And you just focus on the task that needs to be performed.

                  Another example might be a keyboard. Once you get proficient at typing, you almost stop experiencing the keyboard. Instead, your primary experience is just of the words that you're typing on the screen. It's only when it breaks or it doesn't do what we want it to do, that it really becomes visible as a piece of technology. The rest of the time it's just the medium through which we experience the world.

                  Heidegger talks about technology withdrawing from our attention. Others say that technology becomes transparent. We don't experience it. We experience the world through it. Heidegger says that technology comes with its own way of seeing.

                  ...

                  Now some of you are looking at me like "Bull sh*t. A person using a hammer is just a person using a hammer!" But there might actually be some evidence from neurology to support this.

                  If you give a monkey a rake that it has to use to reach a piece of food, then the neurons in its brain that fire when there's a visual stimulus near its hand start firing when there's a stimulus near the end of the rake, too! The monkey's brain extends its sense of the monkey body to include the tool!

                  And now here's the final step. The philosopher Bruno Latour says that when this happens, when the technology becomes transparent enough to get incorporated into our sense of self and our experience of the world, a new compound entity is formed.

                  A person using a hammer is actually a new subject with its own way of seeing - 'hammerman.' That's how technology provides a framework for action and being. Rake + monkey = rakemonkey. Makeup + girl is makeupgirl, and makeupgirl experiences the world differently, has a different kind of subjectivity because the tech lends us its way of seeing.

                  You think guns don't kill people, people do? Well, gun + man creates a new entity with new possibilities for experience and action - gunman!

                  So if we're onto something here with this idea that tech can withdraw from our attention and in so doing create new subjects with new ways of seeing, then it makes sense to ask when a new piece of technology comes along, what kind of people will this turn us into.

                  I thought that we were pretty solidly past the idea that anything is “just a tool” after seeing Twitler scramble Grok’s innards to advance his personal politics.

                  Like, if you still had any lingering belief that AI is “like a hammer”, that really should’ve extinguished it.

                  But I guess some people see that as an aberrant misuse of AI, and not an indication that all AI has an agenda baked into it, even if it’s more subtle.

                  I K G A B 6 Replies Last reply
                  7
                  • occultist8128@infosec.pubO [email protected]

                    Peak misunderstanding between AI and LLM

                    F This user is from outside of this forum
                    F This user is from outside of this forum
                    [email protected]
                    wrote last edited by
                    #53

                    Why the hell are you being downvoted? You are completely right.

                    People will look back at this and "hover boards" and will think "are they stupid!?"

                    Mislabeling a product isn't great marketing, it's false advertisement.

                    occultist8128@infosec.pubO G 2 Replies Last reply
                    0
                    • M [email protected]

                      I personally think of AI as a tool, what matters is how you use it. I like to think of it like a hammer. You could use a hammer to build a house, or you could smash someone's skull in with it. But no one's putting the hammer in jail.

                      O This user is from outside of this forum
                      O This user is from outside of this forum
                      [email protected]
                      wrote last edited by [email protected]
                      #54

                      Seriously, the AI hate gets old fast. Like you said it's a tool, gey get over it people.

                      L 1 Reply Last reply
                      19
                      • K [email protected]

                        “Guns don’t kill people, people kill people”

                        Edit:

                        Controversial reply, apparently, but this is literally part of the script to a Philosophy Tube video (relevant part is 8:40 - 20:10)

                        We sometimes think that technology is essentially neutral. It can have good or bad effects, and it might be really important who controls it. But a tool, many people like to think, is just a tool. "Guns don't kill people, people do." But some philosophers have argued that technology can have values built into it that we may not realise.

                        ...

                        The philosopher Don Idhe says tech can open or close possibilities. It's not just about its function or who controls it. He says technology can provide a framework for action.

                        ...

                        Martin Heidegger was a student of Husserl's, and he wrote about the ways that we experience the world when we use a piece of technology. His most famous example was a hammer. He said when you use one you don't even think about the hammer. You focus on the nail. The hammer almost disappears in your experience. And you just focus on the task that needs to be performed.

                        Another example might be a keyboard. Once you get proficient at typing, you almost stop experiencing the keyboard. Instead, your primary experience is just of the words that you're typing on the screen. It's only when it breaks or it doesn't do what we want it to do, that it really becomes visible as a piece of technology. The rest of the time it's just the medium through which we experience the world.

                        Heidegger talks about technology withdrawing from our attention. Others say that technology becomes transparent. We don't experience it. We experience the world through it. Heidegger says that technology comes with its own way of seeing.

                        ...

                        Now some of you are looking at me like "Bull sh*t. A person using a hammer is just a person using a hammer!" But there might actually be some evidence from neurology to support this.

                        If you give a monkey a rake that it has to use to reach a piece of food, then the neurons in its brain that fire when there's a visual stimulus near its hand start firing when there's a stimulus near the end of the rake, too! The monkey's brain extends its sense of the monkey body to include the tool!

                        And now here's the final step. The philosopher Bruno Latour says that when this happens, when the technology becomes transparent enough to get incorporated into our sense of self and our experience of the world, a new compound entity is formed.

                        A person using a hammer is actually a new subject with its own way of seeing - 'hammerman.' That's how technology provides a framework for action and being. Rake + monkey = rakemonkey. Makeup + girl is makeupgirl, and makeupgirl experiences the world differently, has a different kind of subjectivity because the tech lends us its way of seeing.

                        You think guns don't kill people, people do? Well, gun + man creates a new entity with new possibilities for experience and action - gunman!

                        So if we're onto something here with this idea that tech can withdraw from our attention and in so doing create new subjects with new ways of seeing, then it makes sense to ask when a new piece of technology comes along, what kind of people will this turn us into.

                        I thought that we were pretty solidly past the idea that anything is “just a tool” after seeing Twitler scramble Grok’s innards to advance his personal politics.

                        Like, if you still had any lingering belief that AI is “like a hammer”, that really should’ve extinguished it.

                        But I guess some people see that as an aberrant misuse of AI, and not an indication that all AI has an agenda baked into it, even if it’s more subtle.

                        I This user is from outside of this forum
                        I This user is from outside of this forum
                        [email protected]
                        wrote last edited by
                        #55

                        My skull-crushing hammer that is made to crush skulls and nothing else doesn't crush skulls, people crush skulls
                        In fact, if more people had skull-crushing hammers in their homes, i'm sure that would lead to a reduction in the number of skull-crushings, the only thing that can stop a bad guy with a skull-crushing hammer, is a good guy with a skull-crushing hammer

                        pupbiru@aussie.zoneP 1 Reply Last reply
                        8
                        • K [email protected]

                          “Guns don’t kill people, people kill people”

                          Edit:

                          Controversial reply, apparently, but this is literally part of the script to a Philosophy Tube video (relevant part is 8:40 - 20:10)

                          We sometimes think that technology is essentially neutral. It can have good or bad effects, and it might be really important who controls it. But a tool, many people like to think, is just a tool. "Guns don't kill people, people do." But some philosophers have argued that technology can have values built into it that we may not realise.

                          ...

                          The philosopher Don Idhe says tech can open or close possibilities. It's not just about its function or who controls it. He says technology can provide a framework for action.

                          ...

                          Martin Heidegger was a student of Husserl's, and he wrote about the ways that we experience the world when we use a piece of technology. His most famous example was a hammer. He said when you use one you don't even think about the hammer. You focus on the nail. The hammer almost disappears in your experience. And you just focus on the task that needs to be performed.

                          Another example might be a keyboard. Once you get proficient at typing, you almost stop experiencing the keyboard. Instead, your primary experience is just of the words that you're typing on the screen. It's only when it breaks or it doesn't do what we want it to do, that it really becomes visible as a piece of technology. The rest of the time it's just the medium through which we experience the world.

                          Heidegger talks about technology withdrawing from our attention. Others say that technology becomes transparent. We don't experience it. We experience the world through it. Heidegger says that technology comes with its own way of seeing.

                          ...

                          Now some of you are looking at me like "Bull sh*t. A person using a hammer is just a person using a hammer!" But there might actually be some evidence from neurology to support this.

                          If you give a monkey a rake that it has to use to reach a piece of food, then the neurons in its brain that fire when there's a visual stimulus near its hand start firing when there's a stimulus near the end of the rake, too! The monkey's brain extends its sense of the monkey body to include the tool!

                          And now here's the final step. The philosopher Bruno Latour says that when this happens, when the technology becomes transparent enough to get incorporated into our sense of self and our experience of the world, a new compound entity is formed.

                          A person using a hammer is actually a new subject with its own way of seeing - 'hammerman.' That's how technology provides a framework for action and being. Rake + monkey = rakemonkey. Makeup + girl is makeupgirl, and makeupgirl experiences the world differently, has a different kind of subjectivity because the tech lends us its way of seeing.

                          You think guns don't kill people, people do? Well, gun + man creates a new entity with new possibilities for experience and action - gunman!

                          So if we're onto something here with this idea that tech can withdraw from our attention and in so doing create new subjects with new ways of seeing, then it makes sense to ask when a new piece of technology comes along, what kind of people will this turn us into.

                          I thought that we were pretty solidly past the idea that anything is “just a tool” after seeing Twitler scramble Grok’s innards to advance his personal politics.

                          Like, if you still had any lingering belief that AI is “like a hammer”, that really should’ve extinguished it.

                          But I guess some people see that as an aberrant misuse of AI, and not an indication that all AI has an agenda baked into it, even if it’s more subtle.

                          K This user is from outside of this forum
                          K This user is from outside of this forum
                          [email protected]
                          wrote last edited by
                          #56

                          Guns don’t kill people. People with guns kill people.

                          Ftfy

                          P M pupbiru@aussie.zoneP 3 Replies Last reply
                          5
                          • F [email protected]

                            Why the hell are you being downvoted? You are completely right.

                            People will look back at this and "hover boards" and will think "are they stupid!?"

                            Mislabeling a product isn't great marketing, it's false advertisement.

                            occultist8128@infosec.pubO This user is from outside of this forum
                            occultist8128@infosec.pubO This user is from outside of this forum
                            [email protected]
                            wrote last edited by
                            #57

                            IDK LMAO, that's what I really hate about Reddit/Lemmy, the voting system. People downvote but don't tell where I'm wrong in their opinion. I mean, at least argue — say out loud your (supposedly harmless) opinion. I even added a disclaimer there that I don't promote LLM and such stuff. I don't really care either, I stand with correctness and do what I can to correct what is wrong. I totally agree with @[email protected] tho.

                            1 Reply Last reply
                            1
                            • E [email protected]
                              This post did not contain any content.
                              P This user is from outside of this forum
                              P This user is from outside of this forum
                              [email protected]
                              wrote last edited by
                              #58

                              But like... Good.

                              1 Reply Last reply
                              1
                              • K [email protected]

                                Guns don’t kill people. People with guns kill people.

                                Ftfy

                                P This user is from outside of this forum
                                P This user is from outside of this forum
                                [email protected]
                                wrote last edited by
                                #59

                                Hey, that level of pedantry is my job

                                1 Reply Last reply
                                0
                                • M [email protected]

                                  I personally think of AI as a tool, what matters is how you use it. I like to think of it like a hammer. You could use a hammer to build a house, or you could smash someone's skull in with it. But no one's putting the hammer in jail.

                                  P This user is from outside of this forum
                                  P This user is from outside of this forum
                                  [email protected]
                                  wrote last edited by
                                  #60

                                  Yeah, except it's a tool that most people don't know how to use but everyone can use, leading to environmental harm, a rapid loss of media literacy, and a huge increase in wealth inequality due to turmoil in the job market.

                                  So... It's not a good tool for the average layperson to be using.

                                  R 1 Reply Last reply
                                  11
                                  • F [email protected]

                                    Why the hell are you being downvoted? You are completely right.

                                    People will look back at this and "hover boards" and will think "are they stupid!?"

                                    Mislabeling a product isn't great marketing, it's false advertisement.

                                    G This user is from outside of this forum
                                    G This user is from outside of this forum
                                    [email protected]
                                    wrote last edited by
                                    #61

                                    AI is an umbrella term that holds many thing. We have been referring to simple path finding algorithms in video games as AI for two decades, llms are AIs.

                                    F occultist8128@infosec.pubO 2 Replies Last reply
                                    1
                                    • E [email protected]

                                      One could have said many of the same thigs about a lot of new technologies.

                                      The Internet,
                                      Nuclear,
                                      Rockets,
                                      Airplanes etc.

                                      Any new disruptive technology comes with drawbacks and can be used for evil.

                                      But that doesn't mean it's all bad, or that it doesn't have its uses.

                                      P This user is from outside of this forum
                                      P This user is from outside of this forum
                                      [email protected]
                                      wrote last edited by
                                      #62

                                      Of those, only the internet was turned loose on an unsuspecting public, and they had decades of the faucet slowly being opened, to prepare.

                                      Can you imagine if after WW2, Werner Von Braun came to the USA and then just like... Gave every man woman and child a rocket, with no training? Good and evil wouldn't even come into, it'd be chaos and destruction.

                                      Imagine if every household got a nuclear reactor to power it, but none of the people in the household got any training in how to care for it.

                                      It's not a matter of good and evil, it's a matter of harm.

                                      E 1 Reply Last reply
                                      1
                                      • M [email protected]

                                        Yes. AI can be used for spam, job cuts, and creepy surveillance, no argument there, but pretending it’s nothing more than a corporate scam machine is just lazy cynicism. This same “automatic BS” is helping discover life-saving drugs, diagnosing cancers earlier than some doctors, giving deaf people real-time conversations through instant transcription, translating entire languages on the fly, mapping wildfire and flood zones so first responders know exactly where to go, accelerating scientific breakthroughs from climate modeling to space exploration, and cutting out the kind of tedious grunt work that wastes millions of human hours a day. The problem isn’t that AI exists, it’s that a lot of powerful people use it selfishly and irresponsibly. Blaming the tech instead of demanding better governance is like blaming the printing press for bad propaganda.

                                        A This user is from outside of this forum
                                        A This user is from outside of this forum
                                        [email protected]
                                        wrote last edited by
                                        #63

                                        Arent those different types of AI?

                                        I dont think anyone hating AI is referring to the code that makes enemies move, or sort things into categories

                                        M 1 Reply Last reply
                                        3
                                        • E [email protected]

                                          I can run a small LLM locally which I can talk to using voice to turn certain lights on and off, set reminders for me, play music etc.

                                          There are MANY examples of LLM's being useful, it has its drawbacks just like any big technology, but saying it has no uses that aren't worth it, is ridiculous.

                                          P This user is from outside of this forum
                                          P This user is from outside of this forum
                                          [email protected]
                                          wrote last edited by [email protected]
                                          #64

                                          That's like saying "asbestos has some good uses, so we should just give every household a big pile of it without any training or PPE"

                                          Or "we know leaded gas harms people, but we think it has some good uses so we're going to let everyone access it for basically free until someone eventually figures out what those uses might be"

                                          It doesn't matter that it has some good uses and that later we went "oops, maybe let's only give it to experts to use". The harm has already been done by eager supporters, intentional or not.

                                          H 1 Reply Last reply
                                          5
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups