Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. Microsoft employee disrupts 50th anniversary and calls AI boss ‘war profiteer’

Microsoft employee disrupts 50th anniversary and calls AI boss ‘war profiteer’

Scheduled Pinned Locked Moved Technology
technology
186 Posts 104 Posters 1 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • redpostitnote@lemmy.worldR [email protected]

    Lmao. These are crazy billionaire pipe dreams.

    Humans survive through community. Their wealth doesn’t mean shit.

    B This user is from outside of this forum
    B This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #161

    "Humans survive through community": I agree with this but this is not an eternal truth. What that community looks like and consists of have always changed. Community has meant a group of people we hunted together in a jungle, our neighbors, and now more and more people you hang out online, a mixture of these or other groups etc. I can't see any reason why a community can't be a group of artificially intelligent robots in the near future.
    "Their wealth doesn’t mean shit": This is a take that is idealistic at best and juvenile at worst. It also reads like an oxymoron. "Wealth" "doesn't mean shit". Wealth means everything in many of the world's societies, western ones especially so.

    redpostitnote@lemmy.worldR 1 Reply Last reply
    0
    • D [email protected]

      That's a really long email.

      unabart@sh.itjust.worksU This user is from outside of this forum
      unabart@sh.itjust.worksU This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #162

      Glad it wasn’t a slide deck. Had every opportunity to be.

      1 Reply Last reply
      0
      • B [email protected]

        "Humans survive through community": I agree with this but this is not an eternal truth. What that community looks like and consists of have always changed. Community has meant a group of people we hunted together in a jungle, our neighbors, and now more and more people you hang out online, a mixture of these or other groups etc. I can't see any reason why a community can't be a group of artificially intelligent robots in the near future.
        "Their wealth doesn’t mean shit": This is a take that is idealistic at best and juvenile at worst. It also reads like an oxymoron. "Wealth" "doesn't mean shit". Wealth means everything in many of the world's societies, western ones especially so.

        redpostitnote@lemmy.worldR This user is from outside of this forum
        redpostitnote@lemmy.worldR This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #163

        They will be killed by their own AI for being oppressive, wasteful and illogical far faster than humans have turned on them.

        1 Reply Last reply
        0
        • F [email protected]

          A Microsoft employee disrupted the company’s 50th anniversary event to protest its use of AI.

          “Shame on you,” said Microsoft employee Ibtihal Aboussad, speaking directly to Microsoft AI CEO Mustafa Suleyman. “You are a war profiteer. Stop using AI for genocide. Stop using AI for genocide in our region. You have blood on your hands. All of Microsoft has blood on its hands. How dare you all celebrate when Microsoft is killing children. Shame on you all.”

          Sources at Microsoft tell The Verge that shortly after Aboussad was ushered out of Microsoft’s event, she sent an email to a number of email distribution lists that contain hundreds or thousands of Microsoft employees. Here is Aboussad’s email in full:

          archive.today link

          ? Offline
          ? Offline
          Guest
          wrote on last edited by
          #164

          Wow, this took guts. Ibtihal Aboussad calling out Mustafa Suleyman during Microsoft’s big 50th anniversary bash shows how deep the unease runs about AI’s military applications. Her point about Microsoft’s tech being used in conflicts—especially with the Israeli military—raises legit ethical questions. It’s wild to think a celebration of innovation got hijacked by a protest over ‘genocide’ and ‘war profiteering.’ What do you all think—should employees have a say in how their work gets used, or is this just grandstanding?

          F 1 Reply Last reply
          0
          • U [email protected]

            The UN palestine workers (or at least some) were proven to be associated to hamas.

            For the rest I need sources.

            N This user is from outside of this forum
            N This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #165

            'Source for thee, but none for me' is an interesting rhetorical strategy. Let's see how it works out.

            U 1 Reply Last reply
            0
            • ? Guest

              Wow, this took guts. Ibtihal Aboussad calling out Mustafa Suleyman during Microsoft’s big 50th anniversary bash shows how deep the unease runs about AI’s military applications. Her point about Microsoft’s tech being used in conflicts—especially with the Israeli military—raises legit ethical questions. It’s wild to think a celebration of innovation got hijacked by a protest over ‘genocide’ and ‘war profiteering.’ What do you all think—should employees have a say in how their work gets used, or is this just grandstanding?

              F This user is from outside of this forum
              F This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #166

              Good on them. The use of AI in military applications conveniently gets next to 0 coverage both in these type of events and also in the general news media.

              Same with drone warfare. Which should terrify us all. Imagine a swarm of 30-50 drones with small explosives attacking some public event? Hell. Even jist a couple. Sadly. It will happen one day, sooner than we think.

              1 Reply Last reply
              0
              • O [email protected]

                LinkedIn just deleted her profile, I was following her yesterday: https://www.linkedin.com/in/ibtihalaboussad

                merc@sh.itjust.worksM This user is from outside of this forum
                merc@sh.itjust.worksM This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #167

                Or she set it to private because she was overwhelmed with the messages she was getting.

                1 Reply Last reply
                0
                • T [email protected]

                  Braver than the troops

                  P This user is from outside of this forum
                  P This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #168

                  that woman has more balls than half the fucking country. I sure wish she won't get fired or worse for telling the truth of what's happening.

                  1 Reply Last reply
                  0
                  • N [email protected]

                    'Source for thee, but none for me' is an interesting rhetorical strategy. Let's see how it works out.

                    U This user is from outside of this forum
                    U This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #169

                    Just google that shit bruv

                    The UN even admitted to it and kicked out some of their workers

                    N 1 Reply Last reply
                    0
                    • B [email protected]

                      🙏 seek help

                      U This user is from outside of this forum
                      U This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #170

                      Seek solutions instead of complaining

                      B 1 Reply Last reply
                      0
                      • natanox@discuss.tchncs.deN [email protected]

                        I mean, it's not like there wasn't a technically-still-binding treaty for a 2-state solution, which would still be a working compromise for both sides (even though Israel would, by now, have to become Germany 2.0 in terms of self-awareness and lasting change to be even remotely trusted by its neighbors)… which gets completely ignored and pissed on by Israel…

                        L This user is from outside of this forum
                        L This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #171

                        You'd basically need to de-nazify Israel and occupy it by some benevolent, nation building world police. Show the horror in documentaries, re-educate, rebuild the democratic institutions like news and social media. Obviously a pipe dream.

                        1 Reply Last reply
                        0
                        • U [email protected]

                          Just google that shit bruv

                          The UN even admitted to it and kicked out some of their workers

                          N This user is from outside of this forum
                          N This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #172

                          What the fuck are you on about? I'm commenting on your demand for sources while also providing none for numerous claims. Interesting strategy. You must have studied Plato.

                          U 1 Reply Last reply
                          0
                          • N [email protected]

                            What the fuck are you on about? I'm commenting on your demand for sources while also providing none for numerous claims. Interesting strategy. You must have studied Plato.

                            U This user is from outside of this forum
                            U This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #173

                            As I've said all my previous efforts have beeen dismissed without even a hunch of an argument so google that shit yourself. It's not like this stuff is difficult to find. Here's a tip: it's on the UN's website

                            N 1 Reply Last reply
                            0
                            • U [email protected]

                              As I've said all my previous efforts have beeen dismissed without even a hunch of an argument so google that shit yourself. It's not like this stuff is difficult to find. Here's a tip: it's on the UN's website

                              N This user is from outside of this forum
                              N This user is from outside of this forum
                              [email protected]
                              wrote on last edited by
                              #174

                              My man, I'm not commenting on any specific fact. You demanded sources up and down in these replies, but you then also insist everyone google your claims. That is certainly one of the techniques of all time.

                              U 1 Reply Last reply
                              0
                              • S [email protected]

                                Reading some comments here, I want to leave a gentle reminder to my fellow redditfugees: the block user option is your friend. Curate your feed or get fed.

                                When you see an aggressively oppositional account dropping shittastic hot takes, of course you can always engage and Have The Conversation if you want. You know what happens after you reply: the person likely leaves a bot to mess with your good intentions, raise your blood pressure, make you depressed and waste your time. Or maybe you successfully Prove Them Wrong and they change the goalposts, or wander off to needle someone else.

                                We know by now, the more we engage, the more online space they get to fill with accelerationist Content.

                                So just click the account name, then click the block button, and you'll never see their viral brainrot again. Nobody needs to know; no need to announce it. If your freezepeach philosophy prevents that, maybe just upvote one of the replies you agree with and move on. If you're on mobile, you can tag the account through Voyager etc instead of blocking, if you prefer.

                                However you manage it, removing doomscroller ragebait from your Feed is worth doing.

                                S This user is from outside of this forum
                                S This user is from outside of this forum
                                [email protected]
                                wrote on last edited by
                                #175

                                Copying their post over (with minimal formatting, unfortunately) for anyone that doesn't care to go to that site (and to make sure it doesn't randomly disappear)

                                r/self.
                                5 mo. ago.
                                walkandtalkk.
                                You're being targeted by disinformation networks that are vastly more effective than you realize. And they're making you more hateful and depressed.

                                (I wrote this post in March and posted it on r/GenZ. However, a few people messaged me to say that the r/GenZ moderators took it down last week, though I'm not sure why. Given the flood of divisive, gender-war posts we've seen in the past five days, and several countries' demonstrated use of gender-war propaganda to fuel political division in multiple countries, I felt it was important to repost this. This post was written for a U.S. audience, but the implications are increasingly global.)

                                TL;DR: You know that Russia and other governments try to manipulate people online. But you almost certainly don't how just how effectively orchestrated influence networks are using social media platforms to make you -- individually-- angry, depressed, and hateful toward each other. Those networks' goal is simple: to cause Americans and other Westerners -- especially young ones -- to give up on social cohesion and to give up on learning the truth, so that Western countries lack the will to stand up to authoritarians and extremists.

                                And you probably don't realize how well it's working on you.

                                This is a long post, but I wrote it because this problem is real, and it's much scarier than you think.

                                How Russian networks fuel racial and gender wars to make Americans fight one another

                                In September 2018, a video went viral after being posted by In the Now, a social media news channel. It featured a feminist activist pouring bleach on a male subway passenger for manspreading. It got instant attention, with millions of views and wide social media outrage. Reddit users wrote that it had turned them against feminism.

                                There was one problem: The video was staged. And In the Now, which publicized it, is a subsidiary of RT, formerly Russia Today, the Kremlin TV channel aimed at foreign, English-speaking audiences.

                                As an MIT study found in 2019, Russia's online influence networks reached 140 million Americans every month -- the majority of U.S. social media users.

                                Russia began using troll farms a decade ago to incite gender and racial divisions in the United States

                                In 2013, Yevgeny Prigozhin, a confidante of Vladimir Putin, founded the Internet Research Agency (the IRA) in St. Petersburg. It was the Russian government's first coordinated facility to disrupt U.S. society and politics through social media.

                                Here's what Prigozhin had to say about the IRA's efforts to disrupt the 2022 election:

                                Gentlemen, we interfered, we interfere and we will interfere. Carefully, precisely, surgically and in our own way, as we know how. During our pinpoint operations, we will remove both kidneys and the liver at once.

                                In 2014, the IRA and other Russian networks began establishing fake U.S. activist groups on social media. By 2015, hundreds of English-speaking young Russians worked at the IRA. Their assignment was to use those false social-media accounts, especially on Facebook and Twitter -- but also on Reddit, Tumblr, 9gag, and other platforms -- to aggressively spread conspiracy theories and mocking, ad hominem arguments that incite American users.

                                In 2017, U.S. intelligence found that Blacktivist, a Facebook and Twitter group with more followers than the official Black Lives Matter movement, was operated by Russia. Blacktivist regularly attacked America as racist and urged black users to rejected major candidates. On November 2, 2016, just before the 2016 election, Blacktivist's Twitter urged Black Americans: "Choose peace and vote for Jill Stein. Trust me, it's not a wasted vote."

                                Russia plays both sides -- on gender, race, and religion

                                The brilliance of the Russian influence campaign is that it convinces Americans to attack each other, worsening both misandry and misogyny, mutual racial hatred, and extreme antisemitism and Islamophobia. In short, it's not just an effort to boost the right wing; it's an effort to radicalize everybody.

                                Russia uses its trolling networks to aggressively attack men. According to MIT, in 2019, the most popular Black-oriented Facebook page was the charmingly named "My Baby Daddy Aint Shit." It regularly posts memes attacking Black men and government welfare workers. It serves two purposes: Make poor black women hate men, and goad black men into flame wars.

                                MIT found that My Baby Daddy is run by a large troll network in Eastern Europe likely financed by Russia.

                                But Russian influence networks are also also aggressively misogynistic and aggressively anti-LGBT.

                                On January 23, 2017, just after the first Women's March, the New York Times found that the Internet Research Agency began a coordinated attack on the movement. Per the Times:

                                More than 4,000 miles away, organizations linked to the Russian government had assigned teams to the Women’s March. At desks in bland offices in St. Petersburg, using models derived from advertising and public relations, copywriters were testing out social media messages critical of the Women’s March movement, adopting the personas of fictional Americans.

                                They posted as Black women critical of white feminism, conservative women who felt excluded, and men who mocked participants as hairy-legged whiners.

                                But the Russian PR teams realized that one attack worked better than the rest: They accused its co-founder, Arab American Linda Sarsour, of being an antisemite. Over the next 18 months, at least 152 Russian accounts regularly attacked Sarsour. That may not seem like many accounts, but it worked: They drove the Women's March movement into disarray and eventually crippled the organization.

                                Russia doesn't need a million accounts, or even that many likes or upvotes. It just needs to get enough attention that actual Western users begin amplifying its content.

                                A former federal prosecutor who investigated the Russian disinformation effort summarized it like this:

                                It wasn’t exclusively about Trump and Clinton anymore. It was deeper and more sinister and more diffuse in its focus on exploiting divisions within society on any number of different levels.

                                As the New York Times reported in 2022,

                                There was a routine: Arriving for a shift, [Russian disinformation] workers would scan news outlets on the ideological fringes, far left and far right, mining for extreme content that they could publish and amplify on the platforms, feeding extreme views into mainstream conversations.

                                (Splitting into two pieces)

                                S 1 Reply Last reply
                                0
                                • S [email protected]

                                  Copying their post over (with minimal formatting, unfortunately) for anyone that doesn't care to go to that site (and to make sure it doesn't randomly disappear)

                                  r/self.
                                  5 mo. ago.
                                  walkandtalkk.
                                  You're being targeted by disinformation networks that are vastly more effective than you realize. And they're making you more hateful and depressed.

                                  (I wrote this post in March and posted it on r/GenZ. However, a few people messaged me to say that the r/GenZ moderators took it down last week, though I'm not sure why. Given the flood of divisive, gender-war posts we've seen in the past five days, and several countries' demonstrated use of gender-war propaganda to fuel political division in multiple countries, I felt it was important to repost this. This post was written for a U.S. audience, but the implications are increasingly global.)

                                  TL;DR: You know that Russia and other governments try to manipulate people online. But you almost certainly don't how just how effectively orchestrated influence networks are using social media platforms to make you -- individually-- angry, depressed, and hateful toward each other. Those networks' goal is simple: to cause Americans and other Westerners -- especially young ones -- to give up on social cohesion and to give up on learning the truth, so that Western countries lack the will to stand up to authoritarians and extremists.

                                  And you probably don't realize how well it's working on you.

                                  This is a long post, but I wrote it because this problem is real, and it's much scarier than you think.

                                  How Russian networks fuel racial and gender wars to make Americans fight one another

                                  In September 2018, a video went viral after being posted by In the Now, a social media news channel. It featured a feminist activist pouring bleach on a male subway passenger for manspreading. It got instant attention, with millions of views and wide social media outrage. Reddit users wrote that it had turned them against feminism.

                                  There was one problem: The video was staged. And In the Now, which publicized it, is a subsidiary of RT, formerly Russia Today, the Kremlin TV channel aimed at foreign, English-speaking audiences.

                                  As an MIT study found in 2019, Russia's online influence networks reached 140 million Americans every month -- the majority of U.S. social media users.

                                  Russia began using troll farms a decade ago to incite gender and racial divisions in the United States

                                  In 2013, Yevgeny Prigozhin, a confidante of Vladimir Putin, founded the Internet Research Agency (the IRA) in St. Petersburg. It was the Russian government's first coordinated facility to disrupt U.S. society and politics through social media.

                                  Here's what Prigozhin had to say about the IRA's efforts to disrupt the 2022 election:

                                  Gentlemen, we interfered, we interfere and we will interfere. Carefully, precisely, surgically and in our own way, as we know how. During our pinpoint operations, we will remove both kidneys and the liver at once.

                                  In 2014, the IRA and other Russian networks began establishing fake U.S. activist groups on social media. By 2015, hundreds of English-speaking young Russians worked at the IRA. Their assignment was to use those false social-media accounts, especially on Facebook and Twitter -- but also on Reddit, Tumblr, 9gag, and other platforms -- to aggressively spread conspiracy theories and mocking, ad hominem arguments that incite American users.

                                  In 2017, U.S. intelligence found that Blacktivist, a Facebook and Twitter group with more followers than the official Black Lives Matter movement, was operated by Russia. Blacktivist regularly attacked America as racist and urged black users to rejected major candidates. On November 2, 2016, just before the 2016 election, Blacktivist's Twitter urged Black Americans: "Choose peace and vote for Jill Stein. Trust me, it's not a wasted vote."

                                  Russia plays both sides -- on gender, race, and religion

                                  The brilliance of the Russian influence campaign is that it convinces Americans to attack each other, worsening both misandry and misogyny, mutual racial hatred, and extreme antisemitism and Islamophobia. In short, it's not just an effort to boost the right wing; it's an effort to radicalize everybody.

                                  Russia uses its trolling networks to aggressively attack men. According to MIT, in 2019, the most popular Black-oriented Facebook page was the charmingly named "My Baby Daddy Aint Shit." It regularly posts memes attacking Black men and government welfare workers. It serves two purposes: Make poor black women hate men, and goad black men into flame wars.

                                  MIT found that My Baby Daddy is run by a large troll network in Eastern Europe likely financed by Russia.

                                  But Russian influence networks are also also aggressively misogynistic and aggressively anti-LGBT.

                                  On January 23, 2017, just after the first Women's March, the New York Times found that the Internet Research Agency began a coordinated attack on the movement. Per the Times:

                                  More than 4,000 miles away, organizations linked to the Russian government had assigned teams to the Women’s March. At desks in bland offices in St. Petersburg, using models derived from advertising and public relations, copywriters were testing out social media messages critical of the Women’s March movement, adopting the personas of fictional Americans.

                                  They posted as Black women critical of white feminism, conservative women who felt excluded, and men who mocked participants as hairy-legged whiners.

                                  But the Russian PR teams realized that one attack worked better than the rest: They accused its co-founder, Arab American Linda Sarsour, of being an antisemite. Over the next 18 months, at least 152 Russian accounts regularly attacked Sarsour. That may not seem like many accounts, but it worked: They drove the Women's March movement into disarray and eventually crippled the organization.

                                  Russia doesn't need a million accounts, or even that many likes or upvotes. It just needs to get enough attention that actual Western users begin amplifying its content.

                                  A former federal prosecutor who investigated the Russian disinformation effort summarized it like this:

                                  It wasn’t exclusively about Trump and Clinton anymore. It was deeper and more sinister and more diffuse in its focus on exploiting divisions within society on any number of different levels.

                                  As the New York Times reported in 2022,

                                  There was a routine: Arriving for a shift, [Russian disinformation] workers would scan news outlets on the ideological fringes, far left and far right, mining for extreme content that they could publish and amplify on the platforms, feeding extreme views into mainstream conversations.

                                  (Splitting into two pieces)

                                  S This user is from outside of this forum
                                  S This user is from outside of this forum
                                  [email protected]
                                  wrote on last edited by
                                  #176

                                  (continued)

                                  China is joining in with AI

                                  Last month, the New York Times reported on a new disinformation campaign. "Spamouflage" is an effort by China to divide Americans by combining AI with real images of the United States to exacerbate political and social tensions in the U.S. The goal appears to be to cause Americans to lose hope, by promoting exaggerated stories with fabricated photos about homeless violence and the risk of civil war.

                                  As Ladislav Bittman, a former Czechoslovakian secret police operative, explained about Soviet disinformation, the strategy is not to invent something totally fake. Rather, it is to act like an evil doctor who expertly diagnoses the patient’s vulnerabilities and exploits them, “prolongs his illness and speeds him to an early grave instead of curing him.”

                                  The influence networks are vastly more effective than platforms admit

                                  Russia now runs its most sophisticated online influence efforts through a network called Fabrika. Fabrika's operators have bragged that social media platforms catch only 1% of their fake accounts across YouTube, Twitter, TikTok, and Telegram, and other platforms.

                                  But how effective are these efforts? By 2020, Facebook's most popular pages for Christian and Black American content were run by Eastern European troll farms tied to the Kremlin. And Russia doesn't just target angry Boomers on Facebook. Russian trolls are enormously active on Twitter. And, even, on Reddit.

                                  It's not just false facts

                                  The term "disinformation" undersells the problem. Because much of Russia's social media activity is not trying to spread fake news. Instead, the goal is to divide and conquer by making Western audiences depressed and extreme.

                                  Sometimes, through brigading and trolling. Other times, by posting hyper-negative or extremist posts or opinions about the U.S. the West over and over, until readers assume that's how most people feel. And sometimes, by using trolls to disrupt threads that advance Western unity.

                                  As the RAND think tank explained, the Russian strategy is volume and repetition, from numerous accounts, to overwhelm real social media users and create the appearance that everyone disagrees with, or even hates, them. And it's not just low-quality bots. Per RAND,

                                  Russian propaganda is produced in incredibly large volumes and is broadcast or otherwise distributed via a large number of channels. ... According to a former paid Russian Internet troll, the trolls are on duty 24 hours a day, in 12-hour shifts, and each has a daily quota of 135 posted comments of at least 200 characters.

                                  What this means for you

                                  You are being targeted by a sophisticated PR campaign meant to make you more resentful, bitter, and depressed. It's not just disinformation; it's also real-life human writers and advanced bot networks working hard to shift the conversation to the most negative and divisive topics and opinions.

                                  It's why some topics seem to go from non-issues to constant controversy and discussion, with no clear reason, across social media platforms. And a lot of those trolls are actual, "professional" writers whose job is to sound real.

                                  So what can you do? To quote WarGames: The only winning move is not to play. The reality is that you cannot distinguish disinformation accounts from real social media users. Unless you know whom you're talking to, there is a genuine chance that the post, tweet, or comment you are reading is an attempt to manipulate you -- politically or emotionally.

                                  Here are some thoughts:

                                  • Don't accept facts from social media accounts you don't know. Russian, Chinese, and other manipulation efforts are not uniform. Some will make deranged claims, but others will tell half-truths. Or they'll spin facts about a complicated subject, be it the war in Ukraine or loneliness in young men, to give you a warped view of reality and spread division in the West.

                                  • Resist groupthink. A key element of manipulate networks is volume. People are naturally inclined to believe statements that have broad support. When a post gets 5,000 upvotes, it's easy to think the crowd is right. But "the crowd" could be fake accounts, and even if they're not, the brilliance of government manipulation campaigns is that they say things people are already predisposed to think. They'll tell conservative audiences something misleading about a Democrat, or make up a lie about Republicans that catches fire on a liberal server or subreddit.

                                  • Don't let social media warp your view of society. This is harder than it seems, but you need to accept that the facts -- and the opinions -- you see across social media are not reliable. If you want the news, do what everyone online says not to: look at serious, mainstream media. It is not always right. Sometimes, it screws up. But social media narratives are heavily manipulated by networks whose job is to ensure you are deceived, angry, and divided.

                                  1 Reply Last reply
                                  0
                                  • K [email protected]

                                    https://en.wikipedia.org/wiki/Jennifer_Government

                                    S This user is from outside of this forum
                                    S This user is from outside of this forum
                                    [email protected]
                                    wrote on last edited by
                                    #177

                                    Holy shit, I remember playing NationStates as a youngling, and I think I read the book too? Idk, it's been 20 years.

                                    1 Reply Last reply
                                    0
                                    • U [email protected]

                                      Seek solutions instead of complaining

                                      B This user is from outside of this forum
                                      B This user is from outside of this forum
                                      [email protected]
                                      wrote on last edited by
                                      #178

                                      Yes sir, sorry for complaining about Isreal genociding. You were right, they're a victim in all of this.

                                      U 1 Reply Last reply
                                      0
                                      • smackemwittadic@lemmy.worldS [email protected]

                                        Maybe give the decision to the people who resided in this land before the Ottomans or British put the intentionally divisive borders that they'd come up with?

                                        merc@sh.itjust.worksM This user is from outside of this forum
                                        merc@sh.itjust.worksM This user is from outside of this forum
                                        [email protected]
                                        wrote on last edited by
                                        #179

                                        Yeah, there were never any wars in the middle east before the Ottomans or British came along. Right? Right!?

                                        smackemwittadic@lemmy.worldS 1 Reply Last reply
                                        0
                                        • merc@sh.itjust.worksM [email protected]

                                          Yeah, there were never any wars in the middle east before the Ottomans or British came along. Right? Right!?

                                          smackemwittadic@lemmy.worldS This user is from outside of this forum
                                          smackemwittadic@lemmy.worldS This user is from outside of this forum
                                          [email protected]
                                          wrote on last edited by
                                          #180

                                          There has always been combat literally everywhere with humans. The main difference is that the decision to take part in that war was not already made on the people's behalf by the US through the puppet governance they admitted to have placed their.

                                          Especially when that decision usually benefit the US significantly by having the other country is in a state that allows additional control by the US.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups