'Doomsday Clock' moves closer to midnight amid threats of climate change, nuclear war, pandemics, AI
-
[email protected]replied to [email protected] last edited by
This was started nearly a century ago by scientists after the creation of the atomic bomb a couple years after the end of WW2.
The point is mostly to say "hey, we have the technology to blow up the world and things do not not seem to be going well". They actually give out an annual report every year explaining their reasoning.
In setting the Clock one second closer to midnight, we send a stark signal: Because the world is already perilously close to the precipice, a move of even a single second should be taken as an indication of extreme danger and an unmistakable warning that every second of delay in reversing course increases the probability of global disaster.
Essentially- we are closer than ever to a global war between nuclear powers.
In regard to nuclear risk, the war in Ukraine, now in its third year, looms over the world; the conflict could become nuclear at any moment because of a rash decision or through accident or miscalculation. Conflict in the Middle East threatens to spiral out of control into a wider war without warning. The countries that possess nuclear weapons are increasing the size and role of their arsenals, investing hundreds of billions of dollars in weapons that can destroy civilization. The nuclear arms control process is collapsing, and high-level contacts among nuclear powers are totally inadequate given the danger at hand.
Now someone may say "Closer than ever?? What about the Cuban Missile Crisis?"
The thing is, we have been developing newer and "less dangerous" nuclear weapons. Tactical bombs that won't leave the traditional nuclear fallout. This creates a sort of itchy trigger finger syndrome. After the Cold War, we created nuclear arms control treaties between the US and Russia. These are collapsing. Both the US and Russia are complicit in this.
If anybody wants to read more https://thebulletin.org/doomsday-clock/2025-statement/
But to tldr:
The world is in a chaotic period of time. Fascism seems to be taking hold again, the economy is on the edge of collapse, and war remains an ever-present threat. Any war between great powers (US, China, Russia) would certainly mean nuclear disaster.
The point is that we are vulnerable right now. Any push could shove us tumbling down the hill. Diplomatic crisis, another pandemic, economic crash, a regional war, etc. Any of those could be the straw that breaks the camel's back.
I have a lot of respect for the Bulletin Board of Atomic Scientists. We need these types of organizations to remind people of the danger we are currently in. We become desensitized because of the constant barrage of "historic news" but they're going to look back on this period similarly to the decade before WW2, I believe.
-
[email protected]replied to [email protected] last edited by
At least
-
[email protected]replied to [email protected] last edited by
Unpopular opinion: I believe the only way to save humanity is through AI. Humans aren't going to fix things.
-
[email protected]replied to [email protected] last edited by
And nobody is at the wheel...just kids playing HoI
-
[email protected]replied to [email protected] last edited by
I hope this doesn’t come out the wrong way, but I’m curious what AI would be able to do to solve these issues? There are a lot of ways I could see it being used to make plans or ideas, but ultimately wouldn’t people need to trust AI and give it power over our decisions?
Even if AI weren’t plagued with human biases, it’s hard to imagine people agreeing to trust it. People barely trust each other, and we’d have to trust those who program AI not to manipulate it in their own favor.
-
[email protected]replied to [email protected] last edited by
I mean if the Stargate Project creates Skynet, that's one way to the kind of salvation folks preach about.
-
[email protected]replied to [email protected] last edited by
AI is just a reflection of humanity.
-
[email protected]replied to [email protected] last edited by
Yeah, tbh, it seems like it should be a lot closer to midnight. If things are this bad and we're still over a minute, what's it going to take?
-
[email protected]replied to [email protected] last edited by
We don't have AI.
-
[email protected]replied to [email protected] last edited by
If (or when) we achieve the technological singularity (we aren't even close, current AI is just marketing) they will be able to lay down a plan to fix anything without making mistakes, they will predict the consequences of actions in detail, ours or theirs (some thing are more difficult like a volcano exploding).
Handling is not necessary they could be able to just take it, the only way to stop them would be to cut electricity I guess.
But the thing is not the current marketing term for AI, we don't have AI.
-
[email protected]replied to [email protected] last edited by
Oh I never knew that!
-
[email protected]replied to [email protected] last edited by
Perhaps even thrice.
-
[email protected]replied to [email protected] last edited by
Go to 12 already
-
[email protected]replied to [email protected] last edited by
Yeah. To be clear, I don't know if they called it a scale change, but "we need to start considering more, weaker crises because there aren't big ones anymore" was the gist.