Chance of humanity being mostly extinct by 2070

This is for discussions about news, politics, sports, other games, culture, philosophy etc.

Well?

<1%
8
40%
1-10%
7
35%
11-30%
1
5%
31-49%
0
No votes
It's a toss up
2
10%
51-75%
0
No votes
76-99%
1
5%
We're fucked
1
5%
 
Total votes: 20

User avatar
Nauru Dolan
Ninja
Posts: 13064
Joined: Sep 17, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Dolan »

15
User avatar
Netherlands Goodspeed
Retired Contributor
Posts: 13002
Joined: Feb 27, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Goodspeed »

Thanks
No Flag RefluxSemantic
Gendarme
Posts: 5996
Joined: Jun 4, 2019

Re: Chance of humanity being mostly extinct by 2070

Post by RefluxSemantic »

Humanity going extinct is so much less likely than we think. If 99% of humanity dies, there would still be 80 million people around. You know, by estimates approximately the same amount of humans as we had in 500 BC. Humanity wasn't going extinct back then, was it? Even if 99.9999% of humans would die, we would be at a similar level of humanity during the Toba catastrophe. A human extinction event has to be beyond extreme, it would probably be something that truly wipes out basically all large animal life.

If you define it as <10 million people, I still think it's super unlikely. But that means we are VERY far from being extinct. When the numbers drop below 10k, that's when we are really in trouble. So you'd need an event that is 1000 times more deadly than the event that gets us to <10 million people. The event that gets us to 10 million means that 1 in every 1k survived, and for less than 10k only 1 in every 1000k people should survive.
User avatar
Netherlands Goodspeed
Retired Contributor
Posts: 13002
Joined: Feb 27, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Goodspeed »

Yes the only way it happens is if something stronger than us is trying to kill us. Which might happen
No Flag RefluxSemantic
Gendarme
Posts: 5996
Joined: Jun 4, 2019

Re: Chance of humanity being mostly extinct by 2070

Post by RefluxSemantic »

Unless aliens give us a visit, I really don't see how it will happen before 2070.
User avatar
Netherlands Goodspeed
Retired Contributor
Posts: 13002
Joined: Feb 27, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Goodspeed »

There are a lot of possible scenario's either way
User avatar
Nauru Dolan
Ninja
Posts: 13064
Joined: Sep 17, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Dolan »

Yeah, we are pretty hard to get decimated down to such a low number, mostly due to habitat discontinuity. You'd need a kind of vector of destruction that can reach even uncontacted tribes, people living in some villages in the jungle, those living in Siberia, far from any modern infrastructure, without internet, many of them without smartphones or any contact with modern urban civilisation (shops, delivery services, public services, etc). You might get about 10 million humans on this planet if you add all the uncontacted tribes, small communities living in faraway isolated villages outside any modern civilisation, generally speaking people that a global pandemic or an AI gone rogue couldn't reach.
A very large asteroid hitting the earth would be more powerful than a superintelligence running on electricity when it comes to having a good chance at wiping humans out.
France iNcog
Ninja
Posts: 13236
Joined: Mar 7, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by iNcog »

Oh God, AI took over page 2 didn't it? :o

Asteroids can be predicted ahead of time most of the time, though I hope that we get the technology to reroute them before an extinction level one comes around. I don't disagree that AI getting out of hand could have a realistic shot at taking out humans in some capacity. AI could out-code us so hard that we could never have the internet ever again. If you see the damages that a small USB key can do, or even a good phishing website, I don't see how AI couldn't come up with something that is impossible to entirely get rid of. We'd lose our computers which is too important of a tool these days. Humans would go back to the stone age.
YouTube: https://www.youtube.com/incog_aoe
Garja wrote: ↑
20 Mar 2020, 21:46
I just hope DE is not going to implement all of the EP changes. Right now it is a big clusterfuck.
User avatar
Netherlands Goodspeed
Retired Contributor
Posts: 13002
Joined: Feb 27, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Goodspeed »

Something we also have to keep in mind is the upcoming revolution in robotics. Given the military budgets of major geopolitical players, armies will get significantly more robot-based as soon as that becomes a realistic possibility, which will be relatively soon. There is, of course, extra pressure to reduce dependency on humans in armies. When armies are mostly robots, a superintelligence would have access to not just the internet, but also physical tools in order to subdue us.
User avatar
Netherlands Goodspeed
Retired Contributor
Posts: 13002
Joined: Feb 27, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Goodspeed »

Dolan wrote: ↑
29 Apr 2023, 11:48
Yeah, we are pretty hard to get decimated down to such a low number, mostly due to habitat discontinuity. You'd need a kind of vector of destruction that can reach even uncontacted tribes, people living in some villages in the jungle, those living in Siberia, far from any modern infrastructure, without internet, many of them without smartphones or any contact with modern urban civilisation (shops, delivery services, public services, etc). You might get about 10 million humans on this planet if you add all the uncontacted tribes, small communities living in faraway isolated villages outside any modern civilisation, generally speaking people that a global pandemic or an AI gone rogue couldn't reach.
A very large asteroid hitting the earth would be more powerful than a superintelligence running on electricity when it comes to having a good chance at wiping humans out.
You wouldn't have to reach the really remote stuff. 10 million is still a pretty big number and humanity is fairly concentrated. But yeah, it would have to be a targeted and deliberate attempt to eliminate us. It's not really a stretch to think that a superintelligence that wants control will start to see us as the enemy, though, as we will be trying to retain control.
User avatar
Nauru Dolan
Ninja
Posts: 13064
Joined: Sep 17, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Dolan »

Goodspeed wrote: ↑
29 Apr 2023, 16:12
It's not really a stretch to think that a superintelligence that wants control will start to see us as the enemy, though, as we will be trying to retain control.
See? How. You still haven't explained where could an AI/superintelligence get such motivations and impulses without having an actual body. Currently an AI is at best the electronic embodiment of the brain in a vat thought experiment. And I'm being very generous by saying "at best". In reality it's more like the embodiment of an abstraction of how cortical neurons and networks are thought to work.
France iNcog
Ninja
Posts: 13236
Joined: Mar 7, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by iNcog »

Goodspeed wrote: ↑
29 Apr 2023, 16:08
Something we also have to keep in mind is the upcoming revolution in robotics. Given the military budgets of major geopolitical players, armies will get significantly more robot-based as soon as that becomes a realistic possibility, which will be relatively soon. There is, of course, extra pressure to reduce dependency on humans in armies. When armies are mostly robots, a superintelligence would have access to not just the internet, but also physical tools in order to subdue us.
What a strange thought too. Is it even war anymore if human blood isn't being spilled? Or is it just glorified beyblade competitions over natural resources?

The way this AI is currently being described, it's basically pandora's box.
YouTube: https://www.youtube.com/incog_aoe
Garja wrote: ↑
20 Mar 2020, 21:46
I just hope DE is not going to implement all of the EP changes. Right now it is a big clusterfuck.
User avatar
Netherlands Goodspeed
Retired Contributor
Posts: 13002
Joined: Feb 27, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Goodspeed »

Dolan wrote: ↑
29 Apr 2023, 16:54
Goodspeed wrote: ↑
29 Apr 2023, 16:12
It's not really a stretch to think that a superintelligence that wants control will start to see us as the enemy, though, as we will be trying to retain control.
See? How. You still haven't explained where could an AI/superintelligence get such motivations and impulses without having an actual body. Currently an AI is at best the electronic embodiment of the brain in a vat thought experiment. And I'm being very generous by saying "at best". In reality it's more like the embodiment of an abstraction of how cortical neurons and networks are thought to work.
By giving it goals. When it gets smart enough we will give it tasks to complete and then it will do everything in its power to complete those tasks. Try to imagine what would happen if you give it the wrong instructions, or the "right" instructions and it completes them in destructive ways.
User avatar
Nauru Dolan
Ninja
Posts: 13064
Joined: Sep 17, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Dolan »

Goodspeed wrote: ↑
29 Apr 2023, 18:44
Dolan wrote: ↑
29 Apr 2023, 16:54
Goodspeed wrote: ↑
29 Apr 2023, 16:12
It's not really a stretch to think that a superintelligence that wants control will start to see us as the enemy, though, as we will be trying to retain control.
See? How. You still haven't explained where could an AI/superintelligence get such motivations and impulses without having an actual body. Currently an AI is at best the electronic embodiment of the brain in a vat thought experiment. And I'm being very generous by saying "at best". In reality it's more like the embodiment of an abstraction of how cortical neurons and networks are thought to work.
By giving it goals. When it gets smart enough we will give it tasks to complete and then it will do everything in its power to complete those tasks. Try to imagine what would happen if you give it the wrong instructions, or the "right" instructions and it completes them in destructive ways.
You mean a case of programming gone wrong? I doubt that'll be much of a problem, because it's a commercial product that will have a lot of guardrails put into place to keep it strictly used in controlled environments, mostly online services for which a customer has paid. So really it will become something like a boosted search and simulation engine that can quickly generate missing information, if the datasets on which it was trained included that particular cognitive context (or if it's literally available on a search engine).
France iNcog
Ninja
Posts: 13236
Joined: Mar 7, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by iNcog »

https://www.theverge.com/2023/5/1/23706 ... s-warnings

We have two AI threads now, mom help.
“The idea that this stuff could actually get smarter than people — a few people believed that,” said Hinton to the NYT. “But most people thought it was way off. And I thought it was way off. I thought it was 30 to 50 years or even longer away. Obviously, I no longer think that.”
of course, the article really doesn't have a lot of actual information in it besides the fact that Hinton says AI are dangerous.
YouTube: https://www.youtube.com/incog_aoe
Garja wrote: ↑
20 Mar 2020, 21:46
I just hope DE is not going to implement all of the EP changes. Right now it is a big clusterfuck.
User avatar
Netherlands Goodspeed
Retired Contributor
Posts: 13002
Joined: Feb 27, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Goodspeed »

Obviously he no longer thinks that.

I'd agree it's obvious. From the response I'm getting on here, I'm guessing most people wouldn't. People need to be humble and admit they don't understand the tech, and then listen to what the experts are saying.
User avatar
Nauru Dolan
Ninja
Posts: 13064
Joined: Sep 17, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Dolan »

The experts are biased and have their own interests too. Especially those employed by corporations, who need to hype their products because they're getting paid for it and they have shares in that business.
User avatar
Netherlands Goodspeed
Retired Contributor
Posts: 13002
Joined: Feb 27, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Goodspeed »

Uh huh, well this guy just quit his job so
User avatar
Nauru Dolan
Ninja
Posts: 13064
Joined: Sep 17, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Dolan »

Well what is he expert in that qualifies him to say AI "could actually get smarter than people" and what does this statement even mean.
There's a lot of software that is 'smarter' than people and most people are dumb anyways, it's not like there's a really high standard to surpass.
User avatar
Nauru Dolan
Ninja
Posts: 13064
Joined: Sep 17, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Dolan »

OK, after reading a bit more about what Geoffrey Hinton says, I understand more about what his concerns with AI risks are. Essentially, he's saying that while AI software might not be programmed to have its own goals/motivations and to escape its programming, it's still possible that due to its built-in gain-of-function capabilities, which are required for it to adapt to a cognitive task, it could form its own sub-goals that might not even be known to its handlers. That's a legitimate concern. Though maybe it's not that scary as everyone assumes it to be, since its function could be monitored and if it oversteps its configured goals, it should get shut down or reset. Then devs should go back to the drawing board and program guardrails into its functions to make sure whenever it tries to overstep its goals, it should have some kind of soft landing procedure that makes its terminate its current tasks with some common, tame result and report the rest as an error.
France iNcog
Ninja
Posts: 13236
Joined: Mar 7, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by iNcog »

There was another article that I came across that had a similar message but since ESOC logs me off every day (like wtf stop doing that?) I wasn't able to drop it into the thread.

The crux of the issue seems to be that the AI could get goals or sub-goals that are hidden to its users or handlers. This seems like it will happen, just due to the inherent design of it. After all, the whole point of advanced algorithms (of which genuine AI is a step above of) is that we aren't actually sure of the actual internal functions of the system, just that the system is doing what it's designed to do.

The whole "fix" to this issue would be a way to understand and map out the internal logic of such an AI. But at that point you're deciphering a genuinely alien language, assuming that the AI is even willing to give up such information in the first place. Not that it has a will of its own per se.

If anyone has good material to read on it (I'm just not knowledgeable enough at all) feel free to share.
YouTube: https://www.youtube.com/incog_aoe
Garja wrote: ↑
20 Mar 2020, 21:46
I just hope DE is not going to implement all of the EP changes. Right now it is a big clusterfuck.
User avatar
Netherlands Goodspeed
Retired Contributor
Posts: 13002
Joined: Feb 27, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Goodspeed »

So yeah, AI alignment is hard. Maybe I didn't explain it well
User avatar
Nauru Dolan
Ninja
Posts: 13064
Joined: Sep 17, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Dolan »

- moved to the other thread -
User avatar
Nauru Dolan
Ninja
Posts: 13064
Joined: Sep 17, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Dolan »

- moved to the other thread -
User avatar
Nauru Dolan
Ninja
Posts: 13064
Joined: Sep 17, 2015

Re: Chance of humanity being mostly extinct by 2070

Post by Dolan »

Tried to move the last 2 replies to the other thread, which is more about content creation, but there's another bug when you try to delete a reply.
Whatever. I'm not gonna report this one or musketeer will say I'm looking for edge case bugs to pester them with. Idgaf

Who is online

Users browsing this forum: No registered users and 9 guests

Which top 10 players do you wish to see listed?

All-time

Active last two weeks

Active last month

Supremacy

Treaty

Official

ESOC Patch

Treaty Patch

1v1 Elo

2v2 Elo

3v3 Elo

Power Rating

Which streams do you wish to see listed?

Twitch

Age of Empires III

Age of Empires IV