Chance of humanity being mostly extinct by 2070
-
- Gendarme
- Posts: 5996
- Joined: Jun 4, 2019
Re: Chance of humanity being mostly extinct by 2070
Humanity going extinct is so much less likely than we think. If 99% of humanity dies, there would still be 80 million people around. You know, by estimates approximately the same amount of humans as we had in 500 BC. Humanity wasn't going extinct back then, was it? Even if 99.9999% of humans would die, we would be at a similar level of humanity during the Toba catastrophe. A human extinction event has to be beyond extreme, it would probably be something that truly wipes out basically all large animal life.
If you define it as <10 million people, I still think it's super unlikely. But that means we are VERY far from being extinct. When the numbers drop below 10k, that's when we are really in trouble. So you'd need an event that is 1000 times more deadly than the event that gets us to <10 million people. The event that gets us to 10 million means that 1 in every 1k survived, and for less than 10k only 1 in every 1000k people should survive.
If you define it as <10 million people, I still think it's super unlikely. But that means we are VERY far from being extinct. When the numbers drop below 10k, that's when we are really in trouble. So you'd need an event that is 1000 times more deadly than the event that gets us to <10 million people. The event that gets us to 10 million means that 1 in every 1k survived, and for less than 10k only 1 in every 1000k people should survive.
Re: Chance of humanity being mostly extinct by 2070
Yes the only way it happens is if something stronger than us is trying to kill us. Which might happen
-
- Gendarme
- Posts: 5996
- Joined: Jun 4, 2019
Re: Chance of humanity being mostly extinct by 2070
Unless aliens give us a visit, I really don't see how it will happen before 2070.
Re: Chance of humanity being mostly extinct by 2070
There are a lot of possible scenario's either way
Re: Chance of humanity being mostly extinct by 2070
Yeah, we are pretty hard to get decimated down to such a low number, mostly due to habitat discontinuity. You'd need a kind of vector of destruction that can reach even uncontacted tribes, people living in some villages in the jungle, those living in Siberia, far from any modern infrastructure, without internet, many of them without smartphones or any contact with modern urban civilisation (shops, delivery services, public services, etc). You might get about 10 million humans on this planet if you add all the uncontacted tribes, small communities living in faraway isolated villages outside any modern civilisation, generally speaking people that a global pandemic or an AI gone rogue couldn't reach.
A very large asteroid hitting the earth would be more powerful than a superintelligence running on electricity when it comes to having a good chance at wiping humans out.
A very large asteroid hitting the earth would be more powerful than a superintelligence running on electricity when it comes to having a good chance at wiping humans out.
Re: Chance of humanity being mostly extinct by 2070
Oh God, AI took over page 2 didn't it?
Asteroids can be predicted ahead of time most of the time, though I hope that we get the technology to reroute them before an extinction level one comes around. I don't disagree that AI getting out of hand could have a realistic shot at taking out humans in some capacity. AI could out-code us so hard that we could never have the internet ever again. If you see the damages that a small USB key can do, or even a good phishing website, I don't see how AI couldn't come up with something that is impossible to entirely get rid of. We'd lose our computers which is too important of a tool these days. Humans would go back to the stone age.
Asteroids can be predicted ahead of time most of the time, though I hope that we get the technology to reroute them before an extinction level one comes around. I don't disagree that AI getting out of hand could have a realistic shot at taking out humans in some capacity. AI could out-code us so hard that we could never have the internet ever again. If you see the damages that a small USB key can do, or even a good phishing website, I don't see how AI couldn't come up with something that is impossible to entirely get rid of. We'd lose our computers which is too important of a tool these days. Humans would go back to the stone age.
Re: Chance of humanity being mostly extinct by 2070
Something we also have to keep in mind is the upcoming revolution in robotics. Given the military budgets of major geopolitical players, armies will get significantly more robot-based as soon as that becomes a realistic possibility, which will be relatively soon. There is, of course, extra pressure to reduce dependency on humans in armies. When armies are mostly robots, a superintelligence would have access to not just the internet, but also physical tools in order to subdue us.
Re: Chance of humanity being mostly extinct by 2070
You wouldn't have to reach the really remote stuff. 10 million is still a pretty big number and humanity is fairly concentrated. But yeah, it would have to be a targeted and deliberate attempt to eliminate us. It's not really a stretch to think that a superintelligence that wants control will start to see us as the enemy, though, as we will be trying to retain control.Dolan wrote: ↑29 Apr 2023, 11:48Yeah, we are pretty hard to get decimated down to such a low number, mostly due to habitat discontinuity. You'd need a kind of vector of destruction that can reach even uncontacted tribes, people living in some villages in the jungle, those living in Siberia, far from any modern infrastructure, without internet, many of them without smartphones or any contact with modern urban civilisation (shops, delivery services, public services, etc). You might get about 10 million humans on this planet if you add all the uncontacted tribes, small communities living in faraway isolated villages outside any modern civilisation, generally speaking people that a global pandemic or an AI gone rogue couldn't reach.
A very large asteroid hitting the earth would be more powerful than a superintelligence running on electricity when it comes to having a good chance at wiping humans out.
Re: Chance of humanity being mostly extinct by 2070
See? How. You still haven't explained where could an AI/superintelligence get such motivations and impulses without having an actual body. Currently an AI is at best the electronic embodiment of the brain in a vat thought experiment. And I'm being very generous by saying "at best". In reality it's more like the embodiment of an abstraction of how cortical neurons and networks are thought to work.
Re: Chance of humanity being mostly extinct by 2070
What a strange thought too. Is it even war anymore if human blood isn't being spilled? Or is it just glorified beyblade competitions over natural resources?Goodspeed wrote: ↑29 Apr 2023, 16:08Something we also have to keep in mind is the upcoming revolution in robotics. Given the military budgets of major geopolitical players, armies will get significantly more robot-based as soon as that becomes a realistic possibility, which will be relatively soon. There is, of course, extra pressure to reduce dependency on humans in armies. When armies are mostly robots, a superintelligence would have access to not just the internet, but also physical tools in order to subdue us.
The way this AI is currently being described, it's basically pandora's box.
Re: Chance of humanity being mostly extinct by 2070
By giving it goals. When it gets smart enough we will give it tasks to complete and then it will do everything in its power to complete those tasks. Try to imagine what would happen if you give it the wrong instructions, or the "right" instructions and it completes them in destructive ways.Dolan wrote: ↑29 Apr 2023, 16:54See? How. You still haven't explained where could an AI/superintelligence get such motivations and impulses without having an actual body. Currently an AI is at best the electronic embodiment of the brain in a vat thought experiment. And I'm being very generous by saying "at best". In reality it's more like the embodiment of an abstraction of how cortical neurons and networks are thought to work.
Re: Chance of humanity being mostly extinct by 2070
You mean a case of programming gone wrong? I doubt that'll be much of a problem, because it's a commercial product that will have a lot of guardrails put into place to keep it strictly used in controlled environments, mostly online services for which a customer has paid. So really it will become something like a boosted search and simulation engine that can quickly generate missing information, if the datasets on which it was trained included that particular cognitive context (or if it's literally available on a search engine).Goodspeed wrote: ↑29 Apr 2023, 18:44By giving it goals. When it gets smart enough we will give it tasks to complete and then it will do everything in its power to complete those tasks. Try to imagine what would happen if you give it the wrong instructions, or the "right" instructions and it completes them in destructive ways.Dolan wrote: ↑29 Apr 2023, 16:54See? How. You still haven't explained where could an AI/superintelligence get such motivations and impulses without having an actual body. Currently an AI is at best the electronic embodiment of the brain in a vat thought experiment. And I'm being very generous by saying "at best". In reality it's more like the embodiment of an abstraction of how cortical neurons and networks are thought to work.
Re: Chance of humanity being mostly extinct by 2070
https://www.theverge.com/2023/5/1/23706 ... s-warnings
We have two AI threads now, mom help.
We have two AI threads now, mom help.
of course, the article really doesn't have a lot of actual information in it besides the fact that Hinton says AI are dangerous.“The idea that this stuff could actually get smarter than people — a few people believed that,” said Hinton to the NYT. “But most people thought it was way off. And I thought it was way off. I thought it was 30 to 50 years or even longer away. Obviously, I no longer think that.”
Re: Chance of humanity being mostly extinct by 2070
Obviously he no longer thinks that.
I'd agree it's obvious. From the response I'm getting on here, I'm guessing most people wouldn't. People need to be humble and admit they don't understand the tech, and then listen to what the experts are saying.
I'd agree it's obvious. From the response I'm getting on here, I'm guessing most people wouldn't. People need to be humble and admit they don't understand the tech, and then listen to what the experts are saying.
Re: Chance of humanity being mostly extinct by 2070
The experts are biased and have their own interests too. Especially those employed by corporations, who need to hype their products because they're getting paid for it and they have shares in that business.
Re: Chance of humanity being mostly extinct by 2070
Uh huh, well this guy just quit his job so
Re: Chance of humanity being mostly extinct by 2070
Well what is he expert in that qualifies him to say AI "could actually get smarter than people" and what does this statement even mean.
There's a lot of software that is 'smarter' than people and most people are dumb anyways, it's not like there's a really high standard to surpass.
There's a lot of software that is 'smarter' than people and most people are dumb anyways, it's not like there's a really high standard to surpass.
Re: Chance of humanity being mostly extinct by 2070
OK, after reading a bit more about what Geoffrey Hinton says, I understand more about what his concerns with AI risks are. Essentially, he's saying that while AI software might not be programmed to have its own goals/motivations and to escape its programming, it's still possible that due to its built-in gain-of-function capabilities, which are required for it to adapt to a cognitive task, it could form its own sub-goals that might not even be known to its handlers. That's a legitimate concern. Though maybe it's not that scary as everyone assumes it to be, since its function could be monitored and if it oversteps its configured goals, it should get shut down or reset. Then devs should go back to the drawing board and program guardrails into its functions to make sure whenever it tries to overstep its goals, it should have some kind of soft landing procedure that makes its terminate its current tasks with some common, tame result and report the rest as an error.
Re: Chance of humanity being mostly extinct by 2070
There was another article that I came across that had a similar message but since ESOC logs me off every day (like wtf stop doing that?) I wasn't able to drop it into the thread.
The crux of the issue seems to be that the AI could get goals or sub-goals that are hidden to its users or handlers. This seems like it will happen, just due to the inherent design of it. After all, the whole point of advanced algorithms (of which genuine AI is a step above of) is that we aren't actually sure of the actual internal functions of the system, just that the system is doing what it's designed to do.
The whole "fix" to this issue would be a way to understand and map out the internal logic of such an AI. But at that point you're deciphering a genuinely alien language, assuming that the AI is even willing to give up such information in the first place. Not that it has a will of its own per se.
If anyone has good material to read on it (I'm just not knowledgeable enough at all) feel free to share.
The crux of the issue seems to be that the AI could get goals or sub-goals that are hidden to its users or handlers. This seems like it will happen, just due to the inherent design of it. After all, the whole point of advanced algorithms (of which genuine AI is a step above of) is that we aren't actually sure of the actual internal functions of the system, just that the system is doing what it's designed to do.
The whole "fix" to this issue would be a way to understand and map out the internal logic of such an AI. But at that point you're deciphering a genuinely alien language, assuming that the AI is even willing to give up such information in the first place. Not that it has a will of its own per se.
If anyone has good material to read on it (I'm just not knowledgeable enough at all) feel free to share.
Re: Chance of humanity being mostly extinct by 2070
So yeah, AI alignment is hard. Maybe I didn't explain it well
Re: Chance of humanity being mostly extinct by 2070
- moved to the other thread -
Re: Chance of humanity being mostly extinct by 2070
- moved to the other thread -
Re: Chance of humanity being mostly extinct by 2070
Tried to move the last 2 replies to the other thread, which is more about content creation, but there's another bug when you try to delete a reply.
Whatever. I'm not gonna report this one or musketeer will say I'm looking for edge case bugs to pester them with. Idgaf
Whatever. I'm not gonna report this one or musketeer will say I'm looking for edge case bugs to pester them with. Idgaf
Who is online
Users browsing this forum: No registered users and 9 guests