A hammer is a lever. A wedge is a lever. A pully is a lever. All of them increase the ability for a person to do work, and so more advanced machines slow us to do even more work. Computers and advanced machines will continue this process, so we can say they are useful.
“Give me a place to stand, and a lever long enough, and I will move the world.” Archimedes.
With the increased risk that comes with increased leverage, the question is the increased safety from technology offset it enough; and is exponentially growing power exponentially more dangerous?
Various simulation models suggest that if all nukes were launched today, due to a nuclear winter, could kill off 99% of the global population. Sure, we have not seen numbers like this, however, financial markets paint a different picture. Look at the amount of derivatives in markets today, leveraged trades, the massive growth in options trading, and even the relatively benign fractional banking system we all use. When things go up, leverage multiplies the returns, but when things go down, they are leveraged downward in multiples much higher than would be possible without such leverage.
All of the Top Risks to Humanity are from: Advanced Technology
There are several organizations that attempt to define existential risk to humanity, meaning risks so great that it threatens to wipe out most or all humans. These include the Global Challenges Foundation, the Future of Humanity Institute, the Centre for the Study of Existential Risk, etc… but for simplicity, most of the top concerns are essentially the same. Notice the top five on this list.
- Artificial intelligence: machines that are smarter than us can replace us, intentionally, or unintentionally.
- Synthetic biology: biowarfare or bioterrorism become much riskier when the ability and cost of engineering viruses and pathogens decrease.
- Extreme climate change: caused by humans of course
- Nanotechnology: microscopic robots
- Nuclear war
- Major asteroid impact
- Global pandemic
- Super-volcano
- Ecological collapse
- Global system collapse
- Bad global governance
More important is the realization that several of these can act together in concert. E.g. Bad governments build a totalitarian state using AI, robots, and nanotech which manufactures bioweapons. As for the results of natural events like an asteroid impact, or a major volcanic eruption, I am not so concerned due to their low probability, lower probability for total destruction, and far less likelihood of creating a permanent dystopia.
Note that current technology can make it easier to create a totalitarian state. Just as Zygmunt Bauman who noted in the “Modernity and the Holocaust” that without advanced technology and processes, killing on an efficient, industrial scale would not have been possible (need quote).
German industries and govt bureaucracy perfected the process killing off 60% of the Jewish population, and had the German’s continued to win, the genocidewas likely to have been finished. Most importantly, this event and others, like the Holodomir, dropping nukes on Hiroshima / Nagasaki show that killing off the entire population of earth is truly far more feasible than it’s ever been.
With AI self-improving itself, it’s common sense that it will be more capable of planning and implementing the best or worst plans in the history of humanity at a scale we cannot understand.
So, with godlike capabilities of AI, in the hands of evil people, and I really am concerned about the ability of a person or handful of people to destroy much of the world. So, keeping the leverage away from the bad guys is the real challenge. It has never been done, so I doubt that problem will ever be solved.