Sunday, July 6, 2025
Book Reviewed: More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity by Adam Becker
The billionaires, and their last product AI may determine the evolution of humans
This book reads more like the writings of a false diviner warning us about the impending boom that will bring catastrophic end to humanity and perhaps it will be replaced by intelligent machines, the AI with no ethics or empathy. After all, what is virtue as defined by human connotations? Are we too afraid to die? The author refers to some of the billionaires engaged in the futuristic AI technologies like Musk, Bezos, Altman, Kurzweil, Thiel, Andreessen, and many others have desires for immortality.
Recently a number of books have been written on futurology in AI dominated world. This is a product of the influence of tech-billionaires and venture capitalists seeking to shape the future of humanity without an oversight or ethical constraints. Long termism (prioritizing far-future outcomes over present suffering) is used to justify risky ventures like space odyssey and AI creation. Almost all these books sound like the words of false prophets sounding alarm. Here the author discusses how ethics and existential-risk reasoning are leveraged to legitimize funneling vast resources into speculative futures. Elon Musk's Martian ambitions and his vision of establishing a self-sustaining city on Mars are controversial because of technical feasibility, funding, ethics, and underlying motivations. Not long ago, Musk suggested that “if we nuke Mars we can terraform it.” It is extremely arrogant for a human being to make such an irresponsible suggestion.
Recently, Eliezer Yudkowsky, a decision theorist at the Machine Intelligence Research Institute, Berkely, CA said in an open letter in Time op-ed called for all AI labs to immediately pause for at least 6 months the training of AI systems. He observed that a nuclear war could erupt at the slightest misalignment of Artificial General Intelligence (AGI). But a lot of errors could happen besides nuclear Armageddon with AGI. As more Islamic countries make nuclear weapons, more likelihood that they will use for flimsiest reasons such as blasphemy, Ummah, or the destruction of Israel. Russian foreign minister has already said that Iran could buy nuclear warheads from Russia, and more Arab countries will join the frenzy of making their own nuclear warheads.
During a discussion on immortality, the author seems to suggest that there is an urgent need to “grab as much low-entropy matter and energy as possible” to overcome the fear of death. This will never happen because this concept violates the second law of thermodynamics. AI has to do everything within the constraints of laws of physics in the four-dimensional spacetime. However, AI could redefine immortality and offer indirect paths like digital Immortality (mind uploading / emulation), biological extension with AI-driven therapies, cloning and memory transfer, cryonics & future AI revival. All these are currently speculative.
Subscribe to:
Posts (Atom)