Skip to main content

Natural Awakenings Richmond

Avoiding the Singularity

Avoiding the Singularity

metamorworks/Shutterstock

Singularity is a hypothetical future point where technological growth becomes uncontrollable and irreversible. It was thought to be decades away, giving humans time to plan. But Ben Goertzel, Ph.D., CEO of SingularityNET and a leading artificial-intelligence scientist, predicts that singularity is fewer than 10 years away.

 

Goertzel believes that the advent of artificial general intelligence (AGI) is just around the corner, citing the progress made by large language models like Meta’s Llama2 and OpenAI’s GPT-4. These systems have increased global enthusiasm for AGI, leading to more resources, money and human energy invested in its development. AGI could create or modify its own algorithms, essentially teaching itself—something that currently available artificial intelligence does not do.

 

Despite the numerous benefits that AGI could bring, some people are concerned about the potential risks associated with the technology. Detractors worry that AGI could become more intelligent than humans, leading to drastic unforeseeable changes in civilization. While it is difficult to predict the exact timeline in which AGI will become a reality, it is important to consider the ethical implications of this technology, ensuring that its development aligns with human values and does not lead to unintended negative consequences.