I first heard about the "technological singularity" from a friend living in Singapore. The essence of the idea is what would mean if ordinary human intelligence were developed or perhaps overtaken by AI.
Since the early days of ReadyAI, I have been thinking more about this concept. This idea that human history is approaching a "singularity" - that we will someday in the future be overtaken by artificial intelligence robots or cognitively enhanced biological intelligence or maybe a mixture of both has moved from science fiction books and Hollywood movies to serious debate in our communities. Since first introduced to the idea, I have been reading a few books on this topic that I found truly fascinating. I have read authors and theorist that predict that if the field of AI continues to expand at its current rate, the singularity could even come about in the middle of this very century - nonetheless, I don't see it happening this quickly. In a brief but insightful book published by the MIT Press Essential Knowledge Series (I highly recommend reading at other titles of Essential Knowledge Series) - The Technological Singularity - by Murray Shanahan; he offers an introduction to the idea and looks at the consequences of such potentials happening. The book is very informative as the author aim is not to make a prediction but rather to examine a variety of scenarios. Regardless if we believe that singularity is imminent or distant, likely or unlikely, apocalypse or utopia, the very idea suggests critical philosophical and practical question forcing all of us to think seriously about what we want as a species. The book does a great job describing technological advances in AI, focusing on biologically inspired as well as engineered from scratch. Once human-level AI - theoretically possible, but very challenging to accomplish. - has been achieved, Shanahan explains, that shift to superintelligent AI could, in fact, be very fast. The book does a good job considering what the existence of superintelligent machines could even mean for matters like personhood, rights, identity, and responsibility. Just think about it, some superhuman AI agents might be created to benefit us - humankind: some might go rogue. (If you find this topic interesting, take a look at the book, Life 3.0 by Max Tegmark) The concept of singularity presents both an existential threat to humanity and an existential opportunity for the collective humanity to transcend its limitations. The book is a great introduction that makes si clear that we need to think and debate both alternatives if we desire to bring about the better outcome collectively.
0 Comments
Leave a Reply. |
AuthorRoozbeh, born in Tehran - Iran (March 1984) Archives
April 2024
Categories |