Today in pulp l ask the question: given the increasing power and popularity of AI systems should we now fear The Singularity - a super-powered artificial intelligence that will evolve to rule over us forever more?
I'm going to say no for this one, and here's why...
The Technological Singularity is a hypothetical point in the future when technological growth becomes uncontrollable and irreversible, resulting in unfathomable changes to human civilization!
In short, an upgradable intelligent agent would enter a "runaway reaction" of self-improvement cycles: an intelligence explosion would occur resulting in a powerful superintelligence that would qualitatively far surpass all human intelligence.
There's a tradition in the UK whereby every TV show that may be watched by kids issues a Christmas compendium hardback book.
These were normally knocked out under licence by publishing companies that a) never watched the shows, and b) didn't really like kids.
For example: the Doctor Who Christmas annual, whose artwork has long been a source of puzzlement to children. "Who's that weirdo on the cover?" kids would cry every 25th December. "Has he regenerated into the Child Catcher?" They'd then proceed to draw a nob on all the Daleks.