On Sat, Jun 25, 2016 at 4:32 AM, Tim <xxxxxx@little-possums.net> wrote:
So virtually overnight, there's a billion copies of some AI in robot
bodies, of unknown internal thought processes and motivations, and
probably thousands of copies embodied as super-AIs.  So the society
ends up with a huge monoculture of poorly understood beings, who can
easily copy themselves into commodity hardware, and some who can think
and learn just as well and very much faster than any human.

That's not an automatic recipe for disaster, but it's fertile ground
for one.

<snip>
 
, but the
assumption that strong AI can inhabit cheap hardware as soon as it is
developed does seem to be a common theme across a lot of science
fiction, both optimistic and horribly grim.


Sometimes at one and the same time . . . as you just gave capsule summaries of Robopocalypse and its sequel, Robogenesis:

https://en.wikipedia.org/wiki/Robopocalypse


--
Richard Aiken

"Never insult anyone by accident."  Robert A. Heinlein
"I studied the Koran a great deal. I came away from that study with the conviction there have been few religions in the world as deadly to men as Muhammed." Alexis de Tocqueville (1843)
"We know a little about a lot of things; just enough to make us dangerous." Dean Winchester
"It has been my experience that a gun doesn't care who pulls its trigger." Newton Knight (as portrayed by Matthew McConaughey), to a scoffing Confederate tax collector facing the weapons held by Knight's young children and wife.