Appropriate Caution But Overlooks Key Variables

Doc Huston
1 min readSep 7, 2016

While all of what you say is appropriate, what is not included is as bad or worse than cautions you raise.

First, government’s military and intelligence services globally do not play by these rules. First and fastest is what rules in favor of fear of AGI or ASI able to dominate all competitors. Second, once a machine learning moves to comprehending the Internet the ability to out maneuver humans becomes a real concern.

Said differently, any situation that leads to an AGI intelligence explosion can be as bad as a bad, wrong or poor design.

Also, suggest Barrat’s, Our Final Invention, is more readable than Bostrom.

My Medium publication, A Passion to Evolve, has number of articles related to all these issues such as, Why You Should Fear Artificial Intelligence-AI , Only 6 Possible Outcomes in Next 20 Years [ — 4 are Bad — ] , and longish but detailed discussion of big picture evolutionary context and risks, Macroscopic Evolutionary Paradigm .

Doc Huston

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

Doc Huston
Doc Huston

Written by Doc Huston

Consultant & Speaker on future nexus of technology-economics-politics, PhD Nested System Evolution, MA Alternative Futures, Patent Holder — dochuston1@gmail.com

Responses (1)

Write a response