The Next Great Decoupling: AI Takes Control

It may be achieved with digital computational devices (the computers you already know and love) that represent varying quantities symbolically as their numerical values change. Or it may be accomplished with analog computational devices (you probably don’t own an electronic analog computer as they don’t run common software), which use the continuously variable aspects of physical phenomena to solve problems. Or some digital-analog hybrid. Or we may have to wait for quantum computers (which promise a level of computational power said to be exponentially greater than previous technologies) to go online. Of course, big government, big corporations, or nation-states may corner the market on quantum computing and use it for control, but that is for science fiction writers, not me, to deal with.

Whatever technology ends up being tasked with (or seizing) artificial control, the thought of artificial control scares me way more than the thought of rogue artificial intelligence. Even Harari’s dystopian future of useless (conscious but not intelligent enough) humans doesn’t make the hair on the back of my neck stand up in the same way. Once something (conscious or not) achieves artificial control, we will be somewhere new.

Other than the purveyors of fear, uncertainty, and doubt, most people think about AI as just another tool – the way we think about a hammer or a drill. But AI is not just another tool. Hammers don’t think about human needs or consider what needs to be built. Humans control hammers.

Artificial control will be another thing altogether. It will score our human needs by positively reinforcing behaviors that help it achieve its goals (whatever they may be). Then it will give us more of what we become addicted to until it actually changes our behaviors – sort of like social media addiction. Oh, wait – a nascent version of artificial control may already be here.

1 2 3
View single page >> |

Shelly Palmer is Fox 5 New York's On-air Tech Expert (WNYW-TV) and the host of Fox Television's monthly show Shelly Palmer Digital Living. He also hosts United Stations Radio Network's, ...

more
How did you like this article? Let us know so we can better customize your reading experience. Users' ratings are only visible to themselves.

Comments

Leave a comment to automatically be entered into our contest to win a free Echo Show.
Trisha Sanders 1 week ago Member's comment

No, no spoilers please!

Gary Anderson 1 week ago Contributor's comment

I know there is a Utopean cult surrounding self driving cars. But they can't see in snow. They can't see where lanes are not strongly drawn. They cannot grasp cones. They cannot make high speed decisions. They cannot even operate at a 4 way stop. They cannot see so much. Humans can see.

Gary Anderson 1 week ago Contributor's comment

Wow, with all due respect, Hal is not on the way. Ford says self driving cars will never able to be fully autonomous. The hype is wearing off but people are still trying hard.

Adam Reynolds 1 week ago Member's comment

And Bill Gates once said we'll never need more than 640kb of memory. Never say "never." That's a strong word. Maybe we're 5 years away, maybe 20, but we WILL get there.

Gary Anderson 1 week ago Contributor's comment

Ford said never. Misallocated investment?

Craig Richards 1 week ago Member's comment

I agree with Adam. I think we're a long way off but only a fool says never. And if Ford said never... well then that was as short-sited as what Bill Gates said all those years ago.

Barry Hochhauser 1 week ago Member's comment

Pretty much everything that people once said computers could never do well, they can do. They said they could never beat a person in chess. Done. They said speech recognition could never be mastered, Done. And so on and son on.

Gary Anderson 1 week ago Contributor's comment

But those are narrow uses of AI. There won't be Hal driving on freeways.

David J. Tanner 1 week ago Member's comment

Hal was due out back in 2001. Still nothing anywhere near resembling that level of artificial intelligence. Which I think is a good thing!