It’s a whole new world based on Artificial Intelligence (AI) By William Lama, Ph.D.

pexels-photo-4031867.jpg

It’s a whole new world based on Artificial Intelligence (AI)

By William Lama, Ph.D.

Life in PV has changed. The runners and bikers going by my house, usually a chatty crowd, are instead spread out single file, roughly a car length apart.  Communication is limited to an occasional shout of “car back!” Walking my Westies up the street I shout out to my neighbor who is Facetiming with her grand-baby-boy Brooks. My daughter is doing the same with her two grand-baby-girls Kyra and Rubylynne in Bend, Oregon.  They’re social distancing using the most fabulous social media app ever invented. It’s a whole new world based on Artificial Intelligence (AI) that’s even working to defeat the coronavirus.

laptop-office-internet-technology-177598.jpg

Drug development typically takes a decade with a price tag of $2 billion and failure rates over 90%. “We can substantially accelerate this process using AI and make it much cheaper, faster, and more likely to succeed,” says Alex Zhavoronkov, CEO of Insilico Medicine. (“Five Companies Using AI to Fight Coronavirus,” IEEE Spectrum, March 20, 2020).

To appreciate the wonders of AI, consider this thought experiment borrowed from Rodney Brooke, “The Seven Deadly Sins of AI Predictions,” MIT Technology Review, 2017. Think of Isaac Newton who among other things explained the law of gravity. Now imagine that Newton time travels to an apple grove in Cupertino, California on the Apple Computer campus. An iPhone 11 falls out of a tree into Newton’s hands. Being a helpful chap you show him how to turn it on so that the screen is glowing and full of icons. Now play a movie with an English country scene, and then some church music that he would have heard. Show him a Web page of his masterpiece Principia; teach him how to use the pinch to zoom in on details. Show him how the magical device can take photos and movies and record sound, and how it counts steps as he carries it. Demonstrate how it can be used to do computations at incredible speed. Best of all, tell Newton that he can use the iPhone to talk to people anywhere in the world, and Facetime with his great-great-great- - - grandkids. To Newton it would seem like magic.

But it’s all based on "artificial intelligence" (AI). From Wiki, AI describes computer applications that mimic "cognitive" functions associated with the human mind such as learning and problem-solving. Apps generally classified as AI include speech understanding (“Hey Siri”), competing in games such as chess and Go, driverless cars, drug development and intelligent routing. Newton would get a kick out of Siri, the iPhone voice recognition feature and voice-controlled minion.

And there is so much more in the world of “tech.” Just think of all the buzz words: 5G, the cloud, Moore’s law, neural nets, brain computing, quantum entanglement, to mention a few. So, while sheltering in place I researched a few of those hi tech things, from the perspective of a guy who used a (non-digital) slide rule through my school years.

In one of the more amazing AI developments, everything we use is becoming connected. You will recognize some of the AI hubs in this “Internet of Things” (refrigerators, home lighting, and someday, robots.)

Google Images, Internet of Things

Google Images, Internet of Things

I feel sure that Newton would be an “early adopter.” On the other hand, some very smart people tell us to be very worried about AI. Stephen Hawking told Wired, “I fear that AI may replace humans altogether. If people design computer viruses, someone will design AI that replicates itself. This will be a new form of life that will outperform humans.” Edward Clint says not to worry. “We are the shapers of whatever AI we wish to make. This means we can expect them to be more like our favorite domesticated species, the dog. We bred dogs to serve us and we will make AI to serve us, too.” (Edward Clint, “Irrational AI-nxiety,” Quillette 12/14/17)

On the other hand, there is a real concern about the effect of AI on our brains. “I have felt the powerful effects of modern technology on my own brain. I recall as a young adult, prior to having a laptop or smart-phone, visiting the library and experiencing the intense wonder and serenity produced by the books that surrounded me. I never experienced the anxiety or loss of focus at the library that I do today when I skim through massive amounts of information on the Internet. Are AI applications causing us to lose our concentration, attention, and our ability for linear, deep, and critical thought?” (Anna Marieta Moise, “Rational AI-nxiety: A Counter-Argument,” Qulliette, 12/21/17) It’s well known that too much screen time can be harmful, especially to the developing brain. So it’s wise to use tech prudently.

On the technical side, AI is hungry for data, massive amounts of data.  For example, Apple’s facial recognition system “DeepFace” was trained on four million images uploaded by Facebook users. This requires massive computing power. Is the tech industry up to the task?

For 50 plus years the digital electronics industry has experienced an exponential increase in performance at fixed cost and power. Computer chips are stuffed full of transistors, the on/off switches that represent the 0/1 of digital bits. Gordon Moore’s law describes the doubling of the number of transistors in a given chip each processor generation, roughly every 2 years. This is not a physical law but a reflection of the fact that the step size in the photolithography process is cut in half each processor generation, thereby doubling of the transistor density and halving the cost per transistor. This figure shows Moore’s law performance from the introduction of the Intel microprocessor in 1970 until 2010. The transistor count reached 40 billion in an AMD chip in 2019.

Google images, Moore’s Law

Google images, Moore’s Law

However, silicon lithography techniques are beginning to approach their limit and Moore’s Law has been interacting with a rock in the tech road. Arthur Rock’s law predicts that the cost to build a semiconductor chip fabrication plant doubles every four years, taking some of the shine off Moore’s law. So what’s a geek to do? Fortunately, there are many technology developments with the potential to extend Moore’s Law.

Cloud computing relies on sharing a pool of remote servers hosted on the Internet to store, manage, and process data, rather than deploying local or personal hardware and software.  The cloud is enabling the development of the “Internet of Things” and other data-intensive applications. In-memory computing systems store data in RAM, across a cluster of computers, and process it in parallel. In-memory computing results in data processing more than 100 times faster than other solutions. Typical uses include fraud detection, “blockchain” technology (which allows digital information to be distributed but not copied) and applications involving geospatial processing for transportation.

Quantum computers employ subatomic quantum bits (qubits) that exist in a continuum of levels between the “zero” and “one” of digital bits. Qubits can represent both 0 and 1 at the same time. That’s superposition. Each qubit influences the other qubits around it, working together to arrive at a solution. That’s entanglement. Superposition and entanglement are what give quantum computers the ability to process so much more information so much faster. A quantum computer with as few as 72 qubits could solve problems that would be impossible for a digital computer. (that’s “quantum supremacy”) However quantum computers are really terrible at basic functions that digital computers excel at such as reading, writing, and arithmetic. And there is no such thing as a quantum hard drive. Thus it is expected that the quantum computer will be an accelerator to solve certain types of problem rather than a really fast and really powerful general-purpose computer. And quantum computers are complex beasts. Here are the guts of the IBM research quantum computer. Doesn’t it take you back to the days when computers filled a room?

Google images, IBM quantum computer

Google images, IBM quantum computer

The glowing metal is real gold, used for its superconductivity. The quantum chips are placed at the bottom under the lowest circular plate. And this whole giant thing is enclosed within a cryogenic refrigerator with a temperature as low as 1mK (-273 C or -460 F).

Neuromorphic computers (NC) mimic the physical structure and signal processing techniques of mammalian brains by arraying the processor and memory units in neural networks. One application of NC is autonomous vehicles where the AI system must incorporate the expertise that humans develop as experienced drivers. The ball that children are playing with could roll into the street and one of the kids may chase it. One needs to be wary of an aggressive driver in the next lane. Decision making in real life scenarios depends on the correct perception and understanding of the environment to predict future events, thereby enabling the correct, instantaneous course of action. Intel’s neuromorphic processor, Pohoiki Beach, contains 8 million neurons in 64 chips, delivering 1,000 times the performance and 10,000 times the efficiency of conventional CPUs. The 8 million neuron computing power has been characterized as roughly equivalent to the brainpower of a rodent. The human brain, on the other hand, is estimated to have close to 100 billion neurons.

Google images, Intel Pohoiki Beach

Google images, Intel Pohoiki Beach

I seem to have strayed a bit from my theme. So it’s time to resume social distancing in Palos Verdes. My favorite social media app is Bridge Base Online. Linda and I (in separate rooms) play with people all over the world, with no fear of catching or spreading the virus. We actually met playing bridge online when Linda lived in Kansas, but that’s another story for another time.

William Lama bio pic.jpg

Dr. William Lama PhD in physics from the University of Rochester. Taught physics in college and worked at Xerox as a principle scientist and engineering manager. Upon retiring, joined the PVIC docents; served on the board of the RPV Council of Home Owners Associations; served as a PV Library trustee for eight years; served on the PV school district Measure M oversight committee; was president of the Malaga Cove Homeowner's Association. Writes about science, technology and politics, mostly for my friends.

email: wlama@outlook.com