Don't Fear the Singularity, Embrace the Multiplicity

HBO's Westworld, which both creates fear of singularity and points to some multiplicity

Have you heard Stephen Hawking and Elon Musk raising concerns about AI and the singularity? These are fears that others have voiced for many decades and that have filled science-fiction stories for even longer. Singularity is the term given to that point when machines will surpass us.

That point will arrive, though no predictions have so far been correct on when it will occur. A more reasonable approach seem to me to be what some have called the "multiplicity."  That is a way of viewing what is coming as a time of humans working more closely with machines rather than humans versus the machines.

An article in Wired quotes C Berkeley roboticist Ken Goldberg as saying that the multiplicity is "something that's happening right now, and it's the idea of humans and machines working together.” 

I know all the automotive buzz is about driverless cars, but today in my car algorithms are guiding me to my destination, reminding me to stay in my lane, gently applying the brakes and steering when I am less attentive than I should be. My new car seems to be constantly flashing and beeping about something. I fear that the more it does, the more it distracts me from driving. Okay, maybe not that bad.

It is one thing to put your learning into the virtual hands of algorithms, but I am already entrusting a bit of my life protection in the car to them.

The multiplicity concept is not that new. A talk at Davos in 2015 points out that though there are now over a million robots working in factories around the world, we still don’t have them in our homes.

Hans Moravec pointed out 3 decades ago that “Tasks that are hard for humans, like precision spot welding, are easy for robots, while tasks that are easy for humans, like clearing the dinner table, are very hard for robots.”

The hospital robot that delivers drugs and linens to nurses and the ones in warehouses rolling 24/7 through the aisles scanning inventory or puling out items for orders hasn't necessarily surpassed humans in intelligence. But it is willing to work all day and night without breaks or pay. Do all robots replace humans? Much research says no, that they are more likely to enhance human workers or change what humans will do. 

But the fear of the singularity remains.

Amazon's fulfillment centers use around 100,000 robots to bring products to people who are still better at packing them for shipping. Those clever robots still have trouble with simple human tasks like picking up things with their end effectors (hands).

The word multiplicity actually makes me think of a comedy film with Michael Keaton. In that Multiplicity, an overly busy human is able to clone himself multiple times in order to get done all the things he wants to do and still have time to live a life with his family.  

An update of that 1996 film would probably change cloning to robots. 

And that has really been the ultimate goal with AI and robots - to empower humans, not replace them. But the job-killing robot scenario is a tough one to dispel and you can find examples of jobs that disappear because of automation. San Francisco is supposedly considering a tax on robots that replace human workers.

Long before robots, automation threatened and replaced some human labor. The transition to common robot and AI use in our lives will likely be more gradual.

Yes, Westworld is scary, both in how the robots interact with humans, and in how the humans treat the robots.

When the singularity does arrive, make sure you know how to power down that robot.


ELIZA and Chatbots

sheldonI first encountered a chatterbot, it was ELIZA on the Tandy/Radio Shack computers that were in the first computer lab in the junior high school where I taught in the 1970s.

ELIZA is an early natural language processing program that came into being in the mid-1960s at the MIT Artificial Intelligence Laboratory. The original was by Joseph Weizenbaum, but there are many variations on it.

This was very early artificial intelligence. ELIZA is still out there, but I have seen a little spike in interest because she was featured in an episode of the TV show Young Sheldon. The episode, "A Computer, a Plastic Pony, and a Case of Beer," may still be available at Sheldon and his family become quite enamored by ELIZA, though the precocious Sheldon quickly realizes it is a very limited program.

ELIZA was created to demonstrate how superficial human to computer communications was at that time, but that didn't mean that when it was put on personal computers, humans didn't find it engaging. Sure, kids had fun trying to trick it or cursing at it, but after awhile you gave up when it started repeating responses.

The program in all the various forms I have seen it still uses pattern matching and substitution methodology. She (as people often personified ELIZA), gives canned responses based on a keyword you input. If you say "Hello," she has a ready response. If you say "friend," she has several ways to respond depending on what other words you used. Early users felt they were talking to "someone" who understood their input.

ELIZA was one of the first chatterbots (later clipped to chatbot) and a sample for the Turing Test. That test of a machine's ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human, is not one ELIZA can pass by today's standards. ELIZA fails very quickly if you ask her a few complex questions.

The program is limited by the scripts that are in the code. The more responses you gave her, the more variety there will be in her answers and responses. ELIZA was originally written in MAD-Slip, but modern ones are often in JavaScript or other languages. Many variations on the original scripts were made as amateur coders played around with the fairly simple code.

One variation was called DOCTOR and was made to be a crude Rogerian psychotherapist who likes to "reflect" on your questions by turning the questions back at the patient.  This was the version that my students when I taught middle school found fascinating and my little programming club decided to hack the code and make their own versions.

Are chatbots useful to educators?  They have their uses, though I don't find most of those applications to be things that will change education in ways I want to see it change. I would like to see them used for things like e-learning support and language learning

If you want to look back at an early effort, you can try a somewhat updated version of ELIZA that I used in class at my NJIT website. See what ELIZA's advice for you turns out to be.


Wizards Unite in Augmented Reality

The Wizarding World of Harry Potter: This Way To Hogwarts

Remember all the coverage in summer 2016 around Pokémon Go?  It was a big success for Niantic Labs. It was a great pairing of game design, location-based augmented reality mobile experience with some intellectual property that had a solid fan base. But not much happened in the popular space with AR since then.

I am not going out on a limb to predict that the big AR title for 2018 will probably be Harry Potter: Wizards Unite, an AR title being co-developed by Niantic and Warner Bros. Interactive's Portkey Games.

Harry Potter has a bigger fan base than the original Pokémon and author J.K. Rowling has kept a close watch on the quality of things based on her Wizarding World. Using mobile phones and AR for a scavenger hunt in our real Muggle world and using that phone to cast spells, and find objects, fantastic beasts and characters from the book series is very likely to give Niantic another hit.  

Some people touted Pokémon Go for getting kids outside as they wandered neighborhoods, parks and other places. Some people complained that these kids were tramping around their property. 

This gaming use of AR with kids (and some older kids) is certainly wonderful preparation for more serious marketing use of AR for shopping experiences, as well as for virtual tours in museums and other more serious applications.

Niantic raised $30 million in funding for Pokémon Go. This time they have $200 million in a funding round, from investors for Wizards Unite.  That kind of money will mean work as well as a few Aberto and Alohomora  spells at opening the AR money door.

Immersive Learning Spaces

CAEE Immersive Classroom Concept

Immersive learning spaces will make use of augmented and virtual reality (AR and VR) but most attention on those technologies are around consumer use, especially gaming. What will be the other markets? Is education one of those markets?

Microsoft has been pushing its HoloLens AR headset as an enterprise product, but only in industrial applications. Ford, for example, is using HoloLens headsets to improve its design process, allowing modifications of both its clay models and real cars to be viewed and modified on the fly, without having to re-sculpt or rebuild anything. ThyssenKrupp has been equipping service technicians with HoloLens headsets that show the faults they're trying to diagnose. Engineers remotely can can annotate the physical infrastructure technicians are seeing and guide maintenance and repairs.

A recent EDUCAUSE article predicts that in another decade, "immersive technology will become nearly ubiquitous and virtually unnoticeable, embodied in our eyeglasses and other wearable devices. But before we get there, we have the exciting opportunity to build our understanding of pedagogical frameworks, design new physical and virtual learning spaces, and create transformative learning experiences with immersive technologies."       

VR and AR are found in some makerspaces in libraries and media centers, but thinking more creatively about their use in the design of learning spaces is still at an early stage.

Innovative spaces include both formal and informal opportunities for learning. Some of this requires physical spaces, but it also includes simple design choices such as offering a swivel chair for 360 degree viewing. 

For education, pricing is an important factor for adoption and VR headset pricing is slowly but surely approaching costs that will make them more attractive for schools.

VR and AR: Transforming Learning and Scholarship in the Humanities and Social Sciences

Virtual Reality Devices – Where They Are Now and Where They’re Going

VR and AR: Driving a Revolution in Medical Education & Patient Care

AR and VR in STEM: The New Frontiers in Science

Humans Learning About Machines Learning

GoDeep learning might sound like that time when we get really serious about what we are thinking about, and go deeper into the subject and learning. But it is not about the human brain. It is about machine learning. Also known as deep structured learning or hierarchical learning, it is part of the study of machine learning methods. It is about machines getting smarter on their own as they complete tasks.

The theories do look at biological nervous systems as models. Neural coding attempts to define a relationship between various stimuli and associated neuronal responses in the brain The terms used are many. Deep learning architecture, deep neural networks, deep belief networks and recurrent neural networks are all labels used in computer vision, speech recognition, natural language processing, audio recognition, social network filtering, machine translation, bioinformatics and drug design. That means a machine is producing results instead of human experts.

Google's artificial intelligence software, DeepMind, has gotten a fair amount of press coverage. It has the ability to teach itself many things, including how to walk, jump, and run. In the press, it will defeat the world's best player of the Chinese strategy game, Go, but deep learning is more serious than that.

You can take a free, 3-month course on Deep Learning offered through Udacity, taught by Vincent Vanhoucke, the technical lead in Google's Brain team.

Machine learning is a fast-growing and exciting field of study and deep learning is at its "bleeding edge. This course is considered to be an "intermediate to advanced level course offered as part of the Machine Learning Engineer Nanodegree program. It assumes you have taken a first course in machine learning, and that you are at least familiar with supervised learning methods."