Machine Learning :: Human Learning

AI - “artificial intelligence” - was introduced at a science conference at Dartmouth University in 1956. Back then it was a theory, but in the past few decade it has become something beyond theoretical. been less theory and more in practice than decades before.

The role of AI in education is still more theory than practice.

A goal in AI is to get machines to learn. I hesitate to say "think" but that is certainly a goal too. I am reading The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution currently and in that history there is a lot of discussion of the people trying to get machines to do more than just compute (calculate) but to learn from its experiences without requiring a human to program those changes. The classic example is the chess playing computer that gets better every time it wins or loses. Is that "learning?"

But has it had an impact on how you teach or how your students learn?

It may have been a mistake in the early days of AI and computers that we viewed the machine as being like the human brain. It is - and it isn't.

But neuroscientists are now finding that they can also discover more about human learning as a result of machine learning. An article on opencolleges.edu.au points to several interesting insights from the machine and human learning research that may play a role in AI in education.

One thing that became clear is that physical environment is something humans learn easier than machines. After a child has started walking or opened a few doors or drawers or climbed a few stairs, she learns how to do it. Show her a different door, drawer, or a spiral staircase and it doesn't make much of a difference. A robot equipped with some AI will have a much steeper learning curve to learn these simple things. It also has a poor sense of its "body." Just watch any videos online of humanoid robots trying to do those things and you'll see how difficult it is for a machine.


Then again, it takes a lot longer for humans to learn how to drive a car on a highway safely. And even when it is learned, our attention, or lack thereof, is a huge problem. AI in vehicles is learning how to drive fairly rapidly, and its attention is superior to human attention. Currently, it is still a fall back human error in mist cases, but that will certainly change in a decade or two. I learned to parallel park a car many years ago and I am still lousy at doing it. A car can do it better than me.

Although computers can do tasks they are programmed to do without any learning curve, for AI to work they need to learn by doing - much like humans. The article points out that AI systems that traced letters with robotic arms had an easier time recognizing diverse styles of handwriting and letters than visual-only systems. 

AI means a machine gets better at a task the more it does it, and it can also apply that learning to similar but not identical situations. You can program a computer to play notes and play a series of notes as a song, but getting it to compose real music requires AI.

Humans also learn from shared experiences. A lot of the learning in a classroom comes from interactions between the teacher and students and student to student. This makes me feel pretty confident in the continued need for teachers in the learning process.

One day, I am sure that machines will communicate with each other and learn from each other. This may be part of the reason that some tech and learning luminaries like Elon Musk have fears about AI

I would prefer my smart or autonomous vehicle to "talk" to other vehicles on the roads nearby and share information on traffic, obstructions and vehicles nearby with those quirky human drivers only.

AI built into learning systems, such as an online course, could guide the learning path and even anticipate problems and offer corrections to avoid them. Is that an AI "teacher" or the often-promoted "guide on the side?"

This year on the TV show Humans, one of the human couples goes for marriage counseling with a "synth" (robot). She may be a forerunner of a synth teacher.

Humans TV
The counselor (back to us) can read the husband's body language and knows he does not like talking to a synth marriage counselor.

 

Virtual Reality Education and Flying Cars

holodeck

The Holodeck

People love to use the prediction that we would all be using flying cars by the 21st century as an example of a future technology that never happened. Remember how virtual reality and the augmented reality was going to change everything? So far, it's not.

Last summer, Pokemon Go was huge and even though many people would dismiss it as a silly game, it was AR and seemed like it might change gaming and who knows what else. The promise, or perhaps more accurately the potential, of VR in education is also a popular topic. 

We know that the Internet enabled students to access materials from other institutions and to travel to distant places for their research. Virtual reality may one day change the ways in which we teach and learn. That has me thinking about "virtual reality education" - something I imagine to be unbound by physical spaces like classrooms or campuses and time.That sounds like online learning, but it would be beyond the online learning.

Remember the "holodeck?" Originally, it was a set from the television series Star Trek where the crew could engage with different virtual reality environments. It came back into my view with Janet Murray's book Hamlet on the Holodeck: The Future of Narrative in Cyberspace. She considered whether the computer might provide the basis for an expressive narrative form in the way that print technology supported the development of the novel. In the 20th century, film technology supported the development of movies. 

And remember virtual worlds like Second Life and Active Worlds? I knew a number of educators and schools that made a real commitment to its use in education. I don't know of any of them that are still using virtual worlds.

I'm hopeful that VR, AR, or some version of a holodeck or virtual world will some day enhance education, but so far, I'm still operating in Reality Reality.

Swarming Artificial Intelligence

Unanimous A.I.  http://unanimous.ai has developed what they call Artificial Swarm Intelligence, the UNU swarming platform and the Swarm Insight on-demand intelligence service. The UNU platform allows online groups to form real-time “human swarms” in order to tap collective knowledge, wisdom, and intuition. Distributed online groups can answer questions and make predictions in real-time.

We have been talking about the wisdom of crowds for more than a decade. In 2004, The Wisdom of Crowds (with its very long subtitle "Why the Many Are Smarter Than the Few and How Collective Wisdom Shapes Business, Economies, Societies and Nations") popularized the idea. The term collective intelligence (CI) is also used for shared or group intelligence that emerges from the collaboration and of collective efforts of many individuals. This also leads to consensus decision making.
CI appears in sociobiology, political science as well as crowdsourcing applications.

You'll also hear the term collective IQ, which is a measure of collective intelligence. We shouldn't be overly proud of our use of this since collective intelligence has also been attributed to bacteria and animals.

Unanimous A.I. takes this idea of the wisdom of the crowd's collective opinion rather than that of a single expert. They use this amplified intelligence to generate decisions, predictions, estimations, and forecasts, which can be seen in events such as the Kentucky Derby, the Oscars, the Stanley Cup, Presidential Elections, and the World Series. They call the technology Swarm AITM using algorithms and interfaces modeled after swarms in nature. 

Don't Fire the Humans Yet

human intelligence
AI is the thing everyone wants to use. Social media is in love with artificial intelligence. Of course, much as the cry went up when computers first appeared, some people say that "AI will take our jobs."
Facebook has almost 2 billion users. Those users post a lot of content. Mark Zuckerberg has made it clear that live video is a big part of the future of Facebook. But the company has come in for a lot of criticism for violent video posted this year, including murder and suicide.
How does Facebook (and other social media companies) decide what content violates its community standards? They all are desperately implementing and experimenting with AI, but they still rely mostly on humans.
Facebook announced recently that it is using an AI system designed to identify users contemplating suicide or self-harm. How? By using pattern recognition to determine if a post and its comments resemble previous posts identified as being about self-harm. Facebook is also including clearer options for reporting posts that appear to indicate self-harm. It is people reporting to people who determine inappropriateness.
AI-based image-recognition tools that users can use are assisting human moderators now. Can the 54,000 potential cases of sexually related extortion and revenge porn reportedly posted each month can be found and deleted by AI? Not yet. 
Did you see the film Hidden Figures ? In the early 1960s, the mathematicians working at NASA were called "computers" - people who did computations. But those human computers also saw the entry of IBM mainframes into NASA that were better  computers. They realized they would need to become the humans who could program those electronic computers if they wanted to keep working. Take note Facebook and other companies - and anyone who wants to work for those companies: AI requires human intelligence. 
After 2 murders were broadcast live on Facebook in April. Mark Zuckerberg announced that the company would add 3000 employees to the already 4500 employees who work on their Community team reviewing reports on videos. Live video is growing rapidly online, and Facebook Live is a service with 1.9 billion monthly users to broadcast video. Lawmakers in Germany and the UK have also been pressuring social networks to better remove illegal hate speech and clamp down on fake news. The 3000 new workers will monitor all Facebook content not just live videos. This team would operate around the world and will most likely be virtual contract employees.
Just last week, Facebook's "leaked" guidelines for dealing with these types of situations became public that hopefully can make a big difference in preventing suicide and other life-threatening situations.  

Autonomous Vehicles and Autonomous Learning

autonomous car

One of the newer categories on this blog is for VR, AR and AI. They were not topics of much concern in education when I started writing here in 2006. They are topics of interest now. 

The same may be true of autonomous vehicles and it is definitely true of what I'm calling autonomous learning

You are more likely to hear news about "autonomous vehicles" rather than "driverless cars" these days. They are pretty much interchangeable, but the former doesn't sound as scary. In the way that "global warming" was replaced with "climate change," the newer terms are not only better in public relations terms but also are more accurate.

An autonomous vehicle (AKA driverless, auto, self-driving, robotic) is one that is capable of sensing its environment and navigating without human input. Many such vehicles are being developed, but as of this writing vehicles on public roads are not yet fully autonomous.

Many of the experimental cars and trucks you might see on the road (or, more likely, on the news) have a human along for the ride and ready to take over if needed. Initially we all heard about this future where you would get in a car, tell it your destination and sit back and relax. It was a taxicab without a driver. But more and more we are hearing about the autonomous vehicle with no human in it that might be delivering packages to locations. (No word on how they are unloaded. I guess you meet the vehicle at the curb.)

I was talking to a friend who has no involvement in education about an online course I was teaching and how MOOCs are being used. He said, "So, it's like an autonomous vehicle."

My first response was "No, its not," but when I gave the idea a few moments, I saw his point.

You set up a good online course. It has AI elements and guided learning, predictive analytics and all the other tools. The student enters and goes along on their own. Autonomously. Teacherless.

Some archived MOOCs are already somewhat like this - though probably minus the AI and guidance systems.

I call this autonomous learning. If you search on that term today you are more likely to find articles about learner autonomy. This refers to a student's ability to set appropriate learning goals and take charge of his or her own learning. However, autonomous learners are dependent upon teachers to create and maintain learning environments that support the development of learner autonomy.

My friend and I took the vehicles:learner comparison further. The mixed or hybrid car will probably be with us for a few more decades. By hybrid I mean not only with its fuel but also with driver-assist features. Part of the redundancy there includes the passenger as backup driver - a guide on the side. The car can park itself, but you might need to help in some situations.

Hybrid or blended courses are also going to continue to be around for awhile. Like the vehicles, the fully-automated course will be the experimental exception for a decade or two. But those kids in the college Class of 2037 have a very good chance of taking autonomous classes.

I will feel safe on the road with autonomous vehicles when ALL the vehicles are autonomous. Throw a few human drivers in there and the reliability drops. Do I feel the same about autonomous learning? Too early to say.


Exploring Virtual and Augmented Reality in Learning


Virtual reality, like rock n’ roll, is not something that can be described well. It must be experienced in order to be fully appreciated and understood.

Interestingly, it has been catching on among educators.

Since 2013, Emory Craig, Director of eLearning at the College of New Rochelle, and Maya Georgieva, Co-Founder and Chief Innovation Officer of Digital Bodies, have been presenting workshops on the topic. They’re working with developers, researchers and educators who are embracing the immersive learning technology, which seems to be on the cusp of widespread use...as well as being on the receiving end of a lot of hype.

Around the time Craig and Georgieva began exploring this emergent medium, the arrival of Google Glass seemed to have ushered in greater popularity. Georgieva was one of the educators to experiment with Google Glass. People suddenly had a wearable ideal of what could be tapped to create an augmented reality (AR) or virtual reality (VR). The much-heralded yet now all-but-defunct product left its mark, as several key technological developments have sprung up to satisfy a new market.

One key development also came from the Internet giant: Google Cardboard. An accessible solution that was ‘easy to get into the hands of educators,’ Georgieva noted, it has helped to generate interest in the use of VR in the learning environment. With only a smartphone app and the inexpensive piece of cardboard, students can be transported to other worlds...




continue reading... "Outside the Boundaries: Exploring Virtual and Augmented Reality in Learning" by Kristi DePaul