Event-Based Internet

Event-based Internet is going to be something you will hear more about this year. Though I had heard the term used, the first real application of it that I experienced was a game. But don't think this is all about fun and games. Look online and you will find examples of event-based Internet biosurveillance and event-based Internet robot teleoperation systems and other very sophisticated uses, especially connected to the Internet of Things (IoT).

HQWhat did more than a million people do this past Sunday night at 9pm ET? They tuned in on their mobile devices to HQ Trivia, a game show, on their phones.  

For a few generations that have become used to time-shifting their viewing, this real-time game is a switch. 

The HQ app has had early issues in scaling to the big numbers with game delays, video lag and times when the game just had to be rebooted. But it already has at least one imitator called "The Q" which looks almost identical in design, and imitation is supposed to be a form of flattery.

This 12-question trivia quiz has money prizes. Usually, the prize is $2000, but sometimes it jumps to $10 or $20K. But since there are multiple survivors of the 12 questions that win, the prizes are often less than $25 each.

Still, I see the show's potential (Is it actually a "show?") Business model? Sponsors, commercial breaks, sponsors and product placement in the questions, answers and banter in-between questions.

The bigger trend here is that this is a return to TV "appointment viewing."  Advertisers like that and it only really occurs these days with sports, some news and award shows. (HQ pulled in its first audience of more than a million Sunday during the Golden Globe Awards, so...) 

And is there some education connection in all this?  Event-based Internet, like its TV equivalent, is engaging. Could it bring back "The Disconnected" learner?  

I found a NASA report on "Lessons Learned from Real-Time, Event-Based Internet Science Communications."  This report is focused on sharing science activities in real-time in order to involve and engage students and the public about science.

Event-based distributed systems are being used in areas such as enterprise management, information dissemination, finance,
environmental monitoring and geo-spatial systems.

Education has been "event-based" for hundreds of years. But learners have been time-shifting learning via distance education and especially via online learning for only a few decades. Event-based learning sounds a bit like hybrid or blended learning. But one difference is that learners are probably not going to tune in and be engaged with just a live lecture. Will it take a real event and maybe even gamification to get live learning? 

In all my years teaching online, I have never been able to have all of a course's student attend a "live" session either because of time zone differences, work schedules or perhaps content that just wasn't compelling enough.

What will "Event-based Learning" look like?

Edge Computing

I learned about edge computing a few years ago. It is a method of getting the most from data in a computing system by performing the data processing at the "edge" of the network. The edge is near the source of the data, not at a distance. By doing this, you reduce the communications bandwidth needed between sensors and a central datacenter. The analytics and knowledge generation are right at or near the source of the data.

The cloud, laptops, smartphones, tablets and sensors may be new things but the idea of decentralizing data processing is not. Remember the days of the mainframe computer?

The mainframe is/was a centralized approach to computing. All computing resources are at one location. That approach made sense once upon a time when computing resources were very expensive - and big. The first mainframe in 1943 weighed five tons and was 51 feet long. Mainframes allowed for centralized administration and optimized data storage on disc.

Access to the mainframe came via "dumb" terminals or thin clients that had no processing power. These terminals couldn't do any data processing, so all the data went to, was stored in, and was crunched at the centralized mainframe.

Much has changed. Yes, a mainframe approach is still used by businesses like credit card companies and airlines to send and display data via fairly dumb terminals. And it is costly. And slower. And when the centralized system goes down, all the clients go down. You have probably been in some location that couldn't process your order or or access your data because "our computers are down."

It turned out that you could even save money by setting up a decentralized, or “distributed,” client-server network. Processing is distributed between servers that provide a service and clients that request it. The client-server model needed PCs that could process data and perform calculations on their own in order to have applications to be decentralized. 

Google car

Google Co-Founder Sergey Brin shows U.S. Secretary of State John Kerry the computers inside one of
Google's self-driving cars - a data center on wheels. June 23, 2016. [State Department photo/ Public Domain]

Add faster bandwidth and the cloud and a host of other technologies (wireless sensor networks, mobile data acquisition, mobile signature analysis, cooperative distributed peer-to-peer ad hoc networking and processing) and you can compute at the edge.  Terms like local cloud/fog computing and grid/mesh computing, dew computing, mobile edge computing, cloudlets, distributed data storage and retrieval, autonomic self-healing networks, remote cloud services, augmented reality and more that I haven't encountered yet have all come into being.

Recently, I heard a podcast on "Smart Elevators & Self-Driving Cars Need More Computing Power" that got me thinking about the millions of objects (Internet of Things) connecting to the Internet now. Vehicles, elevators, hospital equipment, factory machines, appliances and a fast-growing list of things are making companies like Microsoft and GE put more computing resources at the edge of the network. 

This is computer architecture for people not things. In 2017, there were about 8 billion devices connect to the net. It is expected that in 2020 that number will be 20 billion. Do you want the sensors in your car that are analyzing traffic and environmental data to be sending it to some centralized resource - or doing it in your car? Milliseconds matter in avoiding a crash. You need the processing to be done on the edge. Cars are "data centers on wheels." 

Remember the early days of the space program? All the computing power was on Earth. You have no doubt heard the comparison that the iPhone in your pocket has hundreds or even thousands of times the computing power of the those early spacecraft. That was dangerous, but it was the only option. Now, much of the computing power is at the edge - even if the vehicle is also at the edge of our solar system. And things that are not as far off as outer space - like a remote oil pump - also need to compute at the edge rather than needing to connect at a distance to processing power. 

Plan to spend more time in the future at the edge.

Monetizing Your Privacy

data

Data is money. People are using your data to make money. What if you could sell, rather than give away, your private data? Is it possible that some day your data might be more valuable than the thing that is supplying your data?

John Ellis deals with big data and how it may change business models. He was Ford Motor Company’s global technologist and head of the Ford Developer Program, so cars are the starting place for the book, but beyond transportation, insurance, telecommunications, government and home building are all addressed. His book, The Zero Dollar Car: How the Revolution in Big Data will Change Your Life, is not as much about protecting our data as users, as it is about taking ownership of it. In essence, he is suggesting that users may be able to "sell" their data to companies (including data collectors such as Google) in exchange for free or reduced cost services or things.

I'm not convinced this will lead to a free/zero dollar car, but the idea is interesting. You are already allowing companies to use your data when you use a browser, shop at a website, use GPS on your phone or in a car device. The growth of the Internet of Things (IoT) means that your home thermostat, refrigerator, television and other devices are also supplying your personal data to companies. And many companies, Google, Apple and Amazon are prime examples, use your data to make money. Of course, this is also why Google can offer you free tools and services like Gmail, Documents etc.

Ellis talks about a car that pays for itself with your use and data, but the book could also be the Zero Dollar House or maybe an apartment. Big technology companies already profit from the sale of this kind of information. Shouldn't we have that option?

Duly noted: the data we supply also helps us. Your GPS or maps program uses your route and speed to calculate traffic patterns and reroute or notify you. The health data that your Apple watch or fitness band uploads can help you be healthier, and in aggregate it can help the general population too.

I remember years ago when Google began to predict flu outbreaks in geographic areas based on searches for flu-related terms. If all the cars on the road were Net-enabled and someone was monitoring the ambient temperature and their use of windshield wipers, what could be done with that data? What does an ambient temperature of 28 F degrees and heavy wiper use by cars in Buffalo, New York indicate? Snowstorm. Thousands or millions of roaming weather stations. And that data would be very useful to weather services and companies (like airlines and shipping companies) that rely on weather data - and are willing to pay for that data.

Am I saying that you should give up your privacy for money or services? No, but you should have that option - and the option to keep all your data private.

Machine Learning :: Human Learning

AI - “artificial intelligence” - was introduced at a science conference at Dartmouth University in 1956. Back then it was a theory, but in the past few decade it has become something beyond theoretical. been less theory and more in practice than decades before.

The role of AI in education is still more theory than practice.

A goal in AI is to get machines to learn. I hesitate to say "think" but that is certainly a goal too. I am reading The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution currently and in that history there is a lot of discussion of the people trying to get machines to do more than just compute (calculate) but to learn from its experiences without requiring a human to program those changes. The classic example is the chess playing computer that gets better every time it wins or loses. Is that "learning?"

But has it had an impact on how you teach or how your students learn?

It may have been a mistake in the early days of AI and computers that we viewed the machine as being like the human brain. It is - and it isn't.

But neuroscientists are now finding that they can also discover more about human learning as a result of machine learning. An article on opencolleges.edu.au points to several interesting insights from the machine and human learning research that may play a role in AI in education.

One thing that became clear is that physical environment is something humans learn easier than machines. After a child has started walking or opened a few doors or drawers or climbed a few stairs, she learns how to do it. Show her a different door, drawer, or a spiral staircase and it doesn't make much of a difference. A robot equipped with some AI will have a much steeper learning curve to learn these simple things. It also has a poor sense of its "body." Just watch any videos online of humanoid robots trying to do those things and you'll see how difficult it is for a machine.


Then again, it takes a lot longer for humans to learn how to drive a car on a highway safely. And even when it is learned, our attention, or lack thereof, is a huge problem. AI in vehicles is learning how to drive fairly rapidly, and its attention is superior to human attention. Currently, it is still a fall back human error in mist cases, but that will certainly change in a decade or two. I learned to parallel park a car many years ago and I am still lousy at doing it. A car can do it better than me.

Although computers can do tasks they are programmed to do without any learning curve, for AI to work they need to learn by doing - much like humans. The article points out that AI systems that traced letters with robotic arms had an easier time recognizing diverse styles of handwriting and letters than visual-only systems. 

AI means a machine gets better at a task the more it does it, and it can also apply that learning to similar but not identical situations. You can program a computer to play notes and play a series of notes as a song, but getting it to compose real music requires AI.

Humans also learn from shared experiences. A lot of the learning in a classroom comes from interactions between the teacher and students and student to student. This makes me feel pretty confident in the continued need for teachers in the learning process.

One day, I am sure that machines will communicate with each other and learn from each other. This may be part of the reason that some tech and learning luminaries like Elon Musk have fears about AI

I would prefer my smart or autonomous vehicle to "talk" to other vehicles on the roads nearby and share information on traffic, obstructions and vehicles nearby with those quirky human drivers only.

AI built into learning systems, such as an online course, could guide the learning path and even anticipate problems and offer corrections to avoid them. Is that an AI "teacher" or the often-promoted "guide on the side?"

This year on the TV show Humans, one of the human couples goes for marriage counseling with a "synth" (robot). She may be a forerunner of a synth teacher.

Humans TV
The counselor (back to us) can read the husband's body language and knows he does not like talking to a synth marriage counselor.

 

Is Education Ready to Connect to the Internet of Things?

IoT

I first encountered the term "Internet of Things" (IoT) in 2013. It is the idea that "things" (physical devices) would be connected in their own network(s). The talk was that things in your home, office and vehicles would be wirelessly connected because they were embedded with electronics, software, sensors, actuators, and network connectivity. Things would talk to things. Things would collect and exchange data.

Some of the early predictions seemed rather silly. Taking a tagged carton of milk out of the refrigerator and not putting it back would tell my food ordering device (such as an Amazon Echo) that I was out of milk. My empty Bluetooth coffee mug would tell the Keurig coffeemaker to make me another cup.

But the "smart home" - something that pre-dates the Internet - where the HVAC knew I was almost home and adjusted the temperature off the economical setting to my comfort zone and maybe put on the front light and started dinner, was rather appealing.

In 2014, the EDUCAUSE Learning Initiative (ELI) published its “7 Things You Should Know About the Internet of Things. The Internet of Things (and its annoying abbreviation of IoT) sounded rather ominous as I imagined them proliferating across our social and physical landscapes. The ELI report said “the IoT has its roots in industrial production, where machine-to-machine communication enabled the manufacture of complex items, but it is now expanding in the commercial realm, where small monitoring devices allow such things as ovens, cars, garage doors, and the human heartbeat to be checked from a computing device.”

Some of the discussions have also been about considerations of values, ethics and ideology, especially if you consider the sharing of the data gathered. 

As your watch gathers data about your activity, food intake and heart rate, it has valuable data about your health. I do this on my Fitbit with its app. Perhaps you share that with an online service (as with the Apple watch & Apple itself) in order to get further feedback information about your health and fitness and even recommendations about things to do to improve it. If you want a really complete analysis, you are asked (hopefully) to share your medications, health history etc. Now, what if that is shared with your medical insurer and your employer?

Might we end up with a Minority Report of predictive analytics that tell the insurance company and your employer whether or not you are a risk?

Okay, I made a leap there, but not a huge one. 

This summer, EDUCAUSE published a few articles on IoT concerning higher education and the collaboration required for the IoT to work. I don't see education at any level really making significant use of IoT right now, though colleges are certainly gathering more and more data about students. That data might be used to improve admissions. Perhaps, your LMS gathers data about student activity and inactivity and can use it to predict what students need academic interventions.

It's more of an academic challenge to find things that can be used currently.

History Lesson: Way back in 1988, Mark Weiser talked about computers embedded into everyday objects and called this third wave "ubiquitous computing." Pre-Internet, this was the idea of many computers, not just the one on your desk, for one person. Add ten years and in 1999, Keven Ashton posited a fourth wave which he called the Internet of Things.

Connection was the key to both ideas. It took another decade until cheaper and smaller processors and chipsets, growing coverage of broadband networks, Bluetooth and smartphones made some of the promises of IoT seem reasonable. 

Almost any thing could be connected to the Internet. We would have guessed at computers of all sizes, cars and appliances. I don't think things such as light bulbs would have been on anyone's list.

Some forecasters predict 20 billion devices will be connected by 2020; others put the number closer to 40-100+ billion connected devices by that time.

And what will educators do with this?