Farewell to Curbs and Other Unexpected Uses of Technology

curbDevin Liddell is Chief Futurist at Teague, a firm that specializes in design for transportation. He thinks about how technology and design will change mobility. An article on Geekwire.com that I saw via Amber MacArthur's newsletter discussed a few of those changes.

The one that surprised me the most was about curbs - that quite old and established way to separate the street from the sidewalk. In 19th century cities, they helped keep walkers from stepping in manure from horse-drawn carriages. But in the 21st century, Liddell says, “The curb as a fixed, rigid piece of infrastructure isn’t going to work.” He thinks there is a role for design in creating a more dynamic understanding of curbs. Nuanced with signage can change curb spaces from no parking to emergency-only to pay-by-the-hour parking.

A suburban curbside may not be an issue, but in cities and at airports, they are problem areas.

Liddell references Coord which is the urban planning spinout of Google/Alphabet’s Sidewalk Labs. Can you believe they have Open Curb Data that maps the use of city curbs. Self-described "Coord makes it easy to analyze, share, and collect curb data. Curbside management now includes better compliance, safety, and efficiency for communities of all sizes."

Curb data? Really?

EdTech 1994

Apple IIeIn 1994, I was teaching at a suburban middle school. The first computer I had in my classroom was an Apple IIe (sometimes stylized as ][e) with its 128k floppy disk versions of word processing, database and spreadsheet (bundled as AppleWorks). It worked well and I am still amazed at what it could do without a hard drive and with those floppy disks. I used it to create lesson plans and handouts with my dot-matrix Apple printer, and students in my homeroom loved to play the many MECC games that we received as a subscription, like Oregon Trail and Odell Lake on it.

The Apple ][e wasn't the first computer students had access to in school. Our first computer class and lab was built using the Radio Shack Tandy TRS-80 computers. I didn't have on in my classroom, but I learned to write programs using BASIC and made a vocabulary flashcard game using the vocabulary lists I was having students study in my English classes. It was a very basic BASIC game but students loved it because it was personalized to their school life.

But videogame consoles were also entering their suburban homes and my TRS-80 and Apple floppy disk games soon became crude or quaint to students who had better systems at home. 

The next computers in school and the one in my classroom were IBM or IBM-clones and in the 1994-95 school year. They were running Windows NT 3.5, an operating system developed by Microsoft that was released on September 21, 1994. It was a not-user-friendly operating system and students didn't like it or really use it with me.

Windows 95It wouldn't be until the summer of 1995 and the next academic year that Windows 95 would be released. That much more friendly and consumer-oriented operating system made a significant change in computer use. The biggest change was its graphical user interface (GUI) and its simplified "plug-and-play" features. There were also major changes made to the core components of the operating system, such a 32-bit preemptive multitasking architecture.

The Today Show’s Katie Couric and Bryant Gumbel didn't have a clue about the Internet in January 1994. It is amusing to hear them ponder what the heck that @ symbol means Gumbel isn't even sure how you pronounce it and Katie suggests “about.”  No one wants to have to say “dot” when they read “.com”.

They have many questions: Do you write to it like mail but it travels like a phone call? Is it just colleges that have it? Gumbel bemoans that anyone can send him email - somehow forgetting that anyone could send him snail mail too.

It would only be ten years later that Google would make its IPO and it was only another few years before this Internet would go from obscurity to mainstream.

Macintosh 1984You might think that after using the Apple IIe in school we would have "upgraded" to Apple's Macintosh computer which was introduced in 1984 with a memorable Orwellian-themed commercial (see below). The original Macintosh is usually credited as being the first mass-market personal computer that featured a graphical user interface and a built-in screen and using a mouse. More obscure was the Sinclair QL which actually beat the Mac to market by a month but didn't capture a market. Apple sold the Macintosh alongside the Apple II family of computers for almost ten years before they were discontinued in 1993.

Of course, there was other "technology" in classrooms at that time. For example, VHS videotapes were wiping out the 16mm films and projectors and recording video was big. I was teaching a freshman "film and video" course in those days. But it was the personal computer and then the Internet that really changed educational technology in the mid-1990s.

The Web and the Internet

networkWe don't hear the term "World Wide Web" (WWW or www) or the shortened "Web" used to mean the Internet (contraction of interconnected network) as a whole much any more. We do hear a lot about websites, web content and other usages of the term.

For a time - and maybe still today - people have seen the Internet and the World Wide Web as the same thing. They are definitely closely linked, but are different systems.

The Internet is the enormous network of computers that are all connected together, and the Web is a collection of webpages found on this network. Web resources are identified by a Uniform Resource Locators (URL), such as https://www.serendipity35.net.

The first and oldest (1985) registered .com domain name on the Internet is http://symbolics.com - now home to the Big Internet Museum. In 1985, there were 6 registered domains on the web; by 1992, there were about 15,000. After that, the web boomed and by 2010, there were 84 million separate domains. Today, it is more than 330 million.

Tim Berners-Lee invented the World Wide Web in 1989 and wrote the first web browser in 1990 while employed at CERN. The first web browser, called WorldWideWeb, was invented in 1990 by Berners-Lee who then recruited Nicola Pellow to write the Line Mode Browser, which displayed web pages on dumb terminals. That browser was released outside CERN in 1991 to other research institutions. 

1993 saw the release of Mosaic, credited as "the world's first popular browser" with its innovation of a graphical interface. Browsers certainly made the Internet boom of the 1990s happen. Marc Andreessen was the leader of the Mosaic team, but started his own company, Netscape, which released the Netscape Navigator browser in 1994 which then overtook Mosaic (which it was based one) as the most popular browser.

The World Wide Web is the way almost all of us interact on the Internet, though it is possible to access and use the web without a browser and that is how it is used by many research institutions.

I have assigned students as a research topic the forerunners of the Internet and the Web. Berners-Lee is the start of the Web, but we can find people and concepts that precede it.

Considering the concept of the web, one person you will find goes back to 1934. Belgian lawyer and librarian Paul Otlet came to the idea that the physical wires and radio waves that were then the high tech that was connecting the world could be used for more than just entertainment.

His concept was of a “mechanical, collective brain.” My students who chose Otlet saw connections from his work to today's work on web infrastructures such as the semantic Web and browsers. Some consider Otlet to be the father of information science .

 

This Business of Predicting: EdTech in 2019

crystal ballAs the year comes to an end, you see many end-of-year summary articles and also a glut of predictions for the upcoming year. I'm not in the business of predicting what will happen in education and technology, but I do read those predictions - with several grains of salt. 

“A good science fiction story should be able to predict not the automobile but the traffic jam.” wrote sci-fi author Frederik Pohl.

Many of the education and technology predictions I see predict things rather than the impact those things will have. Here are some that I am seeing repeated, so that you don't have to read them all, but can still have an informed conversation at the holiday party at work or that first department meeting in January.

If you look at what the folks at higheredexperts.com are planning for 2019 just in the area of higher ed analytics.

Is "augmented analytics" coming to your school? This uses machine learning (a form of artificial intelligence) to augment how we develop, consume and share data. 

And IT analyst firm Gartner is known for their top trends reports. For 2019, one that made the list is "immersive user experience." This concept concerns what happens when human capabilities mix with augmented and virtual realities. Looking at the impact of how that changes the ways we perceive the real and digital world is what interests me.

We are still at the early stages of using this outside schools (which are almost always behind the world in general). You can point to devices like the Amazon Alexa being used in homes to turn on music, lights, appliances or tell us a joke, This is entry-level usage. But vocal interaction is an important change. A few years ago it was touch screen interactions. A few decades before it was the mouse and before that the keyboard. A Gartner video points at companies using remote assistance for applications such as an engineer working with someone in a remote factory to get a piece of equipment back online.

Will faculty be able to do augmented analytics using an immersive user experience? Imagine you can talk to the LMS you use to teach your course and you can ask, using a natural language interface, and ask " Which students in this new semester are most likely to have problems with writing assignments?" The system scans the appropriate data sets, examines different what-if scenarios and generates insights. Yes, predictive analytics is already here, but it will be changing.

But are IT trends also educational technology trends? There is some crossover.

Perhaps, a more important trend to watch for as educators for next year is changing our thinking from individual devices (and the already too many user interfaces we encounter) to a multimodal and multichannel experience.

Multimodal connects us to many of the edge devices around them. It is your phone, your car, your appliances, your watch, the thermostat, your video doorbell, the gate in a parking lot and devices you encounter at work or in stores.

Multichannel mixes your human senses with computer senses. This is when both are monitoring things in your environment that you already recognize, like heat and humidity, but also things we don't sense like Bluetooth, RF or radar. This ambient experience means the environment will become the computer.

One broad category is "autonomous things" some of are around us and using AI. There are autonomous vehicles. You hear a lot about autonomous cars and truck, but you may be more likely to encounter an autonomous drones. Will learning become autonomous? That won't be happening in 2019.

AI-driven development is its own trend. Automated testing tools and model generation is here and AI-driven automated code generation is coming.

Of course, there is more - from things I have never heard of (digital twins) to things that I keep hearing are coming (edge computing) and things that have come and seem to already have mixed reviews (blockchain).

EducationDive.com has its own four edtech predictions for colleges: 

Digital credentials gain credibility - I hope that's true, but I don't see that happening in 2019.  

Data governance grows  - that should be true if their survey has accurately found that 35% of responding institutions said they don't even have a data governance policy - a common set of rules for collecting, accessing and managing data.

Finding the ROI for AI and VR may be what is necessary to overcome the cost barrier to full-scale implementation of virtual and augmented reality. AI has made more inroads in education than VR. An example is Georgia State University's Pounce chatbot.

Their fourth prediction is institutions learning how to use the blockchain. The potential is definitely there, but implementation is challenging. 

Predictions. I wrote elsewhere about Isaac Newton's 1704 prediction of the end of the world. He's not the first or last to predict the end. Most have been proven wrong. Newton - certainly a well respected scientist - set the end at or after 2060 - but not before that. So we have at least 41 years to go.

Using some strange mathematical calculations and the Bible's Book of Revelation, this mathematician, astronomer, physicist came to believe that his really important work would be deciphering ancient scriptures. 

I'm predicting that Newton was wrong on this prediction. He shouldn't feel to bad though because I guesstimate that the majority of predictions are wrong. But just in case you believe Isaac, you can visualize the end in this document from 1486.