The Web and the Internet

networkWe don't hear the term "World Wide Web" (WWW or www) or the shortened "Web" used to mean the Internet (contraction of interconnected network) as a whole much any more. We do hear a lot about websites, web content and other usages of the term.

For a time - and maybe still today - people have seen the Internet and the World Wide Web as the same thing. They are definitely closely linked, but are different systems.

The Internet is the enormous network of computers that are all connected together, and the Web is a collection of webpages found on this network. Web resources are identified by a Uniform Resource Locators (URL), such as https://www.serendipity35.net.

The first and oldest (1985) registered .com domain name on the Internet is http://symbolics.com - now home to the Big Internet Museum. In 1985, there were 6 registered domains on the web; by 1992, there were about 15,000. After that, the web boomed and by 2010, there were 84 million separate domains. Today, it is more than 330 million.

Tim Berners-Lee invented the World Wide Web in 1989 and wrote the first web browser in 1990 while employed at CERN. The first web browser, called WorldWideWeb, was invented in 1990 by Berners-Lee who then recruited Nicola Pellow to write the Line Mode Browser, which displayed web pages on dumb terminals. That browser was released outside CERN in 1991 to other research institutions. 

1993 saw the release of Mosaic, credited as "the world's first popular browser" with its innovation of a graphical interface. Browsers certainly made the Internet boom of the 1990s happen. Marc Andreessen was the leader of the Mosaic team, but started his own company, Netscape, which released the Netscape Navigator browser in 1994 which then overtook Mosaic (which it was based one) as the most popular browser.

The World Wide Web is the way almost all of us interact on the Internet, though it is possible to access and use the web without a browser and that is how it is used by many research institutions.

I have assigned students as a research topic the forerunners of the Internet and the Web. Berners-Lee is the start of the Web, but we can find people and concepts that precede it.

Considering the concept of the web, one person you will find goes back to 1934. Belgian lawyer and librarian Paul Otlet came to the idea that the physical wires and radio waves that were then the high tech that was connecting the world could be used for more than just entertainment.

His concept was of a “mechanical, collective brain.” My students who chose Otlet saw connections from his work to today's work on web infrastructures such as the semantic Web and browsers. Some consider Otlet to be the father of information science .

 

This Business of Predicting: EdTech in 2019

crystal ballAs the year comes to an end, you see many end-of-year summary articles and also a glut of predictions for the upcoming year. I'm not in the business of predicting what will happen in education and technology, but I do read those predictions - with several grains of salt. 

“A good science fiction story should be able to predict not the automobile but the traffic jam.” wrote sci-fi author Frederik Pohl.

Many of the education and technology predictions I see predict things rather than the impact those things will have. Here are some that I am seeing repeated, so that you don't have to read them all, but can still have an informed conversation at the holiday party at work or that first department meeting in January.

If you look at what the folks at higheredexperts.com are planning for 2019 just in the area of higher ed analytics.

Is "augmented analytics" coming to your school? This uses machine learning (a form of artificial intelligence) to augment how we develop, consume and share data. 

And IT analyst firm Gartner is known for their top trends reports. For 2019, one that made the list is "immersive user experience." This concept concerns what happens when human capabilities mix with augmented and virtual realities. Looking at the impact of how that changes the ways we perceive the real and digital world is what interests me.

We are still at the early stages of using this outside schools (which are almost always behind the world in general). You can point to devices like the Amazon Alexa being used in homes to turn on music, lights, appliances or tell us a joke, This is entry-level usage. But vocal interaction is an important change. A few years ago it was touch screen interactions. A few decades before it was the mouse and before that the keyboard. A Gartner video points at companies using remote assistance for applications such as an engineer working with someone in a remote factory to get a piece of equipment back online.

Will faculty be able to do augmented analytics using an immersive user experience? Imagine you can talk to the LMS you use to teach your course and you can ask, using a natural language interface, and ask " Which students in this new semester are most likely to have problems with writing assignments?" The system scans the appropriate data sets, examines different what-if scenarios and generates insights. Yes, predictive analytics is already here, but it will be changing.

But are IT trends also educational technology trends? There is some crossover.

Perhaps, a more important trend to watch for as educators for next year is changing our thinking from individual devices (and the already too many user interfaces we encounter) to a multimodal and multichannel experience.

Multimodal connects us to many of the edge devices around them. It is your phone, your car, your appliances, your watch, the thermostat, your video doorbell, the gate in a parking lot and devices you encounter at work or in stores.

Multichannel mixes your human senses with computer senses. This is when both are monitoring things in your environment that you already recognize, like heat and humidity, but also things we don't sense like Bluetooth, RF or radar. This ambient experience means the environment will become the computer.

One broad category is "autonomous things" some of are around us and using AI. There are autonomous vehicles. You hear a lot about autonomous cars and truck, but you may be more likely to encounter an autonomous drones. Will learning become autonomous? That won't be happening in 2019.

AI-driven development is its own trend. Automated testing tools and model generation is here and AI-driven automated code generation is coming.

Of course, there is more - from things I have never heard of (digital twins) to things that I keep hearing are coming (edge computing) and things that have come and seem to already have mixed reviews (blockchain).

EducationDive.com has its own four edtech predictions for colleges: 

Digital credentials gain credibility - I hope that's true, but I don't see that happening in 2019.  

Data governance grows  - that should be true if their survey has accurately found that 35% of responding institutions said they don't even have a data governance policy - a common set of rules for collecting, accessing and managing data.

Finding the ROI for AI and VR may be what is necessary to overcome the cost barrier to full-scale implementation of virtual and augmented reality. AI has made more inroads in education than VR. An example is Georgia State University's Pounce chatbot.

Their fourth prediction is institutions learning how to use the blockchain. The potential is definitely there, but implementation is challenging. 

Predictions. I wrote elsewhere about Isaac Newton's 1704 prediction of the end of the world. He's not the first or last to predict the end. Most have been proven wrong. Newton - certainly a well respected scientist - set the end at or after 2060 - but not before that. So we have at least 41 years to go.

Using some strange mathematical calculations and the Bible's Book of Revelation, this mathematician, astronomer, physicist came to believe that his really important work would be deciphering ancient scriptures. 

I'm predicting that Newton was wrong on this prediction. He shouldn't feel to bad though because I guesstimate that the majority of predictions are wrong. But just in case you believe Isaac, you can visualize the end in this document from 1486.

Being Secure on Chrome

The Chrome browser’s “not secure” warning is meant to help you understand when the connection to the site you're on isn’t secure. It is also a bit of a shaming motivation to the site's owner to improve the security of their site. But that process of getting the httpS site is not really easy in some cases and for non-tech average web users. 

Google made a warning announcement nearly two years ago and there has been an increase in sites that are secured. They started by only marking pages without encryption that collect passwords and credit card info. Then they began showing the “not secure” warning in two additional situations: when people enter data on an HTTP page, and on all HTTP pages visited in Incognito mode.

Their goal is to make it so that the only markings you see in Chrome are when a site is not secure, and the default unmarked state is secure. They will start removing the “Secure” wording in September 2018, and in October 2018, they will start showing a red “not secure” warning when users enter data on HTTP pages.

Source: https://www.blog.google/products/chrome/milestone-chrome-security-marking-http-not-secure/

Have You Noticed a Lot of Updates to User Agreements Lately?

lockedYou probably have received word via email or in apps lately about changes to company privacy and security agreements. Many companies are updating their privacy policy to make it "more clear and transparent." Why the sudden interest?

That was what a friend asked me recently. He surmised that it had "something to do with all the Facebook issues." That is partially correct. Having Mark Zuckerberg testify to the U.S. Senate and then to the European Parliament certainly put a spotlight on these issues.

But what really pushed companies was the EU's General Data Protection Regulation (GDPR) which went into effect this week. Since most websites are global, even if they don't think of themselves as being global, most big companies decided to adopt the GDPR standards for everyone, including their U.S. clients.

What I am seeing (yes, I read the fine print) is that they have added more detail about the information they collect, how they process that data, and how you can control your data. They may have updates on how they use cookies, for example, or how you can change who else gets to see your data. Some of these options have been around for awhile, but most users either didn't know about them or just didn't want to be bothered. For example, you have been able to block all cookies or third-party cookies or have them wiped when you close your browser for a long time. Did you ever change those settings?

These new changes seem to me to be a good and necessary next step. Add to the Facebook spotlight and GDPR the fact that Google's Chrome browser in its July 2018 version 68 release will mark all HTTP sites as “not secure.” Having the HTTPS  ("S" for secure) in that URL will become important. If your site appears to users as NOT SECURE, you can expect people to click away from it.