I read an article by Margot Machol Bisnow, author of “Raising an Entrepreneur,” who did interviews with parents who raised highly successful people. She was curious about what skills they taught their kids at an early age. A simple takeaway from her research is that one skill they all agreed on was curiosity.
Curiosity can be defined as the desire to know something but that is oversimplified. Every teacher values curiosity of some kind. Sometimes teachers find that student curiosity can be overwhelming (or even annoying) when it doesn't match the path of a lesson. Questions off the topic at hand can hijack a lesson - or they can lead to interesting discussions.
So we might define curiosity as including trying to fix something, asking good questions, wanting to know how something works and wondering how it might be done differently or better.
From the article, here are 3 things parents did with their kids that should also be part of a classroom.
1. They encouraged their kids to fix things.
2. They instilled the confidence to tackle big, real-world problems.
3. They asked hard questions.
Who owns the rights to my face? I assumed it was me until I read an article that reminded me that when we create social media accounts, we pretty much agree to grant those platforms a free license to use our content as they wish.
In most cases, you hold the copyright to any content you upload to social media platforms. But when you created your account on Facebook, Twitter, Instagram, Tik Tok, or any platform you agreed to have a free license to use your content as they wish. How can they use it> It depends, but did you read the user agreement or just click "continue?"
How would you feel if you saw one of your tweets used in a Twitter ad campaign? Violated? Angry? Excited? Feel as you wish, but don't expect any cut of the ad's revenue.
In that article, a person sees a sponsored Instagram Story ad with a video of a person putting on lip balm. The person was her. She watched herself apply the balm and smile at the camera, but Abby never agreed to appear in a nationwide social campaign. How is this possible?
Usage rights dictate who owns an image or asset. It determines how and where it’s allowed to appear, and for how long.
The author had worked in media and knew that employees are often "pressured" to appear in campaigns but it is not a part of the full-time job and it is likely that it will go uncompensated.
In this case, she had been told to participate in a photoshoot demonstrating the product’s healing benefits. She recorded for the work day, was not paid, and she believed the campaign was only going to run on the employer’s social media accounts for a few months. But this was more than a year later. Probably her former employer passed the content to the skincare company, though without her permission.
There's an old saying that if you're not paying for a product, then you are the product. Social media sites like Facebook and Instagram are completely free to use for the average consumer because advertisers pay for your attention (and sometimes your data). This is not a new model. In commercial TV broadcasting, you watch content for free because there are commercials. A more cynical explanation is that you pay for the privilege of having yourself sold. You are consumed. You are the product. They deliver you to the advertiser,. The advertiser is their customer.
Think about that the next time you read - or choose not to read - the terms and conditions and agree with a click.
I saw an article online about "Web 4.0" recently and thought "I know there was a Web 2.0, but did we jump over Web 3.0?"
Back in the early days of this century (Can we say that yet?). I was hearing about, speaking about, and writing about Web 2.0. It was known by a number of names: the Participative or Participatory Web, the Social Web, and my personal favorite moniker the Read-Write Web.
The idea was that the early web that would now be considered Web 1.0 was pretty much a one-way web. It was a passive web. We consumed content. The next phase of the Web was the shift to websites that emphasized user-generated content. These websites were easier to use and allowed participation, interaction, and interoperability. They were active.
The term Web 2.0 was coined by Darcy DiNucci in 1999 and later popularized by Tim O'Reilly and Dale Dougherty at the O'Reilly Media Web 2.0 Conference in late 2004. But this was a gradual change so you can't point to any one date as the start.
By 1999, there are approximately 3 million websites but the majority of these sites are static, read-only sites. But change was coming. Blogs were one of the tools of Web 2.0. It was a major change. You could get free webspace and be your own publisher.
I started a wiki, this blog, got into social media and began podcasting between 2000-2006. By 2006, there are approximately 85 million websites. Two of the big Web 2.0 sites that changed many sites to follow were Wikipedia and Facebook. That year I wrote about "Web 2.0 Colleges."
By 2014, the Internet had more than one billion websites.
So, when did the Web go 3.0 and what is it? Some people have been calling Web 3.0 the semantic web. The venerable Tim Berners-Lee, the inventor of the World Wide Web, said that a Web 3.0 would be a “read-write-execute” web which would include semantic markup and web services.
Actually, Web 3.0 is the one we use today. It is a “semantic web” in the way search engines try to understand human language. It is the way we can execute through mobile devices and the cloud.
The semantic markup means presenting data in a way that can be understood by software agents so that it can be “executed.” One definition of the semantic web I found is that it is a virtual environment in which information and data are connected and organized so that they can be processed automatically. This is a Web in which the machines read content and also interpret it.
Some of this is happening now, but we're not fully there. That is also true of the Web using artificial intelligence. AI can work along with machine-readable content. Talking like this about a 3.0 or a 4.0 version of the web might excite you or it might scare you. Machines directly interacting with each other. Exciting or scary?
From what I read about Web 4.0 it seems to be a fully mobile web. The search engines we are used to using become virtual assistants. Sure, you're already talking to Alexa and Siri, but the experience is not fully functional. It's great that they are starting to understand natural language that is spoken and written. You can ask questions, but too often you get a list of possible websites where your answer might be, rather than the answer. It's coming.
Again, 4.0 can be exciting and scary. You ask your device to book you a room in New York City for next Friday and since it already knows your preferences, your price range, your credit card information and other data it can do it without your additional help. Scary? But if you had a really good human assistant, he or she could also do this for you. Did that scare you? Yes, trust and privacy are concerns for Web 3 and 4.
Here is Tim Berners-Lee giving a TED talk in 2009 about the new "Open and Linked Web."
Take a look and decide how close we have come in a decade.
The one that surprised me the most was about curbs - that quite old and established way to separate the street from the sidewalk. In 19th century cities, they helped keep walkers from stepping in manure from horse-drawn carriages. But in the 21st century, Liddell says, “The curb as a fixed, rigid piece of infrastructure isn’t going to work.” He thinks there is a role for design in creating a more dynamic understanding of curbs. Nuanced with signage can change curb spaces from no parking to emergency-only to pay-by-the-hour parking.
A suburban curbside may not be an issue, but in cities and at airports, they are problem areas.
Liddell references Coord which is the urban planning spinout of Google/Alphabet’s Sidewalk Labs. Can you believe they have Open Curb Data that maps the use of city curbs. Self-described "Coord makes it easy to analyze, share, and collect curb data. Curbside management now includes better compliance, safety, and efficiency for communities of all sizes."
In 1994, I was teaching at a suburban middle school. The first computer I had in my classroom was an Apple IIe (sometimes stylized as ][e) with its 128k floppy disk versions of word processing, database and spreadsheet (bundled as AppleWorks). It worked well and I am still amazed at what it could do without a hard drive and with those floppy disks. I used it to create lesson plans and handouts with my dot-matrix Apple printer, and students in my homeroom loved to play the many MECC games that we received as a subscription, like Oregon Trail and Odell Lake on it.
The Apple ][e wasn't the first computer students had access to in school. Our first computer class and lab was built using the Radio Shack Tandy TRS-80 computers. I didn't have on in my classroom, but I learned to write programs using BASIC and made a vocabulary flashcard game using the vocabulary lists I was having students study in my English classes. It was a very basic BASIC game but students loved it because it was personalized to their school life.
But videogame consoles were also entering their suburban homes and my TRS-80 and Apple floppy disk games soon became crude or quaint to students who had better systems at home.
The next computers in school and the one in my classroom were IBM or IBM-clones and in the 1994-95 school year. They were running Windows NT 3.5, an operating system developed by Microsoft that was released on September 21, 1994. It was a not-user-friendly operating system and students didn't like it or really use it with me.
It wouldn't be until the summer of 1995 and the next academic year that Windows 95 would be released. That much more friendly and consumer-oriented operating system made a significant change in computer use. The biggest change was its graphical user interface (GUI) and its simplified "plug-and-play" features. There were also major changes made to the core components of the operating system, such a 32-bit preemptive multitasking architecture.
The Today Show’s Katie Couric and Bryant Gumbel didn't have a clue about the Internet in January 1994. It is amusing to hear them ponder what the heck that @ symbol means Gumbel isn't even sure how you pronounce it and Katie suggests “about.” No one wants to have to say “dot” when they read “.com”.
They have many questions: Do you write to it like mail but it travels like a phone call? Is it just colleges that have it? Gumbel bemoans that anyone can send him email - somehow forgetting that anyone could send him snail mail too.
It would only be ten years later that Google would make its IPO and it was only another few years before this Internet would go from obscurity to mainstream.
You might think that after using the Apple IIe in school we would have "upgraded" to Apple's Macintosh computer which was introduced in 1984 with a memorable Orwellian-themed commercial (see below). The original Macintosh is usually credited as being the first mass-market personal computer that featured a graphical user interface and a built-in screen and using a mouse. More obscure was the Sinclair QL which actually beat the Mac to market by a month but didn't capture a market. Apple sold the Macintosh alongside the Apple II family of computers for almost ten years before they were discontinued in 1993.
Of course, there was other "technology" in classrooms at that time. For example, VHS videotapes were wiping out the 16mm films and projectors and recording video was big. I was teaching a freshman "film and video" course in those days. But it was the personal computer and then the Internet that really changed educational technology in the mid-1990s.