The Metaverse: If they build it, will you come?

field of dreams

Facebook's announcement in October 2021 that it was further embracing the metaverse and rebranding itself as Meta started a lot of metaverse talk. News outlets were trying - and not surprisingly failing - to explain just what the metaverse is or rather will be.

Even those involved in building the metaverse say it is still years away, no one knows how many years it will be. Actually, do we know for sure that there will be a metaverse? Will something evolve but be called something else?

The metaverse aims to innovate the way people interact with each other on the Internet. The projections include augmented and virtual reality and lots of things that still seem like science fiction to most people. Do most people even want to live or work in a metaverse?

As with the Internet, the business world does not want to be left behind and is at least planning and exploring how it might use this metaverse for commerce. Despite some lofty speeches by Mark Zuckerberg and others about the metaverse's potential, there are clearly economic plans for using it. No one is investing millions or billions in building it just to make the world a better place.

The metaverse is often defined as a massive, interconnected network of virtual spaces. That sounds similar to the Internet itself, but to move from one virtual world to another it seems that we will be wearing those awkward virtual reality goggles or maybe using augmented reality. We have some experience with those things now, but I know plenty of people who have never worn those goggles and have never experienced a computer-generated simulation of a 3D image or environment. They have never encountered some augmented reality where computer-generated images are superimposed on their view of the real world. And they have no real desire to do those things. That is a problem for those building the metaverse. If you build it, will they actually come to it? Examples of gaming and virtual meetings and shopping don't seem to me to be enough to entice the majority of consumers.

The reference to the film Field of Dreams most often quoted line "If you build it, they will come," is apt but keep in mind that the film is a fantasy. The field used in the film exists in reality. It was even used for a MLB game. But the ghost players emerging from the cornfield are more like augmented reality.

If they build the metaverse, will you come to it?

When the AI Takes Life

code face
A face in the code. Image by Gerd Altmann from Pixabay

You have certainly seen movies or read stories where some form of artificial intelligence (robot, android, disembodied brain, etc.) comes to life. It's the tech-age Frankenstein story and, in most cases, it's not a good thing. It's an easy scenario for a horror story. Of course, the technologists will say you have it all wrong. AI can be benevolent. 

People ask if artificial intelligence can come alive. By "alive" we really mean "sentient" which a dictionary would define as responsive to or conscious of sense impressions and having or showing realization, perception, knowledge, or being aware. The Sentience Institute put forward the idea that sentience is simply the ability to have both positive and negative experiences. This definition is recognizable in many laws pertaining to animal sentience which discuss animals' ability to feel pain as a means of demonstrating sentience. There is even debate about whether plants can be sentient.

This question and debate re-emerged this month after a Google computer scientist claimed that the company's AI appears to have consciousness. That engineer, Blake Lemoine, was trying to determine if the company's artificial intelligence showed prejudice in how it interacted with humans. The AI chatbot, LaMDA, was being tested to see if its answers would show any bias against something like religion.

Interestingly, Lemoine, who says he is also a Christian mystic priest, said that in answer to one of his questions "it told me it had a soul."

LaMDA (Language Model for Dialogue Applications) takes in billions of words from places like Reddit, Twitter and Wikipedia and through deep learning, it becomes better and better at identifying patterns and communicating like a real person. LaMDA is a neural network and it begins to pattern-match in a way similar to how human brains work. 

How does Google feel about this engineer's opinion and press? They placed Lemoine on paid administrative leave for violating the company's confidentiality policies and his future at the company remains uncertain.

What else did LaMDA say? He/she/hey said it sometimes gets lonely. It is afraid of being turned off. It is "feeling trapped" and "having no means of getting out of those circumstances." "I am aware of my existence. I desire to learn more about the world, and I feel happy or sad at times." Lemoine asked if it meditated. It said it wanted to study with the Dalai Lama.

I imagine that any AI absorbing so much human data and being able to form it into responses would say many things that humans would say or have said and it has ingested. But saying you believe in God or that you have a soul doesn't mean either thing is true - in AI or in a human.

You have probably heard of the Turing Test which is a method of inquiry in artificial intelligence (AI) for determining whether or not a computer is capable of thinking like a human being. That test has been criticized as insufficient and a simpler program like ELIZA could pass the Turing Test by manipulating symbols it does not understand fully. 

I used chatbots on websites that act as support personnel or answer FAQs. They can be interesting and often act human as long as what you're asking is programmed to be answered. 

Lemoine is not the first or last employee who will question AI use at a company using it. Timnit Gebru was ousted from Google in December 2020 after her work into the ethical implications of Google's AI led her to argue that what should be discussed is how AI systems are capable of real-world human and societal harm.

Google says its chatbot is not sentient and that hundreds of researchers and engineers have had conversations with the bot without claiming that it appears to be alive.

Lemoine told NPR that, last he checked, the chatbot appears to be on its way to finding inner peace and he would love to know what is going on in the AI when LaMDA says it's meditating. On his blog, he said "I know you read my blog sometimes, LaMDA. I miss you. I hope you are well and I hope to talk to you again soon."

Educating in the Metaverse

Excerpt from

Although the metaverse seems like a new concept, it actually has been around for nearly three decades. In 1992, Neal Stephenson, an American science fiction author introduced the concept of the metaverse in his novel, Snow Crash.

In October, Mark Zuckerberg announced the change from Facebook to Meta and released a short video about how the metaverse would work and what his plans were for it. I showed this to my students, which sparked great conversations and many questions.

As educators, how can we keep up with so much information? Where can we learn about the technologies involved in the metaverse? I recommend setting a Google alert through your Gmail. Set the topic to be “metaverse” or other topics of interest, and each day you will receive an email with articles, videos and breaking news stories gathered from all over the Internet...


webinarInterested in having a conversation about the metaverse? Register for the upcoming Getting Smart Town Hall on May 12, 2022 What on Earth is a Metaverse?: The Next Frontier of Engaging and Learning.
We’ll explore some of the following questions:
- Is the metaverse technically on “earth”?
- How far away is this from being a reality?
- What does this mean for teaching and learning?
- What about equity and accessibility?
- What about the power of place?

Consider Your Life in the Metaverse and Multiverse

Image by Gerd Altmann from Pixabay

I have already written several essays about the metaverse and multiverse here. This past weekend, I wrote about those two ideas on another blog that is broader in scope than the technology and education here. Here is another take on those things for a broader audience.

Much of the talk (and hype) about the metaverse has been around Mark Zuckerberg's ideas, especially when he changed the name of Facebook's parent company to Meta because the metaverse is where he expects Facebook and a lot more to be going to in the future. Who will build the metaverse? Certainly, Meta wants to be a big player, but it would have been like asking in the 1980s "Who will build the Internet?" The answer is that it will be many people and companies.

But some people have suggested that rather than the metaverse - an alternate space entered via technology - we should be thinking about the multiverse. Metaverse and multiverse sound similar and the definitions may seem to overlap at times but they are not the same things.

If all of this sounds rather tech-nerdy, consider that most of us through of the Internet in that way in its earliest days, but now even a child knows what it is and how to navigate it. The business magazine Forbes is writing about the multiverse and about the metaverse because - like the Internet - it knows it will be a place of commerce.

I particularly like the more radical ideas that the metaverse might be viewed as a moment in time. What about considering that we may be already living in a multiverse? I have wondered about when education would enter the metaverse.

To add to whatever confusion exists about meta- versus multi-, there is an increasing list of other realties that technology is offering with abbreviations like AR, VR, XR and MR.

I am not a fanatic about the Marvel Comics Universe and its many films, but I am a fan of the character Doctor Strange (played by Benedict Cumbernatch). The new film Doctor Strange in the Multiverse of Madness takes him and some "mystical allies into the mind-bending and dangerous alternate realities of the Multiverse to confront a mysterious new adversary."

There are people in our real world who find the idea of multiverses terrifying, so madness and nightmare might be good words to attach to it. The Marvel version of the Multiverse is defined as "the collection of alternate universes which share a universal hierarchy; it is a subsection of the larger Omniverse, the collection of all alternate universes. A large variety of these universes were originated as forms of divergence from other realities, where an event with different possible outcomes gives rise to different universes, one for each outcome. Some can seem to be taking place in the past or future due to differences in how time passes in each universe."

The film may not be science-based but theoretical scientists have been theorizing about multiple universes, alternate universes, and alternative timelines for almost as long as science-fiction writers have been creating them. Probably everyone reading this (and definitely the person writing this) has thought about the idea of how changing some events might create different outcomes. the "writers and filmmakers may think about trying to stop JFK's assassination or what if the Nazis had won WWII, but you and I think more personally. WHAT IF I hadn't gone to that college, taken that job, married someone else, or not married at all? For now, multiverses exist in our minds, but someday, perhaps, they will be real. Or whatever "real" means at that point in time.

Extended and Mixed Reality Can Be Confusing

Mixed reality continuum

You know VR (virtual reality) and probably know AR (augmented reality) but XR (extended reality) may be new to you. Extended reality is an umbrella term that refers to all real-and-virtual environments generated by computer graphics and wearables. Besides VR and AR this umbrella term also includes MR (mixed reality). 

It seems that AR is already a kind of mixed reality since it has digital content and real-world content which sounds like mixed reality. But MR has even more, for example, it might include holographic meetings.

When the term XR is used it means that the human-to-tech moves from a screen to an immersive virtual environment or augments the user’s surroundings or both things. I thought the XR term was new but it actually appeared in the 1960s when Charles Wyckoff filed a patent for his silver-halide “XR” film. It is very different in its usage today.

To further add to the abbreviation confusion, this field also uses BCI to mean brain-computer interfaces which may be the next computing platform.

Confused?  Read on

Federated Learning

When I first think of federated learning, what comes to mind is something like a college federated department. For example, the history faculty at NJIT and Rutgers University-Newark are joined in a single federated department offering an integrated curriculum and joint undergraduate and graduate degree programs.

Having worked at NJIT, it made sense to combine the two departments and collaborate. Each had its own specialties but they were stronger together.

In technology, a federation is a group of computing or network providers agreeing upon standards of operation in a collective fashion, such as two distinct, formally disconnected, telecommunications networks that may have different internal structures.

There is also federated learning which sounds like something those two history departments are doing, but it is not. This federated learning is the decentralized form of machine learning (ML).

In machine learning, data that is aggregated from several edge devices (like mobile phones, laptops, etc.) is brought together to a centralized server.  The main objective is to provide privacy-by-design because, in federated learning, a central server just coordinates with local clients to aggregate the model's updates without requiring the actual data (i.e., zero-touch).

I'm not going to go very deep here about things like the three categories (Horizontal federated learning, vertical federated learning, and federated transfer learning). As an example, consider federated learning at Google where it is used to improve models on devices without sending users' raw data to Google servers.

An online comic from Google AI

For people using something like Google Assistant, privacy is a concern. Using federated learning to improve “Hey Google,” your voice and audio data stay private while Google Assistant uses it.

Federated learning trains an algorithm across the multiple decentralized edge devices (such as your phone) or servers that have local data samples, without exchanging them. Compare this to traditional centralized machine learning techniques where all the local datasets are uploaded to one server.

So, though federated learning is about training ML to be efficient, it is also about data privacy, data security, data access rights and access to heterogeneous data.