You Are a Data Point

Does it disturb you to be thought of as a "data point" or "test subject"? A data point is a discrete unit of information, a single fact usually derived from a measurement or research. A person as a data point can be represented numerically or graphically. That sounds pretty cold. 
An article on chronicle.com about Western Governors University (WGU), a nonprofit, online-only institution that enrolls 80,000 students worldwide, talks about how it has enlarged its institutional-research office the past few years and how students are very much data points. Of course, students, as well as employees and customers offer a valuable source of data for researchers.
In an educational setting, this data could be used to improve student outcomes and to make assessments that can lead to improvement in learning design and delivery..
One of the often stated benefits of MOOCs has been the opportunity to use these very large courses to obtain data about how students learn online. Critics of this approach say that learning online in a class of 25 versus a class of several thousand are not comparable experiences. And are there valid comparisons to how students learn online to learning in a face-to-face class? That has been argued for several decades. 
WGU is also a competency-based institution. Standardized measurements and goals are how their courses are designed. If not a good thing for a student's education, it certainly is an approach that is great for researchers who can hold certain variables constant while testing tools and interventions to see how they influence students.
No one likes to be thought of as just a number. It reminds me of sci-fi novels and media about the future like 1984 and Brave New World (or the cult favorite TV series, The Prisoner, illustration at top). But we are all very much considered as data points by advertisers and in many modern technologies, social networks and institutions.

Getting to the Level of Social Learning

brain

"The Cognitive Science Behind Learning" is an article that discusses the idea of viewing different levels as a way to approach the study of cognitive science and learning.

It views cognitive science as an "umbrella term to incorporate all levels of human behavior from neural to social, and it includes contributions from many disciplines including philosophy, anthropology, neuroscience, psychology, sociology and more."

It is widely accepted that at the most basic level we have the neural level. That is looking at learning as something that is about forming and strengthening the connections between neurons in our brains. There is not a lot that an educator can do about that level in the classroom.

As educators, we are more concerned about what we might see at the next two levels. The main one we always discuss is the cognitive level, where "learning and instruction is about designed action and guided reflection."

For me, the importance at this level is that human learning is very connected to pattern-matching and meaning-making. Though some of those abilities are formed through some ability to perform by rote repeatedly and accurately, that is not the key to learning. Of course, it is the main thing we do in classrooms and it is generally what we assess in our grading. There are certainly courses that require learning a lot of information, but I agree with the author, Clark Quinn, that "Too often learning leaders make courses when the information doesn’t have to be in the head, it just needs to be on hand."

Quinn spends much less time on a higher level that I believe we should spend more time examining - social learning. Social learning theory is something I associate with Albert Bandura. It considers that learning is a cognitive process that takes place in a social context. It can occur purely through observation or direct instruction, even in the absence of motor reproduction or direct reinforcement.

In the early 1960s, Albert Bandura conducted what is known as the Bobo doll experiment. In the experiment, he had children observe a video of an adult aggressively playing with toys, including a Bobo doll. The large blow-up Bobo doll looks like a clown, and the adult hit the doll, knocked it down and even jumped on it while yelling words like 'pow!' and 'kick him!' When the children were subsequently allowed to play with a variety of toys, including the Bobo doll, more than half of the children modeled the adult and engaged in the same aggressive behaviors with the Bobo doll. This modeling was called Bandura's social learning theory.

Some of social learning occurs naturally in social classroom settings. Some of this theory has also been adopted by advertisers and marketers in social media settings. Social learning theory is most effective when four processes occur. Let me end here by just suggesting that every educator should consider their use of these processes intentionally and unintentionally in their teaching. The four processes are attention, retention, reproduction, and motivation.


Our Attention Economy

eye

Money follows eyeballs. I saw that phrase on a slide in a conference presentation about marketing with social media.

Everyone wants your attention. Your children want your attention. Your spouse wants your attention. You want the attention of your students. Nothing new about that concept and there are plenty of ways to get someone's attention.

But it is a more recent way of thinking about attention to consider it as economics. I was listening to the audiobook of A Beautiful Mind recently. It's a book (and a good but highly romanticized film) about the mathematician John Nash. Nash received the Nobel Prize in Economics for his work on game theory as it was applied to economics. His ideas, presented in the 1950s, certainly must have seemed novel at the time, but 40 years later they seemed logical. That will probably be true of attention economics. There are already a good number of people writing about it.

Attention economics is an approach to the management of information that treats human attention as a scarce commodity. With attention as a commodity, you can apply economic theory to solve various information management problems.

Attention is a scarce commodity or resource because a person has only so much of it.

Not only in economics but in education and other areas that focused mental engagement that makes us attend to a particular item, leads to our decision on whether to act or not. Do we buy the item advertised? Do we do what mommy said to do? 

We are deep into the Information Age and content is so abundant and immediately available, that attention has become a limiting factor. There are so many channels and shows on the many versions of "television" competing for our attention that you may just decide not to watch at all. Or you may to decide to "cut the cord" and disconnect from many of them to make the choices fewer.

Designers know that if it takes the user too long to locate something, you will lose their attention. On web pages, that attention lasts anywhere from a few seconds to less than a second. If they can't find what they were looking for, they will find it through another source.

The goal then becomes to design methods (filters, demographics, cookies, user testing etc.) to make the first content a viewer sees relevant. Google and Facebook want you to see ads that are relevant to YOU. That online vendor wants the products on that first page to be things you are most interested in buying. Everything - and everyone - wants to be appealing to everyone.

In attention-based advertising, we measure the number of "eyeballs" by which content is seen.

"You can't please everyone." Really? Why not?

In the history section of the entry on "Attention Economy" on Wikipedia, it lists Herbert A. Simon as possibly being the first person to articulate the concept of attention economics. Simon wrote: "...in an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it" (Simon 1971, pp. 40–41).

Simon was talking about the idea of information overload as an economic concept and that has led to business strategists such as Thomas H. Davenport to use the term "attention economy" (Davenport & Beck 2001).

Where will this lead? On the outer edges are those who speculate that "attention transactions" will replace financial transactions as the focus of our economic system (Goldhaber 1997Franck 1999).

Designers of websites, software, apps and any user interface already take into account attention, but information systems researchers have also adopted the idea. Will we see mechanism designs which build on the idea of creating property rights in attention?


The Information Literacy of Fake News

fake news

Pre- and post-election last fall, there were many stories in all types of media about "fake news." An article in The Chronicle asks "How Can Students Be Taught to Detect Fake News and Dubious Claims?" but I would say that non-students need even more education in this area. Of course, the real question is whether or not this is a teachable skill.

If you had asked me last January to define "fake news" I would have said it was a kind of satire or parody of mainstream journalism. The Onion online, or Saturday Night Live's news segment would fit that definition. Satire always has a bit of truth in it or it doesn't really work.

The Daily Show and Last Week Tonight with John Oliver and other shows and sites have blurred the line. They use real news and sometimes parody it, but sometimes they are closer to investigative journalism. They can edit together clips of a persons inconsistencies in views over the years and create a montage that shows someone who either has a terrible memory or is a liar. It may frighten some to hear it, but many young people and adults list shows like these as their main source for news.

The fake news that is really the focus of attention now are ones (almost exclusively online) that produce wholly fictionalized news stories. Those non-journalistic entities have a very powerful delivery system via social media like Facebook and twitter.

A Stanford University report published last year concluded that many students could not detect fake or misleading information online. They gave students from middle school to college tasks to see how well they could tell a native advertisement from a news article or identify a partisan website as biased or separate a verified social-media account from an unauthenticated one

A larger conclusion I see here is that faculty often assume that young people are fluent in or savvy about n social media in the same way that it is assumed that digital natives know how to use smartphones, websites, photos, video and other digital technology. Bad assumption or expectation.

I remember teaching lessons on determining the veracity of research sources before there was an Internet and after. That has been a part of literacy education since the time when books became more common. I'm sure it was a teachable moment pre-print when a parent told a child to ignore gossip and stories from certain people/courses.

The Stanford researchers said that we need to teach "civic online reasoning" which is something that goes beyond its need in academic settings.

In whose purview is this teaching? English teachers? Librarians? I would say it would only be effective if, like writing in the disciplines, it is taught by all teachers with a concentration on how it occurs in their field.

The science instructor needs to teach how to determine when science is not science. An easy task? No. Look at teaching the truth of climate science or evolution. It is controversial even if the science seems clear.

Napoleon Bonaparte is credited with saying that "History is a set of lies agreed upon." If that is true, how do we teach the truth about history past and the history that is unfolding before our eyes?

But we can't just say it's impossible to teach or assume someone else will take care of it. Information literacy is still a critical, difficult and overlooked set of skills to teach.


LinkedIn's Economic Graph

I wrote earlier about LinkedIn Learning, a new effort by the company to market online training. I said then that I did not think this would displace higher education any more than MOOCs or online education. If successful, it will be disruptive and perhaps push higher education to adapt sooner.

LinkedIn’s vision is to build what it calls the Economic Graph. That graph will be created using profiles for every member of the work force, every company, and "every job and every skill required to obtain those jobs."

That concept reminded me immediately of Facebook's Social Graph. Facebook introduced the term in 2007 as a way to explain how the then new Facebook Platform would take advantage of the relationships between individuals to offer a richer online experience. The term is used in a broader sense now to refer to a social graph of all Internet users.

social graph



LinkedIn Learning is seen as a service that connects user, skills, companies and jobs. LinkedIn acknowledges that even with about 9,000 courses on their Lynda.com platform they don't have enough content to accomplish that yet.

They are not going to turn to colleges for more content. They want to use the Economic Graph to determine the skills that they need content to provide based on corporate or local needs. That is not really a model that colleges use to develop most new courses. 

But Lynda.com content are not "courses" as we think of a course in higher ed. The training is based on short video segments and short multiple-choice quizzes. Enterprise customers can create playlists of content modules to create something course-like.

One critic of LinkedIn Learning said that this was an effort to be a "Netflix of education." That doesn't sound so bad to me. Applying data science to provide "just in time" knowledge and skills is something we have heard in education, but it has never been used in any broad or truly effective way.

The goal is to deliver the right knowledge at the right time to the right person.

One connection for higher ed is that the company says it is launching a LinkedIn Economic Graph Challenge "to encourage researchers, academics, and data-driven thinkers to propose how they would use data from LinkedIn to generate insights that may ultimately lead to new economic opportunities."

Opportunities for whom? LinkedIn or the university?

This path is similar in some ways to instances of adaptive-learning software that responds to the needs of individual students. I do like that LinkedIn Learning also is looking to "create" skills in order to fulfill perceived needs. Is there a need for training in biometric computing? Then, create training for it.

You can try https://www.linkedin.com/learning/. When I went there, it knew that I was a university professor and showed me "trending" courses such as "How to Teach with Desire2Learn," "Social Media in the Classroom" and  "How to Increase Learner Engagement." Surely, the more data I give them about my work and teaching, the more specific my recommendations will become.


Chasing the MUSE

ENIAC

DARPA has a program called MUSE (Mining and Understanding Software Enclaves) that is described as a "paradigm shift in the way we think about software." The first step is no less than for MUSE to suck up all of the world’s open-source software. That would be hundreds of billions of lines of code, which would then need to be organized it in at database.

A reason to attempt this is because the 20 billion lines of code written each year includes lots of duplication. MUSE will assemble a massive collection of chunks of code and tag it so that programmers can automatically be found and assembled. That means that someone who knows little about programming languages would be able to program.  

Might MUSE be a way to launch non-coding programming?

This can also fit in with President Obama’s BRAIN Initiative and it may contribute to the development of brain-inspired computers.

Cognitive technology is still emerging, but Irving Wladawsky-Berger, formerly of IBM and now at New York University, has said “We should definitely teach design. This is not coding, or even programming. It requires the ability to think about the problem, organize the approach, know how to use design tools.”