Education and the Gig Economy

gigI mentioned the Gig Economy to a colleague at a college last week and he said he had never heard of the term. I said that "gig" is a term I associate with musicians who move from job to job, gig to gig. Now, it is being applied to a labor market characterized by the prevalence of short-term contracts or freelance work as opposed to permanent jobs. "But it has nothing to do with education," he commented. That got me thinking. Is it affecting education?

A study by Intuit predicted that by 2020, 40 percent of American workers will be independent contractors. Most discussions of the gig economy talk about job sharing apps like Uber, Instacart and TaskRabbit. There has long been short term, contract and freelance work being done in the labor market. But the type that is being done by college graduates is said to have grown by more than 50% over the last decade.

Jeff Selingo referenced studies that contend that all the net job growth since the Great Recession has been in the gig or contract economy, and that 47% of college-age students did some sort of freelancing work last year, along with 43% of millennials.

My first thought about gig work in higher education is adjuncts. With more and more adjuncts (and fewer full-time faculty) being used in colleges, many adjuncts put together gigs at several schools. If teaching is your only job, that means trying to get three or more classes per semester fall, spring and summer.

I pulled some books off the bookstore shelf this past weekend and looked at what is being written about The Future Workplace ExperienceThe Gig Economy and Thriving in the Gig Economy are examples. 

They talk about dealing with disruption in recruiting and engaging employees A lot of the popular of the media focus is on the low end of the skill spectrum. Less attention is given to college grads and professionals who have chosen this independent employment route.

I found so many different stats on the size of this gig workforce that I hesitate to link to a source. One book says more than a third of Americans are working in the gig economy. That seems high by my own circle of friends and colleagues, but this includes short-term jobs, contract work, and freelance assignments 

I am now officially in retirement - or unretirement as I prefer to say. I have written elsewhere about unretirement and freelance work which is part of the gig economy. I take on teaching, web and instructional design gigs on a very selective basis. I choose things that interest me and allow me the freedom to work when I want to work and from where I want to work.  Sometimes the work comes from traditional places. I did a 6-month gig with a nearby community college that I had worked at full-time in the past. I have two new web clients for whom I am designing sites and e-commerce stores.

But let's return to what this might have to do with education. Higher education as preparation for a job has always been a topic of debate. "It's not job training," is what many professors would say. Employers have always played a large role in the training and professional development of their workers whether they have degrees or not.

In a gig economy, freelancers have to be self-directed in their learning. They need to decide what knowledge they’re missing, where to acquire it, how to fit it in to their day and how to pay for it. The free (as in MOOC and other online opportunities) is very appealing. Do schools that charge tuition and have traditional classes have any appeal to these people?

Certainly, driving for Uber doesn't require a degree, though having some business training in order to be self-employed would be beneficial. But my interest is more with "professional" freelancers. Take as an example, someone who has some college, certification or preferably a degree, that makes them able to promote themselves as an instructional designer or social media manager. I choose those two because I have done both as a freelancer and I know that if I look right now on a jobs site such as Glassdoor I will find hundreds of opportunities for those two areas locally.

Businesses and colleges save resources in terms of benefits, office space and training by employing these people. They also have the ability to contract with experts for specific projects who might be too high-priced to maintain on staff.

For some freelancers I know, a gig economy appeals because it offers them more control over their work-life balance. In that case, they are selecting jobs that they're interested in, rather than entering the gig economy because they are unable to attain employment, and so pick up whatever temporary gigs they can land. The latter is often the case with adjunct faculty. 

To someone mixing together short-term jobs, contract work, and freelance assignments, where would they go to find additional professional development?

Books like The Gig Economy - with its appealing subtitle offer of being "The Complete Guide to Getting Better Work, Taking More Time Off, and Financing the Life You Want" - is more interested in real-world corporate examples (Airbnb, Lyft, Uber, Etsy, TaskRabbit, France's BlaBlaCar, China's Didi Kuaidi, and India's Ola) as crowd-based capitalism.

The freelancer may not be much concerned with emerging blockchain technologies, but she is certainly part of the changing future of work.

The future is always a land of questions: Will we live in a world of empowered entrepreneurs who enjoy professional flexibility and independence? Will these gig economy workers become disenfranchised, laborers jumping from gig to gig, always looking for work and paying heir own health benefits? How will this affect labor unions, self-regulatory organizations, labor law, and a new generation of retirees who have a more limited social safety net? Are long-term careers at one or two companies a thing of the past?

Robin Chase, founder of Zipcar, the world’s largest car sharing company, said, “My father had one job in his life, I’ve had six in mine. My kids will have six at the same time.”

The one thing all observers seem to agree on is that the way we work is changing.

Jennifer Lachs writes on opencolleges.edu.au about that changing working world and the possible impact it may have on education. I hadn't thought of it as a gig economy job but of course substitute teachers in K-12 education have long been employed on a freelance basis. The education and training industry is among the top 5 highest demand industries for freelance workers due to the high level of specialization and rise of virtual education.

I know of a dozen or so teachers who do online teaching and tutoring as a way to supplement their income. For decades, professors have done freelance writing and thesis editing and much of that has moved online. My wife and I are currently editing a dissertation via email and shared files along with the occasional phone conference.

The writing center I helped build at a community college has relied on online tutoring for student writing as a way to supplement the face-to-face tutoring. Online appealed to students, but it also offered additional work for some of out part-time tutors and others who added it to the gig list.

Are we preparing students for the gig economy once they graduate? No. 

A friend pointed me at "It’s a Project-Based World" which was a thought leadership campaign by Getting Smart to explore the economic realities of a project-based world. The purpose of the campaign: to promote equity and access to deeper learning outcomes for all students. There are blog posts, podcast interviews, publications, and infographics around the preparation of students, teachers and leaders for a project-based world. The focus there seems to be less on obtaining deeper knowledge, and more on teaching skills that students will need in the modern working world.

Finally, I think that the gig economy will have a greater impact on traditional education than traditional education will have on the gig economy. It accounts for employment growth statistics, but secondary or post-secondary schools don't prepare students for this type of work.

 

All Your Students Are Generation Z

Gen ZWhen I started working at a university in 2000, there was a lot of talk about Millennials. That generation gets a lot less attention these days. I am not much of a fan of these generation generalizations, but that won't stop them from being topics of conversation. They are particularly of interest to marketers.

The generation that follows the Millennials are those born between 1995 - 2012. That makes them 5- 22 years old. I don't know how we can generalize very much about that wide a range of people. But educators should take note because they do include kids in kindergarten through the new college graduates and all those students in between.

The post-Millennial generation hasn't gotten name that everyone agrees on. I hear them called Generation Z, Post-Millennials, iGeneration, Centennials and the Homeland Generation.

Although "iGeneration" might suggest that they are self-centered, the lowercase i references the Apple world of iPods, iPhones, iPads etc.

"Homeland" refers to the post-9/11 world they grew up in. September 11, 2001 was the last major event to occur for Millennials. Even the oldest members of Generation Z were quite young children when the 9/11 attacks occurred. They have no generational memory of a time the United States was not at war with the loosely defined forces of global terrorism.

I'll use Gen Z to label this demographic cohort after the Millennials.

Here are some of the characteristics I find that supposedly describe Gen Z. You'll notice that much of this comes from the fact that this generation has lived with the Internet from a young age. This is usually taken to mean that they are very comfortable (don't read that as knowledgeable) with technology and interacting on social media.

Besides living in an Internet age, they live in a post-9/11 age and grew up through the Great Recession and so have a feeling of unsettlement and insecurity.

They get less sleep than earlier generations.

They are mobile phone users - not desktop, laptop or landline users.

They are wiser than earlier generations about protecting their online personalities and privacy, but they live in a world that also offers more threats.  For example, they are more likely to create “rinsta” and “finsta” Instagram personas. (Rinsta is a “real” account and finsta is a “fake” or “friends-only” profile.)

They are wiser to marketing and more resistant to advertising. Less than a quarter of them have a positive perception of online ads (Millward Brown). But, perhaps ironically, they trust YouTube stars, Instagram personalities, and other social media influencers and that includes when they make purchasing decisions.

Having grown up with more of it, they are generally more open to efforts to increase diversity and inclusion.

They’re easily bored with an average attention span of eight seconds (Sparks & Honey). Of course, the attention span of the average millennial is supposed to be 12 seconds. That makes them hard to engage, but they self-identify as wanting to be engaged.

That haven't had or expect to have summer jobs.

They are said to be slower at maturing than earlier generations. They postpone getting a driver's license. Many of them even postpone having sex.

Rather than a generation gap, like the one made famous in the 1960s, they are more likely to hang with their parents.

They are very open to sharing their opinions in many ways from consumer reviews and other consumer behavior, and online they like collaborative communities and the exchange of ideas and opinions.

The Myth of Digital Natives

baby with computer

When I was fairly new to working in higher education, there was a lot of buzz about the students we were getting being "digital natives."  This was around 2001 and educator Marc Prensky had coined the term in an essay.

The claim was that these digital natives had a kind of innate facility with technology because they were born into it. This was also extended to them having increased abilities to do things like multitask.

Prensky took it further by saying that educators needed to change their ways to deal with this tech-savvy generation.

But new research (see below) indicates that this digital native theory is false. 

A digital native who is information-skilled simply just because they never knew a world that was not digital doesn't exist. This is also true in that any special abilities of students in this generation to multitask is also untrue. In fact, designing learning with this assumption hurts rather than helps learning.

We were naive to think that someone could pick up digital skills intuitively. But this may also be a dangerous fallacy that risks leaving young people lacking certain skills that were assumed to be known or so were not taught or emphasized.

I was never a proponent of this digital natives  - and digital immigrants - because I viewed "tech-savvy" as a very superficial kind of knowledge. I found most students in this group to be users of technology, but using a computer or cellphone doesn't impart understanding.

In 2007, I wrote about earlier research that was pointing towards this idea being false. Now, it seem definitive that all of this generational separation is a fallacy. It turns out that none of us is good at multitasking. We do it out of necessity, but something always suffers. Some studies have shown that a driver using a cellphone is the equivalent of a drunk driver.

Millennials - the group often labeled as being natives - don’t necessarily use technology more often and are no better at using basic computer programs than older generations.  Researchers also found that besides educators Millennials also have bought into the myth. Twice as many of them self-identify themselves as digitally proficient as actually would be assessed at that level.

The only aspect of all this that makes sense is that those people born into a technology are less likely to hesitate to use it or fear it. Clearly, a toddler today who is playing with a smartphone at age two and using apps will have no problems using it in other more serious ways in kindergarten. If these young people are better at using a totally new technology than a 70 year old person, I will consider that more about an aging brain than a birth year.

Read More

blogs.discovermagazine.com/d-brief/2017/07/27/

ecdl.org/policy-publications/digital-native-fallacy

sciencedirect.com/science/

 

When Accepted Students Don't Show Up at College

I had a discussion with some colleagues after listening to an episode of NPR's Hidden Brain podcast about research that shows that between 10% and 40% of the kids who intend to go to college at the time of high school graduation don't actually show up in the fall.

I'm doing some consulting for a community college this summer and I asked if this seemed accurate for that school. It turned out that the previous week staff at the college had been asked to "cold call" students who registered for fall courses but were dropped for non-payment and never re-registered. The college's enrollment is down 10% and it is a big concern.

meltingThis phenomenon is sometimes called "summer melt."

It is puzzling why kids who made it through the admissions process and were accepted to a college of their choice, applied for and received financial aid, never showed up for classes.

At my urban community college, financial aid was the most common reason. They registered, but aid did not come through in time to pay the bill. The odd part - the "melt" - was that when their aid did come through, they didn't re-register.

Why? Some had lost interest or felt discouraged by the process. Some reevaluated going to college. Some were just lazy. A few staffers were able to walk students over the phone through re-enrolling, so part of the problem might be information and support from the college.

In the podcast, Lindsay Page, an education researcher now at the University of Pittsburgh who did research while at Harvard, said "The rate with which kids who are college-intending do not actually get to college in the fall is surprisingly high. In one sample that we looked at in the Boston area, we find that upwards of 20% of kids who at the time of high school graduation say that they're continuing on to college don't actually show up in the fall."

This nationwide loss of seemingly college-intending students is particularly evident for those from low-income backgrounds.

But research has also identified relatively low cost interventions that can have a significant impact on alleviating the summer melt phenomenon and increasing college enrollment rates.

Page's research at Harvard was published in the "SDP Summer Melt Handbook: A Guide to Investigating and Responding to Summer Melt." In the report, they use “summer melt” to refer to a "different, but related phenomenon: when seemingly college-intending students fail to enroll at all in the fall after high school graduation. 'College-intending' students are those who have completed key college-going steps, such as applying and being accepted to college and applying for financial aid if their families qualify. In other cases, they have concretely signaled their intention to enroll in college on a high school senior exit survey. We consider a student to have “melted” if, despite being college-intending, she or he fails to attend college the following fall."

Some of their interventions go back to students' high school day and records, such as senior exit surveys, and survey high school counselors. They also provide examples of summer task lists, both personalized for specific institutions and generic, and sample documents for proactive personal outreach, such as an initial outreach checklist, assessment meeting checklist, intake form, and counselor interaction logs. 

Download the report and other resources at sdp.cepr.harvard.edu/summer-melt-handbook 

LISTEN to the Hidden Brain podcast on this topic  npr.org/2017/07/17/537740926/why-arent-students-showing-up-for-college

Learning and Working in the Age of Distraction

screensThere is a lot of talk about distraction these days. The news is full of stories about the Trump administration and the President himself creating distractions to keep the public unfocused on issues they wish would go away (such as the Russias connections) and some people believe the President is too easily distracted by TV news and Twitter.

There are also news stories about the "distraction economy."  So many people are vying for your attention. The average person today is exposed to 1,700 marketing messages during a 24-hour period. Most of these distractions are on screens - TV, computers and phones.  Attention is the new currency of the digital economy.

Ironically, a few years ago I was reading about "second screens," behavioral targeting and social media marketing and that was being called the "attention economy." There is a battle for attention, and the enemy is distraction.

Google estimates that we spend 4.4 hours of our daily leisure time in front of screens. We are still using computers mostly for work/productivity and search. We use smartphones for connectivity and social interactions. Tablets are used more for entertainment. My wife and I are both guilty of "multi-screening." That means we are part of the 77% of consumers watching TV while on other devices. I am on my laptop writing and researching and she is on her tablet playing games and checking mail and messages. It is annoying. We know that.

Of course, the original land of distraction is the classroom. Students have always been distracted. Before the shiny object was a screen full of apps, passing notes was texting, and doodling in your notebook and the cute classmates sitting nearby were the social media. But I have seen four articles on The Chronicle website about "The Distracted Classroom" lately. Is distraction on the rise?

If you are a teacher or student, does your school or your own classroom have a policy on using laptops and phones? If yes, is it enforced?  Anyone who has been in a classroom lately of grade 6 or higher knows that if students have phones or laptops out in class for any reason they are texting, surfing the web, or posting on social media.

Good teachers try to make classes as interactive as possible. We engage students in discussions, group work and active learning, but distractions are there.

Banning devices isn't a good solution. Things forbidden gain extra appeal.

distractionsA few books I have read discuss the ways in which distraction can interfere with learning. In The Distracted Mind: Ancient Brains in a High-Tech World , the authors say that distraction occurs when we are pursuing a goal that really matters and something blocks our efforts to achieve it. Written by a neuroscientist, Adam Gazzaley, and a psychologist, Larry D. Rosen, they join other researchers who report that our brains aren't built for multitasking. This compares to a time a few decades ago when being able to multitask was consider a positive skill.

It seems that the current belief is that we don't really multitask. We switch rapidly between tasks. Any distractions and interruptions, including the technology-related ones - act as "interference" to our goal-setting abilities. 

But is this a new problem or has our brain always worked this way? Is the problem really more about the number of possible distractions and not our "rewired" brains?

Nicholas Carr sounded an alarm in 2011 with The Shallows: What the internet is doing to our brains, arguing that our growing exposure to online media means our brains need to make cognitive changes. The deeper intellectual processing of focused and critical thinking, gets pushed aside in favor of the faster processes like skimming and scanning.

Carr contends that the changes to the brain's "wiring" is real. Neural activity shifts from the hippocampus' deep thinking, to the prefrontal cortex where we are engaged in rapid, subconscious transactions. Substitute speed for accuracy. Prioritize impulsive decision-making over deliberate judgment. 

In the book Why Don't Students Like School?: A Cognitive Scientist Answers Questions About How the Mind Works and What It Means for the Classroom  the author asks questions such as Why Do Students Remember Everything That's on Television and Forget Everything I Say? and Why Is It So Hard for Students to Understand Abstract Ideas? and gives some science and suggestions as answers. But these are difficult questions and simple answers are incomplete answers in many cases.

Some teachers decide to use the tech that is being a distraction to gain attention. I had tried using a free polling service (Poll Everywhere) which allows students to respond/vote using their laptops or phones. You insert questions into your presentation software, and that allows you to track, analyze, and discuss the responses in real time. The problem for me is that all that needs to be pre-planned and is awkward to do on-the-fly, and I am very spontaneous in class with my questioning. Still, the idea of using the tech in class rather than banning it is something I generally accept. But that can't be done 100% of the time, so distracted use of the tech is still going to occur.

bubbleAnd the final book on my distraction shelf is The Filter Bubble. The book looks at how personalization - being in our own bubble - hurts the Internet as an open platform for the spread of ideas. The filter bubble puts us in an isolated, echoing world. The author, Eli Pariser, subtitles the book "How the New Personalized Web Is Changing What We Read and How We Think." Pariser coined the term “filter bubble.” The term is another one that has come up o the news in talking about the rise of Donald Trump and the news bubble that we tend to live in, paying attention to a personalized feed of the news we agree with and filtering out the rest.

Perhaps creating a filter bubble is our way of coping with the attention economy and a way to try to curate what information we have to deal with every day.

Then again, there were a number of things I could have done the past hour instead of writing this piece. I could have done work that I actually get paid to do. I could have done some work around my house. But I wrote this. Why? 

Information overload and non-stop media is hurting my/our discipline for focus and self-control.

Michael Goldhaber defined the attention economy in this more economic way: "a system that revolves primarily around paying, receiving and seeking what is most intrinsically limited and not replaceable by anything else, namely the attention of other human beings.” In order for that economy to be profitable, we must be distracted. Our attention needs to be drawn away from the competition.

As a secondary school teacher for several decades, I saw the rise of ADHD. That was occurring before the Internet and lack of attention, impulsivity and boredom were all symptoms. It worsened after the Internet was widespread, but it was there before it and all the personal digital devices.

Back in 1971,  Herbert A. Simon observed that “what information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention, and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.”

We are collectively wiser than ever before. We have the wisdom of the world in a handheld computer connected to almost everything. But it is so difficult to filter out the distractions and garbage that we don't have a lot of success translating information into knowledge. People used to say that finding out something on the Internet was like taking a sip from a fire hose. Search filtering has helped that, but so far the only filters for our individual brains are self-created and often inadequate.

 

What Is a Modern Learning Experience?

social on mobile

Jane Hart, who I have been following online for many years, is the Director of the Centre for Modern Workplace Learning, which she set up to help organizations and learning professionals modernize their approaches to workplace learning. Reading her online Modern Workplace Learning Magazine has alerted me to trends outside academia and outside the United States.  

She recently posted an article titled "Designing, delivering and managing modern learning experiences" and that made me consider how I would define "modern learning." It would include school experiences for some of us, but for most people today it is more likely an experience that occurs in the workplace and on our own. That itself seems like a big shift from the past. Or is it?

If in 1917, someone had wanted to become a journalist, he could go to college, but he could also get a job without a degree - if he could show he was a good writer. He could do some freelance writing with or without pay to get some experience and samples. Move 50 years to 1967, and the path was more likely to be a school of journalism. What about today?

As Jane points out, the modern learning experience path for the workplace probably includes using: 

  • Google and YouTube to solve their own learning and performance problems
  • social networks like Twitter and LinkedIn to build their own professional network (aka personal learning network)
  • messaging apps on their smartphones to connect with colleagues and groups
  • Twitter to participate in conference backchannels and live chats
  • participating in online courses (or MOOCs) on platforms like Coursera, edX and FutureLearn

The modern learning experience is on demand and continuous, not intermittent, and takes place in minutes rather than hours. It occurs on mobile devices more than on desktop computers.

Jane Hart believes it is also more social with more interacting with people, and that it is more of a personally-designed experience. I don't know if that is true for educational learning. Is it true for the workplace on this side of the pond? Does the individual design the learning rather than an experience designed by someone "in charge."

Modernizing classroom learning has often been about making learning more autonomous (self-directed, self-organized and self-managed) but that model does not easily fit into the model used for the past few hundred years in classrooms.