The Soft Skills of AI

workers talking
Communication is a rising soft skill

AI, especially its subset, generative AI, seems to be changing everything including the workplace. As machines become adept at tasks once considered uniquely human, what does this mean for the workforce, and which worker skills will become more important? For some jobs, AI will simply be complementary to the job, but the prevailing belief is that about half of all jobs will be significantly disrupted by AI.

I have never been a fan of the terms "hard and soft skills" since it seems to make some "soft" skills seem less important. Still, some historically “hard” skills will drop on the hiring credentials.

An article on featured some soft skills that will be important in an AI world.

SOCIAL INTERACTION SKILLSsuch as listening to others in meetings, or collaborating with teammates under pressure, will remain in the human domain. A working paper from the National Bureau of Economic Research showed that almost all job growth since 1980 has been seen in jobs that are social-skill intensive, while jobs that require minimal social interaction have been in decline.

CREATIVITY especially in using AI. One study found that knowledge workers who used Chat GPT 4.0 completed 12.2% more tasks, 25,.1% faster and with 40% greater quality over those who did not use AI to perform their work. That’s astonishing data, especially the data on the increased quality level. Human workers who leverage AI and who demonstrate a combination of strong creativity and critical thinking skills will fare the best.

CRITICAL THINKING SKILLS I don't think that critical thinking has ever been off the skills list for jobs. It must be applied to evaluate AI responses since (as you may have discovered already) not all responses will be valid, unbiased, factual, or error-free. AI can generate vast amounts of data, analyses, and potential solutions at unprecedented speed, but the veracity and applicability of generative AI’s responses are not guaranteed. A uniquely human skill is to think critically.

CURIOSITY is that innate drive to explore, understand, and seek information about the world around us. AI is not curious unless it is told to be or programmed to seek out information. Workers ask questions, probe into things, challenge assumptions and delve deeper.

Yes, the rise of AI will fundamentally alter the nature of skills deemed crucial in the workplace. While some hard skills and jobs will disappear for workers, some soft skills will remain human-only and therefore will become more important - perhaps "harder" -  than ever.

Telling Students to Use AI


2023 was certainly a year for AI. In education, some teachers avoided it and some embraced it, perhaps reluctantly at first. Some educators have reacted, partially to AI that can write essays Some schools, some teachers, some school districts some colleges some departments have tried to ban it issues. Of course, that is impossible, just as it was impossible to ban the use of Wikipedia or going back to the previous century, the use of a word processor, or a calculator in a math class, or use the Internet to copy and paste information.

What happened when an entire class of college students were told to use ChatGPT to write their essays?

Chris Howell, an adjunct assistant professor of religious studies at Elon University, noticed more and more suspiciously chatbot-esque prose popping up in student papers. So rather than trying to police the tech, he embraced it. He assigned students to generate an essay entirely with ChatGPT and then critique it themselves.

When I first caught students attempting to use ChatGPT to write their essays, it felt like an inevitability. My initial reaction was frustration and irritation—not to mention gloom and doom about the slow collapse of higher education—and I suspect most educators feel the same way. But as I thought about how to respond, I realized there could be a teaching opportunity. Many of these essays used sources incorrectly, either quoting from books that did not exist or misrepresenting those that did. When students were starting to use ChatGPT, they seemed to have no idea that it could be wrong.

I decided to have each student in my religion studies class at Elon University use ChatGPT to generate an essay based on a prompt I gave them and then “grade” it. I had anticipated that many of the essays would have errors, but I did not expect that all of them would. Many students expressed shock and dismay upon learning the AI could fabricate bogus information, including page numbers for nonexistent books and articles. Some were confused, simultaneously awed and disappointed. Others expressed concern about the way overreliance on such technology could induce laziness or spur disinformation and fake news. Closer to the bone were fears that this technology could take people’s jobs. Students were alarmed that major tech companies had pushed out AI technology without ensuring that the general population understands its drawbacks.

The assignment satisfied my goal, which was to teach them that ChatGPT is neither a functional search engine nor an infallible writing tool.


Detecting AI-Written Content

chatbotWhen chatGPT hit academia hard at the start of this year, there was much fear from teachers at all grade levels. I saw articles and posts saying it would be the end of writing. A Princeton University student built an app that helps detect whether a text was written by a human being or using an artificial intelligence tool like ChatGPT. Edward Tian was a senior computer science major. He has said that the algorithm behind his app, GPTZero, can "quickly and efficiently detect whether an essay is ChatGPT or human written."

GPTZero is at I was able to attend an online demo of the app now that it has been released as a free and paid product, and also communicated with Tian.

Because ChatGPT has exploded in popularity, it has gotten interest from investors. The Wall Street Journal reported that parent company OpenAI could attract investments valuing it at $29 billion. But the app has also raised fears that students are using the tool to cheat on writing assignments.

GPTZero examines two variables in any piece of writing it examines. It looks at a text's "perplexity," which measures its randomness: Human-written texts tend to be more unpredictable than bot-produced work. It also examines "burstiness," which measures variance, or inconsistency, within a text because there is a lot of variance in human-generated writing.

Unlike other tools, such as, the app does not tell you the source of the writing. That is because of the odd situation that writing produced by a chatbot isn't exactly from any particular source.

There are other tools to detect AI writing - see

Large language models themselves can be trained to spot AI-generated writing if they were trained on two sets of text. One text would be AI and the other written by people, so theoretically you could teach the model to recognize and detect AI writing.

Rethinking Accessible Courses

accessibilty word cloudWhen I was working full-time as an instructional designer, I became very concerned with making courses (especially online courses) accessible. In the early days of this century, very often the college I worked at was quite focused on making accommodations for students with special needs. That was a quick fix but not a sustainable approach.

Retrofitting online courses became part of my department's purview. Our instructional design thinking believed that access(ible) are more than making accommodations. We knew that courses that were accessible for students who had particular needs were also courses that ate probably more accessible for all the other students too. There were so many small examples of things we did. It turned out to be useful to all the students in the course.

One semester now 20 years ago, I decided to provide audio files of my short online lectures and of explanatory talk about some of the more complicated assignments. Some students told me that they would listen to them while driving in the car, or commuting on the train, or on their walks with their dogs. Most of these audio files were taken from videos that I had made often with accompanying PowerPoint slides. So the visual was lost but we all know that a good number of PowerPoint slides used for lecture or text, so not all of the visual content was needed.

The fact that students use them this way, not only convinced me to continue the practice but made me rethink what I was putting in those slides. Perhaps the truly visual presentations needed to be truly visual and not offered as audio files so that students would have to sit down and view the video version. I was rethinking my use of visuals overall.