ChatGPT - That AI That Is All Over the News

Dear ChatGPTSo far, the biggest AI story of 2023 - at least in the education world - is ChatGPT. Chances are you have heard of it. If you have been under a rock or buried under papers you have to grade, ChatGPT is Chat Generative Pre-trained Transformer. ChatGPT is the newest iteration of the chatbot that was launched by OpenAI in late 2022.

OpenAI has a whole GPT-3 family of large language models. It has gotten attention for its detailed responses and articulate answers across many domains of knowledge. But in Educationland, the buzz is that it will allow students to use it to write all their papers. The first article someone sent me had a title like "The End of English Classes."

People started to test it out and there were both reactions of amazement at how good it worked, and also criticisms of very uneven factual accuracy.

Others have written about all the issues in great detail and so I don't need to go into great detail here, but I do want to summarize some things that have emerged in the few months it has been in use with the public, and provide some links to further inquiry.

  • Currently, you can get a free user account at https://chat.openai.com/  I was hesitant at first to register because it required giving a mobile phone number and I don't need to get more spam phone calls but I finally created an account so I could do some testing (more on that in my next post)
  • OpenAI is a San Francisco-based company doing AI research and deployment and states that their mission is "to ensure that artificial general intelligence benefits all of humanity."
  • "Open" may be a misnomer in that the software is not open in the sense of open source and the chatbot will not be free forever.
  • ChatGPT can write essays, and articles and even come up with poems, scripts and answer math questions or write code - all with mixed results.
  • AI chatbots have been around for quite a while. You probably have used one online to ask support questions and tools like Siri, Alexa, and others are a version of this. I had high school students making very crude versions of chatbots back in the last century based on an early natural language processing program called Eliza that had been written in the mid-1960s at MIT.
  • Schools have been dealing with student plagiarism since there have been schools, but this AI seems to take it to a new level since OpenAI claims that the content the bot produces is not copied but that the bot generates text based on the patterns it learned in the training data.
  • This may be a good thing for AI in general or further fuel fears of an "AI takeover." You can find more optimistic stories about how AI is shaping the future of healthcare. It can accurately and quickly analyze medical tests and find connections between patient symptoms, lifestyle, drug interactions, etc.
  • I also see predictions that as AI makes the once humans-only skill of writing automated that our verbal skills will carry more weight.

You can write to me at serendipty35blog at gmail.com with your thoughts about these chatbot AI programs or any issues where tech and education cross paths for better or worse.

FURTHER INQUIRY

Forbes says that ChatGPT And AI Will Fuel New EdTech Boom because venture capitalists predict artificial intelligence, virtual reality and video-learning startups will dominate the space in 2023.

This opinion piece compares ChatGPT to the COVID pandemic! insidehighered.com/views/2023/02/09/chatgpt-plague-upon-education-opinion

The New York Times podcast, The Daily, did an episode that included tests of the bot. Listen on Apple Podcasts

A teacher friend posted on his blog a reaction to the idea that ChatGPT is the death of the essay. he says "And here's my point with regard to artificial intelligence: if students are given the chance and the encouragement to write in their own voices about what really matters to them, what possible reason would they have for wanting a robot to do that work for them?  It's not about AI signaling the death of writing. It's about giving students the chance to write about things they care enough about not to cheat."

OpenAI is not alone in this AI approach. Not to be outdone, Google announced its own Bard, and Microsoft also has a new AI that can do some scary audio tricks.

People are already creating "gotcha" tools to detect things written by ChatGPT.

I found a lesson for teachers about how to use ChatGPT with students. Here is a result of asking ChatGPT to write a lesson plan on how teachers can use ChatGPT with students.

Do You Own Your Face Online?

Image: Clker-Free-Vector-Images from Pixabay

Who owns the rights to my face? I assumed it was me until I read an article that reminded me that when we create social media accounts, we pretty much agree to grant those platforms a free license to use our content as they wish.

In most cases, you hold the copyright to any content you upload to social media platforms. But when you created your account on Facebook, Twitter, Instagram, Tik Tok, or any platform you agreed to have a free license to use your content as they wish. How can they use it> It depends, but did you read the user agreement or just click "continue?"

How would you feel if you saw one of your tweets used in a Twitter ad campaign? Violated? Angry? Excited? Feel as you wish, but don't expect any cut of the ad's revenue.

In that article, a person sees a sponsored Instagram Story ad with a video of a person putting on lip balm. The person was her. She watched herself apply the balm and smile at the camera, but Abby never agreed to appear in a nationwide social campaign. How is this possible?

Usage rights dictate who owns an image or asset. It determines how and where it’s allowed to appear, and for how long.

The author had worked in media and knew that employees are often "pressured" to appear in campaigns but it is not a part of the full-time job and it is likely that it will go uncompensated. 

In this case, she had been told to participate in a photoshoot demonstrating the product’s healing benefits. She recorded for the work day, was not paid, and she believed the campaign was only going to run on the employer’s social media accounts for a few months. But this was more than a year later. Probably her former employer passed the content to the skincare company, though without her permission.

There's an old saying that if you're not paying for a product, then you are the product. Social media sites like Facebook and Instagram are completely free to use for the average consumer because advertisers pay for your attention (and sometimes your data). This is not a new model. In commercial TV broadcasting, you watch content for free because there are commercials. A more cynical explanation is that you pay for the privilege of having yourself sold. You are consumed. You are the product. They deliver you to the advertiser,. The advertiser is their customer.

Think about that the next time you read - or choose not to read - the terms and conditions and agree with a click.

 

This article is also crossposted at One-Page Schoolhouse

Parental Control of Technology

kids on tech
Photo by Andrea Piacquadio

As the new school year begins for all students this week, a series titled "Parental Control" appears from Mozilla (Firefox) about ways to empower parents for some technology challenges. That sounds like a good thing, but particularly when it applies to schools, parental control has cons along with pros.

Many digital platforms offer parental control settings. The most common and most popular allows parents to shield young people from “inappropriate” content. Restricting "mature content" and what is "inappropriate" takes us into a controversial area. Who defines what should be restricted? Mozilla says that "the way platforms identify what that means is far from perfect."

YouTube has apologized after its family-friendly “Restricted Mode” recently blocked videos by gay, bisexual and transgender creators, sparking complaints from users. Restricted Mode is an optional parental-control feature that users can activate to avoid content that’s been flagged by an algorithm.

That example takes me back to the earliest days of the Internet in K-12 schools when filters would block searches for things like "breast cancer" because "breast" was on the list of blocked words.

Limiting screen time is another strategy and is within a parent's control but is certainly controversial within a family. Kids don't like their screen time to be limited.

Mozilla actually had questions for itself about what to call the series. They quote Jenny Radesky, an MD and Associate Professor of Pediatrics-Developmental/Behavioral at the University of Michigan, as saying that “Parental mediation is [a better] term, parental engagement is another – and probably better because it implies meaningful discussion or involvement to help kids navigate media, rather than using controlling or restricting approaches.” She pointed to research that suggests letting children manage their own media consumption may be more effective than parental control settings offered by apps.

The internet has risks, but so do parental controls. Many kids in the LGBTQI+ community can be made vulnerable by tech monitoring tools.

Sensitive information about young people can be exposed to teachers and campus administrators through the school devices they use.

As parents and eductaors, we want to protect students, especially the youngest ones. We als want to, as a society, instill in younger generations why privacy matters.

RESOURCES

Electronic Frontier Foundation https://www.eff.org/search/site/parents

Mozilla https://blog.mozilla.org/en/internet-culture/deep-dives/parental-controls-internet-safety-for-kids/

Federated Learning

When I first think of federated learning, what comes to mind is something like a college federated department. For example, the history faculty at NJIT and Rutgers University-Newark are joined in a single federated department offering an integrated curriculum and joint undergraduate and graduate degree programs.

Having worked at NJIT, it made sense to combine the two departments and collaborate. Each had its own specialties but they were stronger together.

In technology, a federation is a group of computing or network providers agreeing upon standards of operation in a collective fashion, such as two distinct, formally disconnected, telecommunications networks that may have different internal structures.

There is also federated learning which sounds like something those two history departments are doing, but it is not. This federated learning is the decentralized form of machine learning (ML).

In machine learning, data that is aggregated from several edge devices (like mobile phones, laptops, etc.) is brought together to a centralized server.  The main objective is to provide privacy-by-design because, in federated learning, a central server just coordinates with local clients to aggregate the model's updates without requiring the actual data (i.e., zero-touch).

I'm not going to go very deep here about things like the three categories (Horizontal federated learning, vertical federated learning, and federated transfer learning). As an example, consider federated learning at Google where it is used to improve models on devices without sending users' raw data to Google servers.

comic
An online comic from Google AI

For people using something like Google Assistant, privacy is a concern. Using federated learning to improve “Hey Google,” your voice and audio data stay private while Google Assistant uses it.

Federated learning trains an algorithm across the multiple decentralized edge devices (such as your phone) or servers that have local data samples, without exchanging them. Compare this to traditional centralized machine learning techniques where all the local datasets are uploaded to one server.

So, though federated learning is about training ML to be efficient, it is also about data privacy, data security, data access rights and access to heterogeneous data.


MORE at analyticsvidhya.com...federated-learning-a-beginners-guide