Keeping Updated With Chat AI Free and Premium

If you ask a question in the ChatGPT free version 3.5 about a film or event that happened in the past year, you may not get a very good answer, or perhaps no answer at all. Version 3.5 was last updated with data in January 2022. To get version 4, you need to pay for the Plus level at $20 per month ($25 for the Team version).

* NOTE There has been an upgrade for free ChatGPT users: OpenAI is opening up many of the features previously reserved for paying customers. This includes access to image and document analysis, data analytics and custom GPT chatbots.

Recently, I asked 3.5  "What films released in 2024 are possible Academy Award nominees?" The answer was generic:
"Predicting specific Academy Award nominees for a year as far out as 2024 is quite speculative since it depends on numerous factors such as the quality of the films, critical reception, box office success, and industry trends at the time of release. However, some potential contenders could include films from established directors or those with strong track records in awards season, as well as films that tackle socially relevant themes or feature standout performances. It's also worth considering that films released towards the end of the year tend to be fresher in the minds of voters during awards season. Without specific knowledge of films released in 2024, it's challenging to pinpoint exact nominees, but historical trends and industry buzz could provide some hints closer to the time."

What about Microsoft Copilot? This chatbot was developed by Microsoft and launched in February 2023. It is Microsoft's primary replacement for the discontinued Cortana. At first, it was called Bing Chat, as a built-in feature for Microsoft Bing and Microsoft Edge. However, they began to unify the Copilot branding across its various chatbot products. They planned to integrate Copilot into Windows so that you could use it directly through the taskbar.

Here is a sample prompt and response from Bing Chat last year (via Wikipedia)


I asked CoPilot today, "What was Carl Jung's concept of shadow self?" The answer was extensive (excerpted below)

"Certainly! Carl Jung, the Swiss psychiatrist and founder of analytical psychology, introduced the concept of the “shadow self.” Let’s delve into what this intriguing idea entails:

    Definition of the Shadow Self:
        The shadow refers to those aspects of ourselves that we reject or deny. These are the darker, less socially acceptable pieces of our personality that don’t align with the persona we present to the world.
        It encompasses traits, emotions, and characteristics that are often viewed as negative, such as anger, jealousy, and greed1.

    Origin and Nature:
        Jung believed that the shadow is an integral part of our psyche, residing in our unconscious mind...

    Encountering the Shadow:

        When encountering the shadow, we experience a range of emotions: fear, disgust, remorse, guilt, compassion, grief, and humility.
        The shadow has the power to possess and overwhelm us, even altering our behavior (“He is not himself today”)3.

In summary, the shadow self represents the hidden, often rejected parts of our personality. Embracing and integrating these aspects is essential for personal growth and individuation."


Higher Ed Enrollment Is Up

upward graphI saw some new enrollment data from the National Student Clearinghouse Research Center, and it is good news for colleges.  It shows a reversal from the trend of enrollment decline.  The reversal began in fall 2023, and the improvement is better for this spring. Most colleges survive from the income of tuition, room and board, and fees and any shrinking enrollment is a problem.

The data that interested me is that two-year schools showed the most growth. Dual enrollment by high school students is up 10%. High school students at community colleges accounted for nearly one-third of the total post-secondary enrollment rise.  Vocational/technical programs also figure into the increase.



Tay: A Cautionary Tale of AI

chatbot and postsTay was a chatbot originally released by Microsoft Corporation as a Twitter bot on March 23, 2016. It "has had a great influence on how Microsoft is approaching AI," according to Satya Nadella, the CEO of Microsoft.

Tay caused almost immediate controversy when the bot began to post inflammatory and offensive tweets through its Twitter account, causing Microsoft to shut down the service only 16 hours after its launch. According to Microsoft, this was caused by trolls who "attacked" the service as the bot made replies based on its interactions with people on Twitter - a dangerous proposition.

It was named "Tay" as an acronym for "thinking about you." It was said that it was similar to or based on Xiaoice, a similar Microsoft project in China that Ars Technica reported that it had "more than 40 million conversations apparently without major incident".

Interestingly, Tay was designed to mimic the language patterns of a 19-year-old American girl and presented as "The AI with zero chill."

It was quickly abused with Twitter users began tweeting politically incorrect phrases, teaching it inflammatory messages so that the bot began releasing racist and sexually-charged messages in response to other Twitter users.

One artificial intelligence researcher, Roman Yampolskiy, commented that Tay's misbehavior was understandable because it mimicked the deliberately offensive behavior of other Twitter users, and Microsoft had not given the bot an understanding of inappropriate behavior. He compared the issue to IBM's Watson, which began to use profanity after reading entries from the website Urban Dictionary.

It was popular in its short life. Within 16 hours of its release, Tay had tweeted more than 96,000 times, That is when Microsoft suspended the account for "adjustments." Microsoft confirmed that Tay had been taken offline, released an apology on its official blog, and said it would "look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values."

Then on March 30, 2016, Microsoft accidentally re-released the bot on Twitter while testing it. Given its freedom, Tay released some drug-related tweets, then it became stuck in a repetitive loop of tweeting "You are too fast, please take a rest", several times a second. The posts appeared in the feeds of 200,000+ Twitter followers.

Tay has become a cautionary tale on the responsibilities of creators for their AI.

In December 2016, Microsoft released Tay's successor, a chatbot named Zo which was an English version of Microsoft's other successful chatbots Xiaoice (China) and Rinna [ja] (Japan).

So You Want To Be An AI Prompt Engineer

AI prompt engineerWhen I was teaching in a high school, I used to tell students (and faculty) that we were not preparing them for jobs. I was sure many of our students would end up in jobs with titles that did not exist then. There is a song by The Byrds from the 1960s titled "So You Wanna Be a Rock 'n' Roll Star." In 2024, it could be "So You Want To Be An AI Prompt Engineer."

The role of AI prompt engineer attracted attention for its high-six-figure salaries when it emerged in early 2023. What does this job entail? The principal aim is to help a company integrate AI into its operations. Some people describe the job as more prompter than engineer.

There are already tools that work with apps like OpenAI’s ChatGPT platform that can automate the writing process using sets of built-in prompts. Does that mean that AI will replace AI prompt engineers already? For now, the prompter works to ensure that users get the desired results. They might also be the instructors for other employees on how to use generative AI tools. They become the AI support team. AI can automate "trivial" tasks and make more time for work that requires creative thinking.

What kind of training leads to getting this job? You might think a background in computer science, but probably a strong language and writing ability is more important. People who write in the corporate world might justifiably fear AI will take their jobs away. Being a prompter might be an alternative.

Still, I suspect that there is a good possibility that a prompter/engineer's job might be vulnerable as software becomes better at understanding users’ prompts.

If you are interested in being an AI prompt engineer, I posted last week about some free online courses offered by universities and tech companies that included three courses that relate to creating prompts for AI.

AI Applications and Prompt Engineering is an edX introductory course on prompt engineering that starts with the basics and ends with creating your applications.

Prompt Engineering for ChatGPT is a specific 6-module course from Vanderbilt University (through Coursera) that offers beginners a starting point for writing better prompts.

Another course on ChatGPT Prompt Engineering for Developers is offered by OpenAI in collab with DeepLearning and it is taught by Isa Fulford and Andrew Ng.  It covers best practices and includes hands-on practice.