Why and When Did Social Media Go Wrong?

Following my last post about how social media is bad for your health - an idea that I think most people would agree with  - I also feel that social media has undeniably transformed communication and society in numerous ways. If you assume that is true, then you should ask why and when social media went wrong. This is a cross-post from my Weekends in Paradelle blog.

Despite lots of media attention about the negative effects of social media. it is still widely used. I started thinking about when social media became unhealthy. Any answer is subjective and complex and probably depends on individual factors such as personal experiences, societal norms, and technological advancements.While it offers many benefits, there have been turning points that have contributed to negative perceptions of social media.

Here’s my list of some turning points:

Privacy Concerns: As social media platforms evolved and became more integrated into people’s lives, concerns about privacy and data security emerged. High-profile incidents, such as the Cambridge Analytica scandal involving Facebook, raised awareness about the potential misuse of personal data collected by social media companies. This eroded trust among users and led to increased scrutiny of social media platforms’ privacy practices.

Spread of Misinformation and Fake News: Social media has facilitated the rapid spread of misinformation, rumors, and fake news. The ease of sharing content on platforms like Twitter, Facebook, and WhatsApp has made it challenging to verify the accuracy of information, leading to the proliferation of false narratives and conspiracy theories. This phenomenon has had serious consequences, including the exacerbation of social divisions, political polarization, and public health misinformation.

Cyberbullying and Online Harassment: Social media platforms have provided avenues for cyberbullying, harassment, and online abuse. The relative anonymity afforded by the internet, combined with the viral nature of social media, has enabled individuals to target others with hurtful or threatening behavior. This has had particularly harmful effects on young people, leading to mental health issues, social withdrawal, and even suicide in some cases.

Impact on Mental Health: Research has highlighted the negative effects of excessive social media use on mental health, including increased feelings of loneliness, depression, anxiety, and low self-esteem. Factors such as social comparison, cyberbullying, and the pressure to present a curated and idealized version of one’s life contribute to these negative outcomes. Additionally, the addictive nature of social media platforms, characterized by endless scrolling and notifications, can exacerbate feelings of stress and overwhelm.

Erosion of Civil Discourse: Social media was once seen as one way to “democratize” the web. But it has been criticized for contributing to the erosion of civil discourse and the rise of polarized and hostile online environments. Echo chambers and filter bubbles, where users are exposed primarily to viewpoints that align with their own, can reinforce existing biases and prevent constructive dialogue across ideological divides. This has implications for democracy, as it hampers informed decision-making and compromises the ability to find common ground on important societal issues.

So, when and why did social go wrong?

When I was teaching a graduate course in social media, we talked about its timeline history. That was 2016 and we were only talking about the negative effects as a fairly new point on that timeline. If I were teaching that today, I would need to add developments in the history of social media that mark shifts toward negative effects:

Here is a start on that list:
Proliferation of Platforms: Social media platforms began to gain popularity in the early 2000s with sites like MySpace and Friendster. As more platforms emerged and gained widespread adoption, the sheer volume of social interactions online increased dramatically.

Introduction of News Feeds: The introduction of news feeds, where users could see updates from friends and pages they followed in real-time, marked a significant shift in how people consumed content on social media. This change led to increased time spent on platforms and potentially unhealthy comparison behaviors.

Rise of Smartphones: The widespread adoption of smartphones made access to social media constant and ubiquitous. People could now engage with social media anytime, anywhere, blurring the boundaries between online and offline life.

Algorithmic Changes: Social media platforms began to implement algorithms to curate users’ feeds based on their interests and behaviors. While these algorithms aimed to increase engagement, they also contributed to echo chambers, filter bubbles, and the spread of misinformation.

Data Privacy Concerns: High-profile data breaches and scandals, such as the Cambridge Analytica scandal involving Facebook, highlighted how social media platforms could compromise users’ privacy and security. These revelations eroded trust in social media companies and raised concerns about the ethical implications of their practices.

Overall, while social media has brought about numerous positive advancements in communication and connectivity, its negative effects have become increasingly apparent over time. The exact point at which it became “unhealthy” is difficult to pinpoint, but these developments have collectively contributed to growing concerns about the impact of social media on individuals and society.

The Futures of Distance Education

logo

Embedded below is a video of Bryan Alexander's virtual keynote at the DEC 2024 conference. Bryan is a futurist, researcher, writer, speaker, consultant, and teacher, working in the field of higher education’s future. The event was held at New Jersey's Mercer County Community College (and online).

Though AI was not the theme of the conference, it came up in every session I attended. If you are looking for additional professional development opportunities discussing AI, the Instructional Technology Council is holding a virtual spring summit on Friday, April 12th. It will feature presentations and discussion panels examining the benefits and challenges of AI at community colleges across the country.

 

Watch other sessions

Bryan Alexander speaks widely and publishes frequently, with articles appearing in venues including The Atlantic Monthly, Inside Higher Ed. He has been interviewed by and featured in the New York Times, the Washington Post, MSNBC, the Wall Street Journal, US News and World Report, National Public Radio (2017, 2020, 2020, 2020, 2020), the Chronicle of Higher Education (2016, 2020), the Atlantic Monthly, Reuters, Times Higher Education, the National Association of College and University Business Officers, Pew Research, Campus Technology, The Hustle, Minnesota Public Radio, USA Today, and the Connected Learning Alliance. He recently published Academia Next: The Futures of Higher Education for Johns Hopkins University Press (January 2020), which won an Association of Professional Futurists award. He next book, Universities on Fire: Higher Education in the Age of Climate Crisis, is forthcoming from Johns Hopkins. His two other recent books are Gearing Up For Learning Beyond K-12 and The New Digital Storytelling (second edition). Bryan is currently a senior scholar at Georgetown University and teaches graduate seminars in their Learning, Design, and Technology program.

1970s Computer Clubs

Apple I

                 The Apple 1 as displayed at the Computer History Museum

On March 5, 1975, the Homebrew Computer Club first met in a garage near Menlo Park in Silicon Valley, California.

On that day, I was across the country in my last semester at Rutgers. I had taken one course in computer programming, using Fortran, which had been around in some earlier forms since the late 1950s. We used a box of punch cards to create a program. I had looked into the class as an auditor, for no credit and not on my transcript, because I had talked to the professor after an information session he gave, and he was curious to see what an English major would do in his class.

My afterschool and vacation job in high school was doing printing for a liquor distributor. They had a room with huge computers using tape drives and cards, and I would sometimes wander in there and talk to the operator. Of course, I understood nothing about what he was doing. He was in a unique place in that position because no one in the company understood what he was doing except him and his one assistant. And yet those computers, printed all the invoices which I would later have to box up and file in the warehouse. Though they were using the computer to print them all, no one could access that data from their desktop, so if someone wanted a copy of an invoice, they had to dig through a file cabinet.

That 1970 computer was certainly not for personal use, and no one had a personal computer because they did not exist. Most of my fellow students didn't imagine we would ever have a computer in our home. They were gigantic — a computer easily took up an entire room. And they were very, very expensive, costing about a million dollars each. Not even computer engineers or programmers who made a living working on computers had access to a personal computer.

So this California club served a real need for tech-minded people But many of these tech-minded people wanted to build personal computers for fun. And they decided to start a hobbyist club to trade circuit boards and information and share their enthusiasm. Among the early members were high school friends Steve Jobs and Steve Wozniak. Eventually, they would design and build what tey called the Apple I and II computers and brought them to the club to show them off. Lee Felsentein and Adam Osborne were also members and would create the first mass-produced portable computer, the Osborne 1.

Wozniak would write "The theme of the club was 'Give to help others.' Each session began with a 'mapping period,' when people would get up one-by-one and speak about some item of interest, or a rumor, and have a discussion. Somebody would say, 'I've got a new part,' or somebody else would say he had some new data or ask if anybody had a certain kind of teletype."

I started teaching in a junior high school, in the fall of 1975, and shortly thereafter, the school got a terminal that was connected to a mainframe at some university in New Jersey. It was first used by one of the math teachers for a kind of computer club. I did go to his classroom a few times just to see how it worked but I saw no connection to what I had learned about programming in college.

It would be a few years before the first personal computers appeared in the school   We had a lab that was used for the first actual computer class. It was a classroom full of standalone TRS 80s. TRS stands for Tandy RadioShack, though later they were nicknamed Trash 80s. I took a professional development class using those computers where we learned to program in BASIC. I created a vocabulary flashcard program that I was able to use with a few of my English classes during periods when the lab was not being used by the math teacher. The program was crude. The graphics were basically nonexistent, but the kids and I found it very interesting. 

I remember one teacher who was in the professional development class, saying we will all have to learn to program in the future. I was sure she was wrong. I had no doubt that computers would play a role in our teaching future, but I was also sure that other people would be writing the programs and we would only be users.

apple iie

The first computer I had in my classroom was an Apple IIe. Since I had some computer background and more so because I had some interest in learning more, I became the computer coordinator for the building. That meant my computer had two disc drives so that I could copy software that we had purchased and were allowed to copy.  MECC was a big source of classroom software back then.

The first computer I bought for home use was the same as what I had in my classroom which made sense because then I could use the software home too. This hardware was expensive. I paid more for the Apple dot matrix printer than I paid for my laptop last year.

We remained an aApple school, and an apple family for a few years until a new person moved into the position of district computer coordinator. He swapped out all the Apple computers for what we would call IBM clones, but we're the early Windows95-equipped computers. When I bought my next computer, it was one using Windows 95.

When I left teaching secondary school in 2000 and went to work at NJIT, all the computers used Windows except for the school of architecture, which was an Apple Mac building. They were their own little tech world. And so I lost contact with the Apple world in those days when even TV commercials and print ads would argue about whether you were a Windows or Mac kind of person. I remember one professor saying to me that he was surprised I was not using a Mac because I seemed like "a creative type."

Valentine's Day with Artificial Intelligence

valentine card kids
When love was easy. Or at least easier.

Since my dating days were before dating became an online thing and literally before online was a thing, I haven't really kept up with dating and technology. 

I have friends who got divorced and dipped back into dating and used online dating apps. Over 300 million people use dating apps worldwide, according to a 2023 report by Business of Apps. To visualize this figure, it’s almost the entire population of the U.S. or half of Europe’s population.

Tinder is an online dating and geosocial networking application launched in 2012. On Tinder, users “swipe right” to like or “swipe left” to dislike other users’ profiles. A profile has their photos, a short bio, and some of their interests. Tinder uses a “double opt-in” system, also called “matching”, where two users must like each other before they can exchange messages. In 2022, Tinder had 10.9 million subscribers and 75 million monthly active users.

Renate Nyborg was Tinder’s first female CEO, but she recently left the popular dating app and launched Meeno which is described as relationship advice rather than dating. For example, you might ask for advice about dealing with your boss. The Meeno app uses artificial intelligence (AI) to help solve relationship problems. She predicts that the future will be less about online dating and more about real-life encounters.

The numbers for online dating are huge but Nyborg and others see a trend (with Gen Z in particular – 18 to 25-year-olds) that they are more interested in meeting people organically.

When she left Tinder, she had said she wanted to use tech to “help people feel less lonely” and dating is only a part of that. According to a 2023 report on loneliness commissioned by the European Commission, at least 10% of European Union residents feel lonely most of the time. A Pew Research study revealed that 42% of adults surveyed in the US said they had felt lonely during the COVID-19 pandemic. So, Meeno is intended to be your mentor, distinct from a virtual girlfriend, boyfriend, clinical therapist, or coach.

What can AI do in all this? Broadly, AI can speed up the processing of all these apps. It can analyze very quickly user behavior patterns and datasets to identify potential matches based on shared interests, values, and preferences. AI can filter profiles for inappropriate content, such as nudity or hate speech. It can analyze a user’s swiping patterns, interests, answers to questions, and personality results to introduce them to tailored recommendations.

There are other apps, like Blush, Aimm, Rizz, and Teaser AI, that use personality tests and physical type analysis to train AI-powered systems. Some apps use machine learning algorithms to scan for attraction and then suggest images of real people that the app thinks the user might find attractive. these are more for “dating” than everyday relationships which is Meeno’s current target.

This post first appeared in a different format on Weekends in Paradelle