Begin. End. The Waning Days of Coding

code on screen

A piece in The New Yorker (not exactly a technology magazine) titled "A Coder Considers the Waning Days of the Craft," set me thinking about what tech careers will be lost in the near and far future. Yes, artificial intelligence plays into this, but there are other factors too. Coding seems to be a likely candidate for being on the decline.

The author, James Somers, says that, "Coding has always felt to me like an endlessly deep and rich domain. Now I find myself wanting to write a eulogy for it." With his wife pregnant, he wonders that "...by the time that child can type, coding as a valuable skill might have faded from the world." 

It is an interesting read. Kind of a memoir of a coder.

Schools still teach coding. Coders are still working. The question is for for how long? Should a student in middle school think about it as a career? I used to tell my middle school students that a lot of them will go into careers that have titles that don't exist today. Who can predict?

Somers concludes:

"So maybe the thing to teach isn’t a skill but a spirit. I sometimes think of what I might have been doing had I been born in a different time. The coders of the agrarian days probably futzed with waterwheels and crop varietals; in the Newtonian era, they might have been obsessed with glass, and dyes, and timekeeping. I was reading an oral history of neural networks recently, and it struck me how many of the people interviewed—people born in and around the nineteen-thirties—had played with radios when they were little. Maybe the next cohort will spend their late nights in the guts of the A.I.s their parents once regarded as black boxes. I shouldn’t worry that the era of coding is winding down. Hacking is forever."

The future of coding is likely to be affected by all of these factors:

Artificial Intelligence and Automation: AI is already influencing coding through tools that assist developers in writing code, debugging, and optimizing algorithms. As AI continues to advance, it may take on more complex coding tasks, allowing developers to focus on higher-level design and problem-solving.

Low-Code/No-Code Development: The rise of low-code and no-code platforms is making it easier for individuals with limited programming experience to create applications. This trend could democratize software development, enabling a broader range of people to participate in creating digital solutions.

Increased Specialization: With the growing complexity of technology, developers are likely to become more specialized in particular domains or technologies. This could lead to a more segmented job market, with experts in areas like AI, cybersecurity, blockchain, etc.

Remote Collaboration and Distributed Development: Remote work has become more prevalent, and this trend is likely to continue. Tools and practices for collaborative and distributed development will become increasingly important.

Ethical Coding and Responsible AI: As technology plays a more central role in our lives, the ethical considerations of coding will become more critical. Developers will need to be mindful of the societal impact of their creations and consider ethical principles in their coding practices.

Continuous Learning: The pace of technological change is rapid, and developers will need to embrace a mindset of continuous learning. Staying updated with the latest tools, languages, and methodologies will be crucial.

Quantum Computing: While still in its early stages, quantum computing has the potential to revolutionize certain aspects of coding, particularly in solving complex problems that are currently intractable for classical computers.

Augmented Reality (AR) and Virtual Reality (VR): As AR and VR technologies become more widespread, developers will likely be involved in creating immersive experiences and applications that leverage these technologies.

Cybersecurity Emphasis: With the increasing frequency and sophistication of cyber threats, coding with a focus on security will be paramount. Developers will need to incorporate secure coding practices and stay vigilant against emerging threats.

Environmental Sustainability: As concerns about climate change grow, there may be a greater emphasis on sustainable coding practices, including optimizing code for energy efficiency and reducing the environmental impact of data centers.

How do I know this? Because I asked a chatbot to tell me the future of coding.

Report: AI and the Future of Teaching and learning

I see articles and posts about artificial intelligence every day. I have written here about it a lot in the past year. You cannot escape the topic of AI even if you are not involved in education, technology or computer science. It is simply part of the culture and the media today. I see articles about how AI is being used to translate ancient texts at a speed and accuracy that is simply not possible with humans. I also see articles about companies now creating AI software for warfare. The former is a definite plus, but the latter is a good example of why there is so much fear about AI - justifiably so, I believe.

Many educators seem to have had the initial reaction to the generative chatbots that became accessible to the public late last year and were being used by students to write essays and research papers. This spread through K-12 and into colleges and even into academic papers being written by faculty.

A chatbot powered by reams of data from the internet has passed exams at a U.S. law school after writing essays on topics ranging from constitutional law to taxation and torts. Jonathan Choi, a professor at Minnesota University Law School, gave ChatGPT the same test faced by students, consisting of 95 multiple-choice questions and 12 essay questions. In a white paper titled "ChatGPT goes to law school," he and his coauthors reported that the bot scored a C+ overall.

ChatGPT, from the U.S. company OpenAI, got most of the initial attention in the early part of 2023. They received a massive injection of cash from Microsoft. In the second half of this year, we have seen many other AI chatbot players, including Microsoft and Google who incorporated it into their search engines. OpenAI predicted in 2022 that AI will lead to the "greatest tech transformation ever." I don't know if that will prove to be true, but it certainly isn't unreasonable from the view of 2023.

Chatbots use artificial intelligence to generate streams of text from simple or more elaborate prompts. They don't "copy" text from the Internet (so "plagiarism" is hard to claim) but create based on the data they have been given. The results have been so good that educators have warned it could lead to widespread cheating and even signal the end of traditional classroom teaching methods.

Lately, I see more sober articles about the use of AI and more articles about teachers including lessons on the ethical use of AI by students, and on how they are using chatbots to help create their teaching materials. I knew teachers in K-20 who attended faculty workshops this past summer to try to figure out what to do in the fall.

Report coverThe U.S. Department of Education recently issued a report on its perspective on AI in education. It includes a warning of sorts: Don’t let your imagination run wild. “We especially call upon leaders to avoid romancing the magic of AI or only focusing on promising applications or outcomes, but instead to interrogate with a critical eye how AI-enabled systems and tools function in the educational environment,” the report says.

Some of the ideas are unsurprising. For example, it stresses that humans should be placed “firmly at the center” of AI-enabled edtech. That's also not surprising since an earlier White House “blueprint for AI,” said the same thing. And an approach to pedagogy that has been suggested for several decades - personalized learning - might be well served by AI. Artificial assistants might be able to automate tasks, giving teachers time for interacting with students. AI can give instant feedback to students "tutor-style." 

The report's optimism appears in the idea that AI can help teachers rather than diminish their roles and provide support. Still, where AI will be in education in the next year or next decade is unknown.

Micromobility

scootersMicromobility refers to a range of small, lightweight vehicles operating at speeds typically below 25 km/h (15 mph) and driven by users personally. Micromobility devices include bicycles, e-bikes, electric scooters, electric skateboards, shared bicycle fleets, and electric pedal-assisted (pedelec) bicycles, and even hoverboards. The term "micromobility" was originally coined by Horace Dediu in 2017.

There are benefits and challenges for individuals using this type of transportation, including lower initial cost, maintenance, fuel, and parking costs in many instances. The cost for some of those options can even be zero, as with fuel for a traditional bicycle or scooter.

There are also benefits and concerns for communities. Particularly in the ever-evolving landscape of urban transportation, micromobility solutions are revolutionizing the way people commute, reducing congestion, improving air quality and redefining urban mobility.

Some of these modes are electric, some are traditional, such as bicycles, scooters, and skateboards, and some are hybrid. The range of micromobility vehicles offers compact, eco-friendly and convenient options for short-distance travel. By reducing congestion, decreasing carbon emissions and promoting active lifestyles, micromobility has the potential to positively impact urban environments.

One safety concern, particularly in urban areas, is how these vehicles will interact with pedestrains and traditional larger vehicles on roadways.

For more on this topic, see transportation.gov/rural/electric-vehicles/ev-toolkit/electric-micromobility

 

micromobility group

An AI Code of Conduct

code

CC BY-SA 3.0 Alpha Stock Images - Nick Youngson

 

In what I consider to be more of a "business" story, Canada's voluntary AI code of conduct is coming. As an article headline says "not everyone is enthused." Some businesses are concerned rules could stifle innovation and dull competitive edge.

Companies that sign onto the code are agreeing to multiple principles, including that their AI systems are transparent about where and how information they collect is used, and that there are methods to address potential bias in a system. In addition, they agree to human monitoring of AI systems and that developers who create generative AI systems for public use create systems so that anything generated by their system can be detected.

It is interesting that the URL for a story on this includes the term "stopgap"  cbc.ca/news/business/ai-code-of-conduct-stopgap which is defined as a temporary way of dealing with a problem or satisfying a need.

Something similar to Canada's plan is expected to be done in the United States and in the European Union via the EU Artificial Intelligence Act.