1970s Computer Clubs

Apple I

                 The Apple 1 as displayed at the Computer History Museum

On March 5, 1975, the Homebrew Computer Club first met in a garage near Menlo Park in Silicon Valley, California.

On that day, I was across the country in my last semester at Rutgers. I had taken one course in computer programming, using Fortran, which had been around in some earlier forms since the late 1950s. We used a box of punch cards to create a program. I had looked into the class as an auditor, for no credit and not on my transcript, because I had talked to the professor after an information session he gave, and he was curious to see what an English major would do in his class.

My afterschool and vacation job in high school was doing printing for a liquor distributor. They had a room with huge computers using tape drives and cards, and I would sometimes wander in there and talk to the operator. Of course, I understood nothing about what he was doing. He was in a unique place in that position because no one in the company understood what he was doing except him and his one assistant. And yet those computers, printed all the invoices which I would later have to box up and file in the warehouse. Though they were using the computer to print them all, no one could access that data from their desktop, so if someone wanted a copy of an invoice, they had to dig through a file cabinet.

That 1970 computer was certainly not for personal use, and no one had a personal computer because they did not exist. Most of my fellow students didn't imagine we would ever have a computer in our home. They were gigantic — a computer easily took up an entire room. And they were very, very expensive, costing about a million dollars each. Not even computer engineers or programmers who made a living working on computers had access to a personal computer.

So this California club served a real need for tech-minded people But many of these tech-minded people wanted to build personal computers for fun. And they decided to start a hobbyist club to trade circuit boards and information and share their enthusiasm. Among the early members were high school friends Steve Jobs and Steve Wozniak. Eventually, they would design and build what tey called the Apple I and II computers and brought them to the club to show them off. Lee Felsentein and Adam Osborne were also members and would create the first mass-produced portable computer, the Osborne 1.

Wozniak would write "The theme of the club was 'Give to help others.' Each session began with a 'mapping period,' when people would get up one-by-one and speak about some item of interest, or a rumor, and have a discussion. Somebody would say, 'I've got a new part,' or somebody else would say he had some new data or ask if anybody had a certain kind of teletype."

I started teaching in a junior high school, in the fall of 1975, and shortly thereafter, the school got a terminal that was connected to a mainframe at some university in New Jersey. It was first used by one of the math teachers for a kind of computer club. I did go to his classroom a few times just to see how it worked but I saw no connection to what I had learned about programming in college.

It would be a few years before the first personal computers appeared in the school   We had a lab that was used for the first actual computer class. It was a classroom full of standalone TRS 80s. TRS stands for Tandy RadioShack, though later they were nicknamed Trash 80s. I took a professional development class using those computers where we learned to program in BASIC. I created a vocabulary flashcard program that I was able to use with a few of my English classes during periods when the lab was not being used by the math teacher. The program was crude. The graphics were basically nonexistent, but the kids and I found it very interesting. 

I remember one teacher who was in the professional development class, saying we will all have to learn to program in the future. I was sure she was wrong. I had no doubt that computers would play a role in our teaching future, but I was also sure that other people would be writing the programs and we would only be users.

apple iie

The first computer I had in my classroom was an Apple IIe. Since I had some computer background and more so because I had some interest in learning more, I became the computer coordinator for the building. That meant my computer had two disc drives so that I could copy software that we had purchased and were allowed to copy.  MECC was a big source of classroom software back then.

The first computer I bought for home use was the same as what I had in my classroom which made sense because then I could use the software home too. This hardware was expensive. I paid more for the Apple dot matrix printer than I paid for my laptop last year.

We remained an aApple school, and an apple family for a few years until a new person moved into the position of district computer coordinator. He swapped out all the Apple computers for what we would call IBM clones, but we're the early Windows95-equipped computers. When I bought my next computer, it was one using Windows 95.

When I left teaching secondary school in 2000 and went to work at NJIT, all the computers used Windows except for the school of architecture, which was an Apple Mac building. They were their own little tech world. And so I lost contact with the Apple world in those days when even TV commercials and print ads would argue about whether you were a Windows or Mac kind of person. I remember one professor saying to me that he was surprised I was not using a Mac because I seemed like "a creative type."

The Campus Security Robot Is On Duty

It can run up to seven miles per hour, and swim. It can climb steps and scale hills at a 40-degree gradient. It can be outfitted with sensors, night vision, arms, and deployable drones. It is a robotic dog — a “quadruped” platform developed by Ghost Robotics and enhanced by AT&T that, to date, has been used to patrol military zones. Now, the telecommunications giant is pitching a new use for this AI-friendly technology - campus security and safety.

robot

The robot has a 24/7 perimeter patrol, can spot “unidentified” personnel, and disperse unruly protests. Smewhaat Orwellian.

Applying Technology Laws

Huang's Law  and Moore's Law are technology "laws." Maybe it is more accurate to say they are observations, but "law" has become attached to these observations since they appear to remain true.

Moore's law is the observation that the number of transistors in an integrated circuit (IC) doubles about every two years. Moore's law is an observation and projection of a historical trend. Rather than a law of physics, it is an empirical relationship linked to gains from experience in production.

Gordon Moore, the co-founder of Fairchild Semiconductor and Intel (and former CEO of the latter), posited in 1965 posited the idea and projected this rate of growth would continue for at least another decade. In 1975, looking forward to the next decade, he revised the forecast to doubling every two years. His prediction has held since 1975 and has since become known as a "law".

Moore's prediction has been used in the semiconductor industry to guide long-term planning and to set targets for research and development, thus functioning to some extent as a self-fulfilling prophecy.

Huang’s Law has been called the new Moore’s Law. It seems that the law that the same dollar buys twice the computing power every 18 months is no longer true.

Huang's law is an observation in computer science and engineering that advancements in graphics processing units (GPUs) are growing at a rate much faster than with traditional central processing units (CPUs). The observation is in contrast to Moore's law as Huang's law states that the performance of GPUs will more than double every two years.

Jensen Huang was then CEO of Nvidia and at the 2018 GPU Technology Conference and observed that Nvidia’s GPUs were "25 times faster than five years ago" whereas Moore's law would have expected only a ten-fold increase. As microchip components became smaller, it became harder for chip advancement to meet the speed of Moore's law.

tech in oppositionHuang's Law and Moore's Law are concepts primarily associated with the semiconductor industry and technology advancements. However, their principles can be extended and applied to various domains beyond technology.

You can extend Huang's Law to other fields where exponential growth or improvement is observed. For example, consider advancements in renewable energy efficiency, healthcare outcomes, or educational achievements. The idea is to identify areas where progress follows an exponential curve and apply the principles accordingly.

Both laws highlight the concept of scaling - either in computational power (Moore's Law) or AI efficiency (Huang's Law). You could apply this principle to other systems and processes where scaling can lead to significant improvements.

I am imagining a discussion (probably in a classroom setting) about ethical considerations, such as the impact of rapid advancements on society, and focus on responsible and ethical development in various fields. That certainly is true currently in discussions of AI.

Begin. End. The Waning Days of Coding

code on screen

A piece in The New Yorker (not exactly a technology magazine) titled "A Coder Considers the Waning Days of the Craft," set me thinking about what tech careers will be lost in the near and far future. Yes, artificial intelligence plays into this, but there are other factors too. Coding seems to be a likely candidate for being on the decline.

The author, James Somers, says that, "Coding has always felt to me like an endlessly deep and rich domain. Now I find myself wanting to write a eulogy for it." With his wife pregnant, he wonders that "...by the time that child can type, coding as a valuable skill might have faded from the world." 

It is an interesting read. Kind of a memoir of a coder.

Schools still teach coding. Coders are still working. The question is for for how long? Should a student in middle school think about it as a career? I used to tell my middle school students that a lot of them will go into careers that have titles that don't exist today. Who can predict?

Somers concludes:

"So maybe the thing to teach isn’t a skill but a spirit. I sometimes think of what I might have been doing had I been born in a different time. The coders of the agrarian days probably futzed with waterwheels and crop varietals; in the Newtonian era, they might have been obsessed with glass, and dyes, and timekeeping. I was reading an oral history of neural networks recently, and it struck me how many of the people interviewed—people born in and around the nineteen-thirties—had played with radios when they were little. Maybe the next cohort will spend their late nights in the guts of the A.I.s their parents once regarded as black boxes. I shouldn’t worry that the era of coding is winding down. Hacking is forever."

The future of coding is likely to be affected by all of these factors:

Artificial Intelligence and Automation: AI is already influencing coding through tools that assist developers in writing code, debugging, and optimizing algorithms. As AI continues to advance, it may take on more complex coding tasks, allowing developers to focus on higher-level design and problem-solving.

Low-Code/No-Code Development: The rise of low-code and no-code platforms is making it easier for individuals with limited programming experience to create applications. This trend could democratize software development, enabling a broader range of people to participate in creating digital solutions.

Increased Specialization: With the growing complexity of technology, developers are likely to become more specialized in particular domains or technologies. This could lead to a more segmented job market, with experts in areas like AI, cybersecurity, blockchain, etc.

Remote Collaboration and Distributed Development: Remote work has become more prevalent, and this trend is likely to continue. Tools and practices for collaborative and distributed development will become increasingly important.

Ethical Coding and Responsible AI: As technology plays a more central role in our lives, the ethical considerations of coding will become more critical. Developers will need to be mindful of the societal impact of their creations and consider ethical principles in their coding practices.

Continuous Learning: The pace of technological change is rapid, and developers will need to embrace a mindset of continuous learning. Staying updated with the latest tools, languages, and methodologies will be crucial.

Quantum Computing: While still in its early stages, quantum computing has the potential to revolutionize certain aspects of coding, particularly in solving complex problems that are currently intractable for classical computers.

Augmented Reality (AR) and Virtual Reality (VR): As AR and VR technologies become more widespread, developers will likely be involved in creating immersive experiences and applications that leverage these technologies.

Cybersecurity Emphasis: With the increasing frequency and sophistication of cyber threats, coding with a focus on security will be paramount. Developers will need to incorporate secure coding practices and stay vigilant against emerging threats.

Environmental Sustainability: As concerns about climate change grow, there may be a greater emphasis on sustainable coding practices, including optimizing code for energy efficiency and reducing the environmental impact of data centers.

How do I know this? Because I asked a chatbot to tell me the future of coding.