Trading Kilowatts for Qubits

QbitIt had been in the news in the United States all week, the federal government is moving to a policy that will require the power-hungry data centers to get out of the public energy pool and go swim in a plasma of their own making.  Big Tech companies are building and investing in their own energy supplies as they race to meet the huge energy demands of AI computing in 20th century datacenters.  It's estimated that a "traditional" (non-AI producing) datacenter rack can consume somewhere between 5 and 15 kilowatts of power --think central A/C units, commercial clothes dryers, banks of EV car chargers.  That same rack, running AI-capable hardware and processing power will consume 10 times as much -- up to around 100 kilowatts.

As robust as the United Stated power grid is, the demand for electricity to power these clustered artificial intelligence entities will exceed the current ability to support that demand. To feed this need for power, Amazon has begun developing small nuclear reactors (SMRs), Oracle and OpenAI are working on half trillion-dollar natural gas fueled electrical plants.  These solutions have their obvious drawbacks:  Amazon's quest for contemporary electricity using the nuclear option will produce the most-toxic waste ever thrown away, and it will last for thousands of years.  Oracle and OpenAI's investment in huge natural gas energy sources risks not only accelerating climate rot, but it risks exhausting energy supplies at scale.  The risks of expanding the electricity supply on a 20th century grid are substantial.  If the demand for electricity was reduced, those risks would subside.

The development of quantum computing has quietly been on the rise.  These computing instances, in total, consume about 25kilowatts for super computers that require extreme refrigeration to drive their super-conductor-based processors at temperatures near absolute zero.  The warmer weather loving neutral-atom computers operate around room temperature and use 7 kilowatts (or less) of power.  When optimization tasks or simulations are sent to their quantum algorithms, these computers produce solutions at orders of magnitude faster and use a tiny fraction of the energy that a traditional datacenter would require.

Quantum computing, now, can significantly enhance AI (Generative AI) by its speed.  Quantum computers are faster and deeper in data analysis and have now led to a new class of Generative AI called GenQAI (Generative Quantum AI) that can use quantum hardware to iterate complex problems and generate more human-like reasoning and intuition in AI.

Quantinuum, which is reported to be one of the world leaders in quantum technology, in November unveiled its Helios system, which has been described as the world's most accurate quantum computer. That quantum instance requires less than 40 kilowatts of power, about the same as a single data center rack average AI Generative load and configuration.  The company announced last week it was going public and would issue an IPO sometime in the first half of 2026.

With some clairvoyant disruption and a little bit of luck, we'll have frugal quantum computing cottages humming their 4-dimensional power song before we have natural gas caverns and poisonous landfill dirges to endure

 

 

 

The Y2K38 Bug and the End of 32-bit Unix Time

Y2K, short for “Year 2000,” was a potential computer bug caused by how dates were formatted in older software. To save memory space, early computers used two-digit years—like “97” for 1997—which in the new millennium risked misreading “00” as 1900 instead of 2000, potentially disrupting systems that depended on accurate dates (read 101).

Though a kind of panic occurred in 1999, the Y2K issue surfaced in technical literature as early as 1984. Long before it became a global concern, researchers were already flagging the two-digit date flaw. A 1984 book, "Computers in Crisis," outlined how the year 2000 rollover could break financial, governmental, and technical systems if left unaddressed.

In the late 1990s, many feared this glitch could cause widespread failures in banking systems, power grids, transportation networks, and other critical infrastructure. This idea took hold of the public imagination, spawning doomsday predictions, a booming survivalist market, and a massive global push to audit and repair vulnerable systems before the deadline—work that cost an estimated $300B-$500B. 

Because of the extensive preparations, Y2K passed without significant disruptions, however, its legacy endures. The crisis helped modernize global IT systems, accelerated the outsourcing of programming jobs, and exposed society’s dependence on digital infrastructure—prompting long-term shifts in cybersecurity and software maintenance.

The Year 2038 problem is the next potential computer time rollover bug. Many older systems store time as a signed 32-bit integer counting seconds since Jan. 1, 1970. That counter maxes out on Jan. 19, 2038—overflowing into negative time and sending clocks back to 1901, potentially crashing any older software that depends on accurate dates. The Y2K38 bug is also known as the end of 32-bit Unix time and the year 2038 problem.

 

The Light Architecture of Apple's Appleworks

Apple IIe

Apple IIe keyboard, monitor and floppy disk drive

I found a reference this past week to the original Apple II AppleWorks and it got me thinking about how amazing the program was for its time. The software was light and efficient, and it ran on limited hardware.

The "light architecture" of AppleWorks (developed by Rupert Lissner and released in 1984 for the Apple II) was impressive for several reasons. It was an Integrated Suite that combined a word processor, database, and spreadsheet into a single application. On the 8-bit Apple II's limited resources (often starting with just 128K RAM), this level of seamless integration was no less than revolutionary. You could easily share data (via a "clipboard") between the modules.

AppleWorks was written almost entirely in assembly language for the 6502 processor. This gave it incredible speed and efficiency, allowing it to perform complex tasks much faster than programs written in higher-level languages like Pascal.

Its memory management system was highly flexible and sophisticated, allowing it to utilize not just the 128K of the Apple IIe/IIc but also various third-party memory cards. It effectively made up to two megabytes of memory appear as one contiguous space on an 8-bit machine, which was a remarkable technical feat.

It became the "killer application" that extended the life of the Apple II platform well into the late 80s and early 90s. That is when I was using it in my middle school classroom, and as the computer coordinator in my building, I worked with every teacher because they all had at least one Apple IIe in their room.

Although AppleWorks was thought of as something teachers would use most of the time, the user experience was very good, and students would use at least the word processing portion. All three modules shared a consistent, menu-bar-driven user interface with simple text-based controls (often utilizing the Apple II's "MouseText" characters for visual elements like folders and separators). This was highly intuitive and much easier to learn than many contemporary command-line programs. The design prioritized ease of use, making personal computing accessible to a much broader audience, especially in homes and schools.

start screen

AppleWorks, compared to other productivity suites of the time, such as Microsoft Works or the original Mac software, demonstrates a fundamental shift in design philosophy that prioritized integration and efficiency over raw power. Earlier Apple II programs were often monolithic (like stand-alone VisiCalc for spreadsheets) or required users to switch between separate, disparate programs with different interfaces to move data. This efficiency was its competitive edge, keeping the Apple II relevant years after more powerful Macs and PCs emerged.

In the AppleWorks vs. Microsoft Works (for Mac/PC) battle (Works eventually became Apple's main competitor in the integrated suite market) Apple demonstrated a different design approach. But Apple was constrained by the 8-bit Apple II. Microsoft Works and later versions of AppleWorks/ClarisWorks (for Mac and Windows) were developed for 16-bit and 32-bit systems (Macintosh, Windows PC), and these platforms had more abundant memory, faster processors, and graphical user interfaces (GUIs).

The last time I sat down at an Apple IIe was at a tech conference, it was in a "museum "display. As crude as it might seem to users almost 50 years later, I still marveled at what it could do. I was one of those people who found so many later programs, such as Microsoft Office, bloated memory hogs with more horsepower and features than most users would ever need.

Despite the technical differences, both AppleWorks and Microsoft Works shared the goal to provide an all-in-one, cost-effective, and easy-to-use suite for casual users, students, and small businesses who didn't need the complexity or expense of full-blown professional packages like Microsoft Office or Lotus Symphony. The key difference was that AppleWorks achieved this integration on an extremely limited architecture, which is why its design is often cited as a more remarkable technical feat.

Marian Croak: A Force Behind Modern Communication

CroakMarian Croak, a name that may not be familiar to many, has had a profound impact on the way we communicate today. As a renowned American engineer, Croak has spent her career pushing the boundaries of technology, particularly in the realm of Voice over Internet Protocol (VoIP). With over 200 patents to her name, Croak's work has enabled seamless communication over the internet, revolutionizing the way we connect.

Her  U.S. Patent No. 7,599,359 for VoIP (Voice over Internet Protocol) Technology was ultimately used to create applications such as Zoom, WhatsApp and many others.

Born on May 14, 1955, in New York City, Croak's interest in technology was sparked by her father, who built her a chemistry set that led to her early exploration of the sciences. She pursued her passion for problem-solving at Princeton University, where she earned her undergraduate degree in 1977. Later, she received a PhD in Social Psychology and Quantitative Analysis from the University of Southern California.

Croak's career spans three decades at Bell Labs and AT&T, where she worked on digital messaging applications and VoIP technologies. Her team convinced AT&T to adopt the TCP/IP protocol, which allowed for standardized communication over the Internet. Croak's work on VoIP enabled the conversion of voice data into digital signals, making it possible to transmit voice, text, and video over the internet.

Another of Croak's notable achievements is her patent for text-based donations to charity. Developed in response to Hurricane Katrina, this technology allowed users to donate to organizations using text messaging. The technology was widely used after the 2010 Haiti earthquake, raising over $43 million for relief organizations. Croak received the 2013 Thomas Edison Patent Award for this innovation.

Croak's contributions extend beyond her technical expertise. As a leader at AT&T, she managed over 2,000 engineers and computer scientists, overseeing programs that impacted millions of customers. In 2014, she joined Google as Vice President of Engineering, focusing on expanding internet access and developing Responsible AI.

Throughout her career, Croak has received numerous accolades for her work. She was inducted into the Women in Technology International Hall of Fame in 2016 and the National Inventors Hall of Fame in 2022, becoming one of the first two Black women to receive this honor. She has also been inducted into the National Academy of Engineering and the American Academy of Arts and Sciences.

As Croak herself notes, "Inventors are usually people like you. Sometimes they're good at certain things, other times they're not, and that's ok. Just focus on what you want to change, and you become that change and can make that change happen."

Her legacy serves as a testament to the power of innovation and the impact one person can have on the world. As we continue to navigate the complexities of modern communication, we owe a debt of gratitude to pioneers like Marian Croak, who have worked tirelessly to bring people closer together.