Are We Any Closer To Quantum Computing?

quantum computer imagined

In 1981, American physicist and Nobel Laureate Richard Feynman gave a lecture at the Massachusetts Institute of Technology (MIT) in which he outlined a revolutionary idea. Feynman suggested that the strange physics of quantum mechanics could be used to perform calculations. Quantum computing was born. The illustration here shows what one might have imagined it to be back in 1981 - a lind of science-fiction computer.

Quantum computing is a revolutionary area of computing that uses the principles of quantum mechanics to process information in fundamentally different ways than classical computers. In classical computing, information is processed using bits, which are binary and can represent either a 0 or a 1. In quantum computing, however, the fundamental units of information are called qubits. Qubits can exist in a state of 0, 1, or both simultaneously, thanks to a quantum property called superposition. This allows quantum computers to perform multiple calculations at once.

I am not a physicist or computer engineer, so I don't want to go too deeply into that realm. Reading about this, I see the word "entanglement" and have some memory of Einstein referring to quantum entanglement as "spooky action at a distance." He was skeptical since it seemed to defy the principles of classical physics and his theory of relativity. Einstein doubted entanglement, but modern experiments have confirmed its existence and shown that it is a fundamental aspect of quantum mechanics. In quantum computing, entanglement creates strong correlations between qubits, even when they are far apart.

Entanglement enables quantum computers to solve certain types of complex problems much faster than classical computers by leveraging these interconnected qubits. Quantum computers are particularly well-suited to tasks involving massive datasets, optimization problems, simulations, and cryptography. However, they are still in their early stages of development and face challenges such as error rates, stability, and scalability.

In the same way that AI is already in your daily life - even if you don't notice or acknowledge it - quantum computing could be used in everyday activities. It could revolutionize drug discovery and personalized medicine by simulating molecular interactions at an unprecedented speed, leading to faster development of cures and treatments. By solving complex optimization and learning problems, quantum computers could significantly enhance AI's capabilities, leading to smarter assistants and systems.

Cryptography and cybersecurity's current encryption methods could be broken by quantum computers, but they could also enable quantum-safe encryption, making online transactions and communications more secure. There's good and bad in almost every discovery.

In logistics, smarter traffic systems to more efficient delivery routes, quantum computing could optimize logistics, reducing fuel consumption, travel times, and costs.

And quantum computing could impact improved energy solutions, financial modeling, material design, and many things we haven't even considered yet.

Of course, there are challenges. Qubits are highly sensitive to their environment. Even minor disturbances like temperature fluctuations, vibrations, or electromagnetic interference can cause qubits to lose their quantum state—a phenomenon called decoherence. Maintaining stability long enough to perform calculations is a key challenge. Many quantum computers require extremely low temperatures (close to absolute zero) to operate, as qubits need highly controlled environments. Building and maintaining these cryogenic systems is both expensive and challenging.

Small-scale quantum computers exist, but scaling up to thousands or millions of qubits is a monumental task and requires massive infrastructure, advanced error correction mechanisms, and custom hardware, making them cost-prohibitive for widespread adoption.

On the education side of this, quantum computing sits at the intersection of physics, engineering, computer science, and more. A lack of cross-disciplinary expertise will slow down progress in this field.

What Happened to the Internet of Things?

IoT uses

IoT applications

After writing here about how the Internet and websites are not forever, I started looking at some old posts that perhaps should be deleted or updated. With 2200+ posts here since 2006, that seems like an overwhelming and unprofitable use of my time. Plus, maybe an old post has some historical value. But I do know that there are older posts that have links to things that just don't exist on the Internet anymore.

The last post I wrote here labeled "Internet of Things" (IoT) was in June 2021. IoT was on Gartner's trends list in 2012, and my first post about IoT here was in 2009, so I thought an new update was due.

When I wrote about this in 2014, there were around 10 billion connected devices. In 2024, the number has increased to over 30 billion devices, ranging from smart home gadgets (e.g., thermostats, speakers) to industrial machines and healthcare devices. Platforms like Amazon Alexa, Google Home, and Apple HomeKit provide hubs for connecting and controlling a range of IoT devices.

The past 10 years have seen the IoT landscape evolve from a collection of isolated devices to a more integrated, intelligent, and secure ecosystem. Advancements in connectivity, AI, edge computing, security, and standardization have made IoT more powerful, reliable, and accessible, with applications transforming industries, enhancing daily life, and reshaping how we interact with technology. The number of connected devices has skyrocketed, with billions of IoT devices now in use worldwide. This widespread connectivity has enabled smarter homes, cities, and industries.

IoT devices have become more user-friendly and accessible, with smart speakers, wearables, and home automation systems becoming commonplace in households. If you have a washing machine or dryer that reminds you via an app about its cycles. or a thermostat that knows when you are in rooms or on vacation, then IoT is in your home, whether you use that term or not.

Surveying the topic online turned up a good number of things that have pushed IoT forward or that IoT has pushed forward. Most recently, I would say that the 3 big things that have pushed IoT forward are 5G and advanced connectivity, the rise of edge computing, and AI and machine learning integration:

Technological improvements, such as the rollout of 5G networks, have greatly increased the speed and reliability of IoT connections. This has allowed for real-time data processing and more efficient communication between devices.

Many IoT devices now incorporate edge computing and AI to process data locally, reducing the reliance on cloud-based servers. This allows faster decision-making, less latency, and improved security by limiting the amount of data transmitted. IoT devices have increasingly incorporated AI and machine learning for predictive analytics and automation. This shift has allowed for smarter decision-making and automation in various industries, such as manufacturing (predictive maintenance), healthcare (patient monitoring), and agriculture (smart farming).

The integration of big data and advanced analytics has enabled more sophisticated insights from IoT data. This has led to better decision-making, predictive maintenance, and personalized user experiences.

One reason why I have heard less about IoT (and written less about it) is that it has expanded beyond consumer devices to industrial applications. I discovered a new term - Industrial Internet of Things (IIoT) that includes smart manufacturing, agriculture, healthcare, and transportation, improving efficiency and productivity.

There are also concerns that have emerged. As IoT devices proliferate, so have concerns about security. Advances in cybersecurity measures have been implemented to protect data and ensure the privacy of users. The IoT security landscape has seen new protocols and encryption standards being developed to protect against vulnerabilities, with an emphasis on device authentication and secure communication.

The rollout of 5G has enhanced IoT capabilities by providing faster, more reliable connections. This has enabled more efficient real-time data processing for smart cities, autonomous vehicles, and industrial IoT applications, which can now operate at a larger scale and with lower latency.

IoT devices are now able to use machine learning and AI to learn from user behavior and improve their performance. For example, smart thermostats can learn a household’s schedule and adjust settings automatically, while security cameras can differentiate between human and non-human motion.

Edge computing has allowed IoT devices to process data locally rather than relying solely on cloud-based servers. This reduces latency and bandwidth usage, making it especially beneficial for time-sensitive applications like healthcare monitoring, industrial automation, and smart grids.

Despite the growth, the IoT market faces challenges such as chipset supply constraints, economic uncertainties, and geopolitical conflicts

 

 

 

Are You Ready For Y2K38?

 

Do you remember the Y2K scare? It is also known as the Millennium Bug. On this Eve of a new year, I am recalling this scare that stemmed from a widespread concern in the late 1990s that many computer systems would fail when the year changed from 1999 to 2000.

Why? Many older computer systems and software programs represented years using only the last two digits (e.g., "1999" was stored as "99"). It was feared that when 2000 arrived, these systems might interpret "00" as 1900 instead of 2000, leading to several problems.

Systems that relied on accurate date calculations could produce errors or fail entirely. For example, financial systems calculating interest rates or loan payments might miscalculate. Concerns arose about critical systems in utilities, transportation, healthcare, and government shutting down. Files or databases might become corrupted due to incorrect data processing.

Probably the greatest concern was in banking and finance where it was feared that miscalculated transactions, stock market crashes, or ATM malfunctions might occur.

Some people predicted power grid failures or water system disruptions, and aviation navigation systems and air traffic control collapsing.

What if there were malfunctioning military systems, including nuclear launch systems?

And so, billions of dollars were spent worldwide to identify, update, and test potentially vulnerable systems. IT professionals worked tirelessly to ensure compliance before the deadline.

What Happened? The transition to the year 2000 was largely uneventful. A few minor issues were reported, but there were no catastrophic failures. It wasn't that there was no reason to be concerned, but the successful outcome is often credited to the massive preventive effort rather than the fears being overblown.

The Y2K scare highlighted the importance of forward-thinking in software development and helped establish rigorous practices for handling date and time in computing. If you want to start preparing or worrying now for the next similar scare, the Y2K38 Problem (Year 2038 Issue) arises from how older computer systems store time as a 32-bit integer, counting seconds since January 1, 1970 (Unix time). On January 19, 2038, this count will exceed the maximum value for a 32-bit integer, causing a rollover that could result in misinterpreted dates or system crashes. This potentially affects embedded systems, infrastructure, and older software. Modern systems are increasingly being updated to 64-bit time representations, which kicks the problem far into the future.

Natural Language Processing (NLP)

Natural Language Processing (NLP) is a field of artificial intelligence focused on enabling computers to understand, interpret, and generate human language. The "natural" part is that the goal is that this AI language use is meaningful and contextually relevant. This might be used for tasks such as language translation, sentiment analysis, and speech recognition.

NLP sample
NLP sample by Seobility - License: CC BY-SA 4.0

Search engines leverage NLP to improve various aspects of search. Understanding what a user means when searching for a search string and understanding what the different pages on the web are about and what questions they answer are all vital aspects of a successful search engine.

According to AWS, companies commonly use NLP for these automated tasks:
•    Process, analyze, and archive large documents
•    Analyze customer feedback or call center recordings
•    Run chatbots for automated customer service
•    Answer who-what-when-where questions
•    Classify and extract text

NLP crosses over into other fields. Here are three.

Computational linguistics is the science of understanding and constructing human language models with computers and software tools. Researchers use computational linguistics methods, such as syntactic and semantic analysis, to create frameworks that help machines understand conversational human language. Tools like language translators, text-to-speech synthesizers, and speech recognition software are based on computational linguistics. 

Machine learning is a technology that trains a computer with sample data to improve its efficiency. Human language has several features like sarcasm, metaphors, variations in sentence structure, plus grammar and usage exceptions that take humans years to learn. Programmers use machine learning methods to teach NLP applications to recognize and accurately understand these features from the start.

Deep learning is a specific field of machine learning which teaches computers to learn and think like humans. It involves a neural network that consists of data processing nodes structured to resemble the human brain. With deep learning, computers recognize, classify, and co-relate complex patterns in the input data.

Overview of NLP