Elgg

logo
Logo Elgg.orgsource

I wrote here about the open-source software called Elgg almost two decades ago. (Not to be confused with elgg.net, a social networking site for educators back around 2006, which no longer exists.)  Elgg is open-source social networking software that provides individuals and organizations with the components needed to create an online social environment. It offers blogging, microblogging, file sharing, networking, groups, and a number of other features. It was also the first platform to bring ideas from commercial social networking platforms to educational software. It was founded in 2004 by Ben Werdmuller and Dave Tosh

I view those older posts and many of the ones on this site that date back almost 20 years as historical documents of a sort. I'm sometimes tempted to update them, and I do sometimes fix a broken image or a proofreading mistake, but they may have some value as documentation of another time in edtech history.

How many of the alternatives to commercial course management systems from my 2006 list still exist? I looked up Elgg to see if it was still in use. The Wikipedia entry shows an impressive list of sites that are using Elgg. The list includes Oxfam, the Australian, Dutch, Canadian, and British Governments, New Zealand Ministry of Education, State of Ohio, USA, The World Bank, UNESCO, and the United Nations Development Programme.

Here is one of those old posts - now just historic.

Elgg is software for building a personal learning landscape. The software is from the United Kingdom. I first saw it mentioned on the Moodle site and thought it was a kind of plug-in to Moodle. It uses blogs, e-portfolios, shared files, RSS feeds, and other "social networking" tools. I thought it had been designed for educational use, but looking through the users, it has a good number of general users.

Their site has a demo community set up, and their resources/links are set up using an embedded wiki. You can create a free user account and get space for a blog, RSS feeds, an aggregator to read other people's content, and space to store your own resources (files). As a guest, you can still view items made public in user profiles - here's mine

Since their new release is version 0.601, this is obviously new beta software. So does this replace a Moodle or Blackboard, or supplement it, or serve a different purpose?

My collaborator here, Tim Kellers, installed the Elgg software here at NJIT, so drop by and register if you want to try it out. I also suggest you go to the elgg.net site and create an account so you can become part of that educator community. I have made some interesting contacts outside the United States from there. Right now, I am just having this blog's content mirrored to my Elgg blog account by using an RSS feed (yeah, there are some formatting & image issues doing that).

http://webapps.saugus.k12.ca.us/community - California's Saugus Unified School District uses it, and as you can see, it is a secure environment with user id and password access. However, take a look at their user introduction pdf document. It's a nice 9-page intro with screenshots. Another K12 district getting ahead of the colleges!

Ready for the test question? Elgg is to Elgg.net as ____ is to Wikipedia. (Answer: Mediawiki)

Well, to deal with that confusion (or further confuse you), elgg.net will now be edufilter.org.

Here's an email that went out to users from the Elgg folks:

Changes are afoot at Elgg.net!
Actually, you've been accustomed to change throughout the existence of the site since we started it in 2004. New features pop up all the time, and we think you'll be pleased to hear that this isn't going to stop soon.
However, we're going to change the name. Next Wednesday, Elgg.net will become Edufilter.org.
This is because, for a lot of people, Elgg.net is Elgg. Granted, it's a confusing name. But Elgg is a free, open source, white label social networking framework that anyone can install on their own servers. Want it running at your institution? Point your elearning folks at http://elgg.org.
Elgg.net, meanwhile, is a social network for education - and therefore, we think Edufilter is probably a better name.
You've probably got concerns, so let's deal with the most important:
#1: We're not going to break any of your links. While the front page of Elgg.net will forward to the main Elgg software homepage, anyone visiting elgg.net/your-username will still get to your page. We have no plans to end this, so if your address is printed on materials, don't worry. Everything's fine.
#2: The site will not be discontinued. It continues to be our flagship installation.
Furthermore, making the site overtly educational means we can give you more directed content and features. Sponsorship opportunities are available; if you'd like to promote your product or service available to some of the world's leading lights in elearning, let us know.
Best regards,
The Curverider team

 

A Few Other Posts About This

https://serendipity35.net/index.php?/archives/489-Putting-All-Your-Educational-Eggs-In-One-Basket.html

https://serendipity35.net/index.php?/archives/83-More-of-the-Competition-in-the-CMS-Market.html

https://serendipity35.net/index.php?/archives/265-A-directory-to-Web-2.0-Companies.html

 

Trading Kilowatts for Qubits

QbitIt had been in the news in the United States all week, the federal government is moving to a policy that will require the power-hungry data centers to get out of the public energy pool and go swim in a plasma of their own making.  Big Tech companies are building and investing in their own energy supplies as they race to meet the huge energy demands of AI computing in 20th century datacenters.  It's estimated that a "traditional" (non-AI producing) datacenter rack can consume somewhere between 5 and 15 kilowatts of power --think central A/C units, commercial clothes dryers, banks of EV car chargers.  That same rack, running AI-capable hardware and processing power will consume 10 times as much -- up to around 100 kilowatts.

As robust as the United Stated power grid is, the demand for electricity to power these clustered artificial intelligence entities will exceed the current ability to support that demand. To feed this need for power, Amazon has begun developing small nuclear reactors (SMRs), Oracle and OpenAI are working on half trillion-dollar natural gas fueled electrical plants.  These solutions have their obvious drawbacks:  Amazon's quest for contemporary electricity using the nuclear option will produce the most-toxic waste ever thrown away, and it will last for thousands of years.  Oracle and OpenAI's investment in huge natural gas energy sources risks not only accelerating climate rot, but it risks exhausting energy supplies at scale.  The risks of expanding the electricity supply on a 20th century grid are substantial.  If the demand for electricity was reduced, those risks would subside.

The development of quantum computing has quietly been on the rise.  These computing instances, in total, consume about 25kilowatts for super computers that require extreme refrigeration to drive their super-conductor-based processors at temperatures near absolute zero.  The warmer weather loving neutral-atom computers operate around room temperature and use 7 kilowatts (or less) of power.  When optimization tasks or simulations are sent to their quantum algorithms, these computers produce solutions at orders of magnitude faster and use a tiny fraction of the energy that a traditional datacenter would require.

Quantum computing, now, can significantly enhance AI (Generative AI) by its speed.  Quantum computers are faster and deeper in data analysis and have now led to a new class of Generative AI called GenQAI (Generative Quantum AI) that can use quantum hardware to iterate complex problems and generate more human-like reasoning and intuition in AI.

Quantinuum, which is reported to be one of the world leaders in quantum technology, in November unveiled its Helios system, which has been described as the world's most accurate quantum computer. That quantum instance requires less than 40 kilowatts of power, about the same as a single data center rack average AI Generative load and configuration.  The company announced last week it was going public and would issue an IPO sometime in the first half of 2026.

With some clairvoyant disruption and a little bit of luck, we'll have frugal quantum computing cottages humming their 4-dimensional power song before we have natural gas caverns and poisonous landfill dirges to endure

 

 

 

The Y2K38 Bug and the End of 32-bit Unix Time

Y2K, short for “Year 2000,” was a potential computer bug caused by how dates were formatted in older software. To save memory space, early computers used two-digit years—like “97” for 1997—which in the new millennium risked misreading “00” as 1900 instead of 2000, potentially disrupting systems that depended on accurate dates (read 101).

Though a kind of panic occurred in 1999, the Y2K issue surfaced in technical literature as early as 1984. Long before it became a global concern, researchers were already flagging the two-digit date flaw. A 1984 book, "Computers in Crisis," outlined how the year 2000 rollover could break financial, governmental, and technical systems if left unaddressed.

In the late 1990s, many feared this glitch could cause widespread failures in banking systems, power grids, transportation networks, and other critical infrastructure. This idea took hold of the public imagination, spawning doomsday predictions, a booming survivalist market, and a massive global push to audit and repair vulnerable systems before the deadline—work that cost an estimated $300B-$500B. 

Because of the extensive preparations, Y2K passed without significant disruptions, however, its legacy endures. The crisis helped modernize global IT systems, accelerated the outsourcing of programming jobs, and exposed society’s dependence on digital infrastructure—prompting long-term shifts in cybersecurity and software maintenance.

The Year 2038 problem is the next potential computer time rollover bug. Many older systems store time as a signed 32-bit integer counting seconds since Jan. 1, 1970. That counter maxes out on Jan. 19, 2038—overflowing into negative time and sending clocks back to 1901, potentially crashing any older software that depends on accurate dates. The Y2K38 bug is also known as the end of 32-bit Unix time and the year 2038 problem.

 

The Rite of Privacy

privacy roadPrivacy is a cornerstone of personal freedom, yet its meaning and importance have evolved over centuries.

Aristotle viewed the public sphere, or polis, as the space where true freedom and civic life were possible. For him, public life was about participating in politics and achieving lasting accomplishments, while private life was more concerned with household affairs and personal needs. This distinction meant that privacy was often seen as secondary to public engagement, but it also laid the groundwork for later debates about the value of personal space and autonomy. Even the Romans also drew a line between public and private spheres. Public life was where individuals could gain honor and recognition, while private life was associated with family, home, and personal matters.   Fast-forward a millennium or two, and thinkers like Rousseau saw privacy as a retreat from the pressures of society—a necessary space for self-reflection and authenticity. Hannah Arendt later argued that privacy is essential for forming personal identity and exercising political rights.In 1890 Samuel Warren and Louis Brandeis published in the Harvard Law Review an essay on the right to privacy By the early part of the 20th century, courts began interpreting the U.S. Constitution to protect an expansion of privacy to include personal freedom and dignity.

The history of privacy reveals that it has always been closely tied to personal liberty and the boundaries between the individual and society. From ancient debates about public and private life to modern legal protections, the concept of privacy has continually evolved in response to new challenges. Privacy remains a vital issue today, shaping debates about technology, freedom, and the rights of individuals in a rapidly changing world. As concerns escalated, privacy was recognized as a fundamental human right,  and laws and regulations were created to address the concerns caused by the spread of computers and data collection and storage.

Then came Edward Snowden.

The scale and scope of government surveillance was exposed. The global debate about privacy was joined with personal data security.  A full five years after surveillance and data collection concerns were exposed, the European Union’s General Data Protection Regulation claimed to set a new global standard for data protection and user rights.  Even California, with its trove of data-driven companies, took the GDPR seriously and enacted the California Consumer Privacy Act.

locked phone?Personal data has become a valuable commodity in the digital economy. Companies collect, analyze, and sell user information to drive advertising, product development, and business strategies.

This shift has made privacy a key economic issue, as individuals must navigate the trade-offs between convenience and control over their data. 

As surveillance and data collection become more widespread, concerns about personal liberty and autonomy have grown. When every action can be tracked, individuals may feel less free to express themselves or make independent choices. These issues are at the heart of modern privacy debates, reminding us that protecting privacy is essential for maintaining freedom in a digital society. Privacy in the modern era is shaped by rapid technological change, new legal frameworks, and the growing power of data. As personal information becomes more valuable and vulnerable, understanding how privacy has evolved is crucial for protecting autonomy and freedom.

Privacy is not just a right of the past—it’s a challenge for the future. We all must stay vigilant and informed. Freedom depends on it.