Hello AI, I Am Julia

data visualization


Julia for data visualization

A few friends and former students who are working as programmers have told me recently that I should write about Julia. Julia is not a person but a language. One person called this "the new Python" while another said it was the "Python killer."

Python is the so-far-unchallenged leader of AI programming languages and is used by almost 90% of data scientists, but it is probably not the future of machine learning. Programming languages, like all languages, fall out of favor and sometimes die. There is not much demand for the COBOL, FORTRAN and BASIC that was being taught when I was an undergrad.

Julia is faster than Python because it is designed to quickly implement the math concepts like linear algebra and matrix representations. It is an open source project with more than a thousand contributors and is available under the MIT license with the source code available on GitHub.

I have learned that you don’t need to know programming to do some AI. There are no-code AI tools like Obviously.AI, but programming is necessary for some devlopment.

The home site for Julia is julialang.org which has a lot of information.

An article I read at pub.towardsai.net led me to investiagte a free online course on computational thinking at MIT that is taught using Julia.

This is not a course on programming with Julia but almost all data and AI courses are taught in Python (perhaps a few using R and other languages) so this is unique as a course. The course itself uses as its topic the spread of COVID-19.and includes topics on analyzing COVID-19 data, modeling exponential growth, probability, random walk models, characterizing variability, optimization and fitting to data. Through this topic the course teaches how to understand and model exponential functions. That has much broader application into financial markets, compound interest, population growth, inflation, Moore’s Law, etc.

Lorenz attractor

Julia used for scientific computing

As that article notes, right now searching jobs on LinkedIn for “Python Developer” will turn up about 23,000 results, so there is a market for that skill set now. Searching “Julia Developer” will return few results now. You can find a LinkedIn group for Julia developers, called “The Julia Language,” so interest is there and the jobs are beginning to appear. A Julia specialits now has a big advantage in that there are fewer people with that skillset for the jobs that are appearing. The predictions (always a dangerous thing) are that Julia has a big role to play in the data & AI industry.

Is Technology Destructive By Design?

Technology is good. Technology is bad. Both are true. 

The highest tech has transformed the world. It has changed our culture, made information accessible to many more people, altered businesses, education, and the economy.

I came across the book, Terms of Disservice: How Silicon Valley is Destructive by Design, by Dipayan Ghosh recently. Ghosh was a Facebook public policy adviser who went into government work with President Obama's White House.

The book's title is a play on those terms of service that products offer and are often not even read by users. Though you can view this book as being negative on the effects of technology, it actually offers ideas for using technology in positive ways, such as to create a more open and accessible world. That was actually part of the original plan (or dream) for the Internet. The extra level of service he sees as lacking is consumer and civilian protections.   

Ghosh is a computer scientist turned policymaker so much of the focus in the book is on industry leaders and policymakers. Technology has done a lot of good but it has also exacerbated social and political divisions. This year we are hearing again about how technology in the form of social media and cyberterrorism has influenced elections. Civilians has wittingly and unwittingly given private information to American companies which was wittingly and unwittingly passed on to terrorist groups and foreign governments.

We have heard this on an almost daily basis, and yet it seems that nothing is being done to stop it.

In an interview with the LA Review of Books, Ghosh was asked about what a broader “digital social contract” would look like. He answered, in part:

"If we can agree that this business model is premised on uninhibited data collection, the development of opaque algorithms (to enable content curation and ad targeting), and the maintenance of platform dominance (through practices that diminish market competition, including raising barriers to entry for potential rivals), then three basic components of possible intervention stand out. First, for data collection and processing, all the power currently lies within corporate entities. For now, Google can collect whatever information it desires. It can do whatever it wants with this data. It can share this information basically with whomever.

Europe’s GDPR has begun to implement some better industry norms. But to truly resolve these problems, we’ll need to transfer more power away from private firms...

We also need more transparency. Basic awareness of how this whole sector works should not be treated as some contrived trade secret. Individual consumers should have the right to understand how these businesses work, and shouldn’t just get opted in by default through an incomprehensible terms-of-service contract. We likewise need much better transparency on how platform algorithms and data-processing schemes themselves work.

And finally, we need to improve market competition. We need data-portability arrangements, interoperability agreements — and most importantly, a serious regulatory regime to contend realistically with monopolistic concentration."

One of the takeaways from this book is that these institutions are destructive by design. It reminds me of the late revelations about the American tobacco industry that they knew their products were addictive and caused health problems and designed the products to increase that addiction while they ignored and even covered up the health concerns. Can the same be said of technology products? 

Checklist for That Video Conference

video conf screenIt seems almost all of us have been involved in more video conferences the past five months because of the pandemic. Offices and classrooms are closed and a lot of paid work is being done. The learning and workspaces have definitely moved online for many of us. But there are also teleconferences with friends and family that are purely social. I have used Zoom, Google Meet, Webex, Slack and Microsoft Teams for formal presentations, courses, social calls and team meetings.

We are also seeing newscasters and celebrities broadcasting from home with surprisingly varied results and quality. I am no longer surprised to see a well-known person who has the resources and motivation to look good on screen look really bad. There are some basic video tech tips that everyone should follow, but before I get to those I want to list some non-tech items that fall under the heading of being prepared.

  1. Prep your desktop. Do you need notes or a way to take notes? If you use papers or another device make sure they are off-camera and won't block the camera or your microphone.
  2. If you're using a phone or tablet, put it horizontally so that you get a full video frame. Have you ever watched a movie or TV show that was in a vertical format? Of course not - think movie screens.
  3. Hang a "Do not disturb" sign on your office door or at least warn people that you're going to be on air and to give you some privacy. Mute other phones nearby. I've been on several calls where someone's other phone rings or the cell phone rings while they're on their laptop.
  4. How you dress depends on the formality of the conference but avoid pinstripes and checks, which can create distracting moiré patterns on camera, and try not to wear bright white or deep black clothing, because many webcams have automatic exposure settings and will adjust to the brightness or darkness of those colors. 
  5. When you position your camera, try to have it at your eye-level. You might be able to adjust the chair you use for that. You should be sitting straight up not slouched back on a couch. Low and high angles are unflattering. Leave those for horror films. Pros and semi-pros use a tripod to hold their camera or phone and you can get those relatively inexpensively, but there are also less expensive tablet and phone stands.
  6. On the more tech side of preparedness, I would say lighting is at the top of the list. Bad lighting can ruin a video. Most people don't have a studio lighting kit to work with or knowledge of three-point lighting schemes, so here are simple things to do. Do have the main light source behind the camera and pointed at you. That's true when using a sunny window or any kind of lamp. Do not have the light behind you or you might be a silhouette. A single bright light on one side of you might make for a dramatic photograph but not a good video. The light, like the camera, is best at eye level because higher or lower create unflattering shadows on your eyes, nose and chin. Again, leave that to the horror films. I sometimes use a sheet of white poster board to bounce the light on my face for a softer look. You might also point a bright lamp at a plain wall or ceiling to get softer light. If you have a smaller desk lamp around that you can point in different directions (such as a gooseneck one), that can work pretty well. If there are shadows on your face and you're using natural sunlight as your main source you can add the artificial light source to fill in the shadows. If you wear eyeglasses, try to avoid glare and reflections on them. You may have to adjust the angle of the light.
  7. It's a video meeting but being heard clearly is actually more important most of the time. There are people who connect by phone without video sometimes so all that have is your audio. Being in a quiet room. Try to avoid echo which shows up in empty rooms, halls and bathrooms (Yes, I know you sound great singing in the shower but...) Many people just use the microphone built into their laptop, phone or tablet which can be fine if you're close enough to it. If being close enough means you end up with a giant closeup of your face on the video then the microphone is an issue. Many people buy a special headset of higher-quality but also try out the earbuds that came with your phone (the optional wireless ones are great)
  8. Position yourself a distance from the camera that gives viewers a head and shoulders shot. Further back is better than too close. (Do consider that microphone though) When talking, look at the camera rather than at the screen of that iPad or laptop so that there is eye contact.
  9. People get concerned about the background in their video and may choose a location based on that rather than on the more important lighting considerations. Zoom and other apps actually allow you to put in a fake background. Newscasters might use a fake studio set and friends might put themselves on a beach or on the Moon. I'm not a fan of that and sometimes doing that causes odd halo effects that are very distracting. My suggestion is to avoid extremes. A blank white wall is not flattering but either is a busy background of shelves filled with clutter. I know that many academics and writers like to use bookshelves as their setting. 
  10. You should practice with the setup and application you are using. All of the applications have websites with help on how to test your video (such as this one from Zoom). You should be aware of what tools are available and how to use them well before you go on air. Too many people don't know very basic things such as how to mute/unmute, change their ID on the screen, ask a question, use the chat function, etc. Practice may not make perfect but it certainly will make better. There are also many videos on YouTube with tips about the tech side of things and for specific applications. Some application help videos are even specific to certain users. The two screenshots shown here are from a Zoom training video that is for educators. Educators will want to use some of the more advanced tools in the apps, such as screen sharing, breakout rooms, whiteboards, etc. 
  11. Finally, you want to look your best whether this is a job interview or saying hello to your granddaughter. The pros use makeup so that they look good under the bright lights, but for most of us, your ordinary makeup, groomed hair and a video-safe shirt or blouse is enough. If there is some shine on your face from the lighting a simple wipe with a soft cloth might be enough.

Zoom session

Schrodinger's Coin and Quantum Computing

Schrodinger's cat

A cat sits in a box along with some kind of poison that will be released based on the radioactive decay of a subatomic particle. Because these tiny particles are capable of being in multiple states at once (decaying or not decaying at the same time, that means the poison could simultaneously be released and not released. By extension, the cat could be dead and not dead.

In 1935, Austrian physicist Erwin Schrödinger spun this scenario. Though paradoxical, he didn't mean that cats can be simultaneously dead and alive, but that until you opened the box you'd have a cat that was simultaneously dead and alive.

When I first heard back in high school I thought of some Zen koans or stories that are equally paradoxical and maddening.  If a tree falls in the woods and no one is around to hear it, does it make a sound?

Later, I read that Schrödinger was criticizing the "Copenhagen interpretation" which was the prevailing school of thought in quantum mechanics. The Copenhagen interpretation suggested that particles existed in all possible states (different positions, energies, speeds) until they were observed, at which point they collapsed into one set state. But Schrödinger thought that interpretation didn't scale up very well to objects in the visible world.

A clearer analogy for me was when I heard it explained as being like a spinning coin. While it is spinning, it can be heads or tails. We don't know what it is until it falls and stops spinning. No cats are injured in this version. 

I thought about Mr. Schrodinger's cat and about that spinning coin when I was reading something recently about quantum computing. Schrödinger's cat is often used to illustrate the concept of superposition -- the ability for two opposite states to exist simultaneously -- and unpredictability in quantum physics.

Quantum computing is about harnessing and exploiting quantum mechanics in order to process information. The computers we are used to using “bits” of zero or one. If we had a quantum computer, there would be quantum bits (qubits). The freaky Schrodinger's cat part of quantum computers is that they would perform calculations based on the probability of an object's state before it is measured. Not just 1s or 0s. That means they would have the potential to process exponentially more data compared to traditional computers.

It has been 85 years but people are still messing around with the whole cat thing. Some physicists have given Schrödinger’s cat a second box to play in. This cat lives or dies in two boxes at once in order to consider quantum entanglement. Entanglement means that observation can change the state of a distant object instantaneously - something that Einstein considered impossible and referred to as “spooky action at a distance.” 

Are we even close to creating a quantum computer? It depends on who you read

spinning topHere's a leap beyond cats and coins that came to me because I was surfing through channels on the television and saw that Christopher Nolan's film Inception. 

A character in the film returns home after a long time in the dream world and we are told that a little top that he sets into motion will keep spinning forever if he is still in the dream world. If it stops and falls over, that means he is back in reality. It's like the old pinch yourself to see if you're dreaming.

But the film has a frustrating final shot because it ends before we know what happens to the top. It wobbles but then the film ends. That ending was infuriating to most viewers. It was like the finale of The Sopranos. What happened?

Nolan once spoke at a Princeton University graduation ceremony and said that "The way the end of that film worked, Leonardo DiCaprio’s character Cobb — he was off with his kids, he was in his own subjective reality. He didn’t really care anymore, and that makes a statement: perhaps all levels of reality are valid."

Nolan's point to the graduates? Don't chase dreams; chase realities because, unfortunately, "over time, we started to view reality as the poor cousin to our dreams".

Can you prove that you're not dreaming right now?

That "pinch yourself" thing isn't adequate proof. What if this is a dream that you're stuck in?  Does it matter? If it is, this dream is your reality. 

This sounds like some philosophical skepticism - that school of thought that I once had to study in school and that also sent my mind running in circles. It argues that we can't really know that anything is real. Why? Some would say because you deny the possibility of knowledge. The side I fell on as a college student was that we couldn't make that judgment of "real" because there isn't enough evidence.

That's enough circles to run around in for today. 


Even cats have been considering what Schrodinger proposed. (image via GIPHY)