A little history …
In the 30+ years I’ve been a working “computers industry professional” I’ve done a lot of jobs, used a lot of software, and spent time teaching other folks how to be “computer professionals.”
I’m also an “amateur historian” – i.e. I enjoy learning about “history” in general. I’ve had real “history teachers” point out that (in general) people are curious about “what happened before them.”
Maybe this “historical curiosity” is one of the things that distinguishes “humans” from “less advanced” forms of life — e.g. yes, your dog loves you, and misses you when you are gone – but your dog probably isn’t overly concerned with how its ancestors lived (assuming that your dog has the ability to think in terms of “history” – but that isn’t the point).
As part of “teaching” I tend to tell (relevant) stories about “how we got here” in terms of technology. Just like understanding human history can/should influence our understanding of “modern society” – understanding the “history of a technology” can/should influence/enhance “modern technology.”
The Problem …
There are multiple “problems of history” — which are not important at the moment. I’ll just point out the obvious fact that “history” is NOT a precise science.
Unless you have actually witnessed “history” then you have to rely on second hand evidence. Even if you witnessed an event, you are limited by your ability to sense and comprehend events as they unfold.
All of which is leading up to the fact that “this is the way I remember the story.” I’m not saying I am 100% correct and/or infallible – in fact I will certainly get something wrong if I go on long enough – any mistakes are mine and not intentional attempts to mislead 😉
Hardware/Software
Merriam-Webster tells me that “technology” is about “practical applications of knowledge.”
random thought #1 – “technology” changes.
“Cutting edge technology” becomes common and quickly taken for granted. The “Kansas City” scene from Oklahoma (1955) illustrates the point (“they’ve gone just about as far as they can go”).
Merriam-Webster tells me that the term “high technology” was coined in 1969 referring to “advanced or sophisticated devices especially in the fields of electronics and computers.”
If you are a ‘history buff” you might associate 1969 with the “race to the moon”/moon landing – so “high technology” equaled “space age.” If you are an old computer guy – 1969 might bring to mind the Unix Epoch – but in 2022 neither term is “high tech.”
random thought #2 – “software”
The term “hardware” in English dates back to the 15th Century. The term originally meant “things made of metal.” In 2022 the term refers to the “tangible”/physical components of a device – i.e. the parts we can actually touch and feel.
I’ve taught the “intro to computer technology” more times than I can remember. Early on in the class we distinguish between “computer hardware” and “computer software.”
It turns out that the term “software” only goes back to 1958 – invented to refer to the parts of a computer system that are NOT hardware.
The original definition could have referred to any “electronic system” – i.e. programs, procedures, and documentation.
In 2022 – Merriam-Webster tells me that “software” is also used to refer to “audiovisual media” – which is new to me, but instantly makes sense …
ANYWAY – “computer software” typically gets divided into two broad categories – “applications” and “operating systems” (OS or just “systems”).
The “average non-computer professional” is probably unaware and/or indifferent to the distinction between “applications” and the OS. They can certainly tell you whether they use “Windows” or a “Mac” – so saying people are “unaware” probably isn’t as correct as saying “indifferent.”
Software lets us do something useful with hardware
an old textbook
The average user has work to get done – and they don’t really care about the OS except to the point that it allows them to run applications and get something done.
Once upon a time – when a new “computer hardware system” was designed a new “operating system” would also be written specifically for the hardware. e.g. The Mythic Man-Month is required reading for anyone involved in management in general and “software development” in particular …
Some “industry experts” have argued that Bill Gates’ biggest contribution to the “computer industry” was the idea that “software” could be/should be separate from “hardware.” While I don’t disagree – it would require a retelling of the “history of the personal computer” to really put the remark into context — I’m happy to re-tell the story, but it would require at least two beers – i.e. not here, not now
In 2022 there are a handful of “popular operating systems” that also get divided into two groups – e.g. the “mobile OS” – Android, iOS, and the “desktop OS” Windows, macOS, and Linux
The Android OS is the most installed OS if you are counting “devices.” Since Android is based on Linux – you COULD say that Linux is the most used OS, but we won’t worry about things like that.
Apple’s iOS on the other hand is probably the most PROFITABLE OS. iOS is based on the “Berkely Software Distribution” (BSD) – which is very much NOT Linux, but they share some code …
Microsoft Windows still dominates the desktop. I will not be “bashing Windows” in any form – just point out that 90%+ of the “desktop” machines out there are running some version of Windows.
The operating system that Apple includes with their personal computers in 2022 is also based on BSD. Apple declared themselves a “consumer electronics” company a long time ago — fun fact: the Beatles (yes, John, Paul, George, and Ringo – those “Beatles”) started a record company called “Apple” in 1968 – so when the two Steves (Jobs and Wozniak) wanted to call their new company “Apple Computers” they had to agree to stay out of the music business – AND we are moving on …
On the “desktop” then Linux is the rounding error between Windows machines and Macs.
What is holding back “Linux on the desktop?” Well, in 2022 the short answer is “applications” and more specifically “gaming.”
You cannot gracefully run Microsoft Office, Avid, or the Adobe Suit on a Linux based desktop. Yes, there are alternatives to those applications that perform wonderfully on Linux desktops – but that isn’t the point.
e.g. that “intro to computers” class I taught used Microsoft Word, and Excel for 50% of the class. If you want to edit audio/video “professionally” then you are (probably) using Avid or Adobe products (read the credits of the next “major Hollywood” movie you watch).
Then the chicken and egg scenario pops up – i.e. “big application developer” would (probably) release a Linux friendly version if more people used Linux on the desktop – but people don’t use Linux on the desktop because they can’t run all of the application software they want – so they don’t have a Linux version of the application.
Yes, I am aware of WINE – but it illustrates the problem much more than acts as a solution — and we are moving on …
Linux Distros – a short history
Note that “Linux in the server room” has been a runaway success story – so it is POSSIBLE that “Linux on the desktop” will gain popularity, but not likely anytime soon.
Also worth pointing out — it is possible to run a “Microsoft free” enterprise — but if the goal is lowering the “total cost of ownership” then (in 2022) Microsoft still has a measurable advantage over any “100% Linux based” solution.
If you are “large enterprise” then the cost of the software isn’t your biggest concern – “Support” is (probably) “large enterprise, Inc’s” largest single concern.
fwiw: IBM and Red Hat are making progress on “enterprise level” administration tools – but in 2022 …
ANYWAY – the “birthdate” for Linux is typically given as 1991.
Under the category of “important technical distinction” I will mention that “Linux” is better described as the “kernel” for an OS and NOT an OS in and of itself.
Think of Linux as the “engine” of a car – i.e. the engine isn’t the “car”, you need a lot of other systems working with and around the engine for the “car” to function.
For the purpose of this article I will describe the combination of “Linux kernel + other operating system essentials” as a “Linux Distribution” or more commonly just “distro.” Ready? ok …
1992 gave us Slackware. Patrick Volkerding started the “oldest surviving Linux distro” which accounted for 80 percent share of the “Linux” market until the mid-1990s
1992 – 1996 gave us openSUSE Linux. Thomas Fehr, Roland Dyroff, Burchard Steinbild, and Hubert Mantel. I tend to call SUSE “German Linux” and they were just selling the “German version of Slackware” on floppy disks until 1996.
btw: the “modern Internet” would not exist as it is today without Linux in the server room. All of these “early Linux distros” had business models centered around “selling physical media.” Hey, download speed were of the “dial-up” variety and you were paying “by the minute” in most of Europe – so “selling media” was a good business model …
1993 -1996 gave us the start of Debian – Ian Murdock. The goal was a more “user friendly” Linux. First “stable version” was 1996 …
1995 gave us the Red Hat Linux — this distro was actually my “introduction to Linux.” I bought a book that had a copy of Red Hat Linux 5.something (I think) and did my first Linux install on an “old” pc PROBABLY around 2001.
During the dotcom “boom and bust” a LOT of Linux companies went public. Back then it was “cool” to have a big runup in stock valuation on the first day of trading – so when Red Hat “went public” in 1999 they had the eighth-biggest first-day gain in the history of Wall Street.
The run-up was a little manufactured (i.e. they didn’t release a lot of stock for purchase on the open market). My guess is that in 2022 the folks arranging the “IPO” would set a higher price for the initial price or release more stock if they thought the offering was going to be extremely popular.
Full disclosure – I never owned any Red Hat stock, but I was an “interested observer” simply because I was using their distro.
Red Hat’s “corporate leadership” decided that the “selling physical media” business plan wasn’t a good long term strategy. Especially as “high speed Internet” access moved across the U.S.
e.g. that “multi hour dial up download” is now an “under 10 minute iso download” – so I’d say the “corporate leadership” at Red Hat, Inc made the right decision.
Around 2003 the Red Hat distro kind of “split” into “Red Hat Enterprise Linux” (RHEL – sold by subscription to an “enterprise software” market) and the “Fedora project.” (meant to be a testing ground for future versions of RHEL as well as the “latest and greatest” Linux distro).
e.g. the Fedora project has a release target of every six months – current version 35. RHEL has a longer planned release AND support cycle – which is what “enterprise users” like – current version 9.
btw – yes RHEL is still “open source” – what you get for your subscription is “regular updates from an approved/secure channel and support.” AlmaLinux and CentOS are both “clones” of RHEL – with CentOS being “sponsored” by Red Hat.
IBM “acquired” Red Hat in 2019 – but nothing really changed on the “management” side of things. IBM has been active in the open source community for a long time – so my guess is that someone pointed out that a “healthy, independent Red Hat” is good for IBM’s bottom line in the present and future.
ANYWAY – obviously Red Hat is a “subsidiary” of IBM – but I’m always surprised when “long time computer professionals” seem to be unaware of the connections between RHEL, Fedora Project, CentOS, and IBM (part of what motivated this post).
Red Hat has positioned itself as “enterprise Linux” – but the battle for “consumer Linux” still has a lot of active competition. The Fedora project is very popular – but my “non enterprise distros of choice” are both based on Debian:
Ubuntu (first release 2004) – “South African Internet mogul Mark Shuttleworth” gets credit for starting the distro. The idea was that Debian could be more “user friendly.” Occasionally I teach an “introduction to Linux class” and the big differences between “Debian” and “Ubuntu” are noticeable – but very much in the “ease of use” (i.e. “Ubuntu” is “easier” for new users to learn)
I would have said that “Ubuntu” meant “community” (which I probably read somewhere) but the word is of ancient Zulu and Xhosa origin and more correctly gets translated “humanity to others.” Ubuntu has a planned release target of every six months — as well as a longer “long term support” (LTS) version.
Linux Mint (first release 2008) – Clément Lefèbvre gets credit for this one. Technically Linux Mint describes itself as “Ubuntu based” – so of course Debian is “underneath the hood.” I first encountered Linux Mint from a reviewer that described it as the best Linux distro for people trying to not use Microsoft Windows.
The differences between Mint and Ubuntu are cosmetic and also philosophical – i.e. Mint will install some “non open source” (but still free) software to improve “ease of use.”
The beauty of “Linux” is that it can be “enterprise level big” software or it can be “boot from a flash drive” small. It can utilize modern hardware and GPU’s or it can run on 20 year old machines. If you are looking for specific functionality, there might already be a distro doing that – or if you can’t find one, you can make your own
Leave a Reply