buzzword bingo, pedantic-ism, and the internet

just ranting

One of the identifying characteristics of “expert knowledge” is understanding how everything “fits” together. True “mastery” of any field with a substantial “body of knowledge” takes time and effort. Which means there are always more people that “know enough to be dangerous” than there are “real experts.”

Which is really just recognition of the human condition – i.e. if we had unlimited time and energy then there would be a lot more “true experts” in every field.

There is a diminishing return on “additional knowledge” after a certain point. e.g. Does anyone really need to understand how IBM designed Token Ring networks? Well, it might be useful for historic reasons and theoretical discussion – but I’ll go out on a limb and say it isn’t worth the effort to become an “expert” on Token Ring – and if you are studying “networking” becoming an expert on Token Ring is not worth the time.

There are also a lot of subjects where a slightly “incorrect” understanding is part of the learning process. e.g. Remember that high school chemistry class where you learned about electrons orbiting the nucleus at various discrete “energy levels” like tiny moons orbiting a planet? Then remember that college chemistry class where they told you that isn’t the way it actually is – but don’t worry about it, everyone learns it that way.

(random thought – just because we can’t be sure where something is, doesn’t mean it can be in two spots at the same time – just like that cat in a box – it isn’t half alive and half dead, it is one or the other, we just can’t know which one – and moving on …)

buzzwords vs jargon vs actual understanding

Dilbert’s “pointy haired boss” is routinely held up for ridicule for “buzzword spouting” – which – in the most negative sense of the concept – implies that the person using “buzzwords” about “subject” has a very minimal understanding of the “subject.”

Of course the “Dilbert principle” was/is that the competent people in a company are too valuable at their current job – and so cannot be promoted to “management”. Which implies that all managers are incompetent by default/design. It was a joke. It is funny. The reality is that “management” is a different skillset – but the joke is still funny ๐Ÿ˜‰

The next step up are the folks that can use the industry “jargon” correctly. Which simply illustrates that “education” is a process. In “ordinary speech” we all recognize and understand more words than we actively use – the same concept applies to acquiring and using the specific vocabulary/”jargon” of a new field of study (whatever that field happens to be).

However if you stay at the “jargon speaking” level you have not achieved the goal of “actual understanding” and “applied knowledge.” Yes, a lot of real research has gone into describing different “levels”/stages in the process – which isn’t particularly useful. The concept that there ARE stages is much more important than the definition of specific points in the process.

pedants

No one want a teacher/instructor that is a “pedant” – you know, that teacher that know a LOT about a subject and thinks that it is their job to display just how much they know — imagine the high school teacher that insists or correcting EVERYONES grammar ALL THE TIME.

There is an old joke that claims that the answer to EVERY accounting question is “it depends.” I’m fond of applying that concept to any field where “expert knowledge” is possible – i.e. the answer to EVERY question is “it depends.”

(… oh, and pedants will talk endlessly about how much they know – but tend to have problems applying that knowledge in the real world. Being “pedantic” is boring/bad/counter productive – and ’nuff said)

Of course if you are the expert being asked the question, what you get paid for is understanding the factors that it “depends on.” If you actually understand the factors AND can explain it to someone that isn’t an expert – then you are a rara avis.

In “I.T.” you usually have three choices – e.g. “fast”, “cheap” (as in “low cost”/inexpensive), “good”/(as in durable/well built/”it is heavy? then it is expensive”) – but you only get to choose two. e.g “fast and cheap” isn’t going to be “good”, “fast and good” isn’t going to be “inexpensive.”

Is “Cheap and good” possible? – well, in I.T. that probably implies using open source technologies and taking the time to train developers on the system – so an understanding of “total cost of ownership” probably shoots down a lot of “cheap and good” proposals – but it might be the only option if the budget is “we have no budget” – i.e. the proposal might APPEAR “low cost” when the cost is just being pushed onto another area — but that isn’t important at the moment.

internet, aye?

There is an episode of the Simpsons where Homer starts a “dot com” company called Compu Glogal Hyper Meganet – in classic Simpsons fashion they catch the cultural zeitgeist – I’ll have to re-watch the episode later – the point for mentioning it is that Homer obviously knew nothing about “technology” in general.

Homer’s “business plan” was something like saying “aye” after every word he didn’t understand – which made him appear like he knew what he was talking about (at the end of the episode Bill Gates “buys him out” even though he isn’t sure what the company does – 1998 was when Microsoft was in full “antitrust defense by means of raised middle finger” – so, yes it was funny)

(random thought: Microsoft is facing the same sort of accusations with their “OneDrive” product as they did with “Internet Explorer” – there are some important differences – but my guess is THIS lawsuit gets settled out of court ๐Ÿ˜‰ )

ANYWAY – anytime a new technology comes along, things need to settle down before you can really get past the “buzzword” phase. (“buzzword, aye?”) – so, while trying not to be pedantic, an overview of the weather on the internet in 2021 …

virtualization/cloud/fog/edge/IoT

Some (hopefully painless) definitions:

first – what is the “internet” – the Merriam-Webster definition is nice, slightly more accurate might be to say that the internet is the “Merriam-Webster def” plus “that speaks TCP/IP.” i.e. the underlying “language” of the internet is something called TCP/IP

This collection of worldwide TCP/IP connected networks is “the internet” – think of this network as “roads”

Now “the internet” has been around for a while – but it didn’t become easy to use until Tim Berners Lee came up with the idea for a “world wide web” circa 1989.

While rapidly approaching pedantic levels – this means there is a difference between the “internet” and the “world wide web.” If the internet is the roads, then the web is traffic on those roads.

It is “true” to say that the underlying internet hasn’t really changed since the 1980’s – but maybe a little misleading.

Saying that we have the “same internet” today is a little like saying we have the same interstate highway system today as we did when Henry Ford invented the Model-T. A lot of $$ has gone into upgrading the “internet” infrastructure since the 1980’s – just like countless $$ have gone into building “infrastructure” for modern automobiles …

Picking up speed – Marc Andreessen gets credit for writing the first “modern” web browser in the early 1990s. Which kinda makes “web browsers” the “vehicles” running on the “web”

Britannica via Google tells me that the first use of the term “cyberspace” goes back to 1982 – for convenience we will refer to the “internet/www/browser” as “cyberspace” – I’m not a fan of the term, but it is convenient.

Now imagine that you had a wonderful idea for a service existing in “cyberspace” – back in the mid-1990’s maybe that was like Americans heading west in the mid 19th century. If you wanted to go west in 1850, there were people already there, but you would probably have to clear off land and build your own house, provide basic needs for yourself etc.

The cyberspace equivalent in 1995 was that you had to buy your own computers and connect them to the internet. This was the time when sites like “Yahoo!” and/or “eBay” kind of ruled cyberspace. You can probably find a lot of stories of teenagers starting websites – that attracted a lot of traffic, and then sold them off for big $$ without too much effort. The point being that there weren’t a lot of barriers/rules on the web – but you had to do it yourself.

e.g. A couple of nice young men (both named “Sean”) met in a thing called “IRC” and started a little file sharing project called Napster in 1999 – which is a great story, but also illustrates that there is “other traffic” on the internet besides the “web” (i.e. Napster connected users with each other – they didn’t actually host files for sharing)

Napster did some cool stuff on the technical side – but had a business model that was functionally based on copyright infringement at some level (no they were not evil masterminds – they were young men that liked music and computers).

ANYWAY – the point being that the Napster guys had to buy computers/configure the computers/and connect them to the internet …

Startup stories aside – the next big leap forward was a concept called “virtualization”. The short version is that hardware processing power grew much faster than “software processing” required – SO 1 physical machine would be extremely underutilized and inefficient – then “cool tech advancements” happened and we could “host” multiple “servers” on 1 physical machine.

Extending the “journey west” analogy – virtualization allowed for “multi-tenant occupation” – at this point the roads were safe to travel/dependable/you didn’t HAVE to do everything yourself. When you got to your destination you could stay at the local bed and breakfast while you looked for a place to stay permanent (or move on).

… The story so far: we went from slow connections between big time-sharing computers in the 1970’s to fast connections between small personal computers in the 1990’s to “you need a computer to get on the web” and the “web infrastructure” consists mostly of virtualized machines in the early 2000s …

Google happened in there somewhere, which was a huge leap forward in real access to information on the web – another great story, just not important for my story today ๐Ÿ˜‰

they were an online bookstore once …

Next stop 2006. Jeff Bezos and Amazon.com are (probably) one of the greatest business success stories in recorded history. They had a LONG time where they emphasized “growth” over profit – e.g. when you see comic strips from the 1990’s about folks investing in “new economy” companies that had never earned a profit, Amazon is the success story.

(fwiw: of course there were also a LOT of companies that found out that the “new economy” still requires you to make a profit at some point – the dot.com boom and bust/”bubble” has been the subject of many books – so moving on …)

Of course in the mid-2000’s Amazon was still primarily a “retail shopping site.” The problem facing ANY “retail” establishment is meeting customer service/sales with employee staffing/scheduling.

If you happen to be a “shopping website” then your way of dealing with “increased customer traffic” is to implement fault/tolerance and load balancing techniques – the goal is “fast customer transactions” which equals “available computing resources” but could also mean “inefficient/expensive.”

Real world restaurant example: I’m told that the best estimate for how busy any restaurant will be on any given day is to look at how busy they were last year on the same date (adjusting for weekends and holidays). SO if a restaurant expects to be very busy on certain days – they can schedule more staff for those days. If they don’t expect to be busy, then they will schedule fewer employees.

Makes sense? Cool. The point is that Amazon had the same problem – they had the data on “expected customer volume” and had gone about the process of coming up with a system that would allow for automatic adjustment of computing resources based on variable workloads.

I imagine the original goal might have been to save money by optimizing the workloads – but then someone pointed out that if they designed it correctly then they could “rent out” the service to other companies/individuals.

Back to our “westward expansion” analogy – maybe this would be the creation of the first “hotel chains.” The real story of “big hotel chains” probably follows along with the westward expansion of the railroad – i.e. the railroads needed depots, and those depots became natural “access” points for travelers – so towns grew up around the depots and inns/”hotels” developed as part of the town – all of which is speculation on my part – but you get the idea

The point being that in 2006 the “cloud” came into being. To be clear the “cloud” isn’t just renting out a virtual machine in someone else’s data center – the distinct part of “cloud services” is the idea of “variable costs for variable workloads.”

Think of the electrical grid – if you use more electricity then you pay for what you use, if you use less electricity then your electrical expenses go down.

The “cloud” is the same idea – if you need more resources because you are hosting an eSports tournament – then you can use more resources – build out/up – and then when the tournament is over scale back down.

Or if you are researching ‘whatever’ and need to “process” a lot of data – before the cloud you might have had to invest in building your own “super computer” which would run for a couple weeks and then be looking for something to do. Now you can utilize one of the “public cloud” offerings and get your data ‘processed’ at a much lower cost (and probably faster – so again, you are getting “fast” and “inexpensive” but you are using “virtual”/on demand/cloud resources).

If you are interested in the space exploration business – an example from NASA –

Fog/Edge/IoT?

The next problem becomes efficiently collecting data while also controlling cost. Remember with the “cloud” you pay for what you use. Saying that you have “options” for your public/private cloud infrastructure is an understatement.

However, we are back to the old “it depends” answer when we get into concepts like “Fog computing” and the “Internet of things”

What is the “Internet of Things” well NIST has an opinion – if you read the definition and say “that is nice but a little vague” – well, what is the IoT? It depends on what you are trying to do.

The problem is that the how of “data collection” is obviously dependent of the data being collected. So the term becomes so broad that it is essentially meaningless.

Maybe “Fog” computing is doing fast and cheap processing of small amounts of data captured by IoT devices – as opposed to having the data go all the way out to “the cloud” – we are probably talking about “computing on a stick” type devices that plug into the LAN.

Meanwhile “Edge computing” is one for the salespeople – e.g. it is some combination of cloud/fog/IoT – at this point it reminds me of the “Corinthian Leather” Ricardo Montalban was talking about in car commercials way back when ๐Ÿ˜‰

Ok, I’m done – I feel better

SO if you are teaching an online class of substantial length – an entire class only about IoT might be a little pointless. You can talk about various data collecting sensors and chips/whatever – but simply “collecting” data isn’t the point, you need to DO SOMETHING with the data afterwards.

Of course I can always be wrong – my REAL point is that IoT is a buzzword that gets misused on a regular basis. If we are veering off off into marketing and you want to call the class “IoT electric boogaloo” because it increases enrollment – and then talk about the entire cloud/fog/IoT framework – that would probably be worthwhile.

it only took 2400+ words to get that out of my system ๐Ÿ˜‰


Posted

in

, ,

by

Tags: