… then THIS poem is directly about Arthur Henry Hallam — “who died suddenly of a cerebral hemorrhage in Vienna in 1833, aged 22.” (Thank you Google and probably wikipedia)
Published in 1850 – which is the same year Alfred Tennyson married Emily Sellwood. Arthur Hallam’s death would have been 3 or 4 years earlier – so the “love” being lost was his best friend.
The English has a LOT of words – and also a LOT of meanings/connotations for single words. SO “love” gets used a lot in different contexts – allowing for multiple interpretations.
Any “close reading” requires a consideration of the society in which the author is writing – e.g. ancient Greek men talking about “love” is much different than Victorian England men talking about “love.”
An internet commentary speculated that Arthur Hallam and Alfred Tennyson were such close friends that if Mr Hallam hadn’t died that Alfred Tennyson may never have married – which is simply ridiculous.
Yes, they were very close – but any sense of “modern homoeroticism” is being inserted by modern readers. Arthur Hallam was engaged to Tennyson’s younger sister (obviously before his untimely death). Alfred Tennyson wouldn’t meet his future wife for a couple years after Hallam’s death – as mentioned earlier.
For MOST of human history the idea that it possible to “love” someone in a “non sexual manner” has been a given. Obvioulsy “love” and “sex” are NOT synonyms – so if Arthur Hallam had lived Tennyson probably wouldn’t have written “In Memoriam” but he still would have married Emily Sellwood.
Now you can argue about which form of “love” is strongest if you like – but the point (here at least) is that it is possible to love a “best friend” one way and a “romantic partner” another way.
ANYWAY – what I really learned from reciting this poem is that I have no rhythm – or maybe my “rhythm” is from 1950/60 crooners (Crosby/Sinatra/Darin) and not Victorian England 😉
I got around to recording a version of Tennyson’s “Ulysses” ….
also learned how to add subtitles with Davinci Resolve – which is not complicated but is time consuming. I’m sure there is a better way to create the subtitle file for youtube upload – e.g. there is “markup” in the subtitles which I didn’t intend.
The picture is where Tennyson lived from 1853 until his death in 1892. officially it is “Farringford House, in the village of Freshwater Bay, Isle of Wight” — the house is now a “luxury hotel” – which means it isn’t really open as a “tourist destination” but if you have the resources you might be able to stay there …
Tennyson wrote “Ulysses” in his early 20’s after the death of a close friend. The “narrator” of the poem is supposed to be “old” Ulysses after his return to Ithaca – maybe 50-ish –
ANYWAY – the “Iliad” is about the end of the Trojan war – with the story centering around Achilles (who gets named dropped near the end of Tennyson’s poem).
The “Odyssey” is a “sequel” to the “Iliad” telling the story of Odysseus’ (original Greek)/Ulysses (Latin/English translation) journey home from Troy – which ends up taking 10 years (after the 10 year siege of Troy)
classical “spoiler alert” — Achilles death and Odysseus coming up with the “Trojan horse” idea allowing the Greeks inside the wall of Troy — happen “off screen” between where the Iliad ends and the Odyssey begins
Odysseus returns home alone (he was King of Ithaca when he left to fight against Troy) – and promptly kills all of the “high ranking suitors” that were trying to coerce his wife (Penelope) into marriage (you know, because Odysseus must have died – since everyone else returned from Troy 10 years ago).
some versions of Homer’s “Odyssey” have Ulysses taking care of business and then the “gods” have to intervene to establish peace – some modern scholars argue for various “alternate endings” e.g. it may have ended with the reunion of Odysseus and Penelope —
Tennyson’s Ulysses picks up the story a couple years (“three suns” might be “three years”?) after the Odyssey. With his family taken care of and his son firmly established as the next ruler, Ulysses wants to go on one last adventure.
Now, if you want to nit-pick – the “mariners” mentions were obviously NOT under Ulysses command on the return from Troy (remember, he loses everything – ship, crew, clothes – EVERYTHING – on the way home). However, that doesn’t mean that they hadn’t all fought together during the (10 year) Trojan war …
I’ll point out (again) that Tennyson was in his early 20’s when he wrote Ulysses. Tennyson wouldn’t meet Emily Sellwood until his mid 20’s (whom he would be married to until his death 42 years later) – so maybe faithful Penelope should get a better treatment than just a passing reference as an “aged wife” – but that is just me nit-picking 😉
Come my friends, let us reason together … (feel free to disagree, none of this is dogma)
There are a couple of “truisms” that APPEAR to conflict –
Truism 1:
The more things change the more they stay the same.
… and then …
Truism 2:
The only constant is change.
Truism 1 seems to imply that “change” isn’t possible while Truism 2 seems to imply that “change” is the only possibility.
There are multiple way to reconcile these two statements – for TODAY I’m NOT referring to “differences in perspective.”
Life is like a dogsled team. If you aren’t the lead dog, the scenery never changes.
(Lewis Grizzard gets credit for ME hearing this, but he almost certainly didn’t say it first)
Consider that we are currently travelling through space and the earth is rotating at roughly 1,000 miles per hour – but sitting in front of my computer writing this, I don’t perceive that movement. Both the dogsled and my relative lack of perceived motion are examples of “perspective” …
Change
HOWEVER, “different perspectives” or points of view isn’t what I want to talk about today.
For today (just for fun) imagine that my two “change” truisms are referring to different types of change.
Truism 1 is “big picture change” – e.g. “human nature”/immutable laws of the universe.
Which means “yes, Virginia there are absolutes.” Unless you can change the physical laws of the universe – it is not possible to go faster than the speed of light. Humanity has accumulated a large “knowledge base” but “humans” are NOT fundamentally different than they were 2,000 years ago. Better nutrition, better machines, more knowledge – but humanity isn’t much different.
Truism 2 can be called “fashion“/style/”what the kids are doing these days” – “technology improvements” fall squarely into this category. There is a classic PlayStation 3 commercial that illustrates the point.
Once upon a time:
mechanical pinball machines were “state of the art.”
The Atari 2600 was probably never “high tech” – but it was “affordable and ubiquitous” tech.
no one owned a “smartphone” before 1994 (the IBM Simon)
the “smartphone app era” didn’t start until Apple released the iPhone in 2007 (but credit for the first “App store” goes to someone else – maybe NTT DoCoMo?)
SO fashion trends come and go – but the fundamental human needs being services by those fashion trends remain unchanged.
What business are we in?
Hopefully, it is obvious to everyone that it is important for leaders/management to understand the “purpose” of their organization.
If someone is going to “lead” then they have to have a direction/destination. e.g. A tourist might hire a tour guide to “lead” them through interesting sites in a city. Wandering around aimlessly might be interesting for awhile – but could also be dangerous – i.e. the average tourist wants some guidance/direction/leadership.
For that “guide”/leader to do their job they need knowledge of the city AND direction. If they have one OR the other (knowledge OR direction), then they will fail at their job.
The same idea applies to any “organization.” If there is no “why”/direction/purpose for the organization then it is dying/failing – regardless of P&L.
Consider the U.S. railroad system. At one point railroads were a huge part of the U.S. economy – the rail system opened up the western part of the continent and ended the “frontier.”
However, a savvy railroad executive would have understood that people didn’t love railroads – what people valued was “transportation.”
Just for fun – get out any map and look at the location of major cities. It doesn’t have to be a U.S. map.
The point I’m working toward is that throughout human history, large settlements/cities have centered around water. Either ports to the ocean or next to riverways. Why? Well, obviously humans need water to live but also “transportation.”
The problem with waterways is that going with the current is much easier than going against the current.
SO this problem was solved first by “steam powered boats” and then railroads. The early railroads followed established waterways connecting established cities. Then as railroad technology matured towns were established as “railway stations” to provide services for the railroad.
Even as the railroads became a major portion of the economy – it was NEVER about the “railroads” it was about “transportation”
fwiw: then the automobile industry happened – once again, people don’t car so much about “cars” what they want/need is “transportation”
If you are thinking “what about ‘freight’ traffic” – well, this is another example of the tools matching the job. Long haul transportation of “heavy” items is still efficiently handled by railroads and barges – it is “passenger traffic” that moved on …
We could do the same sort of exercise with newspapers – i.e. I love reading the morning paper, but the need being satisfied is “information” NOT a desire to just “read a physical newspaper”
What does this have to do with I.T.?
Well, it is has always been more accurate to say that “information technology” is about “processing information” NOT about the “devices.”
full disclosure: I’ve spent a lifetime in and around the “information technology” industry. FOR ME that started as working on “personal computers” then “computer networking”/LAN administration – and eventually I picked up an MBA with an “Information Management emphasis”.
Which means I’ve witnessed the “devices” getting smaller, faster, more affordable, as well as the “networked personal computer” becoming de rigueur. However, it has never been about “the box” i.e. most organization aren’t “technology companies” but every organization utilizes “technology” as part of their day to day existence …
Big picture: The constant is that “good I.T. practices” are not about the technology.
Backups
When any I.T. professional says something like “good backups” solve/prevent a lot of problems it is essential to remember how a “good backup policy” functions.
Back in the day folks would talk about a “grandfather/father/son” strategy – if you want to refer to it as “grandmother/mother/daughter” the idea is the same. At least three distinct backups – maybe a “once a month” complete backup that might be stored in a secure facility off-site, a “once a week” complete backup, and then daily backups that might be “differential.”
It is important to remember that running these backups is only part of the process. The backups also need to be checked on a regular basis.
Checking the validity/integrity of backups is essential. The time to check your backups is NOT after you experience a failure/ransomware attack.
Of course how much time and effort an organization should put into their backup policy is directly related to the value of their data. e.g. How much data are you willing to lose?
Just re-image it
Back in the days of the IBM PC/XT, if/when a hard drive failed it might take a day to get the system back up. After installing the new hard drive, formatting the drive and re-installing all of the software was a time intensive manual task.
Full “disk cloning” became an option around 1995. “Ghosting” a drive (i.e. “cloning”) belongs in the acronym Hall of Fame — I’m told it was supposed to stand for “general hardware-oriented system transfer.” The point being that now if a hard drive failed, you didn’t have to manually re-install everything.
Jump forward 10 years and Local Area Networks are everywhere – Computer manufacturers had been including ‘system restore disks’ for a long time AND software to clone and manage drives is readily available. The “system cloning” features get combined with “configuration management” and “remote support” and this is the beginning of the “modern I.T.” era.
Now it is possible to “re-image” a system as a response to software configuration issues (or malware). Disk imaging is not a replacement for a good backup policy – but it reduced “downtime” for hardware failures.
The more things change …
Go back to the 1980’s/90’s and you would find a lot of “dumb terminals” connecting to a “mainframe” type system (well, by the 1980s it was probably a “minicomputer” not a full blown “mainframe”).
A “dumb terminal” has minimal processing power – enough to accept keyboard input and provide monitor output, and connect to the local network.
Of course those “dumb terminals” could also be “secured” so there were good reasons for keeping them around for certain installations. e.g. I remember installing a $1,000 expansion card into new late 1980’s era personal computers to make it function like a “dumb terminal” – but that might have just been the Army …
Now in 2022 we have “chrome books” that are basically the modern version of “dumb terminals.” Again, the underlying need being serviced is “communication” and “information” …
All of which boils down to “basics” of information processing haven’t really changed. The ‘personal computer’ is a general purpose machine that can be configured for various industry specific purposes. Yes, the “era of the PC” has been over for 10+ years but the need for ‘personal computers’ and ‘local area networks’ will continue.
Merriam-Webster tells me that etymology is the “the history of a linguistic form (such as a word)” (the official definition goes on a little longer – click on the link if interested …)
The last couple weeks I’ve run into a couple of “industry professionals” that are very skilled in a particular subset of “information technology/assurance/security/whatever” but obviously had no idea what “the cloud” consists of in 2022.
Interrupting and then giving an impromptu lecture on the history and meaning of “the cloud” would have been impolite and ineffective – so here we are 😉 .
Back in the day …
Way back in the 1980’s we had the “public switched telephone network” (PSTN) in the form of (monopoly) AT&T. You could “drop a dime” into a pay phone and make a local call. “Long distance” was substantially more – with the first minute even more expensive.
The justification for higher connection charges and then “per minute” charges was simply that the call was using resources in “another section” of the PSTN. How did calls get routed?
Back in 1980 if you talked to someone in the “telecommunications” industry they might have referred to a phone call going into “the cloud” and connecting on the other end.
(btw: you know all those old shows where they need “x” amount of time to “trace” a call – always a good dramatic device, but from a tech point of view the “phone company” knew where each end of the call was originating – you know, simply because that was how the system worked)
I’m guessing that by the breakup of AT&T in 1984 most of the “telecommunications cloud” had gone digital – but I was more concerned with football games in the 1980s than telecommunications – so I’m honestly not sure.
In the “completely anecdotal” category “long distance” had been the “next best thing to being there” (a famous telephone system commercial – check youtube if interested) since at least the mid-1970s – oh, and “letter writing”(probably) ended because of low cost long distance not because of “email”
Steps along the way …
Important technological steps along the way to the modern “cloud” could include:
the first “modem” in in the early 1960s – that is a “modulator”/”demodulator” if you are keeping score. A device that could take a digital signal and convert it to an analog wave for transmission over the PSTN on one end of the conversation and another modem could reverse the process on the other end.
Ethernet was invented in the early 1970’s – which allowed computers to talk to each other over long distances. You are probably using some flavor of Ethernet on your LAN
TCP/IP was “invented” in the 1970’s then became the language of ARPANET in the early 1980’s. One way to define the “Internet” is as a “large TCP/IP network” – ’nuff said
that web thing
Tim Berners-Lee gets credit for “inventing” the world wide web in 1989 while at CERN. Which made “the Internet” much easier to use – and suddenly everyone wanted a “web site.”
Of course the “personal computer” needed to exist before we could get large scale adoption of ANY “computer network” – but that is an entirely different story 😉
The very short version of the story is that personal computer sales greatly increased in the 1990s because folks wanted to use that new “interweb” thing.
A popular analogy for the Internet at the time was as the “information superhighway” – with a personal computer using a web browser being the “car” part of the analogy.
Virtualization
Google tells me that “virtualization technology” actually goes back to the old mainframe/time-sharing systems in the 1960’s when IBM created the first “hypervisor.”
A “hypervisor” is what allows the creation of “virtual machines.” If you think of a physical computer as an empty warehouse that can be divided into distinct sections as needed then a hypervisor is what we use to create distinct sections and assign resources to those sections.
The ins and outs of virtualization technology is beyond the scope of this article BUT it is safe to say that “commodity computer virtualization technology” was an industry changing event.
The VERY short explanation is that virtualization allows for more efficient use of resources which is good for the P&L/bottom line.
(fwiw: any technology that gets accepted on a large scale in a relatively short amount of time PROBABLY involves saving $$ – but that is more of a personal observation that an industry truism.)
Also important was the development of “remote desktop” software – which would have been called “terminal access” before computers had “desktops.”
e.g. Wikipedia tells me that Microsoft’s “Remote Desktop Protocol” was introduced in Windows NT 4.0 – which ZDNet tells me was released in 1996 (fwiw: some of of my expired certs involved Windows NT).
“Remote access” increased the number of computers a single person could support which qualifies as another “industry changer.” As a rule of thumb if you had more than 20 computers in your early 1990s company – you PROBABLY had enough computer problems to justify hiring an onsite tech.
With remote access tools not only could a single tech support more computers – they could support more locations. Sure in the 1990’s you probably still had to “dial in” since “always on high speed internet access” didn’t really become widely available until the 2000s – but as always YMMV.
dot-com boom/bust/bubble
There was a “new economy” gold rush of sorts in the 1990s. Just like gold and silver exploration fueled a measurable amount of “westward migration” into what was at the time the “western frontier” of the United States – a measurable amount of folks got caught up in “dot-com” hysteria and “the web” became part of modern society along the way.
I remember a lot of talk about how the “new economy” was going to drive out traditional “brick and mortar” business. WELL, “the web” certainly goes beyond “industry changing” – but in the 1990s faith in an instant transformation of the “old economy” into a web dominated “new economy” reached zeitgeist proportions …
In 2022 some major metropolitan areas trace their start to the gold/silver rushes in the last half of the 19th century (San Francisco and Denver come to mind). There are also a LOT of abandoned “ghost towns.”
In the “big economic picture the people running saloons/hotels/general stores in “gold rush areas” had a decent change of outliving the “gold rush” assuming that there was a reason for the settlement to be there other than “gold mining”
The “dot-com rush” equivalent was that a large number of investors were convinced that a company could stay a “going concern” if it didn’t make a profit. However – just like the people selling supplies to gold prospectors had a good chance of surviving the gold – the folks selling tools to create a “web presence” did alright – i.e. in 2022 the survivors of the “dot-com bubble” are doing very well (e.g. Amazon, Google)
Web Hosting
In the “early days of the web” establishing a “web presence” took (relatively) arcane skills. The joke was that if you could spell HTML then you could get a job as a “web designer” – ok, maybe it isn’t a “funny” joke – but you get the idea.
An in depth discussion of web development history isn’t required – pointing out that web 1.0 was the time of “static web pages” is enough.
If you had a decent internet service provider they might have given you space on their servers for a “personal web page.” If you were a “local” business you might have been told by the “experts” to not worry about a web site – since the “web” would only be useful for companies with a widely dispersed customer base.
That wasn’t bad advice at the time – but the technology needed to mature. The “smart phone” (Apple 2007) motivated the “mobile first” development strategy – if you can access the web through your phone, then it increases the value of “localized up to date web information.”
“Web hosting” was another of those things that was going to be “free forever” (e.g. one of the tales of “dot-com bubble” woes was “GeoCities”). Which probably slowed down “web service provider” growth – but that is very much me guessing.
ANYWAY – in web 1.0 (when the average user was connecting by dial up) the stress put on web servers was minimal – so simply paying to rent space on “someone else’s computer” was a viable option.
The next step up from “web hosting” might have been to rent a “virtual server” or “co-locate” your own server – both of which required more (relatively) arcane skills.
THE CLOUD
Some milestones worth pointing out:
1998 – VMWare “Workstation” released (virtualization on the desktop)
“Google search” was another “industry changing” event that happened in 1998 – ’nuff said
2001 VMWare ESX (server virtualization)
2005 Intel released the first cpus with “Intel Virtualization Technology” (VT-x)
2005 Facebook – noteworthy, but not “industry changing”
2006 Amazon Web Services (AWS)
Officially Amazon described AWS as providing “IT infrastructure services to businesses in the form of web services” – i.e. “the cloud”
NIST tells us that –
Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics, three service models, and four deployment models.
If we do a close reading of the NIST definition – the “on-demand” and “configurable” portions are what differentiates “the cloud” from “using other folks computers/data center.”
I like the “computing as a utility” concept. What does that mean? Glad you asked – e.g. Look on a Monopoly board and you will see the “utility companies” listed as “Water Works” and “Electric Company.”
i.e. “water” and “electric” are typically considered public utilities. If you buy a home you will (probably) get the water and electric changed into your name for billing purposes – and then you will pay for the amount of water and electric you use.
BUT you don’t have to use the “city water system” or local electric grid – you could choose to “live off the grid.” If you live in a rural area you might have a well for your water usage – or you might choose to install solar panels and/or a generator for your electric needs.
If you help your neighbors in an emergency by allowing them access to your well – or maybe connecting your generator to their house. You are a very nice neighbor BUT you aren’t a “utility company” – i.e. your well/generator won’t have the capacity that the full blown “municipal water system” or electric company can provide.
Just like if you have a small datacenter and start providing “internet services” to customers – unless you are big enough to be “ubiquitous, convenient, and on-demand” then you aren’t a “cloud provider.”
Also note the “as a service” aspect of the cloud – i.e. when you sign up you will agree to pay for what you use, but you aren’t automatically making a commitment for any minimal amount of usage.
As opposed to “web hosting” or “renting a server” where you will probably agree to a monthly fee and a minimal term of service.
Billing options and service capabilities are obviously vendor specific. As a rule of thumb – unless you have “variable usage” then using “the cloud” PROBABLY won’t save you money over “web hosting”/”server rental.”
The beauty of the cloud is that users can configure “cloud services” to automatically scale up for an increase in traffic and then automatically scale down when traffic decreases.
e.g. image a web site that has very high traffic during “business hours” but then minimal traffic the other 16 hours of the day. A properly configured “cloud service” would scale up (costing more $$) during the day and then scale down (costing fewer $$) at night.
Yes, billing options become a distinguishing element of the “cloud” – which further muddies the water.
Worth pointing out is that if you are “big internet company” you might get to the point where it is in your company’s best interest to build your own datacenters.
This is just the classic “rent” vs “buy” scenario – i.e. if you are paying more in “rent” than it would cost you to “buy” then MAYBE “buying your own” becomes an option (of course “buying your own” also means “maintaining” and “upgrading” your own). This tends to work better in real estate where “equity”/property values tends to increase.
Any new “internet service” that strives to be “globally used” will (probably) start out using “the cloud” – and then if/when they are wildly successful, start building their own datacenters while decreasing their usage of the public cloud.
Final Thoughts
It Ain’t What You Don’t Know That Gets You Into Trouble. It’s What You Know for Sure That Just Ain’t So
Artemus Ward
As a final thought – “cloud service” usage was $332.3 BILLION in 2021 up from $270 billion in 2020 (according to Gartner).
There isn’t anything magical about “the cloud” – but it is a little more complex than just “using other people’s computers.”
The problem with “language” in general is that there are always regional and industry differences. e.g. “Salesforce” and “SAP” fall under the “cloud computing” umbrella – but Salesforce uses AWS to provide their “Software as a Service” product and SAP uses Microsoft Azure.
I just spent 2,000 words trying to explain the history and meaning of “the cloud” – umm, maybe a cloud by any other name would still be vendor specific
HOWEVER I would be VERY careful with choosing a cloud provider that isn’t offered by a “big tech company” (i.e. Microsoft, Amazon, Google, IBM, Oracle). “Putting all of your eggs in one basket” is always a risky proposition (especially if you aren’t sure that the basket is good in the first place) — as always caveat emptor …
Just some random thoughts – Starting off with a famous quote attributed to Albert Einstein –
If you can’t explain it simply, you don’t understand it well enough
Albert Einstein
Leadership
The Einstein quote came to mind for a “2 drink story” reason that I will not relate here.
I’ve been a “student of leadership” going back to my days playing “high school sports.” Athletics can become a “leadership classroom” – with “wins/losses” providing feedback – and obvious “leadership” lessons involved in “team performance”.
If a team is going to be “successful” then the “coach” needs to tailor their “coaching” to the level of the athletes. e.g. Coaching a group of 10 year old athletes will obviously be different than coaching a group of 20 year old athletes.
SO in “leadership education” they might call this “situational leadership.” In coaching this is the old “you need to master the basic skills first” concept.
You need to master crawling before you learn to walk. You need to master walking before you can run. Then riding a bike might take care of itself when/if you are ready – assuming you have “learned how to learn.”
Teaching
The task facing the coach/teacher/leader becomes helping the athletes/students/employees “master” the required skills.
The thought on my mind is that how much the coach “knows” isn’t as important as how much they can help the athlete learn.
“Playing” a sport requires different skills than “coaching” a sport. Just because someone was a great athlete does NOT mean they can teach those skills to others. Just because someone wasn’t a great athlete doesn’t mean they won’t be a great coach.
(… examples abound of both “great athletes” becoming great coaches, “great athletes” becoming “meh” coaches, as well as “average athletes” becoming great coaches – but that isn’t important at the moment)
Of course having great athletes can make an average coach look like a great coach – but that also isn’t my point today.
I’ve watched a lot of “video lectures” given by highly qualified instructors. Occasionally I run into an instructor/presenter that the only thing I get from their presentation is that THEY appear to know a lot – i.e. they didn’t “teach me” anything.
e.g. one instructor seemed to be reading from the manual – I’m sure in their head they were “transferring information” but the lessons were unwatchable. IF I want to read the manual – I can find the manual and read it. What I want from an instructor is examples illustrating the material NOT just a recitation of the facts.
Again, a presenter/teacher bombarding the audience with the breadth and width of their knowledge might be satisfying to the presenter’s ego – but not much else.
I’m a an of “storytelling” as an instructional tool – but that means “tell relevant stories that illustrate a point” NOT “vent to a captive audience.”
Education
Tailoring your message to the audience is probably “presenting 101.” It could also be “coaching 101” and “teaching 101.”
“Education” then becomes the end product of coaching/teaching/leadership and is ALWAYS an individualized process.
The worst coach/teacher might still have the occasional championship athlete/high achieving student. My experience has been that the “bad” coach/teacher tends to blame the athletes/students when things go wrong but takes all the credit if something goes right.
MEANWHILE – the “good” coaches/teachers are tailoring their instruction to the level of their athletes/students and recognize that, while getting an education is always an “individual process”, the “process of education” is a “group effort.”
Even if you go to the library and get a book on a subject – someone had to write the book for you to learn the material.
Learning to Teach
Those “bad” coaches/teachers PROBABLY don’t really understand their sport/subject – which is part of what Mr Einstein’s quote points out.
I have had “not so good” teachers tell me a subject is “easy” and that the class needs to memorize the textbook. Yes, the subject might be “easy” to some students – but not ALL of the students – and rote memorization as a means of mass instruction isn’t a particularly effective use of time.
I have also had excellent teachers tell me THEY learn something each time they teach a class. They don’t try to impress with their “vast knowledge.” They will try teach the students what is “important” (some memorization might be required but not as the major form of instruction). These instructors tend to be realistic about how much can be “taught” and emphasize the individual effort required to “learn” anything.
“You will get out of it what you put into it” is imprinted in my mind for some reason. This has morphed into my personal philosophy that “grades in a class tend to be an indication of effort and interest NOT intelligence.” Not everyone can get an “A” in every class, but if they put forth the effort everyone can “pass” the class.
ANYWAY – If someone teaches for 5 years and then looks back at their first year and DOESN’T see improvement in both teaching skills and mastery of the subject – well, they have 1 year of experience 5 times NOT “5 years” experience.
I use a free service called https://letsencrypt.org/ for my “personal” sites ssl/tls certs. The automated cert install usually sets up automatic renewal at 3 month intervals. I’m honestly not sure – mostly because it has “just worked” for my minimal usage.
I’ve been “testing” the setup of streaming from 1 server to another server – and again, it has MOSTLY been error free. There is 20ish hours of music on a playlist – and the playlist loops – it seems to work fine 95% of the time with no effort on my part.
BUT the music stops playing on a regular basis – my suspicion is that the “sending” machine freezes up for some reason. Rebooting and restarting the stream seems has fixed the issue.
Since I’m using a “non enterprise” solution – my guess is that there is a small bug/memory leak in the code that isn’t a factor 99.99% of the time. Since this application has always been temporary – I can’t complain too much 😉
HOWEVER – I ran into a problem with the “receiving” machine giving me a “certificate expired” type error and not playing the stream through the web browser as intended.
Yes, the cert did seem to have expired – but then when I manually renewed it the problem persisted. The stream would play when I accessed it directly (i.e. the “sending” and “receiving” parts seem to be working fine). BUT then accessing it through the web page results in the cert expired error.
My next guess was that the cert was being cached somewhere along the line – cleared browser/machine/network caches and tried multiple machines. No luck.
SO then I just combined the “send/receive” on the same machine and and it works as expected. Of course this meant creating a new certificate for the new site – so the original problem is, well, still a problem but not worth the time to troubleshoot …
Another of my quixotic projects is making a video production of reading Ralph Waldo Emerson’s essays – which I’m sure would appeal to maybe a handful of folks worldwide.
BUT considering my track record in “estimating market potential” something that I think has zero potential would probably perform better than the ideas that I think have a HUGE potential.
Anyway – good ol’ Mr Emerson pointed out that –
“The man who knows how will always have a job. The man who knows why will always be his boss.”
-Ralph Waldo Emerson
Of course a “good boss” will make sure that the folks doing the “how” have an understanding of the “why” – but that isn’t the point.
On my mind this morning is that the difference between knowing “how” to do something, actually doing something, AND understanding “why” something is done.
story time
In “modern America” my guess is that the “average American” has no idea what actually happens when they flip on a light switch.
Of course everyone understands that you flip a switch, turn a knob, push a button – and “electricity” causes the light to come on. I’m NOT saying that the “average” American is uneducated or unintelligent – I’m just saying that the average American (probably) has no idea “why” the light comes on.
Ok, this isn’t meant to be an insult or a negative comment about “education in America” – just pointing out that the “modern electrical grid” involves a lot of parts. Truly understanding the “why” of electricity takes some effort.
experience vs “time in service”
Again, in “modern America” being an “electrician” requires specific training – e.g. here in Ohio, Google tells me that schools offer “electrician training” programs ranging from 9 months to 2 years.
After graduation our aspiring electrician probably has a good understanding of “how” to “work with the electrical grid” – and can perform work “up to code.”
While our young electrician understands the “how” of his job, they probably don’t understand the “why” of EVERYTHING in the codes.
Again, I’m not trying to insult electricians – just pointing out that somethings won’t become “obvious” until you have some experience doing the job.
SO what is “obvious” to that wise old electrician that has been doing the job for 20 years PROBABLY isn’t going to be (as) “obvious” to the electrician with 1 year of experience.
Of course it is POSSIBLE (and probable) that SOME long time professionals will never progress in the understanding of their profession past the bare minimum. (A small percentage will be (probably) be incompetent but true incompetence isn’t this issue here.)
Yes, this falls into the “insult” category – e.g. it is possible to be in a position for 5 years, and not learn anything. SO that would be “5 years “time in service” but functionally “1 year of experience 5 times” NOT “5 years of experience.”
Just showing up for work everyday doesn’t mean you are going to automatically improve.
teaching and understanding
There are a couple verses in the “Old Testament” that come to mind (emphasis obviously mine):
9 Only take heed to thyself, and keep thy soul diligently, lest thou forget the things which thine eyes have seen, and lest they depart from thy heart all the days of thy life: but teach them thy sons, and thy sons’ sons;10 specially the day that thou stoodest before the Lord thy God in Horeb, when the Lord said unto me, Gather me the people together, and I will make them hear my words, that they may learn to fear me all the days that they shall live upon the earth, and that they may teach their children.
Deuteronomy 4:9-10 (AKJV)
Notice that the command for parents to teach their children is meant to benefit BOTH the parents AND the children. The “secular” thought is that “to teach is to learn twice.”
Of course there are always “effective teachers” and “not as effective teachers.” Albert Einstein liked to point out that:
“If you can’t explain it simply, you don’t understand it well enough.”
Albert Einstein
SO to teach requires understanding and the act of teaching can/should increase understanding – both for the student AND the teacher.
I’ve been “teaching” in a formal “classroom” sense for almost 10 years. Looking back at those early classes – I definitely learned more than (some) of my students.
I had been working “in the field” for 15 years, had numerous industry certifications, and a Masters degree – so MY learning was mostly about the “why” – but there were still areas within the field that I didn’t REALLY understand.
Ok, the students didn’t know enough to notice that I didn’t know enough – but you get the idea.
This is the old “no one knows everything” sort of idea – at the time I simply didn’t know that I didn’t know 😉
ANYWAY – just kinda random thoughts – one more quote that I would have attributed to Mark Twain (he said something similar – but probably not this exact quote) – Harry Truman liked the quote, so did John Wooden – and Earl Weaver used it as the title of his autobiography:
It’s What You Learn After You Know It All That Counts
Yes, there is a difference between “statistics” and “analytics” – maybe not a BIG difference but there is a difference.
“Statistics” is about collecting and interpreting “masses of numerical data.” “Analytics” is about logical analysis – probably using “statistics”.
Yeah, kinda slim difference – the point being that there is a difference between “having the numbers” and “correctly interpreting the numbers.”
“Data analysis” becomes an exercise in asking questions and testing answers – which might have been how a high level “statistician” described their job 100 years ago – i.e. I’m not dogmatic about the difference between “statistics” and “analytics”, just establishing that there are connotations involved.
Analytics and Sports
Analytics as a distinct field has gained popularity in recent years. In broad strokes the fields of “data science”, “artificial intelligence”, and “machine learning” all mean “analytics.”
For a while the term “data mining” was popular – back when the tools to manage “large data sets” first became available.
I don’t want to disparage the terms/job titles – the problem is that “having more data” and having “analysis to support decisions” does not automatically mean “better leadership.”
It simply isn’t possible to ever have “all of the information” but it is very easy to convince “management types” that they have “data” supporting their pet belief.
e.g. I always like to point out that there are “trends” in baby name popularity (example site here) – but making any sort of conclusion from that data is probably specious.
What does this have to do with “sports” – well, “analytics” and sports “management” have developed side by side.
Baseball’s word for the concept of “baseball specific data analysis” dates back to 1982 – about the time that “personal computers” where starting to become affordable and usable by “normal” folks.
My round about point today is that most “analytics” fall into the “descriptive” category by design/definition.
e.g. if you are managing a ‘sportball’ team and have the opportunity to select players from a group of prospects – how do you decide which players to pick?
Well, in 2022 the team is probably going to have a lot of ‘sportball’ statistics for each player – but do those statistics automatically mean a player is a “good pick’ or a “bad pick”? Obviously not – but that is a different subject.
The team decision process will (probably) include testing players physical abilities and watching the players work out – but neither of those 100% equates to “playing the game against other skilled opponents.”
That player with great statistics might have been playing against a lower level of competition. That player that has average “physical ability test scores” might be a future Hall of Famer because of “hidden attributes”
i.e. you can measure how fast an athlete can run, and how high they can jump – but you can’t measure how much they enjoy playing the game.
MEANWHILE back at the ranch
Now imagine that you are an athlete and you want to improve your ‘sportball’ performance. How do you decide what to work on?
Well, the answer to that question is obviously going to be very sport AND athlete specific.
However, your ‘sportball’ statistics are almost certainly not going to help you make decisions on how/what you should be trying to develop – i.e. those statistics will be a reflection of how well you have prepared, but do not directly tell you how to prepare.
Bowling
Full disclosure – I am NOT a competitive bowler. I have participated/coached other sports – but I’m a “casual bowler.” i.e. if I have misinterpreted the sport, please let me know 😉
Now imagine that someone has decided that they want to improve their “bowling average” – how should they approach the problem?
Step 1 would be to establish a baseline from which improvements can be measured.
Step 2 would be to determine what you need to “work on” to improve your scores from Step 1.
Step 3 would be to establish a session of “practices” to work on the items from Step 2.
Step 4 would be to re-test the items from Step 1 and adjust steps 2 and 3 accordingly.
Sure, I just described the entire field of “management” and/or “coaching” – but how well a manager/coach helps athletes through the above (generic) process will be directly reflected in wins/losses in competition.
Remember that the old axiom that “practice makes perfect” is a little misleading:
Practice does not make perfect. Only perfect practice makes perfect.
-Vince Lombardi
Back to bowling – bowling every week might be fun, but won’t automatically mean “better performance.”
Keeping track of your game scores might be interesting, but also won’t automatically mean “better scores.”
I’m told that the three factors for the “amateur bowler” to work on are:
first ball pin average
single pin spare %
multipin spare %
In a “normal” game there are 10 pins possible each frame. The bowler gets two balls to knock down all 10.
If your “first ball pin average” is 10, then you are a perfect bowler –and knock all the pins down every frame with your first ball.
To be honest I haven’t seen any real data on “first ball pin averages” – it probably exists in much the same manner that “modern baseball statistics” can be derived from old “box scores” – but I’m told that a first pin average around 9 is the goal.
If you consistently average 9 pins on your first throw – then you have a consistent “strike” delivery.
Which then means that IF you consistently knock down 9 pins – you will have to pickup “single pin spares” on a regular basis.
Then “multipin spares” are going to be an exercise in statistics/time and fate. Obviously if you average 9 pins on your first ball, the number of “multipin spare” opportunities should be relatively small.
SO those are the data points being tracked with my “bowling analytics” application.
Merriam-Webster tells me that the word “meme” dates all the way back to 1976 (coined by Richard Dawkins in The Selfish Gene) with the meaning of “unit of cultural transmission.”
Apparently the “-eme” suffix “indicates a distinctive unit of language structure.” Dr Dawkins combined the Greek root “mim-” (meaning “mime” or “mimic”) with “-eme” to create “meme.”
Then this interweb thing hit full stride and minimally edited images with captions with (maybe) humorous intent became a “meme.”
Humor is always subjective, and with brevity (still) being the soul of wit – most “memes” work on a form of “revelatory” humor. The humor comes in “discovering” the connection between the original image, and then the edited/altered image.
We aren’t dealing in high level moral reasoning or intense logic – just “Picture A”, “Picture B”, brain makes connection – grin/groan and then move on. By definition “memes” are short/trivial.
Which makes commenting on “memes” even more trivial. Recently on “social media platform” I made an off the cuff comment about a meme that amounted to (in my head) “A=B”, “B=C”, “A+B+C<D.”
Now, I readily admit that my comment was not logical – but from a certain perspective “true” – if not directly provable from the supplied evidence. It was a trivial response meant to be humorous – not a “grand unifying theory of everything.”
… and of course I (apparently) offended someone to the point that they commented on my comment – accusing me of “not understanding the meme.”
Notice the “apparently” qualifier – it is POSSIBLE that they were trying to be funny. My first reaction was to explain by comment (because obviously no one would intentionally be mean or rude on the interweb – i.e. the commenter on my comment must have simply misunderstood my comment 😉 ) – BUT that would have been a fourth-level of trivialness …
HOWEVER the incident got me thinking …
geeks and nerds
Another doctor gets credit for creating the term “nerd” (Dr Suess – If I Ran the Zoo, 1950). It didn’t take on the modern meaning implying “enthusiasm or expertise” about a subject until later years. The term “dork” came about in the 1960’s (… probably as a variation on “dick” – and quickly moving on …) – meaning “odd, socially awkward, unstylish person.”
“Geek” as a carnival performer biting the heads of chickens/snakes, eating weird things, and generally “grossing out” the audience goes back to 1912. Combined with another term it becomes synonymous with “nerd” – e.g. “computer geek” and “computer nerd.”
random thought: if you remember Harry Anderson – or have seen re-runs of Night Court – his standup act consisted of “magic” and some “carnival geek” bits, he didn’t bite the heads of any live animals though (which would have gotten him in trouble – even in the 1980’s). Of course youtube has some video that I won’t link to – search for “Harry Anderson geek” if curious …
I think a nerd is a person who uses the telephone to talk to other people about telephones. And a computer nerd therefore is somebody who uses a computer in order to use a computer.
Douglas Adams
ANYWAY – to extend the Douglas Adams quote – a “nerd” might think that arguing about a meme is a good use of their time …
Which brings up the difference between “geeks” and “nerds” – I have (occasionally) used Sheldon and Leonard from the Big Bang Theory to illustrate the difference – with Sheldon being the “geek” and Leonard the “nerd”. Both of their lives revolve around technology/intellectual pursuits but they “feel” differently about that fact – i.e. Sheldon embraces the concept and is happily eccentric (“geek”) while Leonard feels self-conscious and awkward (“nerd”).
SO when I call myself a “computer geek” it is meant as a positive descriptive statement 😉 – yes, I am aware that the terms aren’t AS negative as they once were, I’m just pointing out that my life has ended up revolving around “computers” (using them/repairing them) and it doesn’t bother me …
Though I suppose “not being able to use a computer” in 2022 is in the same category that “not able to ride a horse” or “can’t shoot a rifle” would have been a couple hundred years ago … in a time when being “adorkable” is an accepted concept – calling yourself a “geek” or “nerd” isn’t as bad as it used to be — umm, in any case when I say “geek” I’ve never bitten the head off anything (alive or dead), I did perfect biting into and tearing off a part of an aluminum can back in high school – but that is another story …
reboots
While I did NOT comment on the comment about my comment – I did use the criticism of my comment as an opportunity for self-examination.
Background material: The meme in question revolved around “movie franchise reboots.” (again, trivial trivialness)
In 2022 when we talk about “movie franchise reboots” the first thing that is required is a “movie franchise.”
e.g. very obviously “Star Trek” got a JJ Abrams reboot. Those “Star Wars” movies were “sequels” not “reboots” but the less said about JJ Abrams and that franchise the better
the big “super hero” franchises have also obviously been rebooted –
Batman in the 1990’s played itself out – then we got the “Batman reboot” trilogy directed by Christopher Nolan,
Superman in the 1970’s/80’s didn’t get a movie franchise reboot until after Christopher Reeves died
Spider-Man BECAME a movie franchise in the 2000’s, then got a reboot in 2012, and another in 2016/2017
SO the issue becomes counting the reboots – i.e. Batman in the 1990’s (well, “Batman” was released in 1989) had a four movie run with three different actors as Batman. I’m not a fan of those movies – so I admit my negative bias – but they did get progressively worse …
Oh, and if we are counting “reboots” do you count Batman (1966) with Adam West? Probably not – it exists as a completely separate entity – but if you want to count it I won’t argue – the relevant point is that just “changing actors” doesn’t equal a “reboot” – restarting/retelling the story from a set point makes a “reboot.”
However, counting Superman “reboots” is just a matter of counting actor changes – e.g. Christopher Reed made 4 Superman movies (which also got progressively worse) – “Superman Returns” (2006) isn’t a terrible movie – but it exists in its own little space because it stomped all over the Lois Lane/Superman relationship – then we have the Henry Cavill movies that were central to DC comics attempt at a “cinematic universe.”
We can also determine “reboots” by counting actors with Spider-Man. Of course the Spider-Man franchise very much illustrates that the purpose of the “movie industry” is to make money – not tell inspiring stories, raise awareness, or educate the masses – make money. If an actor becomes a liability – they can be replaced – it doesn’t matter if you setup another movie or not 😉
There are other not so recent franchises – “Tarzan” was a franchise, maybe we are stretching to call Wyatt Earp a franchise, how about Sherlock Holmes?
The Wyatt Earp/OK corral story is an example of a “recurring story/theme” that isn’t a franchise. Consider that “McDonald’s” is a franchise but “hamburger joint” is not …
Then we have the James Bond franchise.
The problem with the “Bond franchise” is that we have multiple “actor changes” and multiple “reboots.” i.e. Assuming we don’t count Peter Seller’s 1967 “Casino Royale” there have been 6 “James Bond” actors. Each actor change wasn’t a “reboot” but just because they kept making sequels doesn’t mean they had continuity.
The “Sean Connery” movies tell a longform story of sorts – with Blofeld as the leader of Spectre. The “James Bond” novels were very much products of the post WWII/Cold War environment – but the USSR was never directly the villain in any of the movies, the role of villain was usually Spectre in some form.
The easy part: The “Daniel Craig” Bond movies were very obviously a reboot of the Blofeld/Spectre storyline.
The problem is all of those movies between “On Her Majesty’s Secret Service” (1969) and “Casino Royale” (2006).
“Diamonds are Forever” (1971) was intended to finish the story started in “On Her Majesty’s Secret Service” – i.e. Bond gets married (and retires?), then Blofeld kills Bond’s wife as they leave the wedding – then bad guys drive away – Bond holds his dead wife while saying “We have all the time in the world.” – roll credits.
(fwiw: Except for the obviously depressing ending “On her Majesty’s Secret Service” is actually one of the better Bond movies)
Then George Lazenby (who had replaced Sean Connery as Bond) asked for more money than the studio was willing to pay – and they brought back Sean Connery for a much more light hearted/cartoonish Bond in “Diamonds are Forever.” (did I mention the profit making motive?)
Of course “Diamonds are Forever” starts out with Bond hunting down and killing Blofeld – but that is really the only reference we get to the previous movie – SO reboot? this particular movie maybe, maybe not – but it did signify a “formula change” if nothing else.
Any attempt at “long form storytelling” was abandoned in preference for a much more “cartoony” James Bond. MOST of the “Roger Moore” Bond movies have a tongue-in cheek feeling to them.
The “70’s Bond movies” became progressively more cartoonish – relying more on gadgets, girls, violence than storytelling (e.g. two of the movies “The Spy Who Loved Me” and “Moonraker” are basically the same plot). There are a few references to Bond having been married but nothing that would be recognized as “character development” or continuity – it could be argued that each movie did a “soft reboot” to the time after “Diamonds are Forever”, but simply saying that the “continuity” was that there was no “continuity” is more accurate.
Then we got the “80’s Bond” – “For Your Eyes Only” intentionally backed off the gadgets and promiscuity – Bond visits his wife’s grave and Blofeld makes a (comic) appearance in the “Bond intro action sequence” – so I would call this one a “soft reboot” but not a complete relaunch.
The same goes for Timothy Dalton’s Bond movies – not a full blown restart, but a continuation of the “upgrading” process – still no memorable continuity between movies – (he only did two Bond movies).
Pierce Brosnan as Bond in “GoldenEye” (1995) qualifies as another actor change and “soft reboot” – Bond is promiscuous and self-destructive but it is supposed to be as a reaction to his job, not because being promiscuous and self-destructive is cool – but we were back to the tongue in cheek – gadget fueled Bond (two words: “invisible car”).
The Daniel Craig Bond movies certainly fit ANY definition of a reboot. “No Time to Die” (2021) was the last Bond movie for Mr Craig – but what direction the “franchise” is going is all just speculation at the moment …
ANYWAY – comparing 27 Bond movies over 58ish years to the modern “Super hero” reboots – was the gist of my trivial answer to a trivial meme (which only took 1,700+ words to explain 😉 )
… this was kind of a “web design” exercise/proof of concept
functionally this is a “web radio” front end – there is an “on/off” button that toggles playing the stream
I’m told that the Safari web browser might have issues playing the stream – so this is kind of a “beta test” request
the music is all “public domain” – the pictures of paintings are examples of the “Hudson River School” movement from the mid-19th century – i.e. they aren’t related to the music, just some files I had available
I’m always quick to point out that I am NOT a graphical designer – e.g. as I was cobbling the front end together it occurred to me that having the “on/off” toggle at the top (and bigger) might be a better option
SO if you could take a look honest feedback would be appreciated – MOSTLY I’m curious if it works – the pictures should rotate through 10 images, “current song” info should update, there should be music, is the on/off toggle obvious enough …