Category: history

  • memoirs of an adjunct instructor or What do you mean “full stack developer?”

    During the “great recession” of 2008 I kind of backed into “teaching.”

    The small company where I was the “network technician” for 9+ years wasn’t dying so much as “winding down.” I had ample notice that I was becoming “redundant” – in fact the owner PROBABLY should have “let me go” sooner than he did.

    When I was laid off in 2008 I had been actively searching/”looking for work” for 6+ months – certainly didn’t think I would unemployed for an extended period of time.

    … and a year later I had gone from “applying at companies I want to work for” to “applying to everything I heard about.” When I was offered an “adjunct instructor” position with a “for profit” school in June 2009 – I accepted.

    That first term I taught a “keyboarding class” – which boiled down to watching students follow the programmed instruction. The class was “required” and to be honest there wasn’t any “teaching” involved.

    To be even MORE honest, I probably wasn’t qualified to teach the class – I have an MBA and had multiple CompTIA certs at the time (A+, Network+) – but “keyboarding” at an advanced level isn’t in my skill set.

    BUT I turned in the grades on time, that “1 keyboarding class” grew into teaching CompTIA A+ and Network+ classes (and eventually Security+, and the Microsoft client and server classes at the time). fwiw: I taught the Network+ so many times during that 6 years that I have parts of the book memorized.

    Lessons learned …

    Before I started teaching I had spent 15 years “in the field” – which means I had done the job the students were learning. I was a “computer industry professional teaching adults changing careers how to be ‘computer industry professionals’”

    My FIRST “a ha!” moment was that I was “learning” along with the students. The students were (hopefully) going from “entry level” to “professional” and I was going from “working professional” to “whatever comes next.”

    Knowing “how” to do something will get you a job, but knowing “why” something works is required for “mastery.”

    fwiw: I think this same idea applied to “diagramming sentences” in middle school – to use the language properly it helps to understand what each part does. The fact I don’t remember how to diagram a sentence doesn’t matter.

    The “computer networking” equivalent to “diagramming sentences” is learning the OSI model – i.e. not something you actually use in the real world, but a good way to learn the theory of “computer networking.”

    When I started teaching I was probably at level 7.5 of 10 on my “OSI model” comprehension – after teaching for 6 years I was at a level 9.5 of 10 (10 of 10 would involve having things deeply committed to memory which I do not). All of which is completely useless outside of a classroom …

    Of course most students were coming into the networking class with a “0 of 10” understanding of the OSI model BUT had probably setup their home network/Wi-Fi.

    The same as above applies to my understanding of “TCP/IP networking” and “Cyber Security” in general.

    Book Learning …

    I jumped ship at the “for profit school” I was teaching in 2015 for a number of reasons. MOSTLY it was because of “organizational issues.” I always enjoyed teaching/working with students, but the “writing was on the wall” so to speak.

    I had moved from “adjunct instructor” to “full time director” – but it was painfully obvious I didn’t have a future with the organization. e.g. During my 6 years with the organization we had 4 “campus directors” and 5 “regional directors” — and most of those were “replaced” for reasons OTHER than “promotion.”

    What the “powers the be” were most concerned with was “enrollment numbers” – not education. I appreciate the business side – but when “educated professionals” (i.e. the faculty) are treated like “itinerate labor”, well, the “writing is on the wall.”

    In 2014 “the school” spent a lot of money setting up fiber optic connections and a “teleconferencing room” — which they assured the faculty was for OUR benefit.

    Ok, reality check – yes I understand that “instructors” were their biggest expense. I dealt with other “small colleges” in the last 9 years that were trying to get by with fewer and fewer “full time faculty” – SOME of them ran into “accreditation problems” because of an over reliance on “adjuncts” – I’m not criticizing so much as explaining what the “writing on the wall” said …

    oh, and that writing was probably also saying “get a PhD if you want a full time teaching position” — if “school” would have paid me to continue my education or even just to keep my skills up to date, I might have been interested in staying longer.

    Just in general – an organization’s “employees” are either their “biggest asset” OR their “biggest fixed cost.” From an accounting standpoint both are (probably) true (unless you are “Ivy League” school with a huge endowment). From an “administration” point of view dealing with faculty as “asset” or “fixed cost” says a LOT about the organization — after 6 years it was VERY clear that the “for profit” school looked at instructors as “expensive necessary evils.”

    COVID-19 was the last straw for the campus where I worked. The school still exits but appears to be totally “online” –

    Out of the frying pan …

    I left “for profit school” to go to teach at a “tech bootcamp” — which was jumping from “bad situation to worse situation.”

    The fact I was commuting an hour and a half and was becoming more and more aware of chronic pain in my leg certainly didn’t help.

    fwiw: I will tell anyone that asks that a $20 foam roller changed my life — e.g. “self myofascial release” has general fitness applications.

    I was also a certified “strength conditioning professional” (CSCS) in a different life – so I had a long history of trying to figure out “why I had chronic pain down the side of my leg” – when there was no indication of injury/limit on range of motion.

    Oh, and the “root cause” was tied into that “long commute” – the human body isn’t designed for long periods of “inaction.” The body adapts to the demands/stress placed on it – so if it is “immobile” for long periods of time – it becomes better at being “immobile.” For me that ended up being a constant dull pain down my left leg.

    Being more active and five minutes with the foam roller after my “workout” keeps me relatively pain free (“it isn’t the years, it’s the mileage”).

    ANYWAY – more itinerate level “teaching” gave me time to work on “new skills.”

    I started my “I.T. career” as a “pc repair technician.” The job of “personal computer technician” is going (has gone?) the way of “television repair.”

    Which isn’t good or bad – e.g. “personal computers” aren’t going away anymore than “televisions” have gone away. BUT if you paid “$X” for something you aren’t going to pay “$X” to have it repaired – this is just the old “fix” vs “replace” idea.

    The cell phone as 21st Century “dumb terminal” is becoming reality. BUT the “personal computer” is a general purpose device that can be “office work” machine, “gaming” machine, “audiovisual content creation” machine, or “whatever someone can program it to do” machine. The “primary communication device” might be a cell phone, but there are things a cell phone just doesn’t do very well …

    Meanwhile …

    I updated my “tech skill set” from “A+ Certified PC repair tech” to “networking technician” in the 1990s. Being able to make Cat 5/6 twisted pair patch cables still comes in handy when I’m working on the home network but no one has asked me to install a Novell Netware server recently (or Windows Active Directory for that matter).

    Back before the “world wide web” stand alone applications were the flavor of the week. e.g. If you bought a new PC in 1990 it probably came with an integrated “modem” but not a “network card.” That new PC in 1990 probably also came with some form of “office” software – providing word processing and spreadsheet functions.

    Those “office” apps would have been “stand alone” instances – which needed to be installed and maintained individually on each PC.

    Back in 1990 that application might have been written in C or C++. I taught myself “introductory programming” using Pascal mostly because “Turbo Pascal” came packaged with tools to create “windows” and mouse control. “Pascal” was designed as a “learning language” so it was a little less threatening than C/C++ back in the day …

    random thought: If you wanted “graphical user interface” (GUI) functionality in 1990 you had to write it yourself. One of the big deals with “Microsoft Windows” was that it provided a uniform platform for developers – i.e. developers didn’t have to worry about writing the “GUI operating system hooks” they could just reference the Windows OS.

    Apple Computers also had “developers” for their OS – but philosophically “Apple Computers” sold “hardware with an operating system included” while Microsoft sold “an operating system that would run on x86 hardware” – since x86 hardware was kind of a commodity (read that as MUCH less expensive than “Apple Computers”). The “IBM PC” story that ended up making Microsoft, inc a lot of money. — which was a fun documentary to show students bored of listening to me lecture …

    What users care about is applications/”getting work done” not the underlying operating system. Microsoft also understood the importance of developers creating applications for their platform.

    fwiw: “Microsoft, Inc” started out selling programming/development tools and “backed into” the OS market – which is a different story.

    A lot of “business reference applications” in the early 1990s looked like Microsoft Encarta — they had a “user interface” providing access to a “local database.” — again, one machine, one user at a time, one application.

    N-tier

    Originally the “PC” was called a “micro computer” – the fact that it was self contained/stand alone was a positive selling point. BEFORE the “PC” a larger organization might have had a “terminal” system where a “dumb terminal” allowed access to a “mainframe”/mini computer.

    SO when the “world wide web” happened and “client server” computing became mainstream the concept of “N tier” computing model as a concept became popular.

    N-tier might be a the “presentation” layer/web server, the “business logic” layer/a programming language, and then the “data” layer/a database management system

    Full Stack Developer

    In the 21st Century “stand alone” applications are the exception – and “web applications” the standard.

    Note that applications that allow you to download and install files on a personal computer are better called “subscription verification” applications rather than “N Tier.”

    e.g. Adobe allows folks to download their “Creative Suite” and run the applications on local machines using computing resources from the local machine – BUT when the application starts it verifies that the user has a valid subscription.

    An “N tier” application doesn’t get installed locally – think Instagram or X/Twitter …

    For most “business applications” designing an “N tier” app using “web technologies” is a workable long term solution.

    When we divided the application functionality the “developer” job also differentiated – “front end” for the user facing aspects and “back end” for the database/logic aspects.

    The actual tools/technologies continue to develop – in “general” the “front end” will involve HTML/CSS/JavaScript and the “back end” involves a combination of “server language” and “database management system.”

    Languages

    Java (the language maintained by Oracle not “JavaScript” also known as ECMAscript) has provided “full stack development” tools for almost 30 years. The future of Java is tied into Oracle, Inc but neither is gonna be “obsolete” anytime soon.

    BUT if someone is competent with Java – then they will describe themselves as a “Java developer” – Oracle has respected industry certifications

    I am NOT a “Java developer” – but I don’t come to “bury Java” – if you are a computer science major looking to go work for “large corporation” then learning Java (and picking up a Java certification) is worth your time.

    Microsoft never stopped making “developer tools” – “Visual Studio” is still their flagship product BUT Visual Studio Code is my “go to” (free, multi-platform) programming editor in 2024)

    Of course Microsoft wants developers to develop “Azure applications” in 2024 – C# provides easy access to a lot of those “full stack” features.

    … and I am ALSO not a C# programmer – but there are a lot of C# jobs out there as well (I see C# and other Microsoft ‘full stack’ tech specifically mentioned with Major League Baseball ‘analytics’ jobs and the NFL – so I’m sure the “larger corporate” world has also embraced them)

    JavaScript on the server side has also become popular – Node.js — so it is possible to use JavaScript on the front and back end of an application. opportunities abound

    My first exposure to “server side” programming was PHP – I had read some “C” programming books before stumbling upon PHP, and my first thought was that it looked a lot like “C” – but then MOST computer languages look a lot like “C.”

    PHP tends to be the “P” part of the LAMP stack acronym (“Linux OS, Apache web server, MySQL database, and PHP scripting language”).

    Laravel as a framework is popular in 2024 …

    … for what it is worth MOST of the “web” is probably powered by a combination of JavaScript and PHP – but a lot of the folks using PHP are unaware they are using PHP, i.e. 40%+ of the web is “powered by WordPress.”

    I’ve installed the LAMP stack more times than I can remember – but I don’t do much with PHP except keep it updated … but again, opportunities abound

    Python on the other hand is where I spend a lot of time – I find Django a little irritating, but it is popular. I prefer flask or pyramid for the “back end” and then select a JavaScript front end as needed

    e.g. since I prefer “simplicity” I used “mustache” for template presentation with my “Dad joke” and “Ancient Quote” demo applications

    Python was invented with “ease of learning” as a goal – and for the most part it succeeds. The fact that it can also do everything I need it to do (and more) is also nice 😉 – and yes, jobs, jobs, jobs …

    Databases

    IBM Db2, Oracle, Microsoft SQL server are in the category of “database management system royalty” – obviously they have a vast installation base and “large corporate” customers galore. The folks in charge of those systems tend to call themselves “database managers.” Those database managers probably work with a team of Java developers …

    At the other end of the spectrum the open source project MySQL was “acquired” by Sun Microsystems in 2008 which was then acquired by Oracle in 2010. Both “MySQL” and “Oracle” are popular database system back ends.

    MySQL is an open source project that has been “forked” into the “MariaDB foundation.”

    PostgreSQL is a little more “enterprise database” like – also a popular open source project.

    MongoDB has become popular and is part of its own “full stack” acronym MEAN (MongoDB, Express, Angular, and Node) – MongoDB is a “NoSQL” database which means it is “philosophically” different than the other databases mentioned – making it a great choice for some applications, and not so great for other applications.

    To be honest I’m not REALLY sure if there is a big performance difference between database management back ends. Hardware and storage space are going to matter much more than the database engine itself.

    “Big Corporate Enterprise Computing” users aren’t as concerned with the price of the database system they want rock solid dependability – if there was a Mount Rushmore of database management systems – DB2, Oracle, and Microsoft SQL server would be there …

    … but MariaDB is a good choice for most projects – easy to install, not terribly complicated to use. There is even a nice web front end – phpMyAdmin

    I’m not sure if the term “full stack developer” is gonna stick around though. Designing an easy to use “user interface” is not “easy” to do. Designing (and maintaining) a high performing database back end is also not trivial. There will always be room for specialists.

    “Generalist developer” sounds less “techy” than “full stack developer” – but my guess is that the “full stack” part is going to become superfluous …

  • Sisyphus, “Say Anything”, The Seeker

    The tragic part of living a life of “quiet desperation” (in the Henry David Thoreau sense) is usually the lost opportunity to do good as opposed to “intentional malice.”

    For sweetest things turn sourest by their deeds;
    Lilies that fester smell far worse than weeds.

    Sonnet 94 (William Shakespeare)

    In 2023 Merriam-Webster tells us that a “tragedy” is “a disastrous event CALAMITY

    Back in Mr Shakespeare’s time a “tragedy” was closer to “a medieval narrative poem or tale typically describing the downfall of a great man” (Merriam-Webster definition 2C – and I used the term as in Merriam-Webster definition 3: “tragic quality or element”)

    fwiw: Mr Shakespeare’s plays tend to be divided into “tragedy”, “comedy”, and “histories” – kind of the broad “genres” of his time. In Shakespearean “tragedy” a lot of people will be dead at the end of the play, in a “comedy” folks will pair up/get married, and “histories” were obviously “based on a true story” BUT tended to be presented to “please the sponsor” much more than be an accurate representation of historic events …

    Sisyphus

    The Ancient Greek concept of tragedy would have required a “great man” – to suffer a great downfall BUT more along the Merriam-Webster 2A definition (“the a serious drama typically describing a conflict between the protagonist and a superior force (such as destiny) and having a sorrowful or disastrous conclusion that elicits pity or terror”)

    Ancient Greek “tragedy” tends to involve a “mostly admirable” king/leader that does nothing “wrong” but still suffers because of a relatively small character flaw – e.g. the hero tries to avoid his “destiny”/fate and ends up bringing about his fate BECAUSE he tried to avoid it.

    Wikipedia tells us that Sisyphus was the king of Corinth, punished in Tartarus by being cursed to roll a huge boulder up a hill in Greek mythology.

    BUT the myth of Sisyphus is more of a “cautionary tale” about divine justice rather than a “tragedy” – the “lesson” the Ancient Greeks were passing along with the myth of Sisyphus was probably “don’t mess with the ‘gods’” not “don’t fight your fate”

    The punishment aspect of the myth of Sisyphus is always that he is sentenced to an endless AND pointless task – just pushing the boulder up a hill might not seem that bad, but being forced to do it FOREVER for no reason, well, that wouldn’t be any fun …
    The Ancient Greek concept of tragedy would have required a “great man” – to suffer a great downfall BUT more along the Merriam-Webster 2A definition (“the a serious drama typically describing a conflict between the protagonist and a superior force (such as destiny) and having a sorrowful or disastrous conclusion that elicits pity or terror”)

    Ancient Greek “tragedy” tends to involve a “mostly admirable” king/leader that does nothing “wrong” but still suffers because of a relatively small character flaw – e.g. the hero tries to avoid his “destiny”/fate and ends up bringing about his fate BECAUSE he tried to avoid it.

    Wikipedia tells us that Sisyphus was the king of Corinth, punished in Tartarus by being cursed to roll a huge boulder up a hill in Greek mythology.

    BUT the myth of Sisyphus is more of a “cautionary tale” about divine justice rather than a “tragedy” – the “lesson” the Ancient Greeks were passing along with the myth of Sisyphus was probably “don’t mess with the ‘gods’” not “don’t fight your fate”

    The punishment aspect of the myth of Sisyphus is always that he is sentenced to an endless AND pointless task – just pushing the boulder up a hill might not seem that bad, but being forced to do it FOREVER for no reason, well, that wouldn’t be any fun …

    Lloyd Dobler

    Now, the “average Ancient Greek” was a subsistence farmer (well, the “average Ancient human” was also a subsistence farmer – but that isn’t important).

    Life as a “subsistence farmer” (i.e. trying to live off of growing your own food) probably sounds “hard” to modern humans – but it would have had the advantage of a clear purpose/reason for daily labor (i.e. “survival” – feed yourself and your family).

    Fast forward to the 20th Century and there are still subsistence farmers – but they tend to be in what gets called “developing nations” in 2023.

    (aside: The concept of “Third World” nations is a relic of the “Cold War” – i.e. countries could be divided into “us” vs “them” with “not us or them” being the “Third World” – of course those countries were probably NOT “us” OR “them” because they were “undeveloped” – but now I feel like I’m going in circles.)

    Just like in “ancient times” the average “modern” subsistence farmer is most concerned with survival – and that daily struggle for survival is an obvious “purpose for work.”

    In the “developed world” the “people” can still be divided between “haves” and “have nots” – but the daily struggle for “food” has been replaced by a “subsistence paycheck” in exchange for labor.

    Of course the “problem” for “modern workers” can become CHOOSING a profession — i.e. again, for most of human existence the problem was growing enough food to survive – not “self-fulfillment”

    The last half of the 20th Century saw a lot of “progress” but human nature didn’t change. We “know” more and we “have” more in the “developed world” but humans are still the same “stuff” we have always been.

    Better nutrition and health care means the average height and weight have increased – people are bigger and healthier but still the same ol’ “people.”

    The unintended consequence of material prosperity has been to replace the “fight for survival” with a “search for meaning.”

    A lot of folks have ALWAYS managed to avoid the subject – and these are those folks leading the “unexamined life is not worth living” (as Socrates put it) or “lives of quiet desperation” (as Mr Thoreau put it).

    The late 20th century version of that struggle is found in “Say Anything” (1989) when the protagonist points out:

    “I don’t want to sell anything, buy anything, or process anything as a career. I don’t want to sell anything bought or processed, or buy anything sold or processed, or process anything sold, bought, or processed, or repair anything sold, bought, or processed.”

    –Lloyd Dobler

    The Seeker

    From a “big picture history” point of view the rise and fall of “great societies”/Empires can be seen as a failure of “values.”

    Yes, different cultures have different concepts of “normal” – BUT for them to be a “culture” they have a “set of shared attitudes, values, goals, and practices.”

    It should be obvious that just living in the same geographic region does NOT make a “culture” – unless you count hating ‘those people’ as a “culture”

    I won’t bother with multiple examples – e.g. “Arabs” and “Jews.”

    On a MUCH smaller scale I laughed at myself when I didn’t apply for a “tech job” with a school system in southwestern Ohio because THEY were rivals with US in high school sports (ok, there were other reasons as well – but the friendly sports rivalry was my first thought when I saw the job posting).

    “The Who” (one of those “rock & roll” bands) serves as a modern cultural example of that human “desire for meaning” and “belonging” – one of their songs asks the big question but American poet E.E. Cummings asked a similar question in 1923:

    seeker of truth

    follow no path
    all paths lead where

    truth is here

    e.e. cummings

    The obvious problem for “seekers” is that it is possible to be deceived into thinking “truth is here” when it isn’t – this verse comes to mind

    I tend to be suspicious of ANYONE that asks me to “trust them” about ANYTHING without any proof/verification – but that is just me (Luke 6:43-45 also comes to mind)

    Just because someone believes something and is sincere DOES NOT mean they are “true” – it is possible to be “sincerely wrong” …

    of course I could ALWAYS be wrong so you shouldn’t trust me on that –

  • Science Fiction, “social commentary”, and “Politics”

    FIRST I will say that I am a fan of William Shatner OC. The “OC” stands for “Order of Canada” – which is an honor of merit bestowed by the Canadian government.

    The 2019 announcement specifically mentions Captain James T Kirk/Star Trek but these types of honors tend to be conferred because of a combination of “entertainment and philanthropy” e.g. The motto of the “Order of Canada” is “DESIDERANTES MELIOREM PATRIAM” (They desire a better country)

    Not being a Canadian – I had to look up the “Order of Canada.” I was trying to figure out if there is a formal address for “Officers of the Order of Canada” (umm, no? I’m still not sure – apparently Canada uses “Honorable” and “Right Honorable” for certain positions/persons – but I don’t think “OC” comes along with an honorific, but again I’m not 100% on that one way or the other …)

    fwiw: Mr Shatner pointed out that being “knighted” is mostly for citizens of Great Britain. SO Mr Shatner is not “Sir William” (and I’m told that Canadian citizens are not eligible for the top two levels of the “OBE”)

    fwiw 2: Article 1 Section 9 Clause 8 of the U.S. Constitution prohibits the United States gov’ment from conferring “Titles of Nobility” – but several prominent Americans have been awarded “KBE” by the United Kingdom. The “KBE” usually gets described as an “honorary knighthood” – they get the award from the Crown and can put “KBE” after their name if they want, but don’t get the official honorific of “Sir/Dame.”

    Star Trek


    “Star Trek” TOS (the original series) ran for 3 seasons (79 official episodes) and then there was an “animated series” that ran for 22 episodes.

    It is part of the legend of “Star Trek” that the show ran for 3 seasons and was CANCELLED each season — organized “fan letter” campaigns convinced network decision makers to bring the show back for another season after the season 1 and 2 cancellations.

    BUT while the fan letter campaign might have convinced network executives to keep the show on the air, it couldn’t convince them to invest money in the show. e.g. if you watch the TOS episodes in order you will notice a drop in “production value” in many season 3 episodes.

    The rumor was that Mr Shatner and Leonard Nimoy were the only actors “getting raises” – and most of season 3 is just not as good as seasons 1 and 2 for various reasons. Of course “not very good” Star Trek is still better than a lot of shows – I’m not being overly critical but two words “Spock’s Brain” (season 3 episode 1)

    I have always had the impression that William Shatner has a passion for performing – which is why he has 250 credits to his name. Leonard Nimoy went from Star Trek to “Mission Impossible” and has 136 credits. DeForest Kelley was 10 years older than both Mr Shatner and Mr Nimoy – and was certainly the more “established” actor when Star Trek TOS started (not surprisingly considering the time and popular tastes – he was in a lot of westerns) – has 133 credits to his name.

    ANYWAY – My very round about point is that while William Shatner OC will be remembered as “Captain Kirk.” Mr Shatner has had a long and distinguished career. – i.e. his career included a LOT more than JUST “Star Trek” – e.g. Mr Shatner’s portrayal of the very “not Captain Kirk” character “Denny Crane” won him a Primetime Emmy in 2004 AND 2005, and don’t forget the “exceptionally 80’s” TJ Hooker.

    The motivation for the blog post was a meme with Mr Shatner asking “When did Star Trek become political?”

    “When did Star Trek become political?”

    William Shatner OC

    There are a LOT of responses belittling Mr Shatner – with the general theme being something like “Star Trek is the most political show in the history of television!”

    While I understand what folks “mean” when they say that Star Trek was/is “political” I have to disagree because, well, they are simply wrong.

    Science Fiction in general

    We should probably define some terms:

    Merriam-Webster tells us that Science Fiction = “fiction dealing principally with the impact of actual or imagined science on society or individuals or having a scientific factor as an essential orienting component”

    The important part of the “science fiction” definition is of course the “science” part — i.e. just because a story takes place in “outer space”, has “ray guns” and/or spaceships does NOT automatically mean it is “science fiction.”

    e.g. A lot of those “serial” films like Flash Gordon or Buck Rodgers are more “space fantasy” than science fiction. “Star Wars” (the original trilogy) is very much “space fantasy” – and the broad thematic similarities between Flash Gordon and Star Wars should be obvious (heroes going off on a mission to save life as we know it).

    To be clear I am not criticizing any of the above – they are entertaining and have had societal influence – but they could just as easily take place “once upon a time in a land far far away” (e.g. sounds a lot like “A long time ago, in a galaxy far, far away”).

    Social Commentary

    Going back to our definition – notice the “impact … on society or individuals” part. If someone is telling stories about the impact of “whatever” on “society and individuals” they are almost certainly engaging in “social commentary” —

    e.g. H.G. Wells typically gets credit for “inventing” the genre with “The Time Machine” in 1895. — Mr Well’s time traveler (an inventor/scientist) went into a distant future where humanity had destroyed the societies of his time (war is bad) and there are two surviving “classes” of humans – one above ground and the other below … so “science” and “social commentary” has ALWAYS been a recipe for “science fiction”

    There are multiple “sub genres” of “science fiction” that I will just wave at as we go by – i.e. a comprehensive discussion on all things “science fiction” is beyond the scope of this little blog …

    BUT “social commentary” is NOT “politics” — e.g. if you want to say that “Star Trek” has always been a commentary on modern society – then I would tend to agree.

    The movies with “the original cast” also fit into that model — i.e. they are broad “social commentary” about issues of the day but are NOT “political”

    Politics

    Obviously now we need to define politics – the first recorded use of the word in English goes back to 1529 – with an “art or science of government” meaning.

    The roots of “politics” go back to the Ancient Greek “polis”/city state – so when Aristotle said that “Man is by nature a political animal” he was saying that men are capable of communication and moral reasoning — therefore they can create governments/societies based on that moral reasoning (i.e. “politics”)

    fwiw: Aristotle wasn’t a fan of “democracies” because they tend to decay into chaos – so his use of the word “political” was descriptive in a general sense – neither positive or negative – he was obviously biased toward the Greek polis (constitutional republic) as an ideal.

    The word “politics” gets thrown around a lot – as it is used in “modern times” it can be understood as the practical process of “who gets what, how much they get, and when do they get it” — i.e. if you have scarce resources there will ALWAYS be “politics” to deal with – whether you are talking about a small business or the Federal government EVERYBODY can’t get EVERYTHING they want NOW – so “politics” happen.

    “Science fiction” might tell a story where the “social commentary theme” is “racism is bad” or “war is bad but sometimes necessary” – but would NOT advance a specific set of policy principles or advocate for (or against) a current political figure.

    Sure someone COULD tell a thinly veiled story pushing a specific political agenda and pretend it is “science fiction” – but that is more accurately called “propaganda” not “fiction”

    If we went through all of of TOS episodes we can PROBABLY find an underlying “social commentary” in each one – some are more overt than others – but it is there if you look for it (an exercise for a time when I have more time on my hands).

    Fashions change – social commentary endures

    It can also be fun to point out the “science fiction fashion victims” – just like you can point out the “historical epic fashion victims” – i.e. any television show or movie tends to reflect the time it was made.

    SO we get miniskirts and beehive hairdos in TOS and somehow all of the aliens look like humans from 1960s North America who all speak English on every “M” class planet they stumble upon after travelling multiples of the speed of light to get there. Oh, and all alien species are all able to interbreed (and fall in love with Captain Kirk) – and those sideburns …

    BUT this is all part of the suspension of disbelief – we can also point at “Doctor Zhivago” (1965) as a great movie about the Russian revolution (1917-1923) with a cast full of actors with “1965” hairstyles – enjoy the movie, don’t worry about the hairstyles

    It is also fun to compare the “tech” from TOS to the “tech” in TNG — One of my favorites is the concept of the “paperless society” – in all of TOS episodes and movies if you see a “dead tree” book on the Enterprise it is probably a “plot element” – they read off of screens a lot, and they use (what we would call) “tablets” a lot. BUT Captain Jean Luc Picard had his leather bound edition of the “Complete Works of Shakespeare”

    in the “just for fun” category Pavel Chekov could illustrate the potential dangers of working in “political” jokes – e.g. the character was introduced in an attempt to appeal to younger viewers and also as a little “Cold War” reference.

    According to Mr Chekov EVERYTHING was invented in Russia – which is still funny as a running gag, but during the Stalin era Russian history was periodically rewritten to conform to the current political environment …


    Scott : [raising his glass] Now this is a drink for a man. Chekov : Scotch?
    Scott : Aye.
    Chekov : It was invented by a little old lady from Leningrad.

    “The Trouble With Tribbles” Season 2 Episode 15



    BUT yes, I am nitpicking — my original point was that Star Trek TOS is “social commentary” and it remains popular BECAUSE it was NOT “political” — which was probably what Mr Shatner was saying — if he actually said the “When did Star Trek become political?” line …

  • Markets and Competition

    True innovation is rare. Ecclesiastes 1:9 is several thousand years old and tells us that “The thing that hath been, it is that which shall be; and that which is done is that which shall be done: and there is no new thing under the sun.”

    Of course when we aren’t talking about “big picture life” – innovation on a smaller scale happens every once in a while. Historians can argue about the number of truly “world changing innovations” – things like development of agriculture, domestication of animals, improvements in building materials, etc but that isn’t what I’m concerned with today.

    Markets

    I enjoyed The Outfit (2022) – which is nominally about a master English tailor who has ended up in a small shop in mid-1950’s Chicago (Mark Rylance’s character describes himself as a “cutter” – it is one of those rare “character driven” gangster movies, the movie has a “tense” energy and we get some “action” – I liked it).

    Near the end of the movie a character complains about how her “organization” had been ignored until they started making some “real” money. Which is plot driven exposition as much as anything.

    THEN I saw a promo for a new “streaming series” – were the main character makes the same complaint – something like “no one paid attention till we started making money, now everyone wants to take over.”

    Ok, both of those examples are “plot driven” but it is important to recognize that the complaint both (fictional characters) are making is that they “innovated”, created a “new” market segment, and then when that market segment became increasingly popular – competitors entered the marketplace.

    “Imitation is the sincerest form of flattery that mediocrity can pay to greatness”

    – Oscar Wilde

    This is the same concept found in the “innovation acceptance curve.” The “innovation acceptance curve” looks like the classic “normal distribution” bell curve – with “innovators” and “early adopters” on one side and “laggards” on the other – and “early” and “late” majority in the middle.

    My point is that there is probably a similar “number of competitors” curve that mirrors the “innovation acceptance curve.”

    Cell Phones

    Think “cell phones” – the first “cell phone” was invented in 1973. In the 1980’s cell phones were extremely rare – and if someone had one it probably looked like a World War 2 “walkie talkie.” By the end of the 1990s cell phones were common. The first iPhone was released in 2007 – which sparked another “innovation acceptance curve” for “smart phones.”

    Look at the “cell phone market” – Nokia dominated the early stages of the acceptance curve – but back in the 1990s the “cell phone service providers” tended to “give away” the phone in exchange for the monthly service fee.

    I’m sure there were a LOT of other companies making “cell phones” in those “early adopter”/”early Majority” days – and there were obviously other innovators (BlackBerry comes to mind).

    If we set the “way back machine” to 2005 (2 years before the iPhone) and asked a random sampling of cell phone users if they would ever think of paying $500 for a cell phone – the response would have been overwhelmingly low, simply because the average user only used their cell phone to take the occasional low resolution picture and make phone calls.

    (fwiw: I used to leave my flip phone in the car 99% of the time – because that was where I would need it, and the battery retained a charge for weeks at a time)

    Of course in 2005 folks might have also carried around a laptop, a “personal digital assistant”, and/or a dedicated MP3 player (the first Apple iPod released in 2001 – but “portable personal music players” had been around for years).

    The point here is that “innovation” is NOT always “market driven.” Successful innovation that results in “market disruption” is about providing something the “masses” didn’t realize they needed.

    “If I had asked people what they wanted, they would have said faster horses.”

    -Henry Ford

    Legendary Apple founder Steve Jobs once said that he didn’t rely on “market research” when developing new products. I’m not questioning Mr. Jobs – but (my opinion) his “genius” was in seeing what people “needed” which was often different than what the “thought the wanted.”

    Apple, Inc under Mr. Jobs was also known for making superior quality products that fell into the “elegant” category – i.e. achieving “product elegance” required a lot of “product testing” and development. SO Steve Jobs didn’t come back to Apple from the “wilderness” in 1997 and hand down from on high the “iMac”, and then the “iPod”, and finally the “iPod” – but he did create the innovation environment that made them possible.

    Market Leaders and Innovation

    After Apple disrupted the cell phone market by introducing the “iPhone” – Google, inc acquired the Android operating system and the HTC Dream was the first Android “smart phone” (September 2008)

    In 2023 Android OS is the most popular operating system in the world with 70% of the market share. Apple iOS has 28% of the market.

    From a “device” point of view – Samsung is the largest “Android” device manufacturer. Apple iOS is “closed source”/proprietary so obviously all “legal” iPhones are running iOS.

    From a “profitability” point of view – Apple, Inc is making a good living off of the selling iPhones for $1,000 and the “App Store” brings in $billions a year. So at the moment they are happily perched atop the “market profitability leader” stack – i.e. they don’t have the largest number of “devices” but they dominate the “top end” of the market and are far and away the most profitable.

    i.e. you can buy a $50 Android smartphone and you can probably find a $100 iPhone, but it will be several “generations” old …

    If you are curious about that other 2% of the mobile market (Android 70%, Apple 28%, other 2%?) – well, in 2023 I’m not sure –

    Microsoft tried to have a “mobile” version of Windows for a long time, Microsoft announced “end of life” for Windows Mobile back in 2017, which means 2022 was when Microsoft support ended.

    BlackBerry is also still around – so that 2% is mostly old Microsoft Mobile and BlackBerry devices.

    The “modern business” cliche is that companies must “innovate or die” – but any “market” will tend to be irrational/unpredictable at a basic level because, well, “people” are involved.

    “Innovation” for the sake of “innovation” is a bad idea – hey, if it ain’t broke, don’t perform radical surgery trying to “fix” it. “Intelligent innovation” with an eye on shifting market demands is always a good “long term” plan.

    What Happened to Nokia?

    ANYWAY – there is a very good documentary on the “Rise and Fall of Nokia Mobile” (2017)

    Just like our fictional “market creators” at the start of this article – Nokia was an innovator and dominated the early mobile industry, then the market got big and profitable and then what happened …

    Well, Nokia is a case study for why “market share dominance” does not always equal “profitability” – but the answer to what “happened” to Nokia is that Microsoft acquired them in 2013.

    You can still buy a “Nokia” phone – they even have the classic “flip phone” – but the Finnish telecom company “Nokia” doesn’t make phones in 2023.

    I’m not giving anything away by pointing out that the “old Nokia” employees blamed the “fall of Nokia” on the Microsoft acquisition – i.e. there is a LOT of “Microsoft as evil American corporation” bashing in the documentary – and for-what-it-is-worth they are probably right in their criticism of the contrasting corporate cultures.

    BUT “Microsoft/Nokia” isn’t at the top of “worst mergers” of all time by any measure (hey, someone is gonna have to do something SPECTACULARLY stupid on a “Biblical” scale to be worse than AOL/Time Warner).

    With 20/20 hindsight – “Nokia mobile” might be in exactly the same spot they are NOW if the Microsoft deal hadn’t happened – i.e. making mid-range Android phones. They certainly didn’t have the resources to compete with Apple and Google for users – so at some point they would (probably) have stopped trying to develop their own mobile OS and thrown in with Google/Android and be exactly where they are today.

    Competition

    Healthy competition drives intelligent innovation. At a “nation state” level this means that “protectionism” is usually a bad idea.

    The “usually” qualifier sneaks in there because of “national security.” Outside of a “national security” concern the best thing for “politicians” to do in regards to “market competition” is “as close to nothing as possible.”

    Yes, rules need to be enforced. Criminal activity should be dealt with as “criminal activity” NOT as an excuse for politicians to “wet their beak” meddling in market regulation. e.g. politicians are great at throwing money at bad ideas and extremely bad at encouraging actual “market innovation.”

    (just in general the most cost effective thing the “gov’ment” can do is “have a contest” and then encourage the free market to solve the problem and win the contest)

    Of course “cronyism” is ALWAYS bad at any level. The Venezuelan oil industry under Hugo Chavez becomes the cautionary tale of “cronyism” disguised as “nationalization.” e.g. no, a “centrally controlled economy” run by “human experts” won’t work on a national scale – and only the greedy and ignorant will try to tell the “masses” that you can get “something for nothing.”

    Acres of Diamonds

    Russell Conwell (February 15, 1843 – December 6, 1925) is remembered for giving a speech called “Acres of Diamonds” (feel free to read the lecture at your leisure)

    One of the lessons that could be taken from Acres of Diamonds is that the best “market” for someone looking to “innovate” and “compete” is the market that they know best.

    I say, beware of all enterprises that require new clothes, and not rather a new wearer of clothes.

    – Henry David Thoreau

    Just because someone else is doing something similar doesn’t mean that there isn’t room in the marketplace for your idea. e.g. Everyone told Dave Thomas that the United States didn’t need ANOTHER hamburger chain – but in 1969 he started “Wendy’s Old Fashioned Hamburgers” in Columbus, Ohio.

    Mr Thomas had worked for the real Colonel Sanders and Kentucky Fried chicken before starting Wendy’s – so he didn’t need “new clothes”, he understood fast food franchising and customer service. btw – Dave Thomas at Wendy’s deserves credit for perfecting the “pick up window” and the
    “salad bar” among other things.

    When Jack Welch was running G.E. they encouraged suggestions/feedback from “ordinary” workers – the idea being that the person that knows how to do the job “better” is probably the person doing the job.

    Yes, for every “introduced into production” G.E. probably had hundreds of “impractical” suggestions – but that is like saying that most rocks in the diamond mine are not diamonds, you don’t stop mining for diamonds because of the “not diamonds”

    (any organization that encourages suggestions should also have a way of quickly evaluating those suggestions – I’d be happy to take a big consulting fee to figure out a way, but with modern I.T. there are a lot of easily implemented solutions).

    Textbooks will through out terms like “unique selling proposition” (USP) – which boil down to “just because other folks are doing it doesn’t mean your slightly different idea won’t work.”

    Ideally your idea will do “something” different/better/cheaper — but the fact that a LOT of other folks are doing “whatever” just means that there is a DEMAND for “whatever.” i.e. if you think that you have a truly unique/innovative idea that no one else has thought of – you might be wrong.

    It is POSSIBLE that your idea has been tried (and failed) OR that there simply isn’t a profitable market for “whatever.” This is where doing a “competitor analysis” becomes informative – if you can’t find ANY competitors than I’d be worried …

    e.g. not surprisingly McDonald’s sells the most hamburgers in the United States but there are 91,989 other “hamburger restaurant businesses” in the U.S. and the number continues to grow.

    I don’t know if I would suggest starting a “hamburger restaurant” if you have 20 years of completely unrelated experience – but this is where “franchising” tries to fill in the knowledge/experience gaps for prospective entrepreneurs.

    Probably having a good location is just as important as having a recognizable brand – e.g. if I have been driving for 8 hours and I’m hungry and have to use the restroom if “anonymous greasy spoon truck stop” is the only place in sight they would look REAL attractive …

    Unlimited Demand

    Usually when doing a competition assessment you will factor in the impact that changes in price will have on “market demand” as well as the cost of “switching.”

    Specifics aren’t important -this is where the textbooks will talk about “elasticity” – but the core idea is that changes in price can have a large impact on “demand.”

    i.e. if you are selling “product x” for $1, something happens, and you need to start charging $2 to stay in business. There are 3 possibilities – you could lose customers, your retain the same number of customers, or you might gain customers (in rare situations).

    Your customer reaction to the price change will probably revolve around the “cost” of switching. e.g. how much do competitors charge and how much trouble is it to switch to one of those competitors?

    To make up a story – imagine “local gas station” increases their prices. Some folks won’t notice because it is inconvenient to go somewhere else, and some will rearrange their lives so that they never have to buy gas at that location again – and probably the only way a “local gas station” INCREASES customers is if traffic patterns change.

    Of course if a competitor is charging 10¢ less and is just across the street – well, that competitor will have long lines and probably put the first station out of business. btw – the cost of gas at “big chain” stations tends to reflect local taxation just as much as the cost of the gasoline – but that is another subject.

    BUT if the price of gas gets too high – folks will buy more gas efficient vehicles and cut down on their driving – so gasoline does not have UNLIMITED demand.

    The number of items with “unlimited” demand are kind of small – “air” comes to mind, but even then “no longer breathing” is a drastic option, and when “basic necessities” become scarce the breakdown of civil society is gonna happen (riots/war/anarchy).

    On a less apocalyptic level – “entertainment” tends to have unlimited demand and also zero switching costs. This is (probably) obvious – the challenge for “creators” becomes not just “making an entertaining video” but finding an audience.

    A tiny audience could equal “profitability” – if production costs are controlled and enough “sponsors”/subscribers found. A large audience could equal “huge losses” – if production costs are high and “advertising”/subscribers are not “large enough.”

    The same math applies to podcasts, broadcast/cable stations, and motion pictures. When “Superman Returns” was released in 2006 it had a $200 million budget. When it made almost $400 million worldwide it returned a profit, but not enough – e.g. a planned sequel was cancelled

    At the same time “The Devil Wears Prada” was released with a $35 million budget. It would make $327 million in box office – AND be considered a huge success making back 8x its budget.

    (umm, it isn’t important that I’ve seen one of those movies and it isn’t the one about the fashion industry – the point is one that Disney, Inc is relearning in 2023, i.e. heavily marketing a polished piece of tripe doesn’t make the tripe into a hamburger)

  • Talking Football – August 2023

    Back when the B1G was actually 10 teams – “two yards and a cloud of dust” was sometimes used to describe the offense philosophy of most coaches in the conference.

    The forward pass might have been added in 1906, but to paraphrase a coach “three things can happen when you throw the ball, and two of them are bad” – and of course that same coach lived by the “off tackle” play (in his defense Woody Hayes believed that “off tackle” could be adjusted as needed – in the same way that Vince Lombardi described the “power sweep” as “running to daylight”)

    Philosophy

    I’d argue that “ball control and defense” is still a sound starting point for a coaching philosophy – but it obviously won’t win video games where running and defense are after thoughts.

    Remember the point of a football game is NOT “score as many points as possible.” The goal in football is to score MORE points than the other team.

    Example: quick which NFL team holds the record for “points scored per game?”

    If you said that the 1950 L.A. Rams scored 38.8 points per game then you are truly a football historian.

    Of course if you also knew that those 1950 Rams went 9-3 in the regular season and then lost the (pre Super Bowl era) NFL Championship to the Cleveland Browns (Rams 28 – Browns 30) then you are probably a Cleveland Browns fan …

    Team Game

    The point is that football is a “team game” – i.e. offense isn’t more important than defense. This idea that “defense matters just as much as offense” applies to MOST team sports.

    At various points in modern sports history “genius coaches” have come up with the idea to “emphasize” offense over defense – and they tend to score a lot of points, but give up more points than they score.

    To be fair – coaching philosophies like “run and gun” (basketball) and “run and shoot” in football came about as creative ways to deal with a lack of “player size.”

    If you put on your “defensive coordinator” hat and imagine the offense that is hardest to defend – and you will probably come up with some version of an “option” offense (i.e. an offense where the play can change in reaction to the defense). The classic “triple option”/wishbone offense comes to mind – which is still successfully used at various levels.

    BUT all of the above goes out the window when you start talking about “professional sports” where “big and fast” players are the norm. Yes, there are still different coaching philosophies – but dealing with an organizational lack of “size and speed” goes away when you can just “draft”/hire big and fast players

    (btw: Glenn Ellison – the football coach not the economist – earned “Ohio Coach of the year” in 1961 for developing the Run N’ Shoot offense at Middletown High School in Ohio – his book on the offense is available on Amazon.

    As I remember the story he also advocated putting the “best 11” players on offense and trying to outscore the opponent. I don’t think he ever had an “undefeated season” but his “run n’ shoot” teams were always competitive.

    Ohio high school football didn’t start having “playoffs” until the 1970s – BUT I will just point out that his offensive philosophy has won a lot of “State Championships” at the High School level. At the NFL level it was kind of a “fad offense” until defensive coordinators “figured it out”)

    Turnovers

    The “traditional Big 10” offense implies a field position philosophy. Part of that philosophy is a practical recognition of traditional Big 10 “winters” and general “not southern California” weather patterns which account for SOME of those “traditional” low scoring games.

    Remember the point of football is to “score MORE points than the opponent” – we could express that as a Win (W) happens if Points Scored (PS) minus Points given up (PGU) is greater than 0

    W = (PS – PGU) > 0

    the “Win” equation

    Simple enough – the nuance comes in when we recognize that EITHER team can score on any play. This is the dreaded “turnovers” statistic.

    To expand the equation “Points Scored” can be broken down to “offensive points scored” and “defensive points scored” and “Points given up” broken down to “offensive” and “defensive” (and no I didn’t forget about “special teams” – feel free to add them as their own category or combine them either offense or defense)

    Then a statistic like “net points off turnovers” could be positive (if the team minimizes turnovers and/or creates more net turnovers) or negative (the inverse)

    Saying that “turnovers” can decide a game is obvious – but from a “team” point of view what matters is how they react to the turnover more than the turnover itself – which is another subject for another post.

    It is a cliche to say that “every play matters” and then point out that most football games are “decide” by 5 plays.

    TODAY I’m just pointing out that “field position football” revolves around the idea that the key to winning is not making “big mistakes” close to your end zone (and giving up points).

    I suppose a true “field position” practitioner would try to “surprise punt” if they are inside their own 10 yard line and try to reverse the field position – but you will never see that in the NFL simply because the athletes involved all have “big play – 90 yard touchdown” potential.

    ANYWAY that “elongated sphere” tends to be slippery in bad weather and bounces funny even in the best of conditions – so “ball control” (don’t turn the ball over on our side of the field) and “defense” (don’t give up big plays) remains a sound coaching philosophy starting point …

    Playbooks

    Imagine that “Team A” gets their hands on a copy of “Team B’s” offensive playbook – does “Team A” get a substantive advantage?

    Well, no. The specific formations/plays don’t matter as much as “real time general tendencies.”

    IF a player has a “tell” then that is going to be useful – i.e. if a receiver only puts in his mouthpiece when it is a ‘passing play’ and holds his mouthpiece in his hand when it is a ‘running play’ then THAT is actionable intelligence.

    Trying to recognize tendencies is the point of “film study” in the NFL.

    The “old school football” idea that you can tell the other team what you are gonna do and they still won’t be able to stop it – MIGHT still work if your players are MUCH physically superior than your opponent.

    Just having the opponents “playbook” is useless – knowing the opponents “tendencies” is priceless.

    This is the substance of “traditional rivalry” games in any sport at any level – i.e. both teams are well acquainted with the other teams players and tendencies so we get the basis for another cliche about “throwing out the win/loss records” because it is “rivalry week”

    At the pro level it tends to be EXTREMELY difficult to “blow out” the same team multiple times in the same year. Yes, statistically “good teams” are going to beat the “not as good teams” on a regular basis – but if they play two times a year every year the chances of two “easy victories” decreases – after all they are all “professional athletes” on both sides of the ball.

    Divisions and Television Rights

    When you are talking about “College football” in 2023 there are obvious divisions – the “small school nonathletic scholarships” folks are still called “Division III” (250 schools), there are 169 “Division II” schools (about 60% of DII athletes get “athletic aid”)

    I will say that the athletic facilities of the “average DIII school” are probably a little nicer than the “average large high school” – and yeah, the best large high school programs might be “competitive” against the average DIII team — but DIII is still “college football”

    For what it is worth – the NCAA runs the “college football” championship playoffs in DII and DIII. “Division I” football has “Champions” going back to 1869. The “National Collegiate Athletics Association” was founded in 1906 – but “Division I” football is still kind of an outlier in the overall “college sports” landscape.

    From a business point of view this “outlier” status is interesting because the NCAA does NOT control the television rights.

    until the mid 1980s the NCAA had control over which teams would appear on television. Which might sound like a “monopoly” if you are not working at the NCAA – and the Supreme Court of the United States agreed in 1984 when they ruled 7-2 in favor of the lawsuit Georgia and Oklahoma (well, the Universities in those States – but it might as well have been the general population) had brought against the NCAA challenging control of “college football on television.”

    (random great line from the lawsuit = “we thought that NCAA stood for ‘Never Compromise Anything Anytime’”)

    The 1984 ruling opened up the television market for individual athletic programs – but (as I remember it) conferences inherited a lot of the “television” control that the NCAA used to have – but that would obviously only apply to “conference games” and certainly didn’t preclude individual Universities from signing contracts for “non conference games”

    ANYWAY in the 1980’s “regional coverage” was the rule – probably an example of the last days of NCAA television control – but you could watch college football all day if you wanted.

    In 1991 Notre Dame Football signed a exclusive contract with NBC for national coverage of their home games – which illustrates the history/popularity of “Notre Dame football” as well as recognized their on the field success.

    In 1993 ESPN started broadcasting “Thursday night college football” – which still seems to feature teams I’ve never heard of on a regular basis. It became a “weekend preview” show just as much as competitive football game.

    The Big 10 had ceased being a 10 team league when Penn State (which along with Notre Dame had up till that point been an “independent” football program) joined in 1990. Penn State football was fully integrated into the Big 10 schedule until 1993.

    The addition of Penn State to the “Big 10” seemed natural – if not inevitable just from the geography involved – i.e. Pennsylvania is in the “mid west” along with Ohio, Michigan, Indiana, Illinois, Minnesota, Wisconsin and Iowa (out of order from memory – did I mention I live in Ohio?).

    FWIW: There were (*cough*) rumors (*cough*) of Notre Dame football “flirting” with the Big 10 in the early 1980s – I honestly don’t know how close Notre Dame Football came to joining the Big 10, but that would have felt like “organic growth’ as well. Notre Dame “athletics” joined the Big East (everything except football) in 1995 and then made the same deal with the ACC in 2013.

    “The Big 10” remained 11 teams as a conference took a little risk by starting the “Big 10 Network” in 2006. Ok (pun alert) it may not have been a “big risk” but is also was a guaranteed success.

    The problem with running any “television network” is content. ESPN had successfully launched “ESPN classic” in 1995 – which had proved that there was a market demand for “classic sports coverage.”

    “ESPN Classic” shutdown in 2021 – probably in part because of the success of “conference television networks” – but that is just me guessing. I wasn’t a huge fan of “ESPN Classic” but I remember watching a rebroadcast of a “game from the 1980s” and getting drawn into the broadcast like it was a live event (since I didn’t remember who won the game).

    The Next Big 10 addition was Nebraska in 2011 – again still felt like “organic growth” – but by this time the “Big 10” network was a success and my guess is that “folks in charge” started seeing the possibility of a truly “national conference” – but I’m just guessing again.

    The Big 10 adding Rutgers and Maryland in 2014 only really makes sense if you have “coast to coast” conference aspirations.

    btw: I am not criticizing either school. I was stationed in Maryland when I was in the Army – I like Maryland – e.g. The Maryland Terrapins beat Indiana to win their National Championship in 2002.

    I’m just pointing out that they may not be on the same “major sports” level as the other teams added to the Big 10 but their addition makes sense if you are building a “national conference.”

    The “Big 10 Network” was a joint venture (51% for the conference and 49% for Fox) with Fox Sports in 2006. In 2022 the conference signed a $7.5 billion deal that was described as using an “NFL approach” i.e. with multiple networks not just Fox Sports.

    With all of the above in mind – well, adding USC, UCLA, Washington, and Oregon in 2024 begins to look like “part of a plan.”

    In the “just my opinion” category ‘super conferences’ have become easier to manage/pull of because of modern technology. With 18 teams it really becomes a “League” with two “conferences” – which is a time tested formula for pro-sports in the U.S. – I’m not in the “predictions” game so I’ll wait and see how they implement the 18 team “B1G” conference …

  • Random thoughts on Time, Distance, and Faster Than Light Travel

    The good folks at Merriam-Webster give us 14 definitions for “time” as a noun, another 5 as a verb, and then 3 more as an adjective.

    A quick peek at the etymology tells us that the “time” came into the English language by way of Old English and (Old Norse) words for “tide.”

    That “time” and “tide” are related shouldn’t surprise anyone — after all “time and tide wait for no man” is one of those “proverb” things. e.g. If you make your living next to/on a large body of water then the tide going out and coming in probably greatly influences your day to day activities as much as the sun rising and setting.

    From an “exploration” point of view “precision time keeping” was essential for sailors because they could use it to determine their longitude. Not being a sailor or even mildly comfortable on a boat that doesn’t have a motor – I’m told you can use a sextant to determine your latitude using the moon and stars.

    Obviously in 2023 GPS is used for most voyages. Some high up officials in the U.S. Navy pointed out that we should still teach “basic seamanship.”

    I’ve had a career that revolves around “fixing” things because, well, things break — so teaching basic navigation without GPS sounds obvious. e.g. the U.S. Army initial entry training (“basic training”) used to spend a little bit of time teaching the POGs (“persons other than grunts”) how to read a map and use a compass.

    “Way back when” I was trained as a medic – which used to mean nine weeks of “basic” and then another period of “AIT” (advanced initial training) — all of which I seem to remember took 6 months in real time. In 2023 Google tells me that the “11B Infantry” training is “One Station Unit Training” lasting 22 weeks.

    The “Distance” Problem

    Before the “industrial revolution” in the 18th century gave us things like trains, and eventually planes, and automobiles – the fastest human beings could travel on land was on the back of a horse.

    Which basically meant that the “average human being” would live and die within 20 miles of where they were born. Since MOST people were ‘subsistence farmers” they probably didn’t have a pressing need to travel exceptionally far.

    Of course “ancient peoples” probably formed the first “cities” as equal parts “areas of mutual protection” AND “areas of commerce” — so the “local farmers market” today might be described as an example of the foundation of modern society – “people gotta eat” and “people like to socialize” …

    Those ancient subsistence farmers no doubt figured out the cycles of the moon as well as the yearly seasons so they could optimize the output of their farms. Those folks not concerned with the tides still had to “plant” and “harvest” – so “time management” was a consideration even if precise time keeping wasn’t an issue.

    Those Ancient Greeks even went so far as to create the idea of a “decisive battle” so they could decide conflicts and get back to their farms with minimal disruption (i.e. if you don’t plant, you can’t expect to harvest) – but that is another story.

    The point being that “time” was a constant – how we “redeem the time” is up to the individual – but part of being human is dealing with the inevitability of “time passing.”

    The relationship between “distance” (d), “speed” (s), and “time” (t) is probably still a “middle school” math exercise (d= st) which I won’t go into – but it is hard to overstate the impact that “fast and safe high speed travel” changed human society.

    My favorite example is “transcontinental” travel in North America. Before the U.S. completed the first transcontinental railroad in 1869 the fastest you could travel from “coast to coast” would take 6 months – e.g. you could probably take a train to Nebraska in a couple days, but then the trip from Nebraska to the west coast would take several months. Or you could sail around South America (Cape Horn) which would also take 6 months (it was probably safer but much more expensive).

    btw: Canada’s “transcontinental railroad” opened in 1881 – and is still in operation. Parts of what was the U.S. Transcontinental railroad are still around – but the rise of “automobiles” and the Interstate highway system made “interstate railway passenger travel” unprofitable.

    AFTER the transcontinental railroads you could travel coast to coast in about a week. The original “intent” of the U.S. transcontinental railroad was that it would open up trade with Asia – i.e. good shipped in from the “far east” could be shipped across the U.S. — the bigger impact ended up being allowing immigrants from Europe to settle “out west” – which is again, another story.

    It is safe to say that the “problem” of distance for “human travel” was solved by the industrial revolution. e.g. Google tell me I can DRIVE from southwestern Ohio to California in 2 days – although I could hop on a plane and travel from CVG to LAX in about 6 hours if I was pressed for time.

    If I wanted to go to Chicago (298 miles from Cincinnati) the drive is about 6 hours – but with the cost of gas (if I schedule far enough in advance) the plane trip would still take 4 hours, but probably cheaper than driving.

    The point being that “Travelling around the world” in ANY amount of time USED to be an unthinkable adventure because of the distances involved and the lack of safe/speedy travel options – now it is about time management and deciding on how comfortable you wanna be while you travel (and of course whether you want to be shot at when you get where you are going 😉 — and THAT is another story …

    Faster Than Light

    Back when I was teaching the “Network+” class multiple times a term – the textbook we used would start out comparing/contrasting common “networking media.” The three “common” media covered were 1. coaxial – one relatively large copper cable, 2. unshielded twisted pair (UTP) – 8 smaller copper wires twisted together in pairs, and then 3. “fiberoptic” cable – thin “optical fiber” strands (“glass”).

    SO I would lecture a couple hours on the costs/benefits/convenience of the three “media type” – spoiler alert most “local area networks” are using some flavor of UTP because it is still hits that sweet spot between cost/speed/convenience. The take away from that “intro to networking class” about “fiberoptic cabling” was that it was exceptionally fast, but more expensive, and harder to install than the other two.

    The “exceptionally fast” part of fiberoptic cabling is because we are dealing with the speed of light. Yes, there are other factors in network “speed” but physics 101 tells us that it is not possible to go faster than the speed of light (which is 300,000 kilometers per second or 186,000 miles per hour)

    (oh, and the “slow” part of most “computing systems/networks” is the human beings involved in the process – so UTP is just fine for 99% of the LAN implementations out there – but once again, that is another story)

    I’m not a physicist but saying that the speed of “light” is the speed of energy without mass is accurate enough for today. The point being that unless you can “change the rules of the universe” as we understand them today – it is NOT possible to go faster than light (FTL).

    There was a lot of optimism that “science” would solve the “interstellar distance” problem during the “space race” period of human history. But “interstellar distance” is mindboggling huge compared to terrestrial travel – AND we keep hitting that hard barrier of the speed of light.

    Of course neither “subsistence farmers” OR “trained thinkers” 2,000 years ago comprehended the size of the earth in relation to the rest of the universe – “educated types” probably thought it was round, and might have had a good idea at the earth’s circumference – but travelling “around the world” would have been the stuff of fantasy.

    Some well meaning folks were predicting “moon tourism” by the end of the 20th century – and I suppose the distance isn’t the problem with “moon tourism” so much as “outer space” being VERY non-conducive to human life (read that as “actively hostile” to human life).

    Gene Rodenberry (probably) came up with the idea for “Star Trek” as a direct result of the “moon mania” of the late 1960’s. Yes, “Star Trek” was conceived of as a “space western” so it was never a “hard” science fiction program – so the “Star Trek” universe tends to get a pass on the FTL issue.

    After all humanity had created jet engines that allowed us to break the speed of sound, wouldn’t it be natural to assume that someone would come up with FTL engines? With that in mind “dilithium crystals” fueling warp drive engines that allow our adventurers to go multiples of the speed of light doesn’t sound that far-fetched.

    Folks were using “Mach 2” to signify multiple of the speed of sound – why not use “Warp speed” for multiples of the speed of light.

    It is easy to forget that “the original series” (TOS) was “cancelled” each year it was produced – after seasons 1 and 2 a fan letter writing campaign convinced the network folks to bring the show back. TOS was always best when it concentrated on the characters and stayed away from the “hard science” as much as possible.

    BUT I’m not picking on “Star Trek” – just pointing out the physics …

    Time Travel

    Mr Einstein’s theory sometimes involves a “though experiment” where we have two newborn babies (or feel free to think of newborn kittens/puppies/hamsters/whatever if the “baby” example gets in the way) AND we put one of the newborns on a “spaceship” and accelerate that ship “close to the speed of light” (we can’t actually go the speed of light – we are just getting as close as possible).

    When our imaginary thought experiment ship returns – the newborn on the ship doesn’t appear to have aged but the newborn that stayed behind is now extremely old. This is the “twin paradox” and a lot of folks smarter than me have spent considerable time examining the question –

    The point is that Mr Einstein’s theory does not allow for “travelling backwards” in time.

    Again, “Star Trek” (TOS) became famous for slingshotting the Enterprise around the sun, and going faster than the speed of light (“light speed break-away factor“) to travel backwards in time.

    Of course, if you have “suspended disbelief” and have accepted that the warp drive engines can routinely achieve multiples of the speed of light – then the “Star Trek” writers are just engaged in good storytelling, which again interesting characters and good stories has always been the best part of the “Star Trek” universe.

    btw: the most plausible “time travel” in a TOS episode was “The City on the Edge of Forever” – that is the one with Joan Collins for casual fans (season 1 episode 28). It tends to be listed near the top of “best episode” lists for TOS.

    I seem to remember someone asking Stephen Hawking about the possibility of time travel “way back when.” (btw: Mr Hawking was a Star Trek fan and has the distinction of being the only “celebrity guest star” to play themselves – TNG Season 6, episode 26 – Data on the holodeck playing poker with Albert Einstein, Isaac Newton, and Stephen Hawking) – as I remember it, Mr Hawking’s response was something along the lines of “if you could travel faster than the speed of light, then time travel might be possible”

    Of course that is probably the same as him saying “… and it is also possible that monkeys might fly out of my butt …” – but you know, it is entertainment not “hard science.”

    While I’m at it

    The “time traveler” in HG Well’s “The Time Machine” explains traveling in time as travelling in another “dimension” – since humanity had created machines to let us travel in the other dimensions (up, down, side to side – e.g. length, width, height) then travel through “time” would just require a new machine.

    That “time travel device” just becomes an element of good storytelling – i.e. best practice is to tell what it does and NOT spend a lot of time explaining HOW it works.

    Doctor Who and the TARDIS (Time And Relative Dimensions In Space”) get a short explanation when required – and they added the ability to travel instantaneously through time AND space, probably both as storytelling device and as a nod to Mr Einstein’s “space-time” concept.

    “The Planet of the Apes” (1968) used the basic “twin paradox” idea – but then “something happened” and rather than landing on a distant planet they end up back on earth.

    In the 1970 sequel “Beneath the Planet of the Apes” the “rescue team” has followed the first group – and this time they say they were caught in a “tear if the fabric of space time” or something. Of course they conveniently land in the same general area as the first crew and everyone speaks English.

    There were three more “Planet of the Apes” sequels – they travel back in time in “Escape from the Planet of the Apes” (1971) – I don’t think they bother to explain how the got back, but I haven’t been able to sit through “Escape from the Planet of the Apes” recently.

    I think “Planet of the Apes” (2001) was a victim of a writer’s strike – it isn’t particularly re-watchable for any number of reasons – not least of which is that they jump through illogical hoops to have Mark Wahlberg end up back in the present with a monkey Lincoln memorial.

    The Andy Serkis as Caesar “Planet of the Apes” trilogy doesn’t bother with the “time travel” trope – substituting a “engineered virus” that (unintentionally) kills most of humanity and makes the surviving humans less intelligent.

    “The Final Countdown” (1980) has an aircraft carrier go back in time to 1941 just before the attack on Pearl Harbor. The movie revolves around the “can/should they change history by intercepting the Japanese attack on Pearl Harbor” question – you can watch it for free on Tubi.com if interested.

    This time around the time travel is a “finger of God” sort of thing – as I remember it a mysterious storm just appears and the 1980’s era aircraft carrier ends up in 1941. I’ll just point out that it is “plausible” but won’t spoil the ending …

    Fred Ward had a long career as a “character actor” that died in 2022. He tried to make the move from “grizzled nice guy co-star/sidekick” to “leading man” multiple times in the 1980’s. He appears destined to be remembered as Kevin Bacon’s co-star in “Tremors” (1990) – he was one of those “instantly recognizable faces but you might not be able to recall his name” actors.

    Mr Ward starred in several movies that qualify as “cult classics” (i.e. well made movies that didn’t find a mass audience at the time of release but continue to be popular years later). Mr Ward’s “time travel” movie was 1982’s “Timerider: The Adventure of Lyle Swann” – which isn’t available streaming, but has a blu-ray release which probably illustrates the “cult classic” concept better than anything

    As I remember it (I haven’t see the movie in years) – Mr Ward is a dirt bike rider that accidently gets sent back in time (1870s American West) by “secret government experiment” of some kind which he accidently stumbles into — the memorable part is that they manage to slip in a version of the classic time-travel “grandfather” paradox.

    Normally the “grandfather” paradox is similar to “Back to the Future” where the time traveler does something to keep their ancestors from meeting/reproducing/whatever. “Timerider” is the other option – where he ends up being his own great-great-grandfather – enjoying the movie doesn’t revolve around that point and it looks like the movie is still being sold on blu-ray in Italy and Spain, so …

    The whole “time travel machine” trope got called for its inherent silliness with “Bill and Ted’s Excellent Adventure” (1989) – the movie is funny on multiple levels, and it is safe to say it skewered the whole “travel in time and change events” movie genre — “Bill and Ted’s Bogus Journey” (1991) takes the joke even further but it suffers a little from “sequel-itis” …

    I’ll finish with a nod toward “Land of the Lost” both the 1974-1977 kids tv show and the 2009 Will Ferrell movie – where they “slip through” rips in time or something.

    I suppose the “science” behind the movie/series is similar to “Indiana Jones and the Dial of Destiny” where it is implied that there are “rips in time” or something that can be predicted and then travelled through.

    Yes, I am ignoring the various “multiverse” shows out there – simply because they are just modern “duex ex machina” plots. Worth noting because they reflect humanities desire to be able to go back and “fix” the past, but they quickly wore out their novelty …

  • Mr Warhol and photography copyrights

    Since Andy Warhol died in 1987 – the Supreme Court was probably/technically ruling against his “estate” in their recent decision.

    Mr Warhol had used a photograph of Prince (“The Artist”) in a 1980’s painting (“Orange Prince”) – money changed hands among the concerned parties back in the early 1980’s so there wasn’t any problem until Prince Rogers Nelson died in 2016 and the Warhol image was used in some publications

    the crux of the issue was that back in the early 1980s Mr Warhol had paid for “one time use” of the photograph – SO was using Warhol’s painting in a magazine 30 years later a violation of the photographers copyright?

    Obviously the issue was convoluted enough that it ended up before the Supreme Court – so I won’t try to summarize it here – short form: the Supreme Court said “yes, the usage violated the copyright holders rights”

    Wait, what about Prince…

    now, ordinary folks might ask – what about the estate of “Prince Rogers Nelson” shouldn’t they have been involved somehow? well, again, the case was about COPYRIGHT – so it is the COPYRIGHT holder that was seeking redress

    SO when Prince’s music was played (assuming his estate still owned the copyrights) – THEY got paid, but the copyright holder of the photograph was/is the photographer.

    Just like in the music industry where “every time the music is played, SOMEONE gets paid” because of copyright – in the photography business “every time the picture gets used, someone gets paid” i.e. the copyright owner.

    Of course there is also the concept of “work for hire” – e.g. when Perry White sent out cub reporter Jimmy Olsen – the pictures Jimmy took belonged to the newspaper because they were paying young Mr Olsen to do a job.

    Peter Parker on the other hand was a freelance photographer for the JJ Jameson at the Daily Bugle – so Mr Parker got paid for his photographs and probably retained the rights to his work.

    I suppose if we could find a real copy of the Daily Planet the copyright notice on a picture Jimmy Olsen took would say “Copyright YEAR Daily Planet publishing” but a real copy of the Daily Bugle with a picture from Peter Parker would say “Copyright YEAR Peter Parker”

    In either case Superman or Spider-Man weren’t getting paid because they were performing in the public arena. Maybe they would have been received a “session fee” if they arranged a time and intentionally posed in front of the camera – but you get the idea …

    Public Photographs

    just in general if you are a “public person” doing your thing “in public” then photographs taken of you “in public” are the property of the photographer – e.g. this is how “paparazzi” make a living

    if you go to a Taylor Swift concert and take pictures of the performance – then YOU own the photographs and can do what you want with them.

    which means that it is possible for an artist to violate the copyright law by using a picture of themselves without the permission of the photographer. It happens on a regular basis.

    of course there is also the “Dave Chappell” solution where the performer can prohibit phones/cameras at the performance as a condition of entry — but that is an additional expense and MOST of the time performers want the publicity when they are “performing.”

    when they AREN’T performing is when the “negative” side of fame becomes an issue – but that is a different subject.

    Copyright

    The point of having “copyright laws” is to allow artists to profit from their creative work.

    There are folks out there that will argue that copyright laws “stifle creativity.” Well, you don’t need to be a student of history to see through that strawman argument.

    Consider Mr Shakespeare – writing 400+ years ago before “copyright laws” – how did he make “money?” Well, his “acting companies” had “benefactors” – which was why they were the “Lord Chamberlain’s Men” and then when King James I became their benefactor in 1603 thy became “King’s Men.” Then they also received money from performing productions/ticket sales.

    The idea of “publishing rights” back then was non-existent. The moveable type printing press had only made it to Europe in 1455 – so obviously “copyrights” were not an issue.

    Which means there were no “professional writers” back then – maybe a lot of “playwrights” and folks that had time to “write” as a hobby, but it was not possible to “make a living” as a “writer.”

    “If you would not be forgotten, as soon as you are dead and rotten, either write things worth reading, or do things worth writing.”

    Benjamin Franklin

    It should be pointed out that Mr Franklin made his fortune as a PRINTER. Ol’ Ben was obviously a gifted writer – but he made money by printing and selling his writing – so he understood the need for “copyright laws” as a profit incentive to creatives.

  • The GREATEST movie of ALL TIME

    well, the obvious problem with the title is “how do you define ‘great’?”

    of course everyone that has answered the question has been “correct” – “greatness” is determined by individual tastes. Consider that the credit for creating the “modern summer blockbuster” belongs to “Jaws” (1975) – which was the “greatest box office success” of all time until “Star Wars” (1977) – but if we did a survey of “movie critics” my guess is that neither movie would be in the top 10 if the question is “Name the greatest movie of all time”.

    Box Office

    Using “raw box office” as a measure of greatness had obvious problems. Most obvious is that “ticket prices” have increased greatly – e.g. in 1940 you could buy a movie ticket for $0.25 – a quarter of a dollar, in 2023 it is considerably more.

    If you want to use “ticket sales” as a measure of “greatness” OTHER problems pop-up. In this case “modern movies” expect to make MOST of their ticket sales in the first two weeks or release, will probably not be in wide theatrical release after four weeks, and will probably be available for “home consumption” (in the form of a digital download) in a few months after release.

    Before the mid 1980’s “home consumption” of a “major movie” would have been to show it on network television. There were “annual events” for some traditional favorites – “The Wizard of Oz” (1939) was shown annually from 1959 to 1991, “The Ten Commandments” (1956) is still shown annually around “Easter” Time.

    Once upon a time “Gone With the Wind” (1940) had been shown in the same theater for decades – so it is the hands down, never gonna be beat “ticket sales” champion movie of ALL TIME.

    Awards

    Remember that ANY “awards show” is inherently biased. The “Academy Awards” in particular are an “industry insider” group that – for the most part – gives out awards to other “industry insiders.”

    SO I notice the Academy Awards when they come out – but I do not consider them a “measure of greatness.” I’m not saying the awards are “not important” – certainly they are important to the folks that get nominated and/or win. I’m just pointing out that the awards are “voted on” by some group and are NOT useful for comparative purposes – e.g. if “movie A” won an Oscar but “movie B” did not win any awards does it automatically mean that “movie A” is BETTER than “movie B”? Nope.

    Categories


    Is being “ground breaking” the measure of “greatness?” “Birth of a Nation” (1915) helped create the “cinematic vocabulary” we take for granted (but the ending is obviously ‘problematic’) – “Citizen Kane” (1941) also broke ground on “camera movement and special effects” (which is why the ‘movie critics’ tend to love Orson Welles in general and “Citizen Kane” in particular) – “Casablanca” (1942) is in a category all its own but I’ll hold it up as an example of “script greatness.”

    to be fair (and for convenience) – there need to be multiple categories, “maybe greatest movie BEFORE ‘television’” (because the “studio movie” standards had to be raised when folks could get “basic entertainment” for free over the air – e.g. a lot of those “old movies” from the 30’s and 40’s feel like “television productions” in terms of length and content – e.g. “Frankenstein” (1931) and “Bride of Frankenstein” (1935) are around 1 hour each – watching them back to back tells a complete story)

    then we need to have a “greatest movie under the ‘studio’ system” AND “production code” category – if you are thinking “production code? what is that?” – well, there was a time when ALL movies where “general admission” – the MPAA didn’t come up with the “rating” system until 1968, BEFORE 1968 the “Production Code” was a form of self-censorship that put restrictions of “language and behavior” (e.g. try finding a “major U.S. movie” from before 1968 with profanity or nudity – I always love to point out “The Dirty Dozen” (1968) as working very hard to not use profanity)

    oh, and then there are the “not in English movies” – “Breathless” (1960) is a great movie (French crime drama). Akira Kurosawa’s work (Japanese director) had a HUGE influence on American cinema – e.g. even casual “western fans” have probably heard that “The Magnificent Seven” was based on Kurosawa’s “Seven Samurai”

    Personal Bias

    Since I was young and impressionable in the 1970’s the work of Steven Spielberg, George Lucas, and Francis Ford Coppola has a special place in the “nostalgia chest” – intellectually I can say that “Schindler’s List” (1993) is Mr Spielberg’s “greatest artistic achievement” while still saying I love “Jaws” and “Close Encounters of the Third Kind”.

    The Godfather and The Godfather part II are great movies – but my personal favorite “Coppola” movie is “Apocalypse Now.”

    As for Mr Lucas – “American Graffiti” (1973) is still a lot of fun to watch (and it foreshadows the “story telling” techniques used in the “Star Wars” franchise – at one level you can say that Mr Lucas was exploring the relationship between “man and machine” in both movies). “The Empire Strikes Back” is arguably a “better” movie than “Star Wars” or “Return of the Jedi”, but c’mon they didn’t even blow up a Death Star!

    No discussion on “big budget blockbusters” would be complete without mentioning James Cameron – I was blown away by the 3D effects in “Avatar” (2009) and “Titanic” (1997) was so full of special effects that people don’t think of it as being full of “special effects” (e.g. no, they did not build a replica of the Titanic – it was mostly “computer generated images” (CGI) – and that CGI was part of why it was the “most expensive movie” of all time back in the 20th Century).

    BUT my favorite “James Cameron” movies are “The Terminator” (1984) and “Aliens” (1986) – as always YMMV

  • Marketing and Propaganda

    In its best form “marketing”/”advertising” is just “information”

    If you have a great product that does “whatever” the best use of your “marketing” budget is to build awareness of the products benefits among folks that need to do “whatever it is that your product does”

    e.g. say you make beer or running shoes – and your goal is to continue to sell beer or running shoes.

    Spending time educating potential customers about the benefits of your beer or running shoes is gonna be much more effective than – I don’t know, randomly pushing a social agenda.

    e.g. The “craft beer” industry got its start by educating folks on how “good beer” should taste. The “athletic shoe” business had to educate/inform how their shoes improved performance.

    Leadership

    This is where competent leadership would say “hey, we are NOT a social advocacy company — we sell beer (or running shoes) so we are gonna concentrate on making the best beer (or running shoes) and leave the social advocacy for other folks”

    That doesn’t mean your company can’t be a “force for positive change” — i.e. being a “good corporate citizen” is always “good business.” It just means that your company has a product to sell and that shouldn’t involve “propaganda.”

    Donating to charities or allowing employees “personal time” to volunteer will have intangible benefits — but taking a “corporate stance” on “controversial” issues with marketing decisions is a pointless gamble.

    Studio System

    For most of the 20th century the above would PROBABLY have qualified as “corporate dogma” for MOST large corporations.

    Back in the old “movie studio system” where actors were “under contract” – the studio made an effort to control the public image of “movie stars” and wouldn’t let the actors express “controversial opinions.”

    why? because folks on both sides of the issue were potential customers – an actor expressing an opinion would (probably) offend SOMEONE – and that would mean “lower sales”

    Yes, they were selling an illusion, but the point was that the studio was NOT in the “advocacy business” – they were selling “escapism”/”entertainment”

    Michael Jordan pointed out that he intentionally was NOT “political” because “Republicans and Democrats both buy shoes” (or something along those lines).

    Freedom of Speech

    The modern business of sport is inherently tied to the “endorsement deal.” I don’t know if anyone can truly claim to have “invented” the idea of celebrity endorsements – i.e. the birth of “mass media” and “marketing” go hand in hand.

    Babe Ruth was the best baseball player in the world (and an all time great) at a time when “mass media” was shifting from newspapers to radio. Baseball was helped by radio, which meant that Babe Ruth’s value as a “celebrity endorser” was helped by radio. BUT while the Babe endorsed everything from “cereal to Girl Scout cookies to soap” I’m not sure if he made more money from “playing baseball” or from endorsements.

    Arnold Palmer (professional golf great) on the other hand made much more money from “endorsements” than he did from winning golf tournaments. This time Mr Palmer benefited from the growth of “television.”

    If a “modern sports star” was looking for an “endorsement” role model – Mr Palmer is probably hard to beat. I’m not a golfer – but I still think of his commercials for a particular motor oil when I’m buying oil.

    Of course the “products” that Arnold Palmer was selling were “golf” AND “Arnold Palmer” – I’m sure he had opinions of the controversial subjects of his day, and I’m sure he contributed to multiple charities, he just kept those opinions separate from his “golf professional image.”

    In 2023, I’m not opposed to an athlete expressing an opinion on “controversial subjects” – I just prefer that they have an educated opinion on the subject BEFORE they comment.

    Of course then “product endorsements” might be impacted by an athlete expressing their opinions. This withdrawal of “corporate approval” is NOT an attack of “freedom if speech” – again, the “company” needs to remember that it is in the business of selling a “product” NOT active propaganda.

    You keep using that word …

    Propaganda is “ideas, facts, or allegations spread deliberately to further one’s cause or to damage an opposing cause” — so is “propaganda” a form of “marketing?”

    well, maybe – “propaganda” USUALLY has a very negative connotation. Propaganda is biased and “selectively true” – i.e. trying to present YOUR idea/product in the best possible way – which might also be true of “marketing.” BUT propaganda allows for “allegations” meant to “damage the opposition” – which implies (at best) unethical behavior, which tends counter productive in the long term.

    Again “Good marketing” starts with a quality product/service. The goal is to educate folks on how YOUR product can help them solve a problem NOT convince them that your competitors are evil.

    Maybe if you have an inferior product/idea then selling “fear uncertainty and doubt” (FUD) is your only option — but wise leadership better serve a company by “repositioning” the product or developing a better product/idea.

    Marketing is NOT Manipulation

    My point is that “marketing” should equal “education” but NOT “manipulation.”

    If a group of “corporate executives” is sitting around thinking “We have the most popular product in the land. We have so much market share it is hard for any new marketing campaign to make a BIG difference one way of the other – you know what we should do? How about we hire a ‘spokesperson’ to advocate for a ‘controversial’ subject!” – well, it is probably time to get some new “corporate executives.”

    I cannot think of ANY product at ANY time that has been so popular that the parent company could try to “force feed” a radical agenda to their customers without losing a significant market share.

    If a company has “monopoly power” then their “marketing” doesn’t matter – but if there are multiple competitors and the cost of switching is just “I’m never buying that brand again – I will buy this other brand readily available from a competitor that hasn’t insulted my intelligence/integrity” – well, you will probably get “new executives” when the ones that made the terrible marketing blunder get fired

  • Profit Margins

    If a company is “profitable” over a long period of time that PROBABLY means it is “well run” or “managed properly.”

    Of course we need to define “long period of time” — in a healthy economy companies will come and go just by the natural cultural shifts and technological advances.

    e.g. Thirty years ago multiple companies making a nice profit from selling “long distance” phone service. Then the “interweb” exploded and “cell phones” became ubiquitous and I’m not sure anyone sells “long distance” phone service anymore.

    Prices

    the price of whatever “product/service” that “profitable company” makes is gonna be influenced by a wide range of variables

    A company can’t “lose a little money” on each transaction and expect to stay in business – so MOST reasonable people can appreciate that the idea of “profit” is not evil. However calculating acceptable “profit margins” (in the real world) is harder than plugging numbers into a formula (something like “profit = (revenue – cost)/revenue”)

    First – the sector/industry which the company is competing influences the idea of acceptable “profit margins.”

    e.g. the “oil industry” has to include some % to finding/acquiring “more oil” – the “lumber industry” has to include some % to “planting trees” – the “pharmaceutical industry” has to include some % for “research, development, and approval” of new drugs

    Second – “marginal utility” comes into play and really messes with “prices.”

    How much “the market” is willing to pay for a product is influenced by how much of that product they “need.”

    Remember there is a difference between “need” and “want.” Real “needs” are things like food/water/shelter. Needs are (relatively) limited. “Wants” on the other hand are unlimited – but will vary wildly between individuals.

    e.g. an individual that is hungry, cold, and lost in the wilderness would be willing to pay much more for a “plate of beans by the fire” than someone that is living in a nice warm house with plenty of food.

    The “value” of diamonds and water are another classic example – if you are dying of thirst, you will “pay” for water and (probably) aren’t concerned with diamonds of any quality. But if you have all the water you need (you know, it tends to fall out of the sky in certain places) – then “shiny things” like diamonds are worth a lot more.

    Cost

    Of course just because “water” can be obtained for free – that doesn’t mean there isn’t a “market” for water. The problem with water is that it is easily contaminated. Historically “dirty water” has been the cause of a LOT of epidemics – which is another subject.

    “Water” may be obtained for “free” – but “clean potable water” doesn’t happen by accident. SO “bottled water” is its own little industry. The larger point being that the “product cost” is not directly linked to the “product price.”

    The same would be true for diamonds – i.e. raw diamonds require some additional work to become “jewelry.”

    SO with any product the company selling the products has other “production costs” than just “materials.”

    If those additional costs are managed poorly – then a product that costs $0 could be sold for “$large number” and the company might NOT be “profitable.”

    OR if those additional costs are managed properly – then the “total cost of production” might be lower so the “product price” might be lower AND the company would be “profitable.”

    Of course it is also possible for a company to have “record profits” despite poor management — but those tend to be short lived “bubbles.”

    As for the stock market: what the “stock market investor” wants to see in a company is “slow and steady” long term growth. Meanwhile the “stock market speculator” is looking for “wild swings” in profits.

    The “intelligent investor” will do more “investing” than “speculation” – I think someone won a Nobel Prize in economics for pointing out that “diversification” was a good thing – which is basically saying that a little “speculation” is a good thing for “long term profits.” This is why “investment professionals” will talk about “risk appetite.”

    In an ideal case our ‘well managed company’ would see slow and steady profit growth year over year. Each year may not set a new “record” for profits, but the graph line would be sloping upwards.

    While that “hot new company” in an “emerging industry” PROBABLY won’t show profits at all for the first few years – but that doesn’t mean an investor shouldn’t risk a small % …

    SO “diversification” is going to look differently for different investors at different points in their lifetime – but the “big idea” is that (from a financial planning point of view) you should never put EVERYTHING into anything …

    Government intervention

    My internal alarms start going off anytime a “government official” starts talking about a company/industry having “record profits” and how this isn’t “fair” to the public.

    Well, we have the “history of socialism/communism in the 20th Century” to point out the dangers of “centrally planned economies.”

    If you want to argue that the USSR and Maoist China were not “true communism” – fine. I understand the difference between the “speculative economics” that Karl Marx wrote about and the “real world implementation” of tyranny done under his name – that isn’t the point.

    The point is that any human government intervention into individual sectors of the economy tends to be counterproductive. Modern economies are vast and complex and change at a pace faster than human government and effectively regulate.

    I can appreciate the goal of “fairness” – but the problem is human nature and “information flow.” Is the purpose of government is NOT to make society “fair.” That simply is not possible with human government.

    I’m not questioning the “intent” of attempts at socialism – I’m pointing out the failures of trying to arbitrarily change human nature and the problems of “scarcity.”

    Mr Marx expected “capitalism” to solve the “scarcity” problem – and then “communism” would happen naturally. I tend to disagree with his hypothesis that if all of humanities basic needs were met that we would live together in peace and harmony – again, “human nature” comes into play.

    But it is pretty to think that Mr Marx wasn’t completely wrong (but again, 20th Century history isn’t on his side)

    The most terrifying words in the English language are: I’m from the government and I’m here to help.

    Ronald Reagan

    All of which means that by the time “government reacts” the problem has probably changed and any government regulation is going to be pointless and ineffective.

    This is why “do something” legislation after a “disaster” might actually make the root problem worse — and “don’t make things worse” is probably a good goal for any human government.

    Of course being able to tell the difference between “we must act now” and “it is better to do nothing” is VERY hard. It does illustrate why “politicians” tend to be despised and true “statesmen”/leaders are few and far between …