Once more unto the breach …

  • Vince Lombardi – Speech 1970

    SO “back in the day” (in another lifetime, in a small town in southwestern Ohio when I might have described myself as an “athlete”) a high school teacher gave me a mimeograph (“ditto sheet”) copy of a speech by Vince Lombardi.

    Yes, that is the “Vince Lombardi” for which the “Lombardi Trophy” is named.

    The speech came to mind because I used the “winning is a habit” line (again).

    Mr Lombardi gave the speech in July of 1970. I’m guessing at the time of the speech he was planning on coaching in the NFL that year, but he died in September 1970 (colon cancer, he was 57).

    The mimeograph copy I had was PROBABLY a transcription of the speech. From a “document” point of view that means that “paragraph breaks” were a little arbitrary – i.e. the “ditto sheet” version was a couple VERY large blocks of text.

    The full speech was probably around 50 minutes (5,000 words, “paid after dinner speech” length) – again, just me guessing after spending 10 years teaching/getting paid to talk.

    I did a little more light editing this morning, changed the font, increased font size for readability, more paragraph breaks. The U.S. Copyright act of 1976 started automatically granting “copyright protection” to any and all “creative works.”

    Before 1976, to get copyright protection you needed to place a “copyright notice” on the work in question – which means I’m 99.99% sure THIS speech is in the public domain


    I want to talk a little bit about attaining a goal, a success what I think it is. I want to say first that I think you’ve got to pay a price for anything that’s worthwhile and success is paying the price. You’ve got to pay a price to win, you’ve got to pay a price to stay on top, and you’ve got to pay a price to get there. Success is not a sometime thing it is an all the time thing. In other words, you don’t do what is right once in a while but all of the time. Success is a habit, just like winning is a habit. Unfortunately, so is losing. So it has been the American zeal, gentlemen, to be first in everything that we do and to win and to win and to win.

    Vince Lombardi

    Random thoughts

    There is a lot of “meat” in the speech which is still valid in the 21st century.

    Vince Lombardi often gets depicted as “legendary football coach” standing on the sidelines and yelling. Leadership styles are obviously influenced by personality – and Mr Lombardi was certainly “explosive.”

    BUT his success did not come from “yelling on the sideline.” We could fill up a small library with books “related to” Vince Lombardi – so he made that transition from “Hall of Fame coach” to “cultural icon” at some point.

    I’ve read a few Lombardi biographies so some random thoughts:

    • He was an assistant coach at the U.S. Military Academy West Point when they were still a national football power – under “legendary” coach Earl “Red” Blaik
    • coaching in the NFL was (probably) a second choice – i.e. there were rumors that “major colleges” at the time wouldn’t want to hire an Italian head coach – I’m not making any accusations, but it was a different time.
    • it is easy to forget that “college football” was more popular than the NFL “back then” – the rumor is that Earl Blaik encouraged Vince Lombardi to take an assistant coach job in the NFL
    • Woody Hayes (as the story goes) called Vince Lombardi the best coach he ever met – Mr Hayes is another example of “great football coach” whose “sideline antics” got a lot of press, but had little to do with his success (but a lot to do with his “fall from grace” – umm, ’nuff said)
    • The NY Giants won the NFL championship in 1956 – with Vince Lombardi as offensive coordinator and Tom Landry as defensive coordinator — Mr Landry would win a few games (and 2 Super Bowls) as head coach for the Dallas Cowboys, ’nuff said
    • There was a LOT less money floating around the sport of football “back then” – pro football was NOT a “full time”/year round job for a lot of players from that era – but I wouldn’t over sympathize the lack of money into thinking of that time as some sort of “when the game was pure” era …
    • from an “armchair amateur historian” point of view – the fact that OTHER coaches considered Vince Lombardi a great coach says a lot more than any win/loss record. I’m sure they didn’t all LIKE him, but they RESPECTED him …

    Would Vince Lombardi be successful in 2024?

    Short answer: Yes.

    The game is obviously very different. There is a lot more competition, players make a LOT more money, but (just me guessing) Vince Lombardi would have adjusted.

    Bill Parcells had a very “Vince Lombardi” coaching style and I would describe (waiting to be inducted into the Hall of Fame) Bill Belichick as another example of a “Lombardi like” approach to the game.

    Again “side line personality” is an increasingly small part of the game of football. e.g. You have to pay the price to win.

    Sports Psychology

    Another famous “Lombardi quote” (when he was coaching in Green Bay) was that he wanted players to place the Green Bay Packers “third” on their list of priorities.

    What should be first and second on the list? “God” and “family.”

    This is important as the “balance point” to another famous “Lombardi quote”: “Winning isn’t everything. It is the only thing.”

    From a “practical sports psychology” point of view – those concepts met at a point where “playing performance” is very high.

    i.e. “football” is important, but not the REASON for existence. Relationships OFF the field are MORE important than on the playing field – but those “on the field” duties shouldn’t be neglected.

    Lose a football game and you shouldn’t be happy, but it isn’t the end of the world. The same applies to “winning a game” – yes, enjoy when you win, but it isn’t permanent.

    The “desired performance state” is where the athlete can go at “full speed” but still be in control. That involves “being in the moment” and not worrying about past OR future possibilities.

    Mistakes are going to happen – but don’t let the “last play” (good OR bad) get in the way of the “current play.” e.g. ok, you messed up, don’t spend time apologizing, worry about getting the next one right – there is plenty of time AFTER the game to dissect what went right/wrong

    i.e. save the “After Action Review” for AFTER the action …

    Of course “elite athlete” doesn’t achieve that without a lot of work/practice. They can’t just “show up” and expect to win.

    e.g. You have to pay the price to win.

    If there are “life lessons” to be learned from “sports”, then that is a still a big one …

    Success consists of getting up just one more time than you fall.

    Oliver Goldsmith

    Management Theory

    There is a lot of talk about how “leadership theory” changes between generations.

    Tom Landry once said that when HE played the game, if the coach had told them to lay on the ground while coaches kicked them in the stomach, THEY would have done it.

    The point being that “back then” players didn’t dare question coaches. Of course “coach” was supposed to have a reason for doing what he did – but he wasn’t expected to share that reason with players.

    THAT type of “centralized command and control” was the norm when Vince Lombardi was coaching.

    Obviously trust has to be earned – and no, I don’t think Tom Landry had coaches kicking him in the stomach. I’m guessing that Tom Landry had players asking him “why are we doing this.”

    Of course “American History” is kind of centered on “questioning authority” – but that is a different subject.

    Random thought: One of the “colorful” personalities in American Revolutionary history was Inspector General Friedrich Wilhelm von Steuben who CLAIMED to be a Prussian officer. He wrote the “Regulations for the Order and Discipline of the Troops of the United States.” The rumor is that General von Steuban complained about “American troops” always wanting to know “why”/asking questions – i.e. as opposed to the obedience of Prussian troops …

    (… btw: the “von” part of his name implies that he was an “aristocrat” – which would have been expected of an “officer” in Prussia/Europe – BUT he probably wasn’t. Like I said he was a “colorful” personality ..)

    MEANWHILE …

    SO Vince Lombardi’s “leadership style” was typical for his generation — but again, he was a “teacher of football.” His view of human nature was that humans are naturally lazy (in general) and need to be “encouraged” to work.

    Of course I’m sure he “encouraged” individual players differently – recognizing that the way “rookie” needs to be “encouraged” is different than they way “veteran player” needs to be “encouraged.”

    Putting a label on his management style isn’t important – the grand “management” concept is ALWAYS that “management equals communication.”

    “Basics of leadership 101” in the 21st Century PROBABLY starts off with the concept that “folks” are going to be better “employees” if they understand the “why” of their job.

    From an “amateur armchair historian” point of view – I would argue that understanding the “big picture” has been the ideal/goal for MOST of human history. It was only after the industrial revolution allowed “management” to “deskill” labor by extreme job specialization that phrases like “that isn’t my job” became possible.

    Random thought: IF I was ever shown a “job description” for a job, there always was an “other duties as assigned” line – which basically meant my job was to do what they told me to do.

    THAT concept might be a good dividing line between “skilled” versus “unskilled” labor – i.e. if they can train your replacement in a short amount of time, you are VERY replaceable.

    How do they learn the “why?” Well, obviously someone needs to teach them – and making sure that happens is “management.”

  • Dumbledore vs Gandalf

    A “social media” post had a poll going about who would win between “Professor Albus Dumbledore” (from the “Harry Potter” books) and Gandalf the grey/white (from The Lord of the Rings – LotR).

    Well, I didn’t bother voting in the poll – I think Dumbledore was winning – but that isn’t the point.

    Polls

    The “winner” of ANY poll is going to based on the survey/poll group. This particular poll is fun because it allows “fans” to be “fans” – i.e. fans of the Harry Potter books are obviously going to choose Dumbledore, and fans of LotR are obviously going to choose Gandalf.

    Short answer: my bias is for Gandalf. BUT there are assumptions to be explained.

    Movies vs Books

    The Harry Potter MOVIES had the luxury of the author still being around. Ms Rowling didn’t write the screenplays – but she provided “assistance”/input to make sure the movies basically kept to the plots of the novels. The point being that “Dumbledore in the books” is pretty much the same as “Dumbledore in the movies.”

    J.R.R. Tolkien died in 1973. Professor Tolkien sold the “film, stage and merchandising rights” to United Artists in 1969. The “internet version” of the story is that he sold the rights because of inheritance tax issues. I have no idea what the deal was – but is sounds like he made a good decision – he got £104,000 (adjusted for inflation around £1.2 million) AND secured royalties for any future productions.

    The “book to movie” translation always comes with “storytelling issues.” What works in “book” can be hard to bring to the screen. Which means there are major differences between “Gandalf in the books” and “Gandalf in the movies.”

    Of course Peter Jackson’s LotR is great – and the “core story” is intact. Both the movies and the books tell an epic story of a battle between good and evil.

    The BIG difference between LotR book and movie is “character arcs.” Professor Tolkien was writing an “epic” with “epic heroes” – you know, big, bold, and confident. While Peter Jackson tried to make the characters a little less “big, bold, and confident” – which of course also allows the actors to “act” …

    SO Gandalf in the movies is not as “powerful” as Gandalf in the books. Ok, Gandalf is obviously not “weak” in the movies – however the character is thousands of years old, he is NOT human. “Wizards” in the LotR are “created beings”/”agents of the divine.”

    Think of the end of The Fellowship of the Ring where Gandalf fights the Balrog. In both movie and book Gandalf emerges victorious, In the movies he dies and is reborn as “Gandalf the White” BUT in the books he doesn’t die. The implication in the books is closer to “leveling up” – he gets promoted not “reborn.”

    Man vs the Divine

    For what it is worth: I’m not sure that wizards in LotR can “die.” They have a physical form that can be destroyed e.g. Saruman at the end of The Return of the King (book) – but is that a “permanent death” or just a temporary inconvenience.

    The Iliad (another epic) comes to mind. Hector (the hero of Troy) vs Achilles ( Greek hero) isn’t a fair fight in the original version – i.e. Achilles is part human and part “divine.” SO “mortal vs divine being” is never going to end well for the mortal.

    The “movie” version of the Iliad (Troy – 2004) includes a great fight scene between Achilles (Brad Pitt) and Hector (Eric Bana) – but when Hector meets Achilles in the original text, Hector runs, and Achilles chases …

    What made he ancient Greek “gods” divine was their long life. Which brings up another point – IF “being” is eternal and they get into a disagreement with “mortal” – then all the “eternal” needs to do is wait for the mortal to shuffle off the mortal coil. That is kind of a theme running through the Iliad – but I’m wandering off on a tangent …

    It’s Time!

    Dumbledore vs Gandalf as a contest between skilled professionals (or chess/checkers/pick a game) might be a toss up. Neither is “all powerful”, Dumbledore is a human with “magic powers” – and Gandalf IS a “magic being”.

    Which is probably why the poll caught my attention in the first place.

    Of course Dumbledore can and does die – so in a contrived “battle to the death” then Dumbledore doesn’t have a chance. e.g. Gandalf could go away for 500 years, come back to visit Dumbledore’s grave and say “I win!”

    Voldemort vs Sauron is also a no-contest for the same reasons. What would happen if Voldemort managed to get hold on the One Ring? Sauron feared someone using THE ring against him, but would even an exceptionally powerful mortal have been able to control the ring, or would he simply become a more powerful Gollum?

    The Ring of power might extend life but The Odyssey comes to mind. Odysseus (the man who gave us the Trojan Horse) had the opportunity to stay with Calypso (a nymph/minor goddess) – she even promised him eternal life. Odysseus desperately wanted to get back to his (mortal) wife – but also implied is that he was wise enough to see that unintended consequences are inevitable. i.e. “eternity” in a mortal body that continues to age wouldn’t be any fun (e.g. Tithonus)

    Anyway, if Dumbledore and Gandalf actually met they would probably play chess, drink wine, and swap stories about little folk – not have a fight to the death …

  • value, price gouging, and, monopolies

    Every “introduction to economics” book will have a chart showing the relationship between “supply” and “demand.” Pointing out that as supply (of a manufactured product) goes up, unfulfilled demand goes down probably sounds obvious, but the relationship works both ways — i.e. if “demand” goes down, then “supply” will decrease.

    However the relationship between supply and demand quickly get “weird” in the real world when talking about specific products. The problem isn’t the relationship between “supply” and “demand” – no, the problem in “real world” applications is that there tend to be “alternative choices.”

    For a real-world example just go to the “cereal aisle” of any “supermarket” in these United States. A well stocked supermarket will have abundant choices of various “breakfast cereals” – IF a specific type/brand isn’t available, then there are obviously alternative choices.

    Of course ONE of the “alternative choices” is always “don’t buy breakfast cereal.”

    Value

    With that “economics 101” review out of the way, the “big picture” of “supply and demand” always includes a third variable: “price.”

    Again, the “economics 101” textbook might have a dotted line pointing at the intersection of the “supply” and “demand” lines marked as “optimal price” but “real world” product pricing is REALLY weird/unpredictable.

    The problem here is that “value” and “price” are not ALWAYS the same. That econ 101 text MIGHT have a chart labeled “price elasticity” — and maybe a nice equation but that isn’t important at the moment.

    The “big picture” is that changes in price can impact both supply AND demand.

    Remember that cereal aisle? If “consumer” has budgeted $X to spend on “favorite cereal” and the price of that product changes to $2X then they MIGHT decide to buy a different cereal (or not buy anything).

    The “value” of the cereal didn’t change, just the “price” – and when value becomes less than the price (everything else being equal) the consumer will make an alternative choice.

    Marginal Utility

    Humans tend to be “predictably irrational” – which (for my purpose today) means that “pricing” can be used to manipulate consumer choices.

    What is something “worth?” Well, the answer is always “it depends.”

    e.g. How much would you pay for an umbrella on a sunny spring day walking in a park? How much would you pay for an umbrella on a rainy day when walking to a job interview?

    Well, if you brought an umbrella with you, both times the answer is probably $0.

    Walking in a park on a sunny day an umbrella might actually be a nuisance – so you aren’t even going to consider the purchase.

    BUT if that job interview is for your “dream job” and showing up looking like you just walked out of a rainstorm isn’t an option – then paying a LOT for that umbrella becomes an option …

    Price gouging

    Now imagine that you are the person SELLING umbrellas.

    If it is going to cost you $something to haul the umbrellas out of the park – and no one is buying umbrellas – you might start lowering your asking price. If folks see a “Free Umbrellas” sign then they might take an umbrella even if it is going to be inconvenient to carry …

    If your “umbrella stand” is on a busy corner in “big city” then you will maximize your profits by adjusting the asking price based on the weather forecast. This is just “good business.”

    The cool sounding term for adjusting price according to demand is “surge pricing.”

    Is surge pricing “fair?” Well, as always “it depends.” Fair to whom?

    The concept of “price gouging” implies taking advantage of folks in an emergency situation.

    Getting caught in the rain without an umbrella is (probably) NOT an “emergency situation” – so the umbrella stand isn’t “price gouging” customers.

    The real question on “price gouging” becomes “who gets to decide.”

    The United States Constitution has a “Commerce Clause”(Article I, Section 8, Clause 3) granting the Federal Government the power to “Regulate Commerce with foreign Nations, and among the several States.”

    Notice the “among the several States” wording equals “interstate commerce.” This is the clause that justifies the Federal gov’ment building INTERSTATE highways. However, Commerce within a State is going to be regulated by THAT specific State.

    SO who gets to decide if something is “price gouging?” Well, if the commercial transaction is between citizens of the same State, then THAT State gets to decide.

    If the commercial transaction is between citizens from DIFFERENT States, then the State where the transaction took place PROBABLY still gets to decide the matter. BUT a lot of high paid lawyers will probably argue about it …

    Again, just charging different prices at different times is not “price gouging” – e.g. airlines have been doing that for years.

    Modern technology has made “dynamic pricing” possible which isn’t good or bad – just another example of “caveat emptor”

    Price fixing

    That “econ 101” textbook might also have a section about “monopoly pricing.”

    The basic idea being that if one entity has total control over a “resource”/product then that entity can charge whatever price they want. If it is one organization then this might get called “monopoly” pricing, if it is a collection of organizations the tactic might get called “price fixing.”

    Is that good or bad? Well, obviously if “price gouging” is also taking place it is very bad. It is also possible that prices might be “fixed” low in certain areas to prevent competition entering the market – which would also be bad.

    Now in a relatively free market – IF “price fixing” is going on, the unintended consequence might be to encourage alternatives.

    This is why in the “real world” examples of “monopoly pricing” becoming “price gouging” are hard to find. e.g. The “monopoly price gouger” is going to encourage competition to enter the market – which will end their monopoly.

    Add in that the gov’ment bureaucracy tends to be naturally slow and inefficient and by the time the typical “antitrust” case gets settled the “market” has innovated and ended the “monopoly.”

    IF the goal of “antitrust legislation” is to protect/benefit the consumer then it (probably) doesn’t have a great list of achievements. I’m not arguing for OR against “antitrust” laws – just pointing out that they are not a “fairness” magic wand …

    Again “competition” in the market if good for consumers – so encouraging innovation and competition should take priority over “punishing” successful companies for being market leaders.

    “Price gouging” is always bad – but State governments are responsible for deciding who is “price gouging.”

    Price fixing may or may not be price gouging – but neither one is legal. “Price fixing” requires a cartel/coalition within a market –

    The chances of a “secret cartel”/cabal manipulating prices on a large scale is the stuff of “thriller fiction” novels not real world economics …

    Gov’ment intervention

    Near the end of the “econ 101” textbook you might get a chapter on the pros and cons of “government intervention” in the marketplace. The U.S. Federal Reserve and “central banks” in general probably get mentioned …

    From a “theory” point of view a gov’ment should be able to easily influence things like inflation and employment – in THEORY.

    Limits on “real time market information” are the existential problem. It just isn’t possible to know EVERYTHING about a large, far flung, economy.

    Gov’ment also tends to be “reactive” – i.e. something bad happens, everyone says “there should be a law to prevent that from happening!” Then by the time a law gets passed attempting to deal with “issue” things have changed and the law is pointless and ineffective.

    The MOST effective thing “large bureaucracy” can do to “help” the economy (in general) is “stay out of the way.” Encourage research and reward innovation – don’t try to pick “winners” or push a pet agenda.

    Of course the government should be involved in the “economy” but that role should be “regulator”/referee NOT “market maker”/head coach.

  • look for the union label, corporate profits, and inflation

    A “meme” post caught my attention. I’ve seen various versions – but the gist is always that “corporate profits” are the cause of “inflation.”

    From a “marketing” point of view the meme does a lot of things right – the version that caught my attention “caught audience attention” by stating that “The profits of the top 6 most profitable corporations” had increased “huge%” THEN the meme tries to connect “corporate profits” with recent “higher than normal” rising inflation.

    Now, the INTERESTING part is that the meme is plausible but also “fact free.” Just who are these 6 corporations? Is it possible for their profitability to cause “inflation?”

    Profits

    What exactly are these “profits?” Google tells me “Profit = Selling Price – Cost Price.”

    Imagine a small business selling “product.” If the small business makes the “product” then the “cost” will include raw materials and labor. For the small business to stay “in business” they need to sell the product for more than it cost them to produce.

    e.g. if raw materials = 30 and labor = 40 – “cost” = 70. IF the business sells the product for 100 then “profits per unit” = 30.

    Then “Profit percentage” = (Profit/Cost Price) x 100. In this:

    Profit percentage = (30/70) x 100 = 43%

    The obvious question NOW becomes is that Profit % good or bad? Unfortunately the answer is “we don’t know.”

    Well, a highly paid financial consultant would say “It depends.” Which is kind of the answer to ANY “financial accounting” question.

    Of course pointing out the factors that profitability “depends” on is the more useful answer. THAT answer will vary depending on the “business industry/sector” – e.g. “costs” are much different for a car company than a pizza company .

    A business being “profitable” just means that it is being well managed. No business will stay a “going concern” long if they LOSE money on EVERY transaction.

    BUT a “well managed” company that makes a small % profit on each transaction …

    Oh, and if that all above sounds fascinating you might want to look into “corporate finance” as a career 😉

    Corporations

    The history of “corporations” is mildly interesting – but not important here.

    In 2024 “corporations” exist as a way for “business” to raise “capital.” A corporation’s “initial public offering” (IPO) involves selling “stock” to “investors.”

    THOSE investors aren’t guaranteed anything (as opposed to “corporate bonds” – did I mention that “corporate finance” is a career field).

    If they aren’t guaranteed ANYTHING why would anyone gamble on an IPO? Well, the obvious answer is that (assuming that the corporation meets some basic financial reporting requirements) the stock becomes an asset that can be traded on a “stock market.”

    The “corporation” gets the cash from the IPO – but the “share holders” can then buy and sell shares among themselves. Which is kind of a big deal in 2024 (and obviously “stock market investment” is beyond the scope of this article.)

    Rule #1 of the “stock market” might be “buy low and sell high” – i.e. the “profits” concept applies to stock trading as well.

    … and what is a big factor in how investors value shares of a corporation’s stock? Profitability.

    How do corporations use the money from an IPO/stock offering? To grow/expand their business.

    Eventually the “profitable corporation” MIGHT distribute “profits” to share holders in the form of “dividends.”

    The grand point being that “corporations” are not evil OR good – they are an investment tool. Corporate profits are also not evil OR good – “profitability” is a function of management and the “business sector.”

    Gov’ment Regulation

    Unrestrained human greed is never a good thing.

    The history of the United States “economy” is a story of “booms” and “busts.” Those swings in the business cycle illustrated an inverse relationship between “unemplyment” and “inflation.”

    During a “boom” the unemplyment rate would go down, but then inflation would go up. Then during a “bust” umemployment went up, and inflation would come down.

    Random thought: There is a scene in “Support Your Local Sheriff” (1969) that (humorously) show the impact “boom times” could have on “consumer prices” – a “mining boom town” has trouble hiring a Sheriff (for “plot” reasons) – James Garner’s character decides to take the job in part because it comes with room and board (and he had just payed a huge amount for a plate of beans).

    In 1913 the Federal Reserve was founded with a “mission” of trying to “smooth out” the business cycles.

    The “economics” text books will say that the Fed’s goal is (around) 5% unemplyment and (around) 2% inflation. How well the Fed has achieved those goals is debateable – BUT that is another topic

    Obviously if the Fed is making decisions based on “unemplyment” and “inflation” rates they need a method of calculating unemplyment and inflation.

    Unemployment seems simple enough – but it a little more complex than just counting people “out of work” – fwiw: the Fed has considered 5% as “full employment because in a large economy there will always be people entering/leaving the work force. e.g. The umployment rate at the height of the Great Depression (1933) was 25% but wage income for employed workers also fell 43% between 1929-1933. Things were bad …

    Calculating Inflation is even more complicated – first the (Bureau Labor Statistics) determines the “consumer price index” (CPI) — which is a “measure of the average change over time in the prices paid by urban consumers for a market basket of consumer goods and services.”

    That CPI “basket of goods” contains 85,000 items spread out over various “sectors” of the economy. That number is important – I’ll mention that again later …

    IF the CPI goes up that equals “inflation.” If it goes down that is called “deflation” (last time the U.S. experienced “deflation” was 2009. Unemplyment rate peaked at 9.9% that year.

    BTW: the short explantion of the “Great Recession” revolves around “sub prime mortgages” – not it wasn’t the fault of “free market capitalism” it was “unrestrained greed” feed by poorly thought out gov’ment intervention in the housing market.

    i.e. the gov’ment was REQUIREING banks to give loans to folks that couldn’t afford to pay them back – not surprisingly when the whole thing expolded it caused a lot of problems. It became a worldwide financial crisis because those “sub prime loans” were “securitized” and sold on “exchanges” — again all fed by greed.

    I like to say that the BEST role for “government” to play in the economy is “referee.” To many unintended consequences are possible when the gov’ment starts CHOOSING “winners and losers” on a large scale. Yes, the economy needs “regulation” but NOT “central planning.”

    Unions

    The first labor union in the United States was the “Federal Society of Journeymen Cordwainers, founded in Philadelphia in 1794.”

    The United States was primarily an “agricultural economy” (as in most people working on/around farms) until the early 20th Century. Which kinda meant that the demand for “labor unions” wasn’t high.

    It is interesting that the first labor union was ruled a “criminal conspiracy” in 1806. Functionally that ruling made attempts at “organizing labor” a crime. It wasn’t until 1842 when “precedent” was set “de-criminalizing” union membership.

    AND the history of organized labor is also interesting – but not important at the moment.

    I’m not ignoring the sometimes adversarial relationship between “management’ and “labor.” Just like corporations, “labor unions” are NOT inherently “good” OR “bad.” Ideally “corporation management” and “labor unions” should work together for mutual benefit.

    BUT “greed” is never good. i.e. “Labor” is just as suceptible to “greed” as “management.”

    Both “labor” and “management” will better serve the “organization” if they understand each others function. The relationship is not “zero sum” or even “either/or.”

    Having an understanding of how the “corporation makes money” will help “labor” communicate with “management.” Of course “management” ALSO needs to appreciate the work performed by “labor.” …

    Corporate Profits

    “Large Corporate profitability” tends to involve a LOT of “Generally accepted accounting principles” (GAAP). The point being that a “multi-billion dollar corporation” is going to generate “profits” from a lot of sources.

    Again, if you find “corporate finance” interesting there are a lot of career options. MY guess is that any of the top 10 “most profitable” corporations COULD adjust their profits up or down (using GAAP) without doing anything illegal.

    ANY “global corporation” will have multiple “books” depending on the audience – i.e. one set of “books” for the U.S. Federal gov’ment, one for each State the do business, one for “management decision making”, and the “books” for whatever other nation-states they do business.

    Oh, and remember that 85,000 sources in the CPI “basket?” That large number of sources used to calculate “inflation” kinda makes it hard for any single “corporation” to have a large impact on the number.

    THEN the large number of “corporations” competing in a particular “business sector” makes it even harder for 1 corporation’s profits to impact inflation.

    There are also laws against “price fixing” (the good ol’ “Sherman Antitrust Act”) – so if a bunch of “cereal makers” got together and decided to “raise prices” to increase “corporate prices” the Federal Gov’ment would NOT be pleased.

    The “market” tends to prevent “excess profits” in established industries. Plain old “competition” between corporations will prevent anyone from “price gouging” —

    e.g. I went to the store to buy “breakfast cereal” and there was an entire aisle dedicated to “cereals” at various price ranges – I bought the ‘store brand’ because it was “good enough quality” and 1/3 the price of “brand name”

    MAYBE that “brand name” was engaging in “price gouging” but it is also (probably) a superior product than “store brand” – either way that ONE corporation isn’t goign to impact the CPI/inflation

    Top 6 corporations 2023

    My original thought was “who are these 6 corporations that are supposed to be causing inflation?”

    Well, the MOST profitable corporation in the world is ..(drum roll) Saudi Aramco. Obviously this “oil” stuff is in high demand and, Saudi Arabia has vast oil reserves. BUT they aren’t an American Corporation. I’m also not sure if their “profitability” changes much year over year — Saudi Arabia is a founding member of OPEC. ’nuff said

    Of course “increased energy costs” is a HUGE factor in recent inflation across all sectors of the U.S. economy. IF the top 6 profitable corporations were “energy companies” then MAYBE they would deserve a look to see if they are “price gouging.”

    However, none of the top 10 most profitable corporations are “energy companies.”

    the list:

    1. Apple, Inc
    2. Microsoft, Inc
    3. Alphabet, Inc (Google)
    4. Berkshire Hathaway, Inc (Warren Buffet’s company)
    5. Industrial and Commercial Bank of China
    6. JPMorgan Chase
    7. China Construction Bank
    8. META (Facebook)
    9. NVIDIA Corp
    10. Amazon.com, Inc

    Under “just my opinion” – I’m not a fan of “Apple, Inc’s” products BECAUSE I think they are over-priced and not “developer friendly.” The latest iPhone PROBABLY qualifies as a “luxury” item – but it isn’t a source of “inflation.” They do make very good “consumer electronics” though …

    Looking at the rest of the list – Alphabet, META, and Amazon might actually help LOWER the CPI/inflation by making it easier for OTHER companies to sell products.

    Berkshire and JPMorgan’s profits are very much “stock market” related – which might impact folks retirement planning 401ks but aren’t moving the dial on the CPI

    Corporations 5 and 7 are obviously based in China — one more under “just my opinion” – ANY company data from Chinese corporations requires an asterisk – maybe an “approved by the Chinese Communist Party” disclaimer

    The “global supply line” issues are part of the inflation story – but again, it is hard to blame any single corporation for those issues …

    Supply and Demand

    The “introduction to economics” text book would also have a section talking about the relationship between “supply” of a product and “unfulfilled demand” for a product. e.g. as “Supply” goes up the “unfulfilled demand” goes down.

    The “slightly unintuitive” concept is that “price” is a third variable NOT ALWAYS related to “supply and demand”

    e.g. Q: if “company” raises the price of their product (and keeps supply constant) how will that impact demand? A: it is impossible to tell.

    Remember the cereal aisle – If “company” raises their prices, then customers MIGHT buy a lower priced alternative product or maybe not buy anything at all.

    This is “price elasticity” – and is another subject 😉

    now if there is only ONE company making “product” – and they keep their prices “high” – all that company is doing is encouraging competition to enter the market.

    I’ll point at “personal computer” sales in the early 1990s as a (kind of) recent example – the cost to buy the “parts” for an “IBM PC compatible” personal computer were (relatively low) compared to selling price.

    IBM being “IBM” made the PC a standard piece of office equipment – but in 2005 sold off their “personal computer division” (to a Chinese company – Lenovo)

    The “IBM price point” encouraged a LOT of “PC clone companies” — e.g. Some young college student at the University of Texas started building and selling PCs out of his dorm room in 1984. In 2024 Michael Dell is worth $96.5 billion …

  • Movies, Television, and Streaming

    Correlation never equals causality.

    Maybe that one line sums up “logic 101” and/or “statistics 101.”

    The example I used to hear was that there was a positive correlation between ice cream sales and drowning. As ice cream sales increase so does the number of deaths by drowning.

    BUT eating ice cream does not CAUSE drowning deaths — i.e. when is more ice cream sold? in the summer. When do more people go swimming? in the summer.

    There is also data out there connecting “eating cheese” and “strangulation” — but again, eating cheese does NOT cause strangulation.

    This concept is important – just in general – but also when talking about the rise of “streaming” and “movie theater” attendance.

    Movies

    When going to the “movies” first became a cultural event 100ish years ago it was a much different experience. Back in that “golden era” of movie theaters folks would go as a WEEKLY “family night out” — there might have been a news reel, a cartoon, and then a feature presentation.

    Other “family entertainment” options might have been staying home and listening to the radio. “Live theater”, and musical concerts might have been an option IF they happened to be in town. Back at that time the “Circus” coming to town would have been a much bigger deal.

    The primary source of “news” would have still been print newspapers – and “sports” like boxing, horse racing, baseball, college football were popular – again either on the radio or attending live events.

    BUT “the movies” were the bread and butter of family entertainment.

    Television

    The “golden age of radio” was relatively short – from the late 1920s to the 1950’s. Radio and movies might have been in the same general “entertainment” markets but they are much different “experiences.”

    “Visuals AND sound” tends to beat “just sound” — BUT “going to the movies” would have been an EVENT, while turning on the radio an everyday experience.

    When Television became popular in the 1950s it ended the “golden age” of radio – and also forced the “movie industry” to adapt.

    e.g. hunt up some old “B” Westerns and you’ll discover that they tend to be about an hour long – and the “weekly serial” adventure/cliff hanger shorts tend to be 20 to 45 minutes. Which sounds a LOT like “television” program lengths to the “modern audience.”

    A lot of those “B” Western stars also had radio shows – and the popular show made the jump from radio to television. There was still a sizable market for both television and radio in the early days. The popular shows probably had a comic book and/or daily newspaper comic strip as well.

    The “point” being that folks wanted “entertainment” NOT a specific TYPE of entertainment.

    Television ended the “weekly ritual” of going to the movies.

    The “movie industry” responded by increasing the “production value” of movies. Movies were “bigger” and “better” than television programming.

    The “movie” advantage was still the bigger screen and the EVENT status. The product required to attract the audience into the theaters obviously changed – gimmicks like 3D, “Technicolor”, CinemaScope came and went.

    Now, the one 20th Century invention that can rival television for “cultural impact” is the automobile. I would tend to argue that the increased “mobility” automobiles allowed makes them the most influential and/or culturally transformational. BUT the point is arguable.

    This “automobile” changed “dating and mating” rituals. PART of that change involved “going to the movies.” At the height there were 4,000 “drive in” movie theaters spread across the U.S. (in the 1950s).

    All of those Baby-boomers doing there thing would have found the “drive in” the more economical option. The post war economic boom created “teenagers” would have had “going to the movies” as an option to “get away from parents” and be, well, “teenagers.”

    The “movie theater business” was disrupted by a Supreme Court ruling in 1948. United States v. Paramount on May 4, 1948 effectively ended the “studio system” – “studio” would no longer be allowed to own “theaters.”

    An unintended consequence of ending the “studio system” was that a lot of “talent” was released from contracts, studios opened up their film libraries and/or sold them to television stations. The number of “regular moviegoers” decreased from 90 million in 1948 to 46 million in 1958. Television ownership went from 8,000 in 1946 to 46 million in 1960

    SO if you REALLY want to put a date on the START of the death of the “movie theater business” – May 4, 1948

    Cable, VCRs, DVDs …

    Of course “movie theaters” have had a long slow decline. To coin a phrase: The reports of “movie theater’s death” has been greatly exaggerated …

    Cable TV rolled across the U.S. starting in the 1970’s. HBO came along in 1972.

    “You want romance? In Ridgemont? We can’t even get cable TV here, Stacy, and you want romance!”

    Fast Times at Ridgemont High 1982

    Drive in theaters continued to close – but they haven’t disappeared yet.

    By the 1970’s television had replaced “the movies” in terms of “cultural impact” – BUT the “birth of the blockbuster” illustrated that “the movies” weren’t dead yet.

    Of course the typical “movie theater” has not made a large % of their profits from SHOWING movies for a long time – i.e. theaters tend to make money at the concession stand NOT from ticket sales.

    The fact that “going to the movies” was still a distinct experience from “watching at home”

    Movie studios were gifted a new revenue stream in the 1980s when “VCR” ownership created the “VHS/Video Rental Store.”

    Again, “seeing it in the theater” with a crowd on the big screen with “theater quality sound” is still a distinct experience.

    DVD’s provided superior picture AND sound than VHS – and the DVD quickly replaced the VCR. The “Rental Store” just shifted from VHS tapes to DVD’s.

    BUT the BIG impact of DVD’s was their durability and lightweight. DVDs could be played multiple times with out lose of quality (VHS tapes degraded a little each viewing), AND they could even be safely (cheaply) mailed.

    Netflix started in 1997. The “Reed Hastings/Netflix story” is interesting – but not important at the moment.

    From a “movie theater” point of view – “The Phantom Menace” being released as a “digital” film in 1999 was a “transitional moment.”

    The music industry as a whole bungled their “digital” transition to the point that a couple generations of folks have grown up expecting “music” to be “free.” THAT is a different subject —

    I’ll point out that a “digital product” can easily be reproduced without lose of quality. If I have a “digital” copy of “media” I can easily reproduce exact duplicates. No need for a “manufacturing” and a “shipping” process – just “copy” from 1 location to the new location. Exact copy. Done.

    For the “movie industry” in the short term the transition to “digital” helped lower distribution costs. Copies of films didn’t need to be created and shipped from theater to theater in “cans of film” – just copy the new movie to the digital projector’s hard drive and you are all set.

    The combination of the “home computer” and “internet access” also deserve the “cultural shift” label – but it was really “more of the same” done “faster and cheaper.”

    Streaming

    It is trendy to blame “streaming” movies of the death of “theaters” — but hopefully by this point I’ve made the point that “streaming” is not the CAUSE of the decline of theaters. At best the “rise of streaming” and the “decline of theaters” are correlated – BUT (all together now)

    Correlation never equals causality.

    “Streaming” deserves credit for killing “Movie rental stores” — but the “theater experience” is still the “theater experience”

    MY issue with “going to the theater” is that ticket prices have pretty much kept up with inflation. Which kinda means a generic “family of four” has to take out a small loan to “go to the movies.”

    I’m placing the recent decline in theater attendance on “inflation” and “bad product.”

    Yes, the “movie industry” has been churning out self-righteous garbage NOT “entertainment.”

    BUT there is still a demand for “family friendly entertainment” — “Inside Out 2” setting box office records illustrates my point

    Old Theaters …

    I like not having to wait in line – but also kinda miss the “old theater” feel. That 20 screen “mega plex” is nice but there is still room for renovated “old theaters” if they can be updated without losing their “charm.”

    To be clear the “charm” of old theaters does NOT include “uncomfortable seats” and feet sticking to the floor. If someone tries to “rehab” a theater I’d spend most of the money on the bathrooms and comfortable seating

    Folks need to feel “safe” AND “comfortable” then if the popcorn is a little stale it doesn’t matter …

  • memoirs of an adjunct instructor or What do you mean “full stack developer?”

    During the “great recession” of 2008 I kind of backed into “teaching.”

    The small company where I was the “network technician” for 9+ years wasn’t dying so much as “winding down.” I had ample notice that I was becoming “redundant” – in fact the owner PROBABLY should have “let me go” sooner than he did.

    When I was laid off in 2008 I had been actively searching/”looking for work” for 6+ months – certainly didn’t think I would unemployed for an extended period of time.

    … and a year later I had gone from “applying at companies I want to work for” to “applying to everything I heard about.” When I was offered an “adjunct instructor” position with a “for profit” school in June 2009 – I accepted.

    That first term I taught a “keyboarding class” – which boiled down to watching students follow the programmed instruction. The class was “required” and to be honest there wasn’t any “teaching” involved.

    To be even MORE honest, I probably wasn’t qualified to teach the class – I have an MBA and had multiple CompTIA certs at the time (A+, Network+) – but “keyboarding” at an advanced level isn’t in my skill set.

    BUT I turned in the grades on time, that “1 keyboarding class” grew into teaching CompTIA A+ and Network+ classes (and eventually Security+, and the Microsoft client and server classes at the time). fwiw: I taught the Network+ so many times during that 6 years that I have parts of the book memorized.

    Lessons learned …

    Before I started teaching I had spent 15 years “in the field” – which means I had done the job the students were learning. I was a “computer industry professional teaching adults changing careers how to be ‘computer industry professionals’”

    My FIRST “a ha!” moment was that I was “learning” along with the students. The students were (hopefully) going from “entry level” to “professional” and I was going from “working professional” to “whatever comes next.”

    Knowing “how” to do something will get you a job, but knowing “why” something works is required for “mastery.”

    fwiw: I think this same idea applied to “diagramming sentences” in middle school – to use the language properly it helps to understand what each part does. The fact I don’t remember how to diagram a sentence doesn’t matter.

    The “computer networking” equivalent to “diagramming sentences” is learning the OSI model – i.e. not something you actually use in the real world, but a good way to learn the theory of “computer networking.”

    When I started teaching I was probably at level 7.5 of 10 on my “OSI model” comprehension – after teaching for 6 years I was at a level 9.5 of 10 (10 of 10 would involve having things deeply committed to memory which I do not). All of which is completely useless outside of a classroom …

    Of course most students were coming into the networking class with a “0 of 10” understanding of the OSI model BUT had probably setup their home network/Wi-Fi.

    The same as above applies to my understanding of “TCP/IP networking” and “Cyber Security” in general.

    Book Learning …

    I jumped ship at the “for profit school” I was teaching in 2015 for a number of reasons. MOSTLY it was because of “organizational issues.” I always enjoyed teaching/working with students, but the “writing was on the wall” so to speak.

    I had moved from “adjunct instructor” to “full time director” – but it was painfully obvious I didn’t have a future with the organization. e.g. During my 6 years with the organization we had 4 “campus directors” and 5 “regional directors” — and most of those were “replaced” for reasons OTHER than “promotion.”

    What the “powers the be” were most concerned with was “enrollment numbers” – not education. I appreciate the business side – but when “educated professionals” (i.e. the faculty) are treated like “itinerate labor”, well, the “writing is on the wall.”

    In 2014 “the school” spent a lot of money setting up fiber optic connections and a “teleconferencing room” — which they assured the faculty was for OUR benefit.

    Ok, reality check – yes I understand that “instructors” were their biggest expense. I dealt with other “small colleges” in the last 9 years that were trying to get by with fewer and fewer “full time faculty” – SOME of them ran into “accreditation problems” because of an over reliance on “adjuncts” – I’m not criticizing so much as explaining what the “writing on the wall” said …

    oh, and that writing was probably also saying “get a PhD if you want a full time teaching position” — if “school” would have paid me to continue my education or even just to keep my skills up to date, I might have been interested in staying longer.

    Just in general – an organization’s “employees” are either their “biggest asset” OR their “biggest fixed cost.” From an accounting standpoint both are (probably) true (unless you are “Ivy League” school with a huge endowment). From an “administration” point of view dealing with faculty as “asset” or “fixed cost” says a LOT about the organization — after 6 years it was VERY clear that the “for profit” school looked at instructors as “expensive necessary evils.”

    COVID-19 was the last straw for the campus where I worked. The school still exits but appears to be totally “online” –

    Out of the frying pan …

    I left “for profit school” to go to teach at a “tech bootcamp” — which was jumping from “bad situation to worse situation.”

    The fact I was commuting an hour and a half and was becoming more and more aware of chronic pain in my leg certainly didn’t help.

    fwiw: I will tell anyone that asks that a $20 foam roller changed my life — e.g. “self myofascial release” has general fitness applications.

    I was also a certified “strength conditioning professional” (CSCS) in a different life – so I had a long history of trying to figure out “why I had chronic pain down the side of my leg” – when there was no indication of injury/limit on range of motion.

    Oh, and the “root cause” was tied into that “long commute” – the human body isn’t designed for long periods of “inaction.” The body adapts to the demands/stress placed on it – so if it is “immobile” for long periods of time – it becomes better at being “immobile.” For me that ended up being a constant dull pain down my left leg.

    Being more active and five minutes with the foam roller after my “workout” keeps me relatively pain free (“it isn’t the years, it’s the mileage”).

    ANYWAY – more itinerate level “teaching” gave me time to work on “new skills.”

    I started my “I.T. career” as a “pc repair technician.” The job of “personal computer technician” is going (has gone?) the way of “television repair.”

    Which isn’t good or bad – e.g. “personal computers” aren’t going away anymore than “televisions” have gone away. BUT if you paid “$X” for something you aren’t going to pay “$X” to have it repaired – this is just the old “fix” vs “replace” idea.

    The cell phone as 21st Century “dumb terminal” is becoming reality. BUT the “personal computer” is a general purpose device that can be “office work” machine, “gaming” machine, “audiovisual content creation” machine, or “whatever someone can program it to do” machine. The “primary communication device” might be a cell phone, but there are things a cell phone just doesn’t do very well …

    Meanwhile …

    I updated my “tech skill set” from “A+ Certified PC repair tech” to “networking technician” in the 1990s. Being able to make Cat 5/6 twisted pair patch cables still comes in handy when I’m working on the home network but no one has asked me to install a Novell Netware server recently (or Windows Active Directory for that matter).

    Back before the “world wide web” stand alone applications were the flavor of the week. e.g. If you bought a new PC in 1990 it probably came with an integrated “modem” but not a “network card.” That new PC in 1990 probably also came with some form of “office” software – providing word processing and spreadsheet functions.

    Those “office” apps would have been “stand alone” instances – which needed to be installed and maintained individually on each PC.

    Back in 1990 that application might have been written in C or C++. I taught myself “introductory programming” using Pascal mostly because “Turbo Pascal” came packaged with tools to create “windows” and mouse control. “Pascal” was designed as a “learning language” so it was a little less threatening than C/C++ back in the day …

    random thought: If you wanted “graphical user interface” (GUI) functionality in 1990 you had to write it yourself. One of the big deals with “Microsoft Windows” was that it provided a uniform platform for developers – i.e. developers didn’t have to worry about writing the “GUI operating system hooks” they could just reference the Windows OS.

    Apple Computers also had “developers” for their OS – but philosophically “Apple Computers” sold “hardware with an operating system included” while Microsoft sold “an operating system that would run on x86 hardware” – since x86 hardware was kind of a commodity (read that as MUCH less expensive than “Apple Computers”). The “IBM PC” story that ended up making Microsoft, inc a lot of money. — which was a fun documentary to show students bored of listening to me lecture …

    What users care about is applications/”getting work done” not the underlying operating system. Microsoft also understood the importance of developers creating applications for their platform.

    fwiw: “Microsoft, Inc” started out selling programming/development tools and “backed into” the OS market – which is a different story.

    A lot of “business reference applications” in the early 1990s looked like Microsoft Encarta — they had a “user interface” providing access to a “local database.” — again, one machine, one user at a time, one application.

    N-tier

    Originally the “PC” was called a “micro computer” – the fact that it was self contained/stand alone was a positive selling point. BEFORE the “PC” a larger organization might have had a “terminal” system where a “dumb terminal” allowed access to a “mainframe”/mini computer.

    SO when the “world wide web” happened and “client server” computing became mainstream the concept of “N tier” computing model as a concept became popular.

    N-tier might be a the “presentation” layer/web server, the “business logic” layer/a programming language, and then the “data” layer/a database management system

    Full Stack Developer

    In the 21st Century “stand alone” applications are the exception – and “web applications” the standard.

    Note that applications that allow you to download and install files on a personal computer are better called “subscription verification” applications rather than “N Tier.”

    e.g. Adobe allows folks to download their “Creative Suite” and run the applications on local machines using computing resources from the local machine – BUT when the application starts it verifies that the user has a valid subscription.

    An “N tier” application doesn’t get installed locally – think Instagram or X/Twitter …

    For most “business applications” designing an “N tier” app using “web technologies” is a workable long term solution.

    When we divided the application functionality the “developer” job also differentiated – “front end” for the user facing aspects and “back end” for the database/logic aspects.

    The actual tools/technologies continue to develop – in “general” the “front end” will involve HTML/CSS/JavaScript and the “back end” involves a combination of “server language” and “database management system.”

    Languages

    Java (the language maintained by Oracle not “JavaScript” also known as ECMAscript) has provided “full stack development” tools for almost 30 years. The future of Java is tied into Oracle, Inc but neither is gonna be “obsolete” anytime soon.

    BUT if someone is competent with Java – then they will describe themselves as a “Java developer” – Oracle has respected industry certifications

    I am NOT a “Java developer” – but I don’t come to “bury Java” – if you are a computer science major looking to go work for “large corporation” then learning Java (and picking up a Java certification) is worth your time.

    Microsoft never stopped making “developer tools” – “Visual Studio” is still their flagship product BUT Visual Studio Code is my “go to” (free, multi-platform) programming editor in 2024)

    Of course Microsoft wants developers to develop “Azure applications” in 2024 – C# provides easy access to a lot of those “full stack” features.

    … and I am ALSO not a C# programmer – but there are a lot of C# jobs out there as well (I see C# and other Microsoft ‘full stack’ tech specifically mentioned with Major League Baseball ‘analytics’ jobs and the NFL – so I’m sure the “larger corporate” world has also embraced them)

    JavaScript on the server side has also become popular – Node.js — so it is possible to use JavaScript on the front and back end of an application. opportunities abound

    My first exposure to “server side” programming was PHP – I had read some “C” programming books before stumbling upon PHP, and my first thought was that it looked a lot like “C” – but then MOST computer languages look a lot like “C.”

    PHP tends to be the “P” part of the LAMP stack acronym (“Linux OS, Apache web server, MySQL database, and PHP scripting language”).

    Laravel as a framework is popular in 2024 …

    … for what it is worth MOST of the “web” is probably powered by a combination of JavaScript and PHP – but a lot of the folks using PHP are unaware they are using PHP, i.e. 40%+ of the web is “powered by WordPress.”

    I’ve installed the LAMP stack more times than I can remember – but I don’t do much with PHP except keep it updated … but again, opportunities abound

    Python on the other hand is where I spend a lot of time – I find Django a little irritating, but it is popular. I prefer flask or pyramid for the “back end” and then select a JavaScript front end as needed

    e.g. since I prefer “simplicity” I used “mustache” for template presentation with my “Dad joke” and “Ancient Quote” demo applications

    Python was invented with “ease of learning” as a goal – and for the most part it succeeds. The fact that it can also do everything I need it to do (and more) is also nice 😉 – and yes, jobs, jobs, jobs …

    Databases

    IBM Db2, Oracle, Microsoft SQL server are in the category of “database management system royalty” – obviously they have a vast installation base and “large corporate” customers galore. The folks in charge of those systems tend to call themselves “database managers.” Those database managers probably work with a team of Java developers …

    At the other end of the spectrum the open source project MySQL was “acquired” by Sun Microsystems in 2008 which was then acquired by Oracle in 2010. Both “MySQL” and “Oracle” are popular database system back ends.

    MySQL is an open source project that has been “forked” into the “MariaDB foundation.”

    PostgreSQL is a little more “enterprise database” like – also a popular open source project.

    MongoDB has become popular and is part of its own “full stack” acronym MEAN (MongoDB, Express, Angular, and Node) – MongoDB is a “NoSQL” database which means it is “philosophically” different than the other databases mentioned – making it a great choice for some applications, and not so great for other applications.

    To be honest I’m not REALLY sure if there is a big performance difference between database management back ends. Hardware and storage space are going to matter much more than the database engine itself.

    “Big Corporate Enterprise Computing” users aren’t as concerned with the price of the database system they want rock solid dependability – if there was a Mount Rushmore of database management systems – DB2, Oracle, and Microsoft SQL server would be there …

    … but MariaDB is a good choice for most projects – easy to install, not terribly complicated to use. There is even a nice web front end – phpMyAdmin

    I’m not sure if the term “full stack developer” is gonna stick around though. Designing an easy to use “user interface” is not “easy” to do. Designing (and maintaining) a high performing database back end is also not trivial. There will always be room for specialists.

    “Generalist developer” sounds less “techy” than “full stack developer” – but my guess is that the “full stack” part is going to become superfluous …

  • Plot holes and “Star Wars” …

    “Telling stories” is a euphemism for “lying.”

    Lying” obviously requires a “lie” to build around – with the definition of “lie” (the third definition from Merriam-Webster: “to make an untrue statement with intent to deceive”) being the relevant point.

    Not that INTENT is required. SO it is POSSIBLE for someone to “tell a story” that is not true, and not be “lying.”

    “Telling tall tales” has probably been a kind of “sport” to rascals, rogues, and tramps as long as there have been “rascals, rogues, and tramps.” Maybe a form of good-natured “right of passage” – e.g. think “wide-eyed novice” listening intently to “grizzled veteran” telling “stories” that get more and more “factually challenged.”

    IF at SOME point the “grizzled veteran” passes a point where the “wide eyed novice” gets the joke – then everyone laughs. The “novice” isn’t as wide-eyed and is on their way to “veteran” status.

    (of course if “wide-eyed novice” DOESN’T get the joke – then, well, that is a different problem)

    “Campfire stories” take on a general form. SOMETIMES there is a kernel of truth – i.e. “legends” are born in the “additions” to the TRUE story. It is probably in those “additions” that we can track “cultural value changes.”

    Art reflects …

    Does life imitate art, or does art imitate life?

    And the answer is, well, “yes.”

    We can quickly get lost in definitions – e.g what is “art?” How about if we agree that “art REFLECTS an IDEAL of life.” Art must be “created,” which requires a “creator” — i.e. the “art” reflects the character of the “artist”/creator.

    Creativity is allowing oneself to make mistakes. Art is knowing which ones to keep.

    Scott Adams

    Since the “artist” does not exist in a cultural vacuum the “art” ends up reflecting the society in which the artist lives.

    Plot and Story

    The difference between “plot” and “story” is that “plot” requires causality.

    e.g. “A” happens, then “B” happens, then “C” happens is a “story” but NOT a plot.

    If “A” happens, then “B” happens BECAUSE of A, and then “C” happens because of “B” (or “A” or “A&B” – depending on just how complicated you wanna get) – THAT is “plot”

    Someone “telling stories” will have a “plot” but there will be intentional “plot holes” testing the listener’s level of gullibility.

    e.g. grizzled veteran: “There I was – just me and my horse, supplies running out, horse almost dead. Suddenly, I was attacked by a gang of 40 cut-throats that would kill me just for my boots.

    I shot the nearest on in the leg, jumped on my horse and headed up the mountain. Now, those cut-throats were REALLY angry and were threatening to bury me up to my neck and leave me to die. SO I managed to find a small cave where they could only get at me 1 or 2 at a time – let my horse go and waited for them to find me. I was down to just 3 bullets and my knife.

    Sure enough, they found me, and then …”

    wide eyed novice: ” … and then?”

    grizzled veteran: “well, I died of course” (laughter, insults, etc)

    (and when that former wide-eyed novice has become “grizzled veteran” they will probably tell the same story to the next batch of wide-eyed novices …)

    Stories …

    If everyone involved KNOWS the story being told is just a “story” then the audience can willingly engage in “suspension of disbelief” and just enjoy the story.

    The required amount of “disbelief” will obviously vary based on genres. The folks “performing” aren’t “intentionally” trying to deceive they are engaging in “storytelling.”

    e.g. the audience at a performance of Hamlet doesn’t ACTUALLY believe that they are watching a “Prince of Denmark” wrestling with the fact that his Uncle may or may not have murdered the former King (Hamlet’s father). Hopefully, the audience puts aside “critical thinking” and plays along with the story.

    Obviously the folks putting on the performance try their best to be convincing. The highest praise that can be given to a “working actor” MIGHT be that they are ALWAYS “convincing” no matter what role they are playing.

    (fwiw: playing “Hamlet” is considered a test of an actor’s acting ability – this is probably why you see so many “famous movie stars” attempt the roll. I have seen a LOT of versions of Hamlet – and most of them are “ok.”

    If I’m watching “Hamlet” and I think “that is so and so TRYING to do Hamlet” – then that qualifies as an “ok performance” — but if I forget that it is “BIG NAME” playing Hamlet, then that is “VERY good” performance … and moving on)

    Random thought: Strange Brew (1983) borrows plot elements from Hamlet – catching the “Hamlet” references elevated the movie from “cute buddy comedy” to “funny at multiple levels” – and yes, INTENTIONAL plot holes-a-plenty …

    Star Wars plot holes …

    I have been re-examining WHY I loved the original “Star Wars” trilogy. In part this is because of the “fan reaction” to the latest “Star Wars product.”

    Apparently others have done this “re-examination” as well. One such re-examination was trying to point out “plot holes” in “Star Wars” (1977)

    In particular they didn’t like the fact that if the “Empire” had blown up the “escape pod” at the beginning the movie ends there. i.e. blow up the escape pod with R2-D2 and C-3P0 and the story ends there.

    BUT that is NOT a “plot hole” – yes, the movie turns on that point BUT it also helps establish that the “Empire” are the bad guys.

    The scene could easily have been taken out – but it serves a “storytelling” purpose. The “Empire” is the “evil authoritarian organization” – notice that the anonymous characters WOULD have blown up the “escape pod” IF they had detected “life forms.” i.e. the anonymous character’s (lack of) action illustrates that “fate”/luck is gonna be part of the story.

    “Fate” interferes throughout “Star Wars” – with “Stormtrooper’s” marksmanship being another great example (e.g. they are extremely precise when shooting at “not major characters” but can’t hit anything important when “major character” is involved)

    Now, if the movie was trying to be “gritty and realistic” then “fate interfering” might constitute “plot hole.”

    I also like to point out that R2-D2 in the “Star Wars universe” is an “agent of fate” or the “finger of the divine” — apparently immortal and all-knowing. Seriously, notice how many times R2 is instrumental in things “working out” for the heroes.

    Sure, R2 get “blown up” a lot – but always returns good as new. If “Star Wars” was hard core science fiction THAT would be a HUGE plot hole – but since it is a space fairy tale set in a galaxy far, far away, just part of the suspension of disbelief.

    BUT if you want to talk about REAL plot-holes – I have always been (mildly) bothered by the fact that after the heroes escape the Death Star – and KNOW they are being tracked – that they (apparently) go straight to the Rebel Base.

    By this point George Lucas has done a masterful job of storytelling – and the fact that the Empire easily tracks the heroes to the Rebel Base – setting up the climactic battle – is easily overlooked.

    Ok, Leia tells Han they are being tracked – Han doesn’t believe her, but even if there is a slight possibility of them being tracked then they should logically have gone ANYWHERE else except the Rebel Base.

    THEN when they are far away from danger AND the Rebel Base – they could have easily transferred the data as required. Or maybe find the tracking device – and send it ANYWHERE else than the Rebel Base.

    “You’re going to need a bigger boat.”

    Chief Brody

    The “Battle of Yavin” is kind like the oxygen tank exploding at the end of Jaws (1975). If the audience has to THINK about it, then it becomes a problem.

    If we have been guided along properly then we are probably “all in” on that plot hole. The plot hole goes completely unnoticed and even gets cheered when told by “expert storyteller.”

    I suppose “storytelling 101” always starts with some form of “show don’t tell” – if the “plot” requires 120 minutes of talking heads then you are telling a much different type of story that if you have “action”/pause/more action/short pause/etc.

    none of this is a secret. The audience expectations on the ratio of “drama” to “relief” is determined by genre — if you are doing “romantic comedy period piece” then long periods of “talking heads” is expected, BUT if you are doing “space fairy tale” then keep the “talking heads delivery exposition” to a minimum …

    it is the genre, silly …

    I’m also fond of pointing out that their is plenty of room for different stories and genres – but trying to fit “agenda” into “genre” is almost always a recipe for commercial failure.

    random thought: a famous “hamburger chain” started offering salads back in the late 1980’s. I think they were responding to “market demand” for “healthier” options. They are a world wide operation that regularly introduces new items to their menu – so offering salads wasn’t a “bad” idea

    the funny thing was that those “hamburger chain salads” could be LESS healthy than the “regular menu” (with salad it is usually the “dressing” that becomes the problem – which had a lot of fat and calories …)

    the same chain sells a “fish sandwich” – that is very popular but definitely NOT the “healthy option”

    HOWEVER “hamburger chain” never lost sight of the fact that their core product is “meat and potatoes” – they make $$ selling hamburgers and fries

    NOW imagine that the “hamburger chain” powers that be decide to turn the menu over to someone that HATES hamburgers and fries – or thinks that “salads” are why people go to “hamburger chain” – well, things aren’t going to go well

    the “new menu maker” might blame the customer for them NOT wanting to eat bad salads instead of hamburgers – but that is not gonna change the customers preference.

    “New menu maker” will almost certainly get bombarded with criticism from lovers of “hamburger and fries” – and sales/profits will plummet.

    Of course the folks that hired “new menu maker” will defend their decision – but that just means that THEY are (probably) the franchises (REAL) problem not the “new menu maker” and certainly NOT the fans …

    if you want another “movie franchise” example – compare and contrast the first “Matrix” (1999) with “Matrix Resurrections” (2021) – notice the difference in the ratio of “action” to “exposition” …

  • What is the purpose of amateur sports?

    Maybe the first question becomes “Do amateur sports have a purpose?”

    The numbers fluctuate but there are AROUND 1 million high school football players each year in the United States.

    Around 7.8% of those high school football players will play in college (at any level).

    Less than 0.5% of those college players will make an NFL roster.

    For baseball the percentages are even worse – 1 in 200 high school players will get drafted to play “professional baseball” (around 0.05% – yes, that means “minor leagues”).

    Around 1% of high school basketball players will play Division I college basketball. Out of every 10,000 High School basketball players 2 or 3 will play in the NBA.

    The point being that if “getting a scholarship” or “going pro” is the “purpose” of playing amateur sports – then a large number of athletes are chasing a fantasy.

    BUT are those “ordinary players” wasting their time playing a sport? Oh, and what about those sports where “going pro” isn’t an option?

    Purpose

    In the U.S. “organized amateur sports” tend to be associated with secondary education/”high schools.”

    The “why” sports are associated with high schools has a lot to do with “organization” by proximity. After the Civil War “disorganized” sports began popping up. Those early ‘amateur athletics’ weren’t much more that ‘pickup games’ with the teams representing “communities.”

    The “point” of those games was simply friendly competition and entertainment.

    Does “competition” have a purpose? Well, the short answer is “yes.”

    Iron sharpeneth iron; so a man sharpeneth the countenance of his friend.

    Proverbs 27:17

    BUT there are “healthy” and “unhealthy” variants of “competition.”

    The goal of ANY competition is NOT just to “win” but to “win within the rules.” HEALTHY competition will make everyone involved “better” – in that Proverbs 27:17 way.

    UNHEALTHY competition is the “law of the jungle” or “winning at any cost.” This isn’t just “cheating” but also potentially trying to harm the opposition.

    To be clear, there is a BIG difference between “competing hard” and “winning at any cost.” Wanting to win isn’t wrong, but being so obsessed with winning that you are willing to “cheat” is missing the point of the competition.

    An individual’s “self worth” should NEVER come from winning an athletic contest. The individual has inherent worth because they are a human being NOT because they are good at “sport ball.”

    The players will change but the sport and/or team will continue. Which means in the grand scheme of things victory is never “total” and defeat is never “final.”

    Losing a “sport ball” contest does NOT diminish a human beings worth. Winning does not excuse bad behavior.

    Teenagers

    In the middle of the 20th Century the post WW2 baby boom and economic prosperity helped create a new demographic called “teenagers.”

    Yes, there have always been 13 to 19 year olds – but in the 1950s they got disposable income and cars. Along with rock & roll music came “organized high school sports.”

    In general terms the core motivation of ‘administrators’ organizing those high school sports was (and still is) the welfare of the “student athlete.”

    Establishing “rules” for sports, certifying “officials” to enforce those rules, and then providing a structure for HEALTHY competition required “organization.”

    i.e. the students were going to compete, “organizing” the competition helped keep that competition healthy. To keep competition “fair” things like “divisions” and “age restrictions” were also required.

    Fast forward 70+ years and “scholastic sports” is a massive industry. However, the PURPOSE of that industry is still healthy (fair) competition.

    The joy of competition comes from preparing and then competing. Having a competition goal, putting in the time and effort to prepare for that competition, and then competing teaches a long list of positives. Winning a close contest against an opponent of equal ability is satisfying BUT losing a close contest to “honorable opponent” is NOT dissatisfying (disappointing? yes – but the “joy” comes from preparing and competing hard – “winning” is a byproduct of the process)

    Meanwhile dominating an outclassed opponent is about as satisfying as taking out the garbage. Something was accomplished, but there isn’t a great deal of “joy” involved.

    Respecting and liking an opponent just makes beating them more fun. If the opponent is inept or “out of their league,” then beating them isn’t particularly satisfying …

    Fair?

    I’ve thrown that term “fair” out there several times – what does it mean?

    Well, “fair competition” is between “peers”/equals. This is obviously why there are “weight classes” and “age divisions” in sports like boxing and wrestling.

    Again, the point of “competition” is to push each other to higher levels NOT just “winning.”

    An athlete that intentionally goes in search of “less skilled” opponents for easy victories will never be forced to “push themselves.”

    One more time – no human beings “purpose” is “beating up on lower skilled opponents.” The “athlete” that INTENTIONALLY seeks out a lower level of competition has once again missed the point or lost their way.

    Lessons learned from competition

    I am always quick to point out that the most valuable thing I learned from “amateur sports” was that “success” is a process.

    Setting a goal, coming up with a plan to achieve that goal, and then following through on the plan are “transferable” life skills.

    Of course OTHER folks doing the same thing will mean that sometimes you get knocked on your duff – however you get the chance to get back up or you can stay “knocked down.”

    “I don’t pity any man who does hard work worth doing. I admire him. I pity the creature who does not work, at whichever end of the social scale he may regard himself as being.” 

    Theodore Roosevelt

    Healthy competition in TEAM sports provides obvious life lessons – with positive socialization, and working together towards a common goal immediately coming to mind.

    BUT remember UNHEALTHY competition involves trying to “win at any cost” and disrespecting the opposition.

    “Winning by cheating” is by definition self-destructive. Unethical competition might work in the “short term” but “being a jerk” will catch up with them eventually …

    I understand there are “well intended” folks that push various flavors of “non competitive” sports. If the goal of the “event” is “socialization” and/or “exercise” then running around on a field for 40 minutes might be useful.

    There is no reason to keep score at such events OR give EVERYONE a trophy at the end of the year. Non-competition means “no winners” NOT “everyone is a winner.”

    I’m not a big fan of “organized youth sports” (whatever age that may be). Organization will always imply competition of some kind. If the lesson learned is “I win by doing nothing but showing up” then “they” are creating self-esteem sinkholes not healthy individuals.

    But of course “youth sports” can be a good or a bad experience for the “youths” BUT the “youths” should be the focus.

    random thought: From an “athletic standpoint” – the “future professional athlete” is probably exceptional at every level they participate. However that doesn’t mean that they are exceptional BECAUSE they started playing “sport ball” before they could walk …

    ANYWAY

    Sports was/is the original “reality” television – amateur sports have a larger purpose only to the point that the teach a work-ethic and social skills. Participating (or NOT participating) in “sport” will never impact the “value” of an individual as a human being.

    The opportunity to compete against peers is “positive” on a grand scale. While claiming that “unfair competition” must be allowed so that “fraction of society” can feel “good” about themselves is counter-productive on a grand scale ….

  • Mr. Shakespeare, marketing, and the “Western”

    A lifetime ago I worked as a “student employee” as an undergrad. I was helping out the “system administration” folks – and ended up doing low level “desktop support” for faculty members.

    random thought: I remember running the big ol’ suit case size “VHS video” camera when they gave a presentation about this new “internet” thing that the college was joining. That was “pre – world wide web” and you needed to use “command line” utilities to move around.

    Thinking back to that presentation – the presenter was talking about using FTP and email (again, there was an “Internet” before there was the “world wide web”). One of the sites they talked about was in London (England) and you could download the complete works of Shakespeare!

    Needless to say, I was impressed – but at that time the “general public” didn’t have access to the Internet. Only military bases and academic institutions were granted access – but the network was growing.

    As I remember the debate – the folks running “academic institutions” seemed to think that if the Internet was opened up to the “general public” it would be overrun by advertisers/porn/spam – and of course they were correct. BUT what really caused the Internet to explode was making it “easy to use” for non-computer experts – i.e. the “world wide web.”

    Hamlet and John Wayne

    ANYWAY – one of the “faculty members” whose office computer I visited way-back-when was in the “theater” department. He had pictures of Hamlet AND John Wayne on his wall.

    I had read Hamlet (for the first time) when I was in the Army, and grew up a John Wayne fan – so I asked him about the pictures. Obviously the Prof new much more about both than I did at the time – as I remember it he said something like “Shakespeare is a lot more ‘rough and tumble’ than you might think” – and also John Wayne more complex.

    Fast forward a lifetime of study — and Mr Shakespeare and John Wayne were both working within “frameworks” catering to an audience. Mr Shakespeare wanted folks to buy tickets to performances of his plays, and Mr Wayne wanted folks to buy tickets to watch his movies.

    BOTH were working in “genres.” John Wayne is most remembered for his work in “westerns” but he made a lot of “war” movies and a handful of “detective” movies – e.g. 184 credits listed on IMDB.

    random thought: the joke was that John Wayne played the same character in every movie – i.e. “John Wayne” – which is a little unfair, but “funny because of the truth involved.” Mr Wayne’s Academy Award winning performance was playing a very NOT “John Wayne” roll – Rooster Cogburn in “True Grit” (1969)

    random thought part 2: at the moment I can only think of 2 “fictional John Wayne character names” – Ethan Edwards in “The Searchers”(1956) and Rooster Cogburn – illustrating that “John Wayne” was what audiences paid to see … of course he also played Davy Crockett in “The Alamo” (1960) and Ghengis Khan in “The Conqueror” (1956) — yes, that was John Wayne as the Great Khan – mid-western drawl and all (not one of his better movies)

    The “genres” Mr Shakespeare was dealing with were PRIMARILY designed to attract an audience. e.g. early on the audience would have gone to a “comedy”/”tragedy” or a “history” play not specifically a play by “William Shakespeare”

    The super short “intro to Shakespeare” class would point out that what distinguished “comedies” and “tragedies” was the ending of the play – a comedy would end at the altar (folks getting married) and the tragedy would end at the crypt (folks dead).

    The “histories” were similar to what we expect from modern “biopics” – they covered “themes” but weren’t always exactly “true.” More “based on a true event” than “actually true.” Again, Mr. Shakespeare was writing for an AUDIENCE – not pushing any agenda (except maybe “sell tickets”).

    Go beyond the “intro” level and Mr Shakespeare’s comedies changed over the course of his career. The “early comedies” might have a “fantasy” aspect (e.g. “A Midsummer Night’s Dream” – the “lovers” go into the forest, things get weird, but are sorted out for a happy resolution in the morning). The “late romances” would have “fantasy” aspects core to the story (e.g. “The Tempest” – Prospero is literally a “wizard” with a “spirit servant” – but things also happily sort themselves out by the end).

    The “entertainment industry” of the Elizabethan era being what it was – Mr Shakespeare wouldn’t have been able to remain a going concern without “patrons” backing his work. i.e. there was no “long tail” market – no “sub rights” to sell.

    I’ve never seen an in depth analysis or a “profit and loss” statement from Shakespeare’s time — I don’t think the “patrons” expected to get a return on their investment OTHER than good seats at play performances. The fact that Mr Shakespeare “retired” at 47 implies the plays were commercially successful (and he died at 52).

    random thought: the death of cause of death for Mr. Shakespeare is still a mystery. There are theories that he died after a drunken binge, that he had syphilis, or he might have been murdered! BUT it was 1616, who knows …

    the “Western” …

    ANYWAY – someone (recently) came up with a “greatest western movies” of all time type list. All such lists tend to be a little “arbitrary” – but also tend to be “interesting.” The list itself wasn’t what caught my attention – i.e. just what makes a “western” a “western?”

    When Mr Shakespeare died, “working in the entertainment industry” wasn’t a highly esteemed profession. When he died the funeral was on a “wealthy local retiree” not “celebrity.” Literary immortality for Mr Shakespeare happened AFTER his death when his friends and admirers collected his works for publication.

    Remember that “movable type printing” was perfected 150 years or so earlier – so it was an established technology but more importantly there was a growing market for “printed books.”

    What does that have to do with “westerns?” Well, multiple zeitgeists probably collided in the last half of the 19th Century – the industrial revolution increased city populations, gave folks more “free time”, and increased disposable/discretionary income (as opposed to agricultural work).

    Combine that with “public education” – and you have what the corporate types would call a “growing market segment” – i.e. folks with money in their pocket looking for something to buy.

    Random thought: ANOTHER “old prof” back in the day liked to point out that the “printing press” had a lot of unintended consequences. Their theory was that people stopped “sitting around the fire” telling stories because they had “books” that they could go off and read by themselves – I think the point was that “humans are natural storytellers” or something BUT “fear of public speaking” is always high on the list of “common phobias.”

    random thought part 2: I don’t think people fear “public speaking” what they fear is “being embarrassed in public” – e.g. a certain amount of “stage fright” is kinda required, if the speaker isn’t a LITTLE worried then they will be exceptionally boring – as everyone that has had to listen to “boring speaker” drone on, and on, and on understands … BUT “boring” might come from arrogance OR lack of preparation – neither of which is predestined

    SO “lower cost printing” meets “public demand” and the “pulp magazines” were born. The “pulp” part was a reference to the low quality paper used in the printing process – and the content tended to be of similar quality.

    Now, “sex” and “violence” are part of human history — just having “sex and violence” in a book doesn’t make it “low quality”, it obviously depends on how the “sex and violence” is presented.

    If you have some form of “action/consequence” then you MIGHT have a work of “high literary quality” BUT if the work is just “descriptions of explicit sex” polite society might call that “pornography.”

    Same idea with “violence” – and I will wave at the trend of “violence porn” without comment beyond it might have some sex/nudity, but is just “pointless violence.”

    I seem to remember hearing that Sam Peckinpah got criticized for showing “blood” in “The Wild Bunch” back in 1969 (which really just looks like ketchup on shirts) – umm, slippery slope and all that …

    MEANWHILE …

    “Pulp” magazines needed content and humans have always loved reading/haring about “exotic locations” so the “American West” after the Civil War was the source of a LOT of “colorful pseudo historical” characters.

    William “Buffalo Bill” Cody and his “Wild West Show” helped create the specific “idea” of the “western” as a distinct genre. But Buffalo Bill serves as an example of the trend – not the source.

    The world’s first “modern celebrity” was Samuel Clemens (Mark Twain) – the quintessential storyteller, both in print and on stage. Mr Clemens was more famous as “travel writer” during his lifetime than for “Huckleberry Fin” – “Roughing It” (published in 1872) was his semi-autobiographical contribution to “books about the west.”

    Again, Mark Twain is an example not the source. The IDEA of a “frontier” separating “polite society” from the “unknown” is (probably) as old as human beings.

    Even the “idea” of “the west” as being “unknown”/terra incognita goes back to “ancient times.” My pet theory is that this “west” as “frontier” involves the rising of the sun (in the “east”) and the setting of the sun (in the “west”) – but I’m just guessing …

    The specific “western frontier” for the United States is obviously based on the fact the the original “13 Colonies” were on the eastern coast of the continent.

    Expansion “west” was initially a slow process for “American History class” reasons. This is where we start bumping up against the problems defining the “western” genre.

    Stories set in “Colonial Times”, “Pioneer Times” (the initial slow move west), and the Civil War period, PROBABLY don’t fit into a narrow definition of “the western.”

    e.g. at one point Ohio was the “western frontier” – and having grown up and living in Ohio I can say we have a lot of “history” – the story of “Blue Jacket” and the Shawnee people is historically interesting – I’m just not calling it a “western” …

    Pop Culture

    The U.S. Bureau of the Census declared the “frontier” closed in 1890 (as in “no longer a discernible demarcation between frontier and settlement”).

    Not surprisingly, the “western” in pop culture became popular AFTER the frontier closed. Again, folks looking for “entertainment” tend to look to the “unknown”/unusual – i.e. if you were living on the “frontier” you probably didn’t have much interest in reading first hand accounts of “frontier life” – even if they were available.

    The “American Wild West” period is usually dated from “after the Civil War” (1865ish) to the turn of the century.

    Zane Grey published his first novel in 1903. Mr. Grey’s name is synonymous with “western” – but again, SOME of his stories could be more accurately called “frontier”/pioneer stories.

    “Max Brand” however was a pen name for Frederick Schiller Faust. Mr Faust wrote 300+ novels under various pen names – “Max Brand” was pure “western” genre written in a “pulp” fashion.

    Then Louis L’Amour (200 million books sold) started writing when the “western” was a fully formed pop culture concept. Mr L’Amour preferred saying he wrote “western stories” not “westerns” — which brings us back to the initial problem …

    Radio, Movies, and TV …

    All of this talk about “literary genres” is nice – but it is all precursor to the TRULY mass media of modern times.

    The western quickly found its way to the silver screen. The “B” western being a great example of “pulp western” plots with visuals.

    Radio brought the western into folks homes – “Return with us now to those thrilling days of yesteryear …” – e.g. both the Lone Ranger and Gunsmoke started out as “radio shows”

    When sound and pictures came into folks homes – so did the western. With the 1950s being the “golden age” of TV westerns — which is another subject …

    Two World Wars and millions of Americans going overseas would change American society, and the “western” changed with it.

    The movies labelled “spaghetti westerns” (in the late 1960’s and 1970s) were truly “multinational” projects – the “man with no name” trilogy being a good example – filmed in Spain, Italian director, American actors. The legend is that the multinational cast members would say their lines in their native language, and then be dubbed over as needed – which gives the films a VERY distinctive look …

    random thought: The fact the several of Akira Kurosawa’s samurai movies were made into “westerns” illustrates both “underlying themes” AND the versatility of the “western” as a genre – both “The Magnificent Seven” and “A Fistful of Dollars” are based on Kurosawa movies (though Sergio Leone denied the connection).

    Did the western die?

    There was almost a decade gap between “The Outlaw Jose Wales” (1976) and “Pale Rider”/”Silverado”/”Rustlers’ Rhapsody” (all 1985).

    Did the “western” die? Well, if you define “western” as a story with “cowboy hats and horses in a specific time period” then the answer is “maybe.”

    From a “movie business” point of view – when a large % of TV shows were westerns and multiple “westerns” would be released each year then the “cost of production” for a “western” wasn’t particularly high compared to a “non western.”

    i.e. a lot of sets could be reused and “talent” was available – so “movie company” could “send the crew” out to the “back lot” and make a movie on time and under budget.

    BUT if everything has to be built from scratch and talent selected/hired – well, things get expensive/”unprofitable” fast.

    SO it would be more accurate to say that the “western” fell out of fashion much more than “died.”

    Some other movie franchises were also wildly popular at the time (“Star Wars” 1977, “Empire Strikes Back” 1980, and “Return of the Jedi” 1983). “Raiders of the Lost Arc” (1981) has a LOT of “western” elements but isn’t a “western.”

    The 1980’s “action movie” isn’t TOO far removed from “pulp western” plots. Clint Eastwood’s career is intertwined with the “western” — I like to point out that “Dirty” Harry Callahan is basically the “man with no name” as “Police Detective” and a bureaucracy …

    the stories we tell …

    All of which means the “western” as a genre is a little hard to define – AND that it isn’t going away anytime soon because it is part of the “American myth” and “foundation legend”

    I should point out the difference between “myth” (completely fabricated) and “legend” (there is a “historic source” but stuff has been added over the years).

    e.g. the story of King Arthur and the Knights of the Round Table is the stuff of “legend” – i.e. there PROBABLY was a historic source for “Arthur” but the story as it is told today says more about the people telling the story than it does about that historic figure.

    e.g. there is apparently no historic basis for “Robin Hood and his Merry Men” – but it does help explain how the U.K. became the U.K. so we could call it a “modern myth”

    The “western” is both “myth” AND “legend” —

    The “myth” might sound like “plucky pioneers endured hardship, overcame nature, with the intent of building a nation” — which isn’t totally “false” but if you had interviewed the folks “going west” they were PROBABLY doing it MOSTLY out of their own self-interest not pursuing some grand ideal of a new nation.

    The number of “western legends” is legion – Davy Crockett swinging his rifle (“Betsy”) on the parapets of the Alamo immediately comes to mind.

    ANY “quick draw gun fight” story is pure “legend” (e.g. Wyatt Earp’s advice for a gun fight was: “take your time and hit what you are aiming at” – which is much easier said than done …).

    Billy the Kid as “frontier Robin Hood” had as much truth in it as “Robin Hood.” Henry McCarty was a real person – but more thug than folk hero. fwiw: he pops up in the (I enjoyed it) movie “Old Henry” (2021) –

    while I’m at it, Wyatt Earp was an interesting individual – but nothing like the classic TV series “The Life and Legend of Wyatt Earp” — again, THAT story says much more about 1950’s America than the real live Wyatt Earp …

    I could go on, but won’t 😉

    find me on linkedin

  • genre twists and franchise changes

    Re-watched the original “Mad Max” (1979) – available on various “streaming services.”

    Now, the ORIGINAL “Mad Max” was/is a “low budget” Australian movie. It didn’t get “distributed” in the U.S. “back in the day” – which was why “Mad Max 2” (1981) was released as “The Road Warrior” (1982) in the U.S.

    The “low budget” nature distracted me when I watched “Mad Max” on home video (probably in the late 1980s). I’m guessing that the version I saw had been “edited” somewhere along the way – because (if memory serves) it was shorter than 90 minutes.

    There is a section of the movie where they establish the “bad guys” as VERY bad — which (when it was obvious what was going on and that it was going to last a while) I fast forwarded through this time around – it wasn’t “explicit” so much as “unpleasant.”

    The “low budget” nature of the movie precluded the sort of “makeup” effects common in movies. I was reminded of Oedipus Rex (the Ancient Greek play) – there was plenty of “implied off stage” violence – but they didn’t/couldn’t show it ON stage.

    The often replayed scene from “Mad Max” is the finale – where Max comes across the last “bad guy” (who has obviously just murdered someone and is trying to steal the dead man’s boots). No spoiler – the “bad” guy (who Max had arrested earlier in the movie and then the “courts” released) pleads for his life saying that he is “sick” and that the “court says I’m not responsible for my actions.”

    Yeah, Max gives the guy a choice – and then drives away. Remember “Mad Max” is set in a “dystopian future” but it reflects a “society without the rule of law.” “Max” crosses the “line” but only after he has been driven to it by the (VERY) bad guys.

    good guys vs bad guys

    “Mad Max” unintentionally hit a lot of the “mythic storytelling” points – and then they INTENTIONALLY hit more of those “mythic hero story” elements in “The Road Warrior.”

    In true “vengeance genre” fashion Max is the “good man” pushed “too far” who then takes matters into his own hands.

    Charles Bronson made a LOT of movies (161 credits on IMDB) – some of those movies are very good – “The Magnificent 7”, “The Great Escape”, “The Dirty Dozen”, and “Once Upon a Time in the West.” If Mr Bronson had stopped making movies (all of those mentioned were made in the 1960s) he would deserve a place in the “Action movie Hall of Fame”

    (random thought: if there isn’t an “Action movie Hall of Fame” there needs to be …)

    BUT then the 1970s happened – the same decade that would give us “The Godfather”, “Jaws”, and “Star Wars” gave us “Death Wish” (1974).

    I have to admit that I have NOT seen the original “Death Wish.” I saw one of the sequels when it was on cable – but by that time the 1980’s action movie and “horror” films had made the “one man on a vengeance mission” even MORE cliche.

    Vengeance is Mine, and recompense;
    Their foot shall slip in due time;
    For the day of their calamity is at hand,
    And the things to come hasten upon them.

    Deuteronomy 32:35

    BUT again, Mr Bronson played the “good guy pushed too far.”

    fwiw: the Judeo Christian “turn the other cheek” ethic doesn’t mean the “bad guys” get away with anything – e.g. the pull quote … ’nuff said

    random thought: A character in the “Dirty Dozen” THINKS he is the “hand of God” carrying out punishment – but the character is nuts

    ANYWAY The fact that there were 5 “Death Wish” movies says something about the business of low-quality exploitation movies than anything (people kept buying tickets, the movies kept making a profit, they kept making more sequels) – but “human vengeance” is never finished might be the message (if there is a message …)

    Dwayne Johnson (“The Rock”) made a “vengeance genre” flick called “Faster” (2010) which drives home the unending nature of “vengeance” — so the movie becomes a good example of “twisting” a genre a little. All of the “vengeance” elements are there AND they added some “philosophical meat” – Google tells me the movie made a small profit, but wasn’t one of Mr Johnson’s bigger “box office” hits

    The MBA in me wants to point out that Faster made an $11 million profit on a $24 million budget so the return on investment (ROI) as a % might have been higher than some of those close to $billion box office movies.

    random thought: that “low budget” but high ROI % was where “Hollywood schlock” legend Roger Corman made a living – Google tells me he had an estimated net worth of $200 million when he died in May 2024 …

    the repentant gunfighter

    IF the “good guys” act just like the “bad guys” what is the difference between the two?

    Well, that is a good question. No, I’m not going to try to summarize all of human existence/experience.

    From a MOVIE morality point of view the difference is “intent” and “motivation.”

    e.g. Max does what he does BECAUSE of what the “bad guys” did. The bad guys did what THEY did because, well, they are “bad.”

    The “psych 101” concept of a “sociopath” involves not feeling remorse. Ever. If “sociopath” gets caught doing “bad thing” then they might feel bad about being “caught” but not for what they did.

    This idea is the “psychology” behind the “repentant gunfighter” genre. “Shane” (1953) is a classic example (of course the book is “better” but the movie is good in its own right).

    e.g. it is implied that “Shane” had done a lot of “bad things” until he decided he wouldn’t. Shane “turned away” from being a gun for hire … and “plot happens” … and Shane has to face another “gun for hire” in the climax.

    The implied difference between “Shane” and the “bad gunfighter” (played by Jack Palance) is that the “bud guy” enjoys killing, and Shane is a “soldier” doing a required task (and he is just very good at the task).

    The legend of John Henry “Doc” Holliday comes to mind. Ol’ Doc was a dentist until he came down with tuberculosis. Since no one wants to go to a dentist with tuberculosis, Doc became a professional gambler and (sometimes) gunfighter.

    His expectation being that one day he would get into a gunfight with someone faster or more accurate than him and the tuberculosis would no longer be a problem. His final words (as he was dying of tuberculosis in a hospital bed) was “This is funny.” c’est la vie

    The important part of the above is that the sociopath (by definition) cannot be “rehabilitated” because they never feel remorse – they can never “repent” because (in their head) they have no reason to “repent.”

    There are a lot of “click bait” sociopath tests that might be amusing – but if you want to know if someone is a “sociopath” all you need to do is ask them. They will (probably) gladly tell you that EVERYONE thinks/acts they way they do and if someone doesn’t, well, they are fools.

    BUT be careful, “sociopaths” (by definition) are also master manipulators – but it is hard to “hide” sociopathic behavior. Paying more attention to what folks “do” more than what they “say” is always good advice, but especially true of “sociopaths”

    … and the “good guy” always understands that (but doesn’t enjoy it)

    “You can’t serve a writ to a rat”

    – Rooster Cogburn

    Oh, and I’ll kind of wave in the direction of “The Outfit” (2022) as another example of the “repentant gunfighter” genre with a “twist” …

    franchises

    The entire concept of a “franchise business” is that customers know what to expect. The “franchise” provides information on “processes” as well as “resources” and (probably) marketing on a large scale.

    e.g. if you go into ANY establishment calling itself a “coffee shop” you expect certain things – obviously a variety of “coffee” and probably some sort of pastry/sandwich selection.

    BUT if you go into a “Starbucks” franchise the expectations will be for specific drinks and food prepared in a uniform manner. The idea being that visiting a “Starbucks” franchise in Los Angeles should be a similar experience to visiting a “Starbucks” franchise in Roanoke (or pick any other location).

    The Starbucks folks might say they are selling an “experience” BUT the true value of being a franchise is probably in the “name recognition.”

    If you try to open a coffee shop that looks just like “Starbucks” but isn’t – if/when they find out about it – the legal department at Starbucks, Inc will send you a nice letter telling you that you are violating various laws and you should cease and desist

    The “franchise” problem becomes that just “looking like a Starbucks” does not guarantee the coffee/food will meet expectations. There are around 16,000 Starbucks in the U.S. and (around) 9,000 of those are run by “corporate.” Those 7,000 other locations are “independently owned and operated” – i.e. THEY might do things slightly different than “corporate” BUT the “core experience” should fall into a certain range of expectations

    SO the same idea holds true for “entertainment franchises.” The problem for “entertainment franchise” is that folks adding to the “franchise” need to understand the “core product.”

    Imagine a group of talented musicians who decide to go on tour with a “Sound of the 1960’s” tour (or pick any decade you like) – folks buying tickets are going to expect what? well, probably music from the 1960s

    Now imagine a group like “1964 The Tribute” – folks buying tickets are going to expect what? Probably music specifically from The Beatles.

    Folks going to a “Tarzan” movie are gonna expect certain “Tarzan” elements – folks going to a “Sherlock Holmes” movie are gonna expect different elements than the Tarzan folks.

    I was trying to think of a “long running” franchise that has stayed true to its “core” and the BEST example I could think of was Scooby-Doo.

    no, seriously – the “core element” of Scooby-Doo has always been a “boy and his dog” — i.e. Shaggy and Scooby are “core elements”, everything else can be added/removed but you always need those two characters — if you try to twist the franchise into “angry girl power show” then, well, you get the “Velma” series – which is only tangentially associated with “Scooby-Doo” as a franchise

    bad product

    I don’t think fans blame “franchise” for “bad product” – again, this is kind of the “franchise” concept we have come to expect.

    Fans understand that MANY establishments are independently owned/operated. BUT that doesn’t really matter – if “location” consistently under performs, then they will lose customers to other locations.

    the job of weeding out the “under performers” that hurt the franchise brand name belongs to “corporate.”

    If “corporate” isn’t up to the task – well, franchises come and go on a regular basis …

    fwiw: yes, “Star Wars” as a franchise abandoned its core audience a few years back. They are selling “feces in a nice box” and seem to think they are defecating gold nuggets. News of developing “Star Wars” projects fall into the same category as a lot of the commercials for prescription drugs I see which I have no idea what they treat (but the guys cuddling and engaging in p.d.a. imply I’m not the target market)

    The “history” lesson is (probably) that “franchises” come and go. Long running franchises are exceptionally rare because “time and fate” happen to us all.

    Now, if “Red Lobster” (first franchise opened in 1968 in Lakeland, Florida) were to disappear I would take notice – but wouldn’t be terribly sad about the franchise demise.

    “Burger Chef” used to be a national chain, then closed their last location in 1996. I’m told a “Burger Chef” like location existed for another couple years due to a long franchise agreement – i.e. it looked like a “Burger Chef”, had a similar sign as “Burger Chef” but called itself something NOT “Burger Chef.”

    Southwestern Ohio used to be the “world headquarters” for “Ponderosa Steakhouse, Inc” so we had access to a LOT of locations “back in the day.” “Ponderosa” was always fun – The price to food quality/quantity ratio was always high – but the possibility of “screaming baby” also tended to be high. In 2024 Google tells me there is a Ponderosa around Columbus somewhere (a little to far for me to drive – but next time I’m in Columbus …).

    The point (if I had one) being that “franchise death” tends to be a long slow process. The beginning of the slippery slope of franchise death is probably barely perceptible – but once it starts it is hard to stop (you know “slippery slope” and all that) and accelerates quickly

    The good news for “entertainment franchises” is that “rebooting” the franchise is just a single good project away — e.g. no one will remember “Velma” in a few years, and Scooby-Doo and Shaggy will continue onto new projects.

    The “core elements” of “Star Wars” were NEVER exclusive to “Star Wars” – so Disney, Inc can be “Disney, Inc” all it wants. Fans looking for “steak and potatoes” will just go somewhere else …