Author: Les

  • Movies, Television, and Streaming

    Correlation never equals causality.

    Maybe that one line sums up “logic 101” and/or “statistics 101.”

    The example I used to hear was that there was a positive correlation between ice cream sales and drowning. As ice cream sales increase so does the number of deaths by drowning.

    BUT eating ice cream does not CAUSE drowning deaths — i.e. when is more ice cream sold? in the summer. When do more people go swimming? in the summer.

    There is also data out there connecting “eating cheese” and “strangulation” — but again, eating cheese does NOT cause strangulation.

    This concept is important – just in general – but also when talking about the rise of “streaming” and “movie theater” attendance.

    Movies

    When going to the “movies” first became a cultural event 100ish years ago it was a much different experience. Back in that “golden era” of movie theaters folks would go as a WEEKLY “family night out” — there might have been a news reel, a cartoon, and then a feature presentation.

    Other “family entertainment” options might have been staying home and listening to the radio. “Live theater”, and musical concerts might have been an option IF they happened to be in town. Back at that time the “Circus” coming to town would have been a much bigger deal.

    The primary source of “news” would have still been print newspapers – and “sports” like boxing, horse racing, baseball, college football were popular – again either on the radio or attending live events.

    BUT “the movies” were the bread and butter of family entertainment.

    Television

    The “golden age of radio” was relatively short – from the late 1920s to the 1950’s. Radio and movies might have been in the same general “entertainment” markets but they are much different “experiences.”

    “Visuals AND sound” tends to beat “just sound” — BUT “going to the movies” would have been an EVENT, while turning on the radio an everyday experience.

    When Television became popular in the 1950s it ended the “golden age” of radio – and also forced the “movie industry” to adapt.

    e.g. hunt up some old “B” Westerns and you’ll discover that they tend to be about an hour long – and the “weekly serial” adventure/cliff hanger shorts tend to be 20 to 45 minutes. Which sounds a LOT like “television” program lengths to the “modern audience.”

    A lot of those “B” Western stars also had radio shows – and the popular show made the jump from radio to television. There was still a sizable market for both television and radio in the early days. The popular shows probably had a comic book and/or daily newspaper comic strip as well.

    The “point” being that folks wanted “entertainment” NOT a specific TYPE of entertainment.

    Television ended the “weekly ritual” of going to the movies.

    The “movie industry” responded by increasing the “production value” of movies. Movies were “bigger” and “better” than television programming.

    The “movie” advantage was still the bigger screen and the EVENT status. The product required to attract the audience into the theaters obviously changed – gimmicks like 3D, “Technicolor”, CinemaScope came and went.

    Now, the one 20th Century invention that can rival television for “cultural impact” is the automobile. I would tend to argue that the increased “mobility” automobiles allowed makes them the most influential and/or culturally transformational. BUT the point is arguable.

    This “automobile” changed “dating and mating” rituals. PART of that change involved “going to the movies.” At the height there were 4,000 “drive in” movie theaters spread across the U.S. (in the 1950s).

    All of those Baby-boomers doing there thing would have found the “drive in” the more economical option. The post war economic boom created “teenagers” would have had “going to the movies” as an option to “get away from parents” and be, well, “teenagers.”

    The “movie theater business” was disrupted by a Supreme Court ruling in 1948. United States v. Paramount on May 4, 1948 effectively ended the “studio system” – “studio” would no longer be allowed to own “theaters.”

    An unintended consequence of ending the “studio system” was that a lot of “talent” was released from contracts, studios opened up their film libraries and/or sold them to television stations. The number of “regular moviegoers” decreased from 90 million in 1948 to 46 million in 1958. Television ownership went from 8,000 in 1946 to 46 million in 1960

    SO if you REALLY want to put a date on the START of the death of the “movie theater business” – May 4, 1948

    Cable, VCRs, DVDs …

    Of course “movie theaters” have had a long slow decline. To coin a phrase: The reports of “movie theater’s death” has been greatly exaggerated …

    Cable TV rolled across the U.S. starting in the 1970’s. HBO came along in 1972.

    “You want romance? In Ridgemont? We can’t even get cable TV here, Stacy, and you want romance!”

    Fast Times at Ridgemont High 1982

    Drive in theaters continued to close – but they haven’t disappeared yet.

    By the 1970’s television had replaced “the movies” in terms of “cultural impact” – BUT the “birth of the blockbuster” illustrated that “the movies” weren’t dead yet.

    Of course the typical “movie theater” has not made a large % of their profits from SHOWING movies for a long time – i.e. theaters tend to make money at the concession stand NOT from ticket sales.

    The fact that “going to the movies” was still a distinct experience from “watching at home”

    Movie studios were gifted a new revenue stream in the 1980s when “VCR” ownership created the “VHS/Video Rental Store.”

    Again, “seeing it in the theater” with a crowd on the big screen with “theater quality sound” is still a distinct experience.

    DVD’s provided superior picture AND sound than VHS – and the DVD quickly replaced the VCR. The “Rental Store” just shifted from VHS tapes to DVD’s.

    BUT the BIG impact of DVD’s was their durability and lightweight. DVDs could be played multiple times with out lose of quality (VHS tapes degraded a little each viewing), AND they could even be safely (cheaply) mailed.

    Netflix started in 1997. The “Reed Hastings/Netflix story” is interesting – but not important at the moment.

    From a “movie theater” point of view – “The Phantom Menace” being released as a “digital” film in 1999 was a “transitional moment.”

    The music industry as a whole bungled their “digital” transition to the point that a couple generations of folks have grown up expecting “music” to be “free.” THAT is a different subject —

    I’ll point out that a “digital product” can easily be reproduced without lose of quality. If I have a “digital” copy of “media” I can easily reproduce exact duplicates. No need for a “manufacturing” and a “shipping” process – just “copy” from 1 location to the new location. Exact copy. Done.

    For the “movie industry” in the short term the transition to “digital” helped lower distribution costs. Copies of films didn’t need to be created and shipped from theater to theater in “cans of film” – just copy the new movie to the digital projector’s hard drive and you are all set.

    The combination of the “home computer” and “internet access” also deserve the “cultural shift” label – but it was really “more of the same” done “faster and cheaper.”

    Streaming

    It is trendy to blame “streaming” movies of the death of “theaters” — but hopefully by this point I’ve made the point that “streaming” is not the CAUSE of the decline of theaters. At best the “rise of streaming” and the “decline of theaters” are correlated – BUT (all together now)

    Correlation never equals causality.

    “Streaming” deserves credit for killing “Movie rental stores” — but the “theater experience” is still the “theater experience”

    MY issue with “going to the theater” is that ticket prices have pretty much kept up with inflation. Which kinda means a generic “family of four” has to take out a small loan to “go to the movies.”

    I’m placing the recent decline in theater attendance on “inflation” and “bad product.”

    Yes, the “movie industry” has been churning out self-righteous garbage NOT “entertainment.”

    BUT there is still a demand for “family friendly entertainment” — “Inside Out 2” setting box office records illustrates my point

    Old Theaters …

    I like not having to wait in line – but also kinda miss the “old theater” feel. That 20 screen “mega plex” is nice but there is still room for renovated “old theaters” if they can be updated without losing their “charm.”

    To be clear the “charm” of old theaters does NOT include “uncomfortable seats” and feet sticking to the floor. If someone tries to “rehab” a theater I’d spend most of the money on the bathrooms and comfortable seating

    Folks need to feel “safe” AND “comfortable” then if the popcorn is a little stale it doesn’t matter …

  • memoirs of an adjunct instructor or What do you mean “full stack developer?”

    During the “great recession” of 2008 I kind of backed into “teaching.”

    The small company where I was the “network technician” for 9+ years wasn’t dying so much as “winding down.” I had ample notice that I was becoming “redundant” – in fact the owner PROBABLY should have “let me go” sooner than he did.

    When I was laid off in 2008 I had been actively searching/”looking for work” for 6+ months – certainly didn’t think I would unemployed for an extended period of time.

    … and a year later I had gone from “applying at companies I want to work for” to “applying to everything I heard about.” When I was offered an “adjunct instructor” position with a “for profit” school in June 2009 – I accepted.

    That first term I taught a “keyboarding class” – which boiled down to watching students follow the programmed instruction. The class was “required” and to be honest there wasn’t any “teaching” involved.

    To be even MORE honest, I probably wasn’t qualified to teach the class – I have an MBA and had multiple CompTIA certs at the time (A+, Network+) – but “keyboarding” at an advanced level isn’t in my skill set.

    BUT I turned in the grades on time, that “1 keyboarding class” grew into teaching CompTIA A+ and Network+ classes (and eventually Security+, and the Microsoft client and server classes at the time). fwiw: I taught the Network+ so many times during that 6 years that I have parts of the book memorized.

    Lessons learned …

    Before I started teaching I had spent 15 years “in the field” – which means I had done the job the students were learning. I was a “computer industry professional teaching adults changing careers how to be ‘computer industry professionals’”

    My FIRST “a ha!” moment was that I was “learning” along with the students. The students were (hopefully) going from “entry level” to “professional” and I was going from “working professional” to “whatever comes next.”

    Knowing “how” to do something will get you a job, but knowing “why” something works is required for “mastery.”

    fwiw: I think this same idea applied to “diagramming sentences” in middle school – to use the language properly it helps to understand what each part does. The fact I don’t remember how to diagram a sentence doesn’t matter.

    The “computer networking” equivalent to “diagramming sentences” is learning the OSI model – i.e. not something you actually use in the real world, but a good way to learn the theory of “computer networking.”

    When I started teaching I was probably at level 7.5 of 10 on my “OSI model” comprehension – after teaching for 6 years I was at a level 9.5 of 10 (10 of 10 would involve having things deeply committed to memory which I do not). All of which is completely useless outside of a classroom …

    Of course most students were coming into the networking class with a “0 of 10” understanding of the OSI model BUT had probably setup their home network/Wi-Fi.

    The same as above applies to my understanding of “TCP/IP networking” and “Cyber Security” in general.

    Book Learning …

    I jumped ship at the “for profit school” I was teaching in 2015 for a number of reasons. MOSTLY it was because of “organizational issues.” I always enjoyed teaching/working with students, but the “writing was on the wall” so to speak.

    I had moved from “adjunct instructor” to “full time director” – but it was painfully obvious I didn’t have a future with the organization. e.g. During my 6 years with the organization we had 4 “campus directors” and 5 “regional directors” — and most of those were “replaced” for reasons OTHER than “promotion.”

    What the “powers the be” were most concerned with was “enrollment numbers” – not education. I appreciate the business side – but when “educated professionals” (i.e. the faculty) are treated like “itinerate labor”, well, the “writing is on the wall.”

    In 2014 “the school” spent a lot of money setting up fiber optic connections and a “teleconferencing room” — which they assured the faculty was for OUR benefit.

    Ok, reality check – yes I understand that “instructors” were their biggest expense. I dealt with other “small colleges” in the last 9 years that were trying to get by with fewer and fewer “full time faculty” – SOME of them ran into “accreditation problems” because of an over reliance on “adjuncts” – I’m not criticizing so much as explaining what the “writing on the wall” said …

    oh, and that writing was probably also saying “get a PhD if you want a full time teaching position” — if “school” would have paid me to continue my education or even just to keep my skills up to date, I might have been interested in staying longer.

    Just in general – an organization’s “employees” are either their “biggest asset” OR their “biggest fixed cost.” From an accounting standpoint both are (probably) true (unless you are “Ivy League” school with a huge endowment). From an “administration” point of view dealing with faculty as “asset” or “fixed cost” says a LOT about the organization — after 6 years it was VERY clear that the “for profit” school looked at instructors as “expensive necessary evils.”

    COVID-19 was the last straw for the campus where I worked. The school still exits but appears to be totally “online” –

    Out of the frying pan …

    I left “for profit school” to go to teach at a “tech bootcamp” — which was jumping from “bad situation to worse situation.”

    The fact I was commuting an hour and a half and was becoming more and more aware of chronic pain in my leg certainly didn’t help.

    fwiw: I will tell anyone that asks that a $20 foam roller changed my life — e.g. “self myofascial release” has general fitness applications.

    I was also a certified “strength conditioning professional” (CSCS) in a different life – so I had a long history of trying to figure out “why I had chronic pain down the side of my leg” – when there was no indication of injury/limit on range of motion.

    Oh, and the “root cause” was tied into that “long commute” – the human body isn’t designed for long periods of “inaction.” The body adapts to the demands/stress placed on it – so if it is “immobile” for long periods of time – it becomes better at being “immobile.” For me that ended up being a constant dull pain down my left leg.

    Being more active and five minutes with the foam roller after my “workout” keeps me relatively pain free (“it isn’t the years, it’s the mileage”).

    ANYWAY – more itinerate level “teaching” gave me time to work on “new skills.”

    I started my “I.T. career” as a “pc repair technician.” The job of “personal computer technician” is going (has gone?) the way of “television repair.”

    Which isn’t good or bad – e.g. “personal computers” aren’t going away anymore than “televisions” have gone away. BUT if you paid “$X” for something you aren’t going to pay “$X” to have it repaired – this is just the old “fix” vs “replace” idea.

    The cell phone as 21st Century “dumb terminal” is becoming reality. BUT the “personal computer” is a general purpose device that can be “office work” machine, “gaming” machine, “audiovisual content creation” machine, or “whatever someone can program it to do” machine. The “primary communication device” might be a cell phone, but there are things a cell phone just doesn’t do very well …

    Meanwhile …

    I updated my “tech skill set” from “A+ Certified PC repair tech” to “networking technician” in the 1990s. Being able to make Cat 5/6 twisted pair patch cables still comes in handy when I’m working on the home network but no one has asked me to install a Novell Netware server recently (or Windows Active Directory for that matter).

    Back before the “world wide web” stand alone applications were the flavor of the week. e.g. If you bought a new PC in 1990 it probably came with an integrated “modem” but not a “network card.” That new PC in 1990 probably also came with some form of “office” software – providing word processing and spreadsheet functions.

    Those “office” apps would have been “stand alone” instances – which needed to be installed and maintained individually on each PC.

    Back in 1990 that application might have been written in C or C++. I taught myself “introductory programming” using Pascal mostly because “Turbo Pascal” came packaged with tools to create “windows” and mouse control. “Pascal” was designed as a “learning language” so it was a little less threatening than C/C++ back in the day …

    random thought: If you wanted “graphical user interface” (GUI) functionality in 1990 you had to write it yourself. One of the big deals with “Microsoft Windows” was that it provided a uniform platform for developers – i.e. developers didn’t have to worry about writing the “GUI operating system hooks” they could just reference the Windows OS.

    Apple Computers also had “developers” for their OS – but philosophically “Apple Computers” sold “hardware with an operating system included” while Microsoft sold “an operating system that would run on x86 hardware” – since x86 hardware was kind of a commodity (read that as MUCH less expensive than “Apple Computers”). The “IBM PC” story that ended up making Microsoft, inc a lot of money. — which was a fun documentary to show students bored of listening to me lecture …

    What users care about is applications/”getting work done” not the underlying operating system. Microsoft also understood the importance of developers creating applications for their platform.

    fwiw: “Microsoft, Inc” started out selling programming/development tools and “backed into” the OS market – which is a different story.

    A lot of “business reference applications” in the early 1990s looked like Microsoft Encarta — they had a “user interface” providing access to a “local database.” — again, one machine, one user at a time, one application.

    N-tier

    Originally the “PC” was called a “micro computer” – the fact that it was self contained/stand alone was a positive selling point. BEFORE the “PC” a larger organization might have had a “terminal” system where a “dumb terminal” allowed access to a “mainframe”/mini computer.

    SO when the “world wide web” happened and “client server” computing became mainstream the concept of “N tier” computing model as a concept became popular.

    N-tier might be a the “presentation” layer/web server, the “business logic” layer/a programming language, and then the “data” layer/a database management system

    Full Stack Developer

    In the 21st Century “stand alone” applications are the exception – and “web applications” the standard.

    Note that applications that allow you to download and install files on a personal computer are better called “subscription verification” applications rather than “N Tier.”

    e.g. Adobe allows folks to download their “Creative Suite” and run the applications on local machines using computing resources from the local machine – BUT when the application starts it verifies that the user has a valid subscription.

    An “N tier” application doesn’t get installed locally – think Instagram or X/Twitter …

    For most “business applications” designing an “N tier” app using “web technologies” is a workable long term solution.

    When we divided the application functionality the “developer” job also differentiated – “front end” for the user facing aspects and “back end” for the database/logic aspects.

    The actual tools/technologies continue to develop – in “general” the “front end” will involve HTML/CSS/JavaScript and the “back end” involves a combination of “server language” and “database management system.”

    Languages

    Java (the language maintained by Oracle not “JavaScript” also known as ECMAscript) has provided “full stack development” tools for almost 30 years. The future of Java is tied into Oracle, Inc but neither is gonna be “obsolete” anytime soon.

    BUT if someone is competent with Java – then they will describe themselves as a “Java developer” – Oracle has respected industry certifications

    I am NOT a “Java developer” – but I don’t come to “bury Java” – if you are a computer science major looking to go work for “large corporation” then learning Java (and picking up a Java certification) is worth your time.

    Microsoft never stopped making “developer tools” – “Visual Studio” is still their flagship product BUT Visual Studio Code is my “go to” (free, multi-platform) programming editor in 2024)

    Of course Microsoft wants developers to develop “Azure applications” in 2024 – C# provides easy access to a lot of those “full stack” features.

    … and I am ALSO not a C# programmer – but there are a lot of C# jobs out there as well (I see C# and other Microsoft ‘full stack’ tech specifically mentioned with Major League Baseball ‘analytics’ jobs and the NFL – so I’m sure the “larger corporate” world has also embraced them)

    JavaScript on the server side has also become popular – Node.js — so it is possible to use JavaScript on the front and back end of an application. opportunities abound

    My first exposure to “server side” programming was PHP – I had read some “C” programming books before stumbling upon PHP, and my first thought was that it looked a lot like “C” – but then MOST computer languages look a lot like “C.”

    PHP tends to be the “P” part of the LAMP stack acronym (“Linux OS, Apache web server, MySQL database, and PHP scripting language”).

    Laravel as a framework is popular in 2024 …

    … for what it is worth MOST of the “web” is probably powered by a combination of JavaScript and PHP – but a lot of the folks using PHP are unaware they are using PHP, i.e. 40%+ of the web is “powered by WordPress.”

    I’ve installed the LAMP stack more times than I can remember – but I don’t do much with PHP except keep it updated … but again, opportunities abound

    Python on the other hand is where I spend a lot of time – I find Django a little irritating, but it is popular. I prefer flask or pyramid for the “back end” and then select a JavaScript front end as needed

    e.g. since I prefer “simplicity” I used “mustache” for template presentation with my “Dad joke” and “Ancient Quote” demo applications

    Python was invented with “ease of learning” as a goal – and for the most part it succeeds. The fact that it can also do everything I need it to do (and more) is also nice 😉 – and yes, jobs, jobs, jobs …

    Databases

    IBM Db2, Oracle, Microsoft SQL server are in the category of “database management system royalty” – obviously they have a vast installation base and “large corporate” customers galore. The folks in charge of those systems tend to call themselves “database managers.” Those database managers probably work with a team of Java developers …

    At the other end of the spectrum the open source project MySQL was “acquired” by Sun Microsystems in 2008 which was then acquired by Oracle in 2010. Both “MySQL” and “Oracle” are popular database system back ends.

    MySQL is an open source project that has been “forked” into the “MariaDB foundation.”

    PostgreSQL is a little more “enterprise database” like – also a popular open source project.

    MongoDB has become popular and is part of its own “full stack” acronym MEAN (MongoDB, Express, Angular, and Node) – MongoDB is a “NoSQL” database which means it is “philosophically” different than the other databases mentioned – making it a great choice for some applications, and not so great for other applications.

    To be honest I’m not REALLY sure if there is a big performance difference between database management back ends. Hardware and storage space are going to matter much more than the database engine itself.

    “Big Corporate Enterprise Computing” users aren’t as concerned with the price of the database system they want rock solid dependability – if there was a Mount Rushmore of database management systems – DB2, Oracle, and Microsoft SQL server would be there …

    … but MariaDB is a good choice for most projects – easy to install, not terribly complicated to use. There is even a nice web front end – phpMyAdmin

    I’m not sure if the term “full stack developer” is gonna stick around though. Designing an easy to use “user interface” is not “easy” to do. Designing (and maintaining) a high performing database back end is also not trivial. There will always be room for specialists.

    “Generalist developer” sounds less “techy” than “full stack developer” – but my guess is that the “full stack” part is going to become superfluous …

  • Plot holes and “Star Wars” …

    “Telling stories” is a euphemism for “lying.”

    Lying” obviously requires a “lie” to build around – with the definition of “lie” (the third definition from Merriam-Webster: “to make an untrue statement with intent to deceive”) being the relevant point.

    Not that INTENT is required. SO it is POSSIBLE for someone to “tell a story” that is not true, and not be “lying.”

    “Telling tall tales” has probably been a kind of “sport” to rascals, rogues, and tramps as long as there have been “rascals, rogues, and tramps.” Maybe a form of good-natured “right of passage” – e.g. think “wide-eyed novice” listening intently to “grizzled veteran” telling “stories” that get more and more “factually challenged.”

    IF at SOME point the “grizzled veteran” passes a point where the “wide eyed novice” gets the joke – then everyone laughs. The “novice” isn’t as wide-eyed and is on their way to “veteran” status.

    (of course if “wide-eyed novice” DOESN’T get the joke – then, well, that is a different problem)

    “Campfire stories” take on a general form. SOMETIMES there is a kernel of truth – i.e. “legends” are born in the “additions” to the TRUE story. It is probably in those “additions” that we can track “cultural value changes.”

    Art reflects …

    Does life imitate art, or does art imitate life?

    And the answer is, well, “yes.”

    We can quickly get lost in definitions – e.g what is “art?” How about if we agree that “art REFLECTS an IDEAL of life.” Art must be “created,” which requires a “creator” — i.e. the “art” reflects the character of the “artist”/creator.

    Creativity is allowing oneself to make mistakes. Art is knowing which ones to keep.

    Scott Adams

    Since the “artist” does not exist in a cultural vacuum the “art” ends up reflecting the society in which the artist lives.

    Plot and Story

    The difference between “plot” and “story” is that “plot” requires causality.

    e.g. “A” happens, then “B” happens, then “C” happens is a “story” but NOT a plot.

    If “A” happens, then “B” happens BECAUSE of A, and then “C” happens because of “B” (or “A” or “A&B” – depending on just how complicated you wanna get) – THAT is “plot”

    Someone “telling stories” will have a “plot” but there will be intentional “plot holes” testing the listener’s level of gullibility.

    e.g. grizzled veteran: “There I was – just me and my horse, supplies running out, horse almost dead. Suddenly, I was attacked by a gang of 40 cut-throats that would kill me just for my boots.

    I shot the nearest on in the leg, jumped on my horse and headed up the mountain. Now, those cut-throats were REALLY angry and were threatening to bury me up to my neck and leave me to die. SO I managed to find a small cave where they could only get at me 1 or 2 at a time – let my horse go and waited for them to find me. I was down to just 3 bullets and my knife.

    Sure enough, they found me, and then …”

    wide eyed novice: ” … and then?”

    grizzled veteran: “well, I died of course” (laughter, insults, etc)

    (and when that former wide-eyed novice has become “grizzled veteran” they will probably tell the same story to the next batch of wide-eyed novices …)

    Stories …

    If everyone involved KNOWS the story being told is just a “story” then the audience can willingly engage in “suspension of disbelief” and just enjoy the story.

    The required amount of “disbelief” will obviously vary based on genres. The folks “performing” aren’t “intentionally” trying to deceive they are engaging in “storytelling.”

    e.g. the audience at a performance of Hamlet doesn’t ACTUALLY believe that they are watching a “Prince of Denmark” wrestling with the fact that his Uncle may or may not have murdered the former King (Hamlet’s father). Hopefully, the audience puts aside “critical thinking” and plays along with the story.

    Obviously the folks putting on the performance try their best to be convincing. The highest praise that can be given to a “working actor” MIGHT be that they are ALWAYS “convincing” no matter what role they are playing.

    (fwiw: playing “Hamlet” is considered a test of an actor’s acting ability – this is probably why you see so many “famous movie stars” attempt the roll. I have seen a LOT of versions of Hamlet – and most of them are “ok.”

    If I’m watching “Hamlet” and I think “that is so and so TRYING to do Hamlet” – then that qualifies as an “ok performance” — but if I forget that it is “BIG NAME” playing Hamlet, then that is “VERY good” performance … and moving on)

    Random thought: Strange Brew (1983) borrows plot elements from Hamlet – catching the “Hamlet” references elevated the movie from “cute buddy comedy” to “funny at multiple levels” – and yes, INTENTIONAL plot holes-a-plenty …

    Star Wars plot holes …

    I have been re-examining WHY I loved the original “Star Wars” trilogy. In part this is because of the “fan reaction” to the latest “Star Wars product.”

    Apparently others have done this “re-examination” as well. One such re-examination was trying to point out “plot holes” in “Star Wars” (1977)

    In particular they didn’t like the fact that if the “Empire” had blown up the “escape pod” at the beginning the movie ends there. i.e. blow up the escape pod with R2-D2 and C-3P0 and the story ends there.

    BUT that is NOT a “plot hole” – yes, the movie turns on that point BUT it also helps establish that the “Empire” are the bad guys.

    The scene could easily have been taken out – but it serves a “storytelling” purpose. The “Empire” is the “evil authoritarian organization” – notice that the anonymous characters WOULD have blown up the “escape pod” IF they had detected “life forms.” i.e. the anonymous character’s (lack of) action illustrates that “fate”/luck is gonna be part of the story.

    “Fate” interferes throughout “Star Wars” – with “Stormtrooper’s” marksmanship being another great example (e.g. they are extremely precise when shooting at “not major characters” but can’t hit anything important when “major character” is involved)

    Now, if the movie was trying to be “gritty and realistic” then “fate interfering” might constitute “plot hole.”

    I also like to point out that R2-D2 in the “Star Wars universe” is an “agent of fate” or the “finger of the divine” — apparently immortal and all-knowing. Seriously, notice how many times R2 is instrumental in things “working out” for the heroes.

    Sure, R2 get “blown up” a lot – but always returns good as new. If “Star Wars” was hard core science fiction THAT would be a HUGE plot hole – but since it is a space fairy tale set in a galaxy far, far away, just part of the suspension of disbelief.

    BUT if you want to talk about REAL plot-holes – I have always been (mildly) bothered by the fact that after the heroes escape the Death Star – and KNOW they are being tracked – that they (apparently) go straight to the Rebel Base.

    By this point George Lucas has done a masterful job of storytelling – and the fact that the Empire easily tracks the heroes to the Rebel Base – setting up the climactic battle – is easily overlooked.

    Ok, Leia tells Han they are being tracked – Han doesn’t believe her, but even if there is a slight possibility of them being tracked then they should logically have gone ANYWHERE else except the Rebel Base.

    THEN when they are far away from danger AND the Rebel Base – they could have easily transferred the data as required. Or maybe find the tracking device – and send it ANYWHERE else than the Rebel Base.

    “You’re going to need a bigger boat.”

    Chief Brody

    The “Battle of Yavin” is kind like the oxygen tank exploding at the end of Jaws (1975). If the audience has to THINK about it, then it becomes a problem.

    If we have been guided along properly then we are probably “all in” on that plot hole. The plot hole goes completely unnoticed and even gets cheered when told by “expert storyteller.”

    I suppose “storytelling 101” always starts with some form of “show don’t tell” – if the “plot” requires 120 minutes of talking heads then you are telling a much different type of story that if you have “action”/pause/more action/short pause/etc.

    none of this is a secret. The audience expectations on the ratio of “drama” to “relief” is determined by genre — if you are doing “romantic comedy period piece” then long periods of “talking heads” is expected, BUT if you are doing “space fairy tale” then keep the “talking heads delivery exposition” to a minimum …

    it is the genre, silly …

    I’m also fond of pointing out that their is plenty of room for different stories and genres – but trying to fit “agenda” into “genre” is almost always a recipe for commercial failure.

    random thought: a famous “hamburger chain” started offering salads back in the late 1980’s. I think they were responding to “market demand” for “healthier” options. They are a world wide operation that regularly introduces new items to their menu – so offering salads wasn’t a “bad” idea

    the funny thing was that those “hamburger chain salads” could be LESS healthy than the “regular menu” (with salad it is usually the “dressing” that becomes the problem – which had a lot of fat and calories …)

    the same chain sells a “fish sandwich” – that is very popular but definitely NOT the “healthy option”

    HOWEVER “hamburger chain” never lost sight of the fact that their core product is “meat and potatoes” – they make $$ selling hamburgers and fries

    NOW imagine that the “hamburger chain” powers that be decide to turn the menu over to someone that HATES hamburgers and fries – or thinks that “salads” are why people go to “hamburger chain” – well, things aren’t going to go well

    the “new menu maker” might blame the customer for them NOT wanting to eat bad salads instead of hamburgers – but that is not gonna change the customers preference.

    “New menu maker” will almost certainly get bombarded with criticism from lovers of “hamburger and fries” – and sales/profits will plummet.

    Of course the folks that hired “new menu maker” will defend their decision – but that just means that THEY are (probably) the franchises (REAL) problem not the “new menu maker” and certainly NOT the fans …

    if you want another “movie franchise” example – compare and contrast the first “Matrix” (1999) with “Matrix Resurrections” (2021) – notice the difference in the ratio of “action” to “exposition” …

  • What is the purpose of amateur sports?

    Maybe the first question becomes “Do amateur sports have a purpose?”

    The numbers fluctuate but there are AROUND 1 million high school football players each year in the United States.

    Around 7.8% of those high school football players will play in college (at any level).

    Less than 0.5% of those college players will make an NFL roster.

    For baseball the percentages are even worse – 1 in 200 high school players will get drafted to play “professional baseball” (around 0.05% – yes, that means “minor leagues”).

    Around 1% of high school basketball players will play Division I college basketball. Out of every 10,000 High School basketball players 2 or 3 will play in the NBA.

    The point being that if “getting a scholarship” or “going pro” is the “purpose” of playing amateur sports – then a large number of athletes are chasing a fantasy.

    BUT are those “ordinary players” wasting their time playing a sport? Oh, and what about those sports where “going pro” isn’t an option?

    Purpose

    In the U.S. “organized amateur sports” tend to be associated with secondary education/”high schools.”

    The “why” sports are associated with high schools has a lot to do with “organization” by proximity. After the Civil War “disorganized” sports began popping up. Those early ‘amateur athletics’ weren’t much more that ‘pickup games’ with the teams representing “communities.”

    The “point” of those games was simply friendly competition and entertainment.

    Does “competition” have a purpose? Well, the short answer is “yes.”

    Iron sharpeneth iron; so a man sharpeneth the countenance of his friend.

    Proverbs 27:17

    BUT there are “healthy” and “unhealthy” variants of “competition.”

    The goal of ANY competition is NOT just to “win” but to “win within the rules.” HEALTHY competition will make everyone involved “better” – in that Proverbs 27:17 way.

    UNHEALTHY competition is the “law of the jungle” or “winning at any cost.” This isn’t just “cheating” but also potentially trying to harm the opposition.

    To be clear, there is a BIG difference between “competing hard” and “winning at any cost.” Wanting to win isn’t wrong, but being so obsessed with winning that you are willing to “cheat” is missing the point of the competition.

    An individual’s “self worth” should NEVER come from winning an athletic contest. The individual has inherent worth because they are a human being NOT because they are good at “sport ball.”

    The players will change but the sport and/or team will continue. Which means in the grand scheme of things victory is never “total” and defeat is never “final.”

    Losing a “sport ball” contest does NOT diminish a human beings worth. Winning does not excuse bad behavior.

    Teenagers

    In the middle of the 20th Century the post WW2 baby boom and economic prosperity helped create a new demographic called “teenagers.”

    Yes, there have always been 13 to 19 year olds – but in the 1950s they got disposable income and cars. Along with rock & roll music came “organized high school sports.”

    In general terms the core motivation of ‘administrators’ organizing those high school sports was (and still is) the welfare of the “student athlete.”

    Establishing “rules” for sports, certifying “officials” to enforce those rules, and then providing a structure for HEALTHY competition required “organization.”

    i.e. the students were going to compete, “organizing” the competition helped keep that competition healthy. To keep competition “fair” things like “divisions” and “age restrictions” were also required.

    Fast forward 70+ years and “scholastic sports” is a massive industry. However, the PURPOSE of that industry is still healthy (fair) competition.

    The joy of competition comes from preparing and then competing. Having a competition goal, putting in the time and effort to prepare for that competition, and then competing teaches a long list of positives. Winning a close contest against an opponent of equal ability is satisfying BUT losing a close contest to “honorable opponent” is NOT dissatisfying (disappointing? yes – but the “joy” comes from preparing and competing hard – “winning” is a byproduct of the process)

    Meanwhile dominating an outclassed opponent is about as satisfying as taking out the garbage. Something was accomplished, but there isn’t a great deal of “joy” involved.

    Respecting and liking an opponent just makes beating them more fun. If the opponent is inept or “out of their league,” then beating them isn’t particularly satisfying …

    Fair?

    I’ve thrown that term “fair” out there several times – what does it mean?

    Well, “fair competition” is between “peers”/equals. This is obviously why there are “weight classes” and “age divisions” in sports like boxing and wrestling.

    Again, the point of “competition” is to push each other to higher levels NOT just “winning.”

    An athlete that intentionally goes in search of “less skilled” opponents for easy victories will never be forced to “push themselves.”

    One more time – no human beings “purpose” is “beating up on lower skilled opponents.” The “athlete” that INTENTIONALLY seeks out a lower level of competition has once again missed the point or lost their way.

    Lessons learned from competition

    I am always quick to point out that the most valuable thing I learned from “amateur sports” was that “success” is a process.

    Setting a goal, coming up with a plan to achieve that goal, and then following through on the plan are “transferable” life skills.

    Of course OTHER folks doing the same thing will mean that sometimes you get knocked on your duff – however you get the chance to get back up or you can stay “knocked down.”

    “I don’t pity any man who does hard work worth doing. I admire him. I pity the creature who does not work, at whichever end of the social scale he may regard himself as being.” 

    Theodore Roosevelt

    Healthy competition in TEAM sports provides obvious life lessons – with positive socialization, and working together towards a common goal immediately coming to mind.

    BUT remember UNHEALTHY competition involves trying to “win at any cost” and disrespecting the opposition.

    “Winning by cheating” is by definition self-destructive. Unethical competition might work in the “short term” but “being a jerk” will catch up with them eventually …

    I understand there are “well intended” folks that push various flavors of “non competitive” sports. If the goal of the “event” is “socialization” and/or “exercise” then running around on a field for 40 minutes might be useful.

    There is no reason to keep score at such events OR give EVERYONE a trophy at the end of the year. Non-competition means “no winners” NOT “everyone is a winner.”

    I’m not a big fan of “organized youth sports” (whatever age that may be). Organization will always imply competition of some kind. If the lesson learned is “I win by doing nothing but showing up” then “they” are creating self-esteem sinkholes not healthy individuals.

    But of course “youth sports” can be a good or a bad experience for the “youths” BUT the “youths” should be the focus.

    random thought: From an “athletic standpoint” – the “future professional athlete” is probably exceptional at every level they participate. However that doesn’t mean that they are exceptional BECAUSE they started playing “sport ball” before they could walk …

    ANYWAY

    Sports was/is the original “reality” television – amateur sports have a larger purpose only to the point that the teach a work-ethic and social skills. Participating (or NOT participating) in “sport” will never impact the “value” of an individual as a human being.

    The opportunity to compete against peers is “positive” on a grand scale. While claiming that “unfair competition” must be allowed so that “fraction of society” can feel “good” about themselves is counter-productive on a grand scale ….

  • Mr. Shakespeare, marketing, and the “Western”

    A lifetime ago I worked as a “student employee” as an undergrad. I was helping out the “system administration” folks – and ended up doing low level “desktop support” for faculty members.

    random thought: I remember running the big ol’ suit case size “VHS video” camera when they gave a presentation about this new “internet” thing that the college was joining. That was “pre – world wide web” and you needed to use “command line” utilities to move around.

    Thinking back to that presentation – the presenter was talking about using FTP and email (again, there was an “Internet” before there was the “world wide web”). One of the sites they talked about was in London (England) and you could download the complete works of Shakespeare!

    Needless to say, I was impressed – but at that time the “general public” didn’t have access to the Internet. Only military bases and academic institutions were granted access – but the network was growing.

    As I remember the debate – the folks running “academic institutions” seemed to think that if the Internet was opened up to the “general public” it would be overrun by advertisers/porn/spam – and of course they were correct. BUT what really caused the Internet to explode was making it “easy to use” for non-computer experts – i.e. the “world wide web.”

    Hamlet and John Wayne

    ANYWAY – one of the “faculty members” whose office computer I visited way-back-when was in the “theater” department. He had pictures of Hamlet AND John Wayne on his wall.

    I had read Hamlet (for the first time) when I was in the Army, and grew up a John Wayne fan – so I asked him about the pictures. Obviously the Prof new much more about both than I did at the time – as I remember it he said something like “Shakespeare is a lot more ‘rough and tumble’ than you might think” – and also John Wayne more complex.

    Fast forward a lifetime of study — and Mr Shakespeare and John Wayne were both working within “frameworks” catering to an audience. Mr Shakespeare wanted folks to buy tickets to performances of his plays, and Mr Wayne wanted folks to buy tickets to watch his movies.

    BOTH were working in “genres.” John Wayne is most remembered for his work in “westerns” but he made a lot of “war” movies and a handful of “detective” movies – e.g. 184 credits listed on IMDB.

    random thought: the joke was that John Wayne played the same character in every movie – i.e. “John Wayne” – which is a little unfair, but “funny because of the truth involved.” Mr Wayne’s Academy Award winning performance was playing a very NOT “John Wayne” roll – Rooster Cogburn in “True Grit” (1969)

    random thought part 2: at the moment I can only think of 2 “fictional John Wayne character names” – Ethan Edwards in “The Searchers”(1956) and Rooster Cogburn – illustrating that “John Wayne” was what audiences paid to see … of course he also played Davy Crockett in “The Alamo” (1960) and Ghengis Khan in “The Conqueror” (1956) — yes, that was John Wayne as the Great Khan – mid-western drawl and all (not one of his better movies)

    The “genres” Mr Shakespeare was dealing with were PRIMARILY designed to attract an audience. e.g. early on the audience would have gone to a “comedy”/”tragedy” or a “history” play not specifically a play by “William Shakespeare”

    The super short “intro to Shakespeare” class would point out that what distinguished “comedies” and “tragedies” was the ending of the play – a comedy would end at the altar (folks getting married) and the tragedy would end at the crypt (folks dead).

    The “histories” were similar to what we expect from modern “biopics” – they covered “themes” but weren’t always exactly “true.” More “based on a true event” than “actually true.” Again, Mr. Shakespeare was writing for an AUDIENCE – not pushing any agenda (except maybe “sell tickets”).

    Go beyond the “intro” level and Mr Shakespeare’s comedies changed over the course of his career. The “early comedies” might have a “fantasy” aspect (e.g. “A Midsummer Night’s Dream” – the “lovers” go into the forest, things get weird, but are sorted out for a happy resolution in the morning). The “late romances” would have “fantasy” aspects core to the story (e.g. “The Tempest” – Prospero is literally a “wizard” with a “spirit servant” – but things also happily sort themselves out by the end).

    The “entertainment industry” of the Elizabethan era being what it was – Mr Shakespeare wouldn’t have been able to remain a going concern without “patrons” backing his work. i.e. there was no “long tail” market – no “sub rights” to sell.

    I’ve never seen an in depth analysis or a “profit and loss” statement from Shakespeare’s time — I don’t think the “patrons” expected to get a return on their investment OTHER than good seats at play performances. The fact that Mr Shakespeare “retired” at 47 implies the plays were commercially successful (and he died at 52).

    random thought: the death of cause of death for Mr. Shakespeare is still a mystery. There are theories that he died after a drunken binge, that he had syphilis, or he might have been murdered! BUT it was 1616, who knows …

    the “Western” …

    ANYWAY – someone (recently) came up with a “greatest western movies” of all time type list. All such lists tend to be a little “arbitrary” – but also tend to be “interesting.” The list itself wasn’t what caught my attention – i.e. just what makes a “western” a “western?”

    When Mr Shakespeare died, “working in the entertainment industry” wasn’t a highly esteemed profession. When he died the funeral was on a “wealthy local retiree” not “celebrity.” Literary immortality for Mr Shakespeare happened AFTER his death when his friends and admirers collected his works for publication.

    Remember that “movable type printing” was perfected 150 years or so earlier – so it was an established technology but more importantly there was a growing market for “printed books.”

    What does that have to do with “westerns?” Well, multiple zeitgeists probably collided in the last half of the 19th Century – the industrial revolution increased city populations, gave folks more “free time”, and increased disposable/discretionary income (as opposed to agricultural work).

    Combine that with “public education” – and you have what the corporate types would call a “growing market segment” – i.e. folks with money in their pocket looking for something to buy.

    Random thought: ANOTHER “old prof” back in the day liked to point out that the “printing press” had a lot of unintended consequences. Their theory was that people stopped “sitting around the fire” telling stories because they had “books” that they could go off and read by themselves – I think the point was that “humans are natural storytellers” or something BUT “fear of public speaking” is always high on the list of “common phobias.”

    random thought part 2: I don’t think people fear “public speaking” what they fear is “being embarrassed in public” – e.g. a certain amount of “stage fright” is kinda required, if the speaker isn’t a LITTLE worried then they will be exceptionally boring – as everyone that has had to listen to “boring speaker” drone on, and on, and on understands … BUT “boring” might come from arrogance OR lack of preparation – neither of which is predestined

    SO “lower cost printing” meets “public demand” and the “pulp magazines” were born. The “pulp” part was a reference to the low quality paper used in the printing process – and the content tended to be of similar quality.

    Now, “sex” and “violence” are part of human history — just having “sex and violence” in a book doesn’t make it “low quality”, it obviously depends on how the “sex and violence” is presented.

    If you have some form of “action/consequence” then you MIGHT have a work of “high literary quality” BUT if the work is just “descriptions of explicit sex” polite society might call that “pornography.”

    Same idea with “violence” – and I will wave at the trend of “violence porn” without comment beyond it might have some sex/nudity, but is just “pointless violence.”

    I seem to remember hearing that Sam Peckinpah got criticized for showing “blood” in “The Wild Bunch” back in 1969 (which really just looks like ketchup on shirts) – umm, slippery slope and all that …

    MEANWHILE …

    “Pulp” magazines needed content and humans have always loved reading/haring about “exotic locations” so the “American West” after the Civil War was the source of a LOT of “colorful pseudo historical” characters.

    William “Buffalo Bill” Cody and his “Wild West Show” helped create the specific “idea” of the “western” as a distinct genre. But Buffalo Bill serves as an example of the trend – not the source.

    The world’s first “modern celebrity” was Samuel Clemens (Mark Twain) – the quintessential storyteller, both in print and on stage. Mr Clemens was more famous as “travel writer” during his lifetime than for “Huckleberry Fin” – “Roughing It” (published in 1872) was his semi-autobiographical contribution to “books about the west.”

    Again, Mark Twain is an example not the source. The IDEA of a “frontier” separating “polite society” from the “unknown” is (probably) as old as human beings.

    Even the “idea” of “the west” as being “unknown”/terra incognita goes back to “ancient times.” My pet theory is that this “west” as “frontier” involves the rising of the sun (in the “east”) and the setting of the sun (in the “west”) – but I’m just guessing …

    The specific “western frontier” for the United States is obviously based on the fact the the original “13 Colonies” were on the eastern coast of the continent.

    Expansion “west” was initially a slow process for “American History class” reasons. This is where we start bumping up against the problems defining the “western” genre.

    Stories set in “Colonial Times”, “Pioneer Times” (the initial slow move west), and the Civil War period, PROBABLY don’t fit into a narrow definition of “the western.”

    e.g. at one point Ohio was the “western frontier” – and having grown up and living in Ohio I can say we have a lot of “history” – the story of “Blue Jacket” and the Shawnee people is historically interesting – I’m just not calling it a “western” …

    Pop Culture

    The U.S. Bureau of the Census declared the “frontier” closed in 1890 (as in “no longer a discernible demarcation between frontier and settlement”).

    Not surprisingly, the “western” in pop culture became popular AFTER the frontier closed. Again, folks looking for “entertainment” tend to look to the “unknown”/unusual – i.e. if you were living on the “frontier” you probably didn’t have much interest in reading first hand accounts of “frontier life” – even if they were available.

    The “American Wild West” period is usually dated from “after the Civil War” (1865ish) to the turn of the century.

    Zane Grey published his first novel in 1903. Mr. Grey’s name is synonymous with “western” – but again, SOME of his stories could be more accurately called “frontier”/pioneer stories.

    “Max Brand” however was a pen name for Frederick Schiller Faust. Mr Faust wrote 300+ novels under various pen names – “Max Brand” was pure “western” genre written in a “pulp” fashion.

    Then Louis L’Amour (200 million books sold) started writing when the “western” was a fully formed pop culture concept. Mr L’Amour preferred saying he wrote “western stories” not “westerns” — which brings us back to the initial problem …

    Radio, Movies, and TV …

    All of this talk about “literary genres” is nice – but it is all precursor to the TRULY mass media of modern times.

    The western quickly found its way to the silver screen. The “B” western being a great example of “pulp western” plots with visuals.

    Radio brought the western into folks homes – “Return with us now to those thrilling days of yesteryear …” – e.g. both the Lone Ranger and Gunsmoke started out as “radio shows”

    When sound and pictures came into folks homes – so did the western. With the 1950s being the “golden age” of TV westerns — which is another subject …

    Two World Wars and millions of Americans going overseas would change American society, and the “western” changed with it.

    The movies labelled “spaghetti westerns” (in the late 1960’s and 1970s) were truly “multinational” projects – the “man with no name” trilogy being a good example – filmed in Spain, Italian director, American actors. The legend is that the multinational cast members would say their lines in their native language, and then be dubbed over as needed – which gives the films a VERY distinctive look …

    random thought: The fact the several of Akira Kurosawa’s samurai movies were made into “westerns” illustrates both “underlying themes” AND the versatility of the “western” as a genre – both “The Magnificent Seven” and “A Fistful of Dollars” are based on Kurosawa movies (though Sergio Leone denied the connection).

    Did the western die?

    There was almost a decade gap between “The Outlaw Jose Wales” (1976) and “Pale Rider”/”Silverado”/”Rustlers’ Rhapsody” (all 1985).

    Did the “western” die? Well, if you define “western” as a story with “cowboy hats and horses in a specific time period” then the answer is “maybe.”

    From a “movie business” point of view – when a large % of TV shows were westerns and multiple “westerns” would be released each year then the “cost of production” for a “western” wasn’t particularly high compared to a “non western.”

    i.e. a lot of sets could be reused and “talent” was available – so “movie company” could “send the crew” out to the “back lot” and make a movie on time and under budget.

    BUT if everything has to be built from scratch and talent selected/hired – well, things get expensive/”unprofitable” fast.

    SO it would be more accurate to say that the “western” fell out of fashion much more than “died.”

    Some other movie franchises were also wildly popular at the time (“Star Wars” 1977, “Empire Strikes Back” 1980, and “Return of the Jedi” 1983). “Raiders of the Lost Arc” (1981) has a LOT of “western” elements but isn’t a “western.”

    The 1980’s “action movie” isn’t TOO far removed from “pulp western” plots. Clint Eastwood’s career is intertwined with the “western” — I like to point out that “Dirty” Harry Callahan is basically the “man with no name” as “Police Detective” and a bureaucracy …

    the stories we tell …

    All of which means the “western” as a genre is a little hard to define – AND that it isn’t going away anytime soon because it is part of the “American myth” and “foundation legend”

    I should point out the difference between “myth” (completely fabricated) and “legend” (there is a “historic source” but stuff has been added over the years).

    e.g. the story of King Arthur and the Knights of the Round Table is the stuff of “legend” – i.e. there PROBABLY was a historic source for “Arthur” but the story as it is told today says more about the people telling the story than it does about that historic figure.

    e.g. there is apparently no historic basis for “Robin Hood and his Merry Men” – but it does help explain how the U.K. became the U.K. so we could call it a “modern myth”

    The “western” is both “myth” AND “legend” —

    The “myth” might sound like “plucky pioneers endured hardship, overcame nature, with the intent of building a nation” — which isn’t totally “false” but if you had interviewed the folks “going west” they were PROBABLY doing it MOSTLY out of their own self-interest not pursuing some grand ideal of a new nation.

    The number of “western legends” is legion – Davy Crockett swinging his rifle (“Betsy”) on the parapets of the Alamo immediately comes to mind.

    ANY “quick draw gun fight” story is pure “legend” (e.g. Wyatt Earp’s advice for a gun fight was: “take your time and hit what you are aiming at” – which is much easier said than done …).

    Billy the Kid as “frontier Robin Hood” had as much truth in it as “Robin Hood.” Henry McCarty was a real person – but more thug than folk hero. fwiw: he pops up in the (I enjoyed it) movie “Old Henry” (2021) –

    while I’m at it, Wyatt Earp was an interesting individual – but nothing like the classic TV series “The Life and Legend of Wyatt Earp” — again, THAT story says much more about 1950’s America than the real live Wyatt Earp …

    I could go on, but won’t 😉

    find me on linkedin

  • genre twists and franchise changes

    Re-watched the original “Mad Max” (1979) – available on various “streaming services.”

    Now, the ORIGINAL “Mad Max” was/is a “low budget” Australian movie. It didn’t get “distributed” in the U.S. “back in the day” – which was why “Mad Max 2” (1981) was released as “The Road Warrior” (1982) in the U.S.

    The “low budget” nature distracted me when I watched “Mad Max” on home video (probably in the late 1980s). I’m guessing that the version I saw had been “edited” somewhere along the way – because (if memory serves) it was shorter than 90 minutes.

    There is a section of the movie where they establish the “bad guys” as VERY bad — which (when it was obvious what was going on and that it was going to last a while) I fast forwarded through this time around – it wasn’t “explicit” so much as “unpleasant.”

    The “low budget” nature of the movie precluded the sort of “makeup” effects common in movies. I was reminded of Oedipus Rex (the Ancient Greek play) – there was plenty of “implied off stage” violence – but they didn’t/couldn’t show it ON stage.

    The often replayed scene from “Mad Max” is the finale – where Max comes across the last “bad guy” (who has obviously just murdered someone and is trying to steal the dead man’s boots). No spoiler – the “bad” guy (who Max had arrested earlier in the movie and then the “courts” released) pleads for his life saying that he is “sick” and that the “court says I’m not responsible for my actions.”

    Yeah, Max gives the guy a choice – and then drives away. Remember “Mad Max” is set in a “dystopian future” but it reflects a “society without the rule of law.” “Max” crosses the “line” but only after he has been driven to it by the (VERY) bad guys.

    good guys vs bad guys

    “Mad Max” unintentionally hit a lot of the “mythic storytelling” points – and then they INTENTIONALLY hit more of those “mythic hero story” elements in “The Road Warrior.”

    In true “vengeance genre” fashion Max is the “good man” pushed “too far” who then takes matters into his own hands.

    Charles Bronson made a LOT of movies (161 credits on IMDB) – some of those movies are very good – “The Magnificent 7”, “The Great Escape”, “The Dirty Dozen”, and “Once Upon a Time in the West.” If Mr Bronson had stopped making movies (all of those mentioned were made in the 1960s) he would deserve a place in the “Action movie Hall of Fame”

    (random thought: if there isn’t an “Action movie Hall of Fame” there needs to be …)

    BUT then the 1970s happened – the same decade that would give us “The Godfather”, “Jaws”, and “Star Wars” gave us “Death Wish” (1974).

    I have to admit that I have NOT seen the original “Death Wish.” I saw one of the sequels when it was on cable – but by that time the 1980’s action movie and “horror” films had made the “one man on a vengeance mission” even MORE cliche.

    Vengeance is Mine, and recompense;
    Their foot shall slip in due time;
    For the day of their calamity is at hand,
    And the things to come hasten upon them.

    Deuteronomy 32:35

    BUT again, Mr Bronson played the “good guy pushed too far.”

    fwiw: the Judeo Christian “turn the other cheek” ethic doesn’t mean the “bad guys” get away with anything – e.g. the pull quote … ’nuff said

    random thought: A character in the “Dirty Dozen” THINKS he is the “hand of God” carrying out punishment – but the character is nuts

    ANYWAY The fact that there were 5 “Death Wish” movies says something about the business of low-quality exploitation movies than anything (people kept buying tickets, the movies kept making a profit, they kept making more sequels) – but “human vengeance” is never finished might be the message (if there is a message …)

    Dwayne Johnson (“The Rock”) made a “vengeance genre” flick called “Faster” (2010) which drives home the unending nature of “vengeance” — so the movie becomes a good example of “twisting” a genre a little. All of the “vengeance” elements are there AND they added some “philosophical meat” – Google tells me the movie made a small profit, but wasn’t one of Mr Johnson’s bigger “box office” hits

    The MBA in me wants to point out that Faster made an $11 million profit on a $24 million budget so the return on investment (ROI) as a % might have been higher than some of those close to $billion box office movies.

    random thought: that “low budget” but high ROI % was where “Hollywood schlock” legend Roger Corman made a living – Google tells me he had an estimated net worth of $200 million when he died in May 2024 …

    the repentant gunfighter

    IF the “good guys” act just like the “bad guys” what is the difference between the two?

    Well, that is a good question. No, I’m not going to try to summarize all of human existence/experience.

    From a MOVIE morality point of view the difference is “intent” and “motivation.”

    e.g. Max does what he does BECAUSE of what the “bad guys” did. The bad guys did what THEY did because, well, they are “bad.”

    The “psych 101” concept of a “sociopath” involves not feeling remorse. Ever. If “sociopath” gets caught doing “bad thing” then they might feel bad about being “caught” but not for what they did.

    This idea is the “psychology” behind the “repentant gunfighter” genre. “Shane” (1953) is a classic example (of course the book is “better” but the movie is good in its own right).

    e.g. it is implied that “Shane” had done a lot of “bad things” until he decided he wouldn’t. Shane “turned away” from being a gun for hire … and “plot happens” … and Shane has to face another “gun for hire” in the climax.

    The implied difference between “Shane” and the “bad gunfighter” (played by Jack Palance) is that the “bud guy” enjoys killing, and Shane is a “soldier” doing a required task (and he is just very good at the task).

    The legend of John Henry “Doc” Holliday comes to mind. Ol’ Doc was a dentist until he came down with tuberculosis. Since no one wants to go to a dentist with tuberculosis, Doc became a professional gambler and (sometimes) gunfighter.

    His expectation being that one day he would get into a gunfight with someone faster or more accurate than him and the tuberculosis would no longer be a problem. His final words (as he was dying of tuberculosis in a hospital bed) was “This is funny.” c’est la vie

    The important part of the above is that the sociopath (by definition) cannot be “rehabilitated” because they never feel remorse – they can never “repent” because (in their head) they have no reason to “repent.”

    There are a lot of “click bait” sociopath tests that might be amusing – but if you want to know if someone is a “sociopath” all you need to do is ask them. They will (probably) gladly tell you that EVERYONE thinks/acts they way they do and if someone doesn’t, well, they are fools.

    BUT be careful, “sociopaths” (by definition) are also master manipulators – but it is hard to “hide” sociopathic behavior. Paying more attention to what folks “do” more than what they “say” is always good advice, but especially true of “sociopaths”

    … and the “good guy” always understands that (but doesn’t enjoy it)

    “You can’t serve a writ to a rat”

    – Rooster Cogburn

    Oh, and I’ll kind of wave in the direction of “The Outfit” (2022) as another example of the “repentant gunfighter” genre with a “twist” …

    franchises

    The entire concept of a “franchise business” is that customers know what to expect. The “franchise” provides information on “processes” as well as “resources” and (probably) marketing on a large scale.

    e.g. if you go into ANY establishment calling itself a “coffee shop” you expect certain things – obviously a variety of “coffee” and probably some sort of pastry/sandwich selection.

    BUT if you go into a “Starbucks” franchise the expectations will be for specific drinks and food prepared in a uniform manner. The idea being that visiting a “Starbucks” franchise in Los Angeles should be a similar experience to visiting a “Starbucks” franchise in Roanoke (or pick any other location).

    The Starbucks folks might say they are selling an “experience” BUT the true value of being a franchise is probably in the “name recognition.”

    If you try to open a coffee shop that looks just like “Starbucks” but isn’t – if/when they find out about it – the legal department at Starbucks, Inc will send you a nice letter telling you that you are violating various laws and you should cease and desist

    The “franchise” problem becomes that just “looking like a Starbucks” does not guarantee the coffee/food will meet expectations. There are around 16,000 Starbucks in the U.S. and (around) 9,000 of those are run by “corporate.” Those 7,000 other locations are “independently owned and operated” – i.e. THEY might do things slightly different than “corporate” BUT the “core experience” should fall into a certain range of expectations

    SO the same idea holds true for “entertainment franchises.” The problem for “entertainment franchise” is that folks adding to the “franchise” need to understand the “core product.”

    Imagine a group of talented musicians who decide to go on tour with a “Sound of the 1960’s” tour (or pick any decade you like) – folks buying tickets are going to expect what? well, probably music from the 1960s

    Now imagine a group like “1964 The Tribute” – folks buying tickets are going to expect what? Probably music specifically from The Beatles.

    Folks going to a “Tarzan” movie are gonna expect certain “Tarzan” elements – folks going to a “Sherlock Holmes” movie are gonna expect different elements than the Tarzan folks.

    I was trying to think of a “long running” franchise that has stayed true to its “core” and the BEST example I could think of was Scooby-Doo.

    no, seriously – the “core element” of Scooby-Doo has always been a “boy and his dog” — i.e. Shaggy and Scooby are “core elements”, everything else can be added/removed but you always need those two characters — if you try to twist the franchise into “angry girl power show” then, well, you get the “Velma” series – which is only tangentially associated with “Scooby-Doo” as a franchise

    bad product

    I don’t think fans blame “franchise” for “bad product” – again, this is kind of the “franchise” concept we have come to expect.

    Fans understand that MANY establishments are independently owned/operated. BUT that doesn’t really matter – if “location” consistently under performs, then they will lose customers to other locations.

    the job of weeding out the “under performers” that hurt the franchise brand name belongs to “corporate.”

    If “corporate” isn’t up to the task – well, franchises come and go on a regular basis …

    fwiw: yes, “Star Wars” as a franchise abandoned its core audience a few years back. They are selling “feces in a nice box” and seem to think they are defecating gold nuggets. News of developing “Star Wars” projects fall into the same category as a lot of the commercials for prescription drugs I see which I have no idea what they treat (but the guys cuddling and engaging in p.d.a. imply I’m not the target market)

    The “history” lesson is (probably) that “franchises” come and go. Long running franchises are exceptionally rare because “time and fate” happen to us all.

    Now, if “Red Lobster” (first franchise opened in 1968 in Lakeland, Florida) were to disappear I would take notice – but wouldn’t be terribly sad about the franchise demise.

    “Burger Chef” used to be a national chain, then closed their last location in 1996. I’m told a “Burger Chef” like location existed for another couple years due to a long franchise agreement – i.e. it looked like a “Burger Chef”, had a similar sign as “Burger Chef” but called itself something NOT “Burger Chef.”

    Southwestern Ohio used to be the “world headquarters” for “Ponderosa Steakhouse, Inc” so we had access to a LOT of locations “back in the day.” “Ponderosa” was always fun – The price to food quality/quantity ratio was always high – but the possibility of “screaming baby” also tended to be high. In 2024 Google tells me there is a Ponderosa around Columbus somewhere (a little to far for me to drive – but next time I’m in Columbus …).

    The point (if I had one) being that “franchise death” tends to be a long slow process. The beginning of the slippery slope of franchise death is probably barely perceptible – but once it starts it is hard to stop (you know “slippery slope” and all that) and accelerates quickly

    The good news for “entertainment franchises” is that “rebooting” the franchise is just a single good project away — e.g. no one will remember “Velma” in a few years, and Scooby-Doo and Shaggy will continue onto new projects.

    The “core elements” of “Star Wars” were NEVER exclusive to “Star Wars” – so Disney, Inc can be “Disney, Inc” all it wants. Fans looking for “steak and potatoes” will just go somewhere else …

  • thoughts on genre

    Genre found its way into the English language in the late 18th century from “middle French.” The French got it from Latin – the “gen-” part tends to refer to “grouping”/category – e.g. “genus” in biology is closely related.

    random thought: gender is also “closely related” but “genesis” was derived from the Greek “gignesthal” with a “to be born” meaning – implying beginnings/origins –

    Classification systems tend to tell us more about the folks doing the “classifying” than on the things being classified.

    A couple of ancient Greek guys liked to argue about the nature of “things” – and without fun stuff like “DNA testing” it can be hard to determine how closely different critters are related.

    “To be is to do” — Socrates.
    “To do is to be” — Aristotle
    “Do be do be do” — Frank Sinatra.”

    (famous graffiti)

    The pull-quote is PROBABLY a famous “misquote” — Socrates asked a lot of questions and his student Plato started a school where Aristotle did a lot of “observing” and classifying.

    If you go back a couple thousand years an expedient way to classify critters would be by what they eat and observable physical traits: e.g. does it have hoofs? are they split? does it eat grass? does it chew the cud?

    SO ol’ Mr Aristotle probably didn’t say “to do is to be” but he said something like “tell me what is does and I’ll tell you what it is” – which is obviously different than “tell me what it is and I’ll tell you what it does” — oh, and Mr Sinatra was singing about “Strangers in the night

    She blinded me with …

    meanwhile the fine folks at Merriam-Webster tell me that the Latin scientia (“knowledge, awareness, understanding, branch of knowledge, learning,”) is the root of the English word “science” – which first appeared in the 14th Century.

    “Science” in modern usage tends to imply a systematized body of knowledge gathered using the “scientific method.” The word “scientist” didn’t pop up until 1834 — a new word was needed for classification. e.g. Ben Franklin would have been called a “natural philosopher.”

    Of course the “natural philosopher” was by definition “God” centered. For what it is worth – it is possible to have “religion” without “science” but that doesn’t mean that “science” and “religion” are at odds with each other.

    Is “science” a “religion?” Umm, yes – but you will probably upset your biology professor if you bring up the subject – and we are moving on …

    Science Fiction

    Ok, my mind went down this rabbit hole when someone tried to suggest that Lucien’s “A True Story” was the first science fiction (“SF”) story.

    Now, I should say that I don’t feel strongly enough about the question to get into a fight about it – but you kinda need “science” before you can have “science fiction”

    The problem is one of “classification” — i.e. is the work “fiction?” yes. does it involve “science?” no.

    Of course that would also mean that some VERY popular “space based” franchises are not “science fiction” either.

    e.g. “Star Wars” is more accurately labelled “space fantasy” than “science fiction.” George Lucas made a movie titled THX-1138 in 1971 that is closer to “science fiction” but if I’m being REALLY pedantic it is a “futuristic dystopia”

    Yeah, the term “science fiction” lost any real meaning a long time ago – but some famous stories NOT thought of as science fiction could fit my definition (again “science” has to be part of the story) – e.g. Mary Shelley’s “Frankenstein” has “electricity” at its core – if you take away the “electricity”/science portion the story doesn’t happen

    While Star Wars (Episode IV) is about a young farm boy going on an adventure to save a princess and becoming a hero along the way — you could take out the “hyperspace travel” and “space dog-fighting” and you have a somewhat traditional adventure story.

    Again, if I’m being pedantic – “space based” combat wouldn’t look anything like what they do in the “Star Wars” franchise – i.e. you kinda need an atmosphere to do the quick turning acrobatic moves. The Death Star showing up in orbit around an “earth like” planet would cause disaster on the surface – just from being in orbit.

    I’m a fan of “Star Wars” and if you passionately want it to be “science fiction” that is just fine with me. I think it is a great movie – just not “technically” science fiction.

    Sub-Genres

    But “science fiction” can cover a wide range of subjects. Stanley Kubrick’s “2001: A Space Odyssey” (1968) fits my definition – so do “The Time Machine” (1960) and “The Matrix” (1999) –

    The Planet of the Apes” (1968) checks off the “science fiction” boxes and so did the “reboot” of the franchise – and notice that the 21st Century “reboot” didn’t have “space ships” or “time travel.”

    The point being that science fiction has a lot of “sub genres.” Just for fun we could classify those sub-genre’s on a scale of “hardness” e.g. maybe “Star Trek” is “medium hardness” and “Doctor Who” is “softer” and the “Three Body Problem” is “harder”

    There tends to be a healthy dose of “speculation” involved in science FICTION – so spending too much time explaining the “speculative science” is a good way to convince me to go somewhere else 😉

    Science Fiction without the “science” …

    SO what are we left with it you take the “science” out of “science fiction” — well, yeah obviously the “fiction” remains but “story taking a long time ago in a place far, far away” is a recipe for “fantasy.”

    I’m probably poking another VERY LARGE mammal if I point out that the “X-Men” franchise is “fantasy” trying very hard to be pretend “science.” Seriously “they were just born with super human powers” is a great way to avoid having to come up with “origin” stories for a wide range of fantastic characters – but it isn’t “science fiction.”

    Of course the “superhero” sub-genre could fit under either fantasy or SF – “The Incredible Hulk” and “The Fantastic Four” are “SF-ish” – but CLASSIC “Superman” not so much (e.g. he is from “outer space” and magically gets his powers from the sun and can fly because … I’m not really sure …).

    Once again, I enjoy “X-Men” and “Superman” – i just don’t consider them “science fiction.”

  • Gifs, dial-up, and Libraries

    I went down the rabbit hole this morning on how to pronounce “gif”

    We always “recognize” more words than we actively use – and if you “learn” a word by reading, then the “correct” pronunciation might seem odd

    English/”American” is particularly bad – because we readily absorb words from other languages. e.g. is the “e” at the end of “cache” silent? (yes, yes it is – even in the original French I’m told the “e” is silent most of the time – but it is French so I have no idea 😉 )

    GIF

    SO there is a techie dispute of how to PROPERLY pronounce the acronym for “graphics interchange format” – is it “hard g” Gif or is it like the peanut butter “jif” – I never had to say “gif” out loud and “back in the era of dial up services” folks didn’t talk about “file extensions” on a regular basis — I’m guessing MY experience isn’t unusual, e.g. the dispute popped up this morning …

    fwiw: the OED suggests “Gif” while Merriam-Webster (in true American style) offers both pronunciations as acceptable (gif) — so if you feel strongly about it one way or the other, you are correct 😉

    I tended to just say the letters g-i-f or maybe “dot g-i-f” if I needed to distinguish the file extension.

    fwiw: back in the ol’ “Disk Operating System” (D.O.S.) days we were limited to file names with a maximum of 8 characters a period and then a 3 letter extension e.g. “something.txt”

    D.O.S. used the file extension to distinguish between “executable files” and “data files” – if the file was “something.bat” then D.O.S. would try to execute/run the file while “something.txt” would be seen as “text data.”

    “Modern” operating systems still tend to look at the file extension as a clue for the file’s purpose. The file extension can be connected with an application – e.g. a “something.xcf” file was probably created in GIMP, if you double click on the file your OS will probably try to open the file with GIMP …

    yeah, we had to use cryptic file names because of those limitations back in the day, but we LIKED it that way! and stay off my lawn you crazy kids!

    If memory serves – I think D.O.S. 5.0 expanded the “before the dot” file name space. “Modern” operating systems allow for longer file names, but you can still be as cryptic as you like …

    Dial-Up


    Before the “internet” became widely available there were various “information services” available over dial-up connections. CompuServe immediately comes to mind (they get credit for “creating” the .gif format). There were multiple large “national services” as well as “bulletin board services” (BBS) “before the interweb”.

    “Dial-up” used a “modem” with speeds measured in “bits per second” – with 56k being a “fast” dial-up modem. Which translates to “slow” and “point to point.” Any large file downloads tended to be “hit or miss” because the connection being broken would (probably) mean you needed to start the download from the beginning.

    This “slow and risky” file download aspect of dial-up was why a lot of Linux distributions sold CD’s/physical media early on – i.e. it might have taken DAYS to download an entire distribution over dial-up … good times 😉

    “Modem” is short for “modulator”/”demodulator” – e.g. the sending computer starts with a digital signal that gets “modulated” to an analog wave that could be sent over the “plain old telephone system” (POTS) by the “modem.” The receiving computer’s modem then “demodulated” the analog signal to a digital signal.

    While I’m at it – if you go searching for ancient computer gear you might also come across “baud rates” – which measures the number of “state changes” in a signal. The “baud rate” might be slower than the “bit rate” due to data compression.

    Ummm, of course none of that is REALLY important in the 21st Century. BUT I like to point out that in the “big picture” telegraph technology (dots and dashes sent as electrical signals over a wire) was the same way “dial-up” worked – and “modern networking” is still sending 1’s and 0’s. Yes, “modern networking” is much faster and reliable, but still just 1’s and 0’s …

    The term “modem” has stuck around as a generic form of “computer communication device” – technically you PROBABLY have a “router” connecting you to the internet – but if you call it a “modem” no one will notice …

    Those “dial-up services” back in the day used to charge per minute – so access was obviously restricted/limited. In the late 1980’s part of a librarian’s job description might have included doing “research” using various dial-up services — e.g. those “card catalog” systems were functionally “analog databases” and the “electronic resources” of the time were not much more sophisticated

    “Google will bring you back, you know, a hundred thousand answers. A librarian will bring you back the right one.”

    -Neil Gaman

    Neil Gaman’s quote illustrates the importance of “context” and the evaluation of “sources”

    I’m seeing a lot of “AI” and “machine learning” (ML) as buzzwords in job postings – and folks predicting a “global golden age” because “insert buzzword here” will transform society on a grand scale – and well, the lesson from history is that “access to information” is NEVER equals “wise application of knowledge”

    I’m not saying that “buzzword” won’t change the workplace – I’m just pointing out that humanity is great at justifying doing the “wrong” thing – i.e. greedy, self-centered, arrogant humans are not likely to create “supremely benevolent and wise AI”

    but yes, AI and ML are (probably) gonna be important TOOLS but we (as in “humanity in general”) are PROBABLY not gonna use those tools to usher in a “golden era” of universal peace and prosperity for EVERYONE

    Libraries

    The “value” of libraries has always come from “information access.” When “books” where expensive and ONLY available in “dead tree” format then “library” was synonymous with “books.”

    “Physical media” still dominated “library holdings” until the late 20th/early 21st Centuries gave us “low cost digital access to information.”

    The value of libraries is STILL “information access” with the caveat that “information curation” is PART of “access.”

    i.e. Including something in a “library” implies that the item has more value than items NOT included in the “library.”

    Obviously just because someone “wrote a book” does NOT mean that the book is “true.” Back in the days of “dead tree book domination” the fact that someone had gone to the expense of PUBLISHING a book implied that SOMEONE thought the book was valuable.

    This is the same idea as the “why” behind “ancient works” being considered “worthy of study” (at least in part) just because they are “ancient” — i.e. the logic being that if someone put the time and effort into making a copy of “work” then it MUST have been highly regarded at the time. Then if there are multiple copies of “work” that logic gets amplified.

    Which again loops back to the importance of “curation” – especially in a time when the barriers to “getting published” are close to nil.

    “Man’s mind, stretched to a new idea, never goes back to its original dimension.”

     – Oliver Wendell Holmes

    Of course special care needs to be taken for the care and feeding of “young minds.” Curation to community standards is NOT the same as “censorship.”

  • random thoughts on “Acres of Diamonds”

    Russel Conwell (February 15, 1843 – December 6, 1925) (from wikipedia) “was an American Baptist minister, orator, philanthropist, author, lawyer, and writer. He is best remembered as the founder and first president of Temple University in Philadelphia, as the Pastor of The Baptist Temple, and for his inspirational lecture, ‘Acres of Diamonds’.”

    A link to the full text of Mr. Conwell’s speech is available on the Temple University page

    The story given as inspiration for the lecture (and as the introduction to the longer lecture) is available here

    100 years ago …

    Mr. Conwell would give the speech 6,000+ times – which is impressive. The “legend” is that when arrived in a new town (where he was going to perform the speech) that he would find out the “prominent”/successful folks in the town and work them into the performance.

    ONE of the “points” of the speech being that “opportunity” can be found everywhere. The entrepreneur doesn’t (automatically) need to travel far away looking for opportunity, it (might) be in the backyard.

    A hundred years ago, Mr. Conwell had to argue that “making money” was a worthwhile endeavor. The “common wisdom” of the day being that “extreme wealth” MUST have been achieved by some form of skullduggery.

    Historically, the human “founders” of these United States had come from a culture where land equaled “wealth.” In the “old world” land was in short supply AND passed down by inheritance. Someone born a “peasant” was going to stay a “peasant” because those “to the manor born” controlled the vast majority of land – and therefore “wealth.”

    A rising “merchant class” was in the process of disrupting things when the American Colonies and the U.K. had a disagreement in the late 18th Century — BUT most folks still lived/worked on farms until the early 20th Century.

    It is unfair to call ALL of those born into privilege “parasites.” However, 18th Century England is a good case study of “those in power” using the system to keep themselves in power AND wealthy.

    The grand point being that “money”/wealth is not evil. Money is a tool which can be used for good purposes OR for bad/”greed.” 1 Timothy 6:10 tell us that “LOVE of money is the root of all evil.”

    “For the love of money is the root of all evil: which while some coveted after, they have erred from the faith, and pierced themselves through with many sorrows.”

    1 Timothy 6:10

    Note that “greed” is never “good.” Greed implies “getting more” at the expense of others – which is obviously impossible to reconcile with “loving your neighbor as yourself.”

    New World

    It is fun to point out that “technology” has always been a disruptive force. Technology is always about “application” of knowledge. Advances in “farming technology” helped farmers be more productive – while also freeing up “labor” for the factories of the industrial revolution.

    If we could do a survey asking “average farm workers” (back when Mr. Conwell was giving his speech) how they could get “wealthy” they PROBABLY would have said some variation of “striking gold.”

    (… and historians can point at the “gold rushes” in the middle of the 19th Century as helping populate the western United States. Of course more “wealth” was generated from folks helping the “prospectors” than from folks “striking it rich” pulling gold/silver out of the ground …)

    Of course if one of those “average farm workers” that sold everything to go gold prospecting had created a “better plow” they would have been much better off.

    e.g. A Vermont born blacksmith solved a common problem for farmers – and both he AND the farmers prospered. John Deere, Inc is still helping farmers be productive in the 21st Century.

    Transportation

    If you look at the “super wealthy” from the late 19th and early 20th Century, the common theme might be “transportation.”

    e.g. Cornelius Vanderbuilt built an empire from ferries – from the time when “waterways” were the primary means of transportation in the U.S.

    John Rockefeller built an oil empire – from the time when oil was used for light and heat. When Henry Ford made the horseless carriage affordable, “oil” being refined into gasoline made the Rockefeller clan even more wealthy.

    Sandwiched between Mr. Rockefeller and Mr. Ford as “wealthiest American” was Andrew Carnegie – who had worked his way up from “child labor” to “steel magnate” – from a time with “railroads” and the telegraph were the latest and greatest “technology.”

    No, I am NOT holding up ANY of these men as “moral exemplars” – the grand point is that they helped “solve problems” for a large number of folks, and solving those problems was the root of their wealth …

    The musical “Oklahoma!” (1943) has a song where “rural residents” marvel at the advancements of Kansas City (“She went about as fur as she could go!”). By the mid 20th Century things like automobiles and the telephone system were commonplace enough to be a plot point in a musical.

    (“Oklahoma!” is set around the time the territory became a State. Oklahoma was the 46th State admitted to the Union in 1907)

    Again, the grand point being that some folks got wealthy from disrupting the status quo, and MANY more got wealthy by making incremental advances to cars and phones.

    e.g. Thomas Edison’s “diamonds in the backyard” looked like improvements to the telegraph system of his time long before “Edison Electric.”

    random thought – I’m sure there is an interesting story with the “cigarette lighter” technology. The actual “cigarette lighter” part isn’t a “standard feature” but you can find a lot of “accessories” that use the “automobile auxiliary power outlet.”

    Modern Times

    The sad fact is that in a LOT of nations the “economic game” IS stacked against the “average individual.” Which is why we see so many folks willing to risk everything to immigrate to “opportunity.”

    Obviously a complex subject – and someone living in a “warzone” is more concerned with survival than anything.

    For those NOT living in a warzone or an extremely dysfunctional government the big question becomes which “career path” to pursue.

    Charlie Chaplin made a movie called “Modern Times” back in 1936. Mr. Chaplin was a world famous “movie star” at the time – the movie sometimes get held up as an example of “radical political beliefs.” I’m not sure the movie has any agenda except “entertainment” – e.g. Mr Chaplin’s “tramp” character is pursuing “happiness” NOT a political agenda.

    That same idea applies to “modern workers” in the 21st century. “Happiness” probably won’t come from a “job.” Generic advice like “follow your bliss” is nice, but not particularly useful.

    There is nothing wrong with “working for a paycheck.” The best case scenario is to “do what you love” for a living. the WORST case scenario is doing a job you hate to survive …

    Education, intelligence, and “degrees”

    “Education in the United States” has changed a great deal in the last 100 years. The first “colleges” in the North America existed to train “clergy” (e.g. Harvard was founded in 1636) and then “academics.”

    The “Agricultural and Mechanical Colleges” came along later with the “land grant” colleges in the late 19th Century. The GI Bill sent 2.2 million WWII veterans to college AND 5.6 million more to other training programs.

    Sputnick I (1957) had the unintended consequence of changing national educational priorities in the U.S. – as well as kickstarting NASA (founded July 29, 1958). Both events helped the U.S. get to the moon 11 years later.

    World war and cold war politics aside, the 20th Century workplace was probably the historical “anomaly.” At one point in the 20th Century a “young worker” could drop out of high school, go to work at the local “factory,” and make a “good living,”

    Remember that for MOST of human history, folks lived and worked on farms. Cities provided a marketplace for those agricultural products as well as “other” commerce. Before mass media and rapid transportation MOST people would live and die within 20 miles of where they were born.

    Again, maybe interesting BUT I’ll point out that “compulsory” public education PROBABLY doesn’t have a great record of achievement in the U.S. (or anywhere). i.e. if the ONLY reason “student” is in “school” is because they “have to” – then that student isn’t going to learn much.

    This has nothing to do with “intelligence” and everything to do with “individual interests” and ability. “Education” is best understood as a live long process – not a short term goal.

    “I have never let my schooling interfere with my education.”

    — Mark Twain

    Part of what makes us “human” is (probably) the desire for “mastery” of skills. In the “best case” this is how “education” should look – a journey from “untrained” to “skilled.”

    If an individual’s investment (in time and money) results in them having a valuable “skill set” – then they are “well educated.”

    The contrast being the “academic” that has a lot of “degrees” but no actual “skills” — i.e. having a “doctorate” doesn’t automatically mean anything. “Having” a degree shows “completion” of a set of requirements not “mastery” of those subjects.

    Of course that distinction is why we have “licensing” as well as “degree” requirements for some professions. e.g. The law school graduate that can’t pass the “Bar examination” won’t be allowed to practice law, but might be allowed to teach.

    Nepo babies

    Now, imagine we did a survey of “modern high school students” in the United States asking them “how can you become wealthy?”

    It would be interesting to actually perform the study – i.e. I’m just guessing here from MY personal experience.

    We would also have collect data on the parent’s education and career — i.e. if a child grows up in a family of “fire fighters” then they are (probably) more likely to pursue a career as “fire fighters” simply because that is what they are familiar.

    The term “nepo baby” gets used (derisively) for some entertainment industry professionals – but if mom and dad are both “entertainment industry professionals” then a child pursuing an acting/performance career kind of becomes “going into the family business.”

    Now, “having good genetics” (you know “being ridiculously good looking”) is always a positive – so there are certainly “nepo babies” out there.

    I’m not throwing stones at anyone, “hiring” is not an exact science in ANY industry. That “genetic component” probably applies to families of doctors, lawyers, and educators as well — i.e. if mom and dad were both “whatever”, it is possible that “junior” will have those same skills/personality preferences.

    … and it is also possible that “junior” will want to do something completely different.

    BUT if “student” has minimal exposure to “work life” outside of what they see at home and school – MY GUESS is that the majority (of my hypothetical survey of high school students) will say the “path to wealth” involves professional sports or “entertainment industry.”

    umm, both of which may be more likely than “winning the lottery” or speculating on the stock market — but not exactly “career counsellor” advice

    (… oh, and you only hear about the “big rock stars” being told by their “career counsellor” that they couldn’t make a living as a “rock star” AFTER they became “big rock stars” – if someone quits after being told they “can’t do it” or that the chance of success is small, then they PROBABLY didn’t want to do “it” very much …)

    “Keep your feet on the ground and keep reaching for the stars.”

    – Casey Kasem

    Did I have a point?

    “Well, the “message” in “Acres of Diamonds” is still valid 100+ years later.

    A certain amount of “knowledge” is required to be able to recognize opportunity. e.g. The person to “build a better mousetrap” is someone that has experience catching mice.

    BUT simply inventing a “better” mousetrap is only half of the problem – the mousetrap needs to be produced, marketed, and sold.

    Two BIG things that weren’t around when Mr. Conwell was giving his speech are “venture capital” and “franchising.” Neither of which “negatively” impacts the argument he was making – and if anything make his argument even stronger …

    check out https://curious.iterudio.com for a short (free) class on “success”

    You might also find this book interesting

  • life, humor, Star Wars

    It bothers me a little when a “random comedian” comes out and describes their “theory of humor” as being “pain.”

    Usually it is an “established” entertainer – and they present the idea that “all humor is based on pain” as being a form of received wisdom.

    Obviously anytime the word “all” creeps into the discussion the chances of the statement in question being 100% correct is small.

    Along the same path – someone recently tried to argue that “Star Wars” was “woke” from day 1 – and, well, my response is dotted line connected to the above …

    Life

    The idea of “stress” as a negative force in daily life has been around for years. Someone in a “big business marketing department” came up with a slogan about “reducing stress” as a way to sell soap/soup/something else – but “stress” is not inherently positive or negative.

    The human body has a generic “stress response” but our perception of “stress” is relative. The “positive” form of stress (eustress) gets a lot less attention than the “negative” form of stress (distress).

    “Become a possibilitarian. No matter how dark things seem to be or actually are, raise your sights and see possibilities — always see them, for they’re always there.”

    Norman Vincent Peale

    Obviously folks WANT eustress – but that tends to get marketed as “fun” or “happiness.”

    It becomes a truism that the only thing we can truly “control” is out attitude towards “stress.” “Life” is gonna happen, all we can really control is how we choose to react.

    Set the “way back machine” to 100 years ago and we would find this “life reaction” automatically influenced by “religion.” “People of the book” might have referenced the “wisdom books” (e.g. Job, Proverbs, Ecclesiastes) – all of which are worthy of study.

    Job tells us that “Man that is born of a woman is of few days and full of trouble.” (Job 14:1) but also “Thou shalt call, and I will answer thee: thou wilt have a desire to the work of thine hands.” (Job 14:15) — which could be examples of reacting to “distress” and then “eustress”

    .. and then of course this quote from Proverbs:

     A merry heart doeth good like a medicine: but a broken spirit drieth the bones.

    Proverbs 17:22

    Humor

    Today the “four noble truths” of Buddhism are on my mind – with the point being that “all humor is based on pain” sounds a lot like “life is suffering.”

    It is more accurate to say that life is “stress” NOT “life is pain/suffering.”

    I automatically reject the statement “ALL humor is based on pain” – because “ALL humor is based on ‘life’” – which is “stress” NOT “pain”

    Pain and pleasure are also “relative” terms to a certain degree – both are “sensations” but perceiving them as feeling “pleasant” or “unpleasant” requires some context

    If we divide the world between “Optimists” on one side and “Pessimists” on the other and charted the general population on that line – we would (probably) see a classic bell curve. Most people would be in the “middle” and very few would be on the extremes — BUT my guess is that most “comedians” are found in the “extremes” – either “optimist” or “pessimist.”

    The point being that I understand WHY someone might say “all humor is based on pain” – not being a “pessimist” (or Buddhist) I simply disagree …

    Humor has trouble translating between generations in part because we have to “identify” with the subject to appreciate the humor.

    e.g. William Shakespeare has a lot of jokes in his play – that audiences 400 years ago probably thought were hilarious – but need to be translated to modern audiences. In the 21st century Charlie Chaplin’s movies are still “humorous” but not as funny as they were to early 20th century audiences.

    Any “topical” humor ceases to be humorous when the “topic” is no longer “topical” e.g. Jackie Mason telling jokes about Ted Kennedy and Henry Kissinger – if you have no idea who Ted Kennedy and Henry Kissinger are, Mr Mason’s delivery is still humorous – but if you recognize the impersonation/truth in the joke it is much funnier

    hmm, so maybe all humor is based on truth? The only characters routinely allowed to tell the “truth” in Mr Shakespeare’s plays are the “fools”/court jesters — or maybe Mel Brooks as stand up philosopher is the definitive example …

    Star Wars

    Any “long running” series is subject to the impact of nostalgia.

    e.g. If you have a preference/opinion on which actor did “James Bond” (or Batman or Superman or Spider-Man) best – that opinion is influenced (positive of negative) by the actor/movies that were released when you were “maturing”

    SO I was a little surprised when I started hearing folks say that they preferred the “Star Wars prequels” to the original trilogy.

    I don’t dislike the “prequels” but think they are obviously not as good as the original trilogy – which may or may not be “true” BUT is 100% influenced by nostalgia on my part.

    As I have aged – I am willing to admit that “The Empire Strikes Back”/Episode V is a “better movie” (plot, character development, fx) than “Star Wars” 1977/”A New Hope”/Episode IV – BUT I still prefer Episode IV

    With MY bias fully disclosed – I REALLY didn’t like Episodes VIII and IX.

    From a storytelling point of view the “middle chapter” tends to be the “strongest” part of most “trilogies” — but ALL three movies being “equally good” is rare

    Notice that should be read “intentional trilogy” as in a story told in three parts, NOT just a collection of 3 movies starring the same character

    e.g. of Episodes I – II – III – my preference goes III (best), II, I (least favorite),

    “Star Wars”/Episode IV stands by itself – mostly because there was no guarantee that the movie would be popular enough to have “sequels” – BUT George Lucas had a general idea for three trilogies, which is why Episodes V and VI become 1 story …

    I’ve heard some folks try to argue that Harrison Ford wasn’t happy and that his characters fate at the end of “Empire” was a way for George Lucas to potentially “write him out of the story” — which is implausible at best.

    No, Mr Ford didn’t want his career to be forever linked to “Star Wars” and avoided to a lot of publicity — but he wasn’t “Harrison Ford film legend” in 1980 when Empire was released.

    Mr Lucas was trying to recreate the old “serial movie” cliff-hanger feel with “Empire” – i.e. he knew there would be an “Episode VI” when making “Episode V.”

    The Episode VI ending was just an example of “expert storytelling” and “good business” at a time when “sequels” were common but tended to be “back for more cash” projects rather than “good storytelling.”

    e.g. did anyone think that Marvel was actually cleaning up the MCU at the end of “Avengers: Infinity War?” No, there was ALWAYS going to be one more movie that would modify the cliff-hanger ending …

    Meanwhile back at the ranch …

    I liked Episode VII — in part because “Star Wars” was slapped on the side of the box – but it was entertaining, and “good enough.”

    No, I didn’t “connect” with any of the new characters introduced – but this is where that generational shift comes into play. The “Disney sequels” made $billions but the “box office” decreased for both Episode VIII AND then Episode IX

    (btw if you rank the Star Wars franchise movies buy adjust for inflation box office — Episode IV is a $billion ahead of the second place movie Episode VII)

    I REALLY wanted to like Episode VIII — but it is just tripe with “Star Wars” slapped on the side. My problem was not with the new characters – it was the ridiculous story full of plot holes. Same with Episode IX – though I went in expecting the movie to be terrible and only saw it in the theater out of a need to “see how they mess up the ending”

    BUT was the original trilogy or the prequels “woke”? where the Disney sequels “woke”?

    What do you mean “woke”?

    “Woke” tends to be used as a negative/insult by folks of one political persuasion and a badge of honor by another political persuasion.

    TO me “woke” and b.s. (NOT “bachelor of science”) are in the same category — i.e. b.s. isn’t concerned with “truth” so much as convincing an audience that the spreader of b.s. believes something – e.g. the speaker wants the audience to believe that they (speaker and audience) share the same values – though the speaker doesn’t come right out and say what they think/believe.

    “Woke” is about pushing an “agenda” more than actually discussing ideas/concepts — with the implication being that EVERYONE must accept the “agenda” and of course you are wrong/stupid/evil if you don’t blindly accept the “agenda”

    SO did episodes VIII and IX have an “agenda” — well, no. They were just terrible storytelling.

    Notice that “strong female characters” does NOT equal “woke.” Even “strong female characters” combined with “man child idiot fool” male characters is NOT woke – just bad storytelling.

    i.e. “Princess Leia” is obviously a strong leader – but she is archetype “mother”/”elder sister” in Episode IV – which is NOT “woke” by any definition

    I like to point out that Luke’s journey from “innocence” to “experience” is reflected in his clothing – i.e. he is in “all white” (innocent/pure) in Episode IV – kind of “grey” in V, and then in all black in Episode VI (experienced/mature)

    Mr Lucas famously had Carrie Fisher “taped up” to keep her from jiggling in Episode IV – so Leia’s arc is a “maturation”/awakening of a different kind than Luke’s — Leia goes from chaste/all in white/funny hair style in “A New Hope” to “slave girl uniform” in Jedi – and all of the bickering with Han was (probably) supposed to be “suppressed sexual tension” – like an old Howard Hawks movie

    I could go on for another thousand words on what I think is “wrong” with Episodes VIII and IX — part of it is about what “leadership” ACTUALLY looks like (umm, which is NOT – go over there for no good reason, then turn around and come back, all while pretending that being a “strong leader” means NOT communicating the plan to subordinates — that isn’t “leadership” that is incompetence — but I digress)

    The biggest flaw with the Disney Sequels is how they treated the core trio from the original trilogy — i.e. all that bickering wasn’t sexual tension, it was just bickering – and of course Luke sees his nephew have a bad dream and decides to run away and sulk — disappointing/bad storytelling? yes. “woke”? well, no.

    The fact that ALL of the male characters are in “man child” mode waiting for “strong female to tell them what to do” might be an example of incompetent “story by committee” – but PROBABLY not “woke” (unless the agenda was “emasculation”)

    ANYWAY

    While I’m at it – I didn’t make it past the first couple episodes of the Disney+ series “Andor” (apparently “remove all the humor” and/or be dark and depressing == “adult story telling” for someone at Disney) and the “Obi Wan” mini series was another exercise in unwatchable tripe

    … but of course YMMV