I recently made an observation that “cheesy and sympathetic” never go out of style – with the implied punch line being that “cheesy” can never go OUT of style because by definition it is never IN style.
The folks at Merriam-Webster tell me that the non-dairy definition of “cheesy” is “shabby, cheap.” SO something that we call “cheesy” (again in a non-dietary subject) tends to be “low budget” and probably “low quality” – e.g. Plan 9 From Outer Space is a “cheesy movie” – SO cheesy that it is funny.
The path cheesy took to mean “cheap” is almost certainly American slang:
cheesy (adj.) Meaning “cheap, inferior” is attested from 1896, perhaps originally U.S. student slang, along with cheese (n.) “an ignorant, stupid person.”
I’m also told at the time across the pond:
In late 19c. British slang, cheesy was “fine, showy” (1858), probably from cheese
That quote about the United States and the United Kingdom being “separated by a common language” comes to mind …
โEngland and America are two countries separated by the same language!โ
George Bernard Shaw
Not always bad …
It should be pointed out that cheesy doesn’t automatically mean “bad.” Guilty pleasures often have a high “cheesy” content. Why are they “guilty pleasures?” – probably because they are “cheap and underappreciated”
From a “food” point of view – adding cheese/cheese like substance can transform “blah” to “gimme more” — think of the difference between plain nachos vs nachos AND cheese dip.
An “artistic work” that strives for simplicity AND entertainment will almost certainly get labelled “cheesy.” e.g. for MOST of U.S. history “romance novels” have been the best selling genre – and of course “rom-coms” as a movie genre are so popular they have channels dedicated to them – and BOTH are extremely cheesy by design.
Just how MUCH “cheesy” is acceptable can change – but just because it is cheesy doesn’t mean it is worthless.
e.g. Pick up a copy of an Edgar Rice Burroughs adventure story or a Max Brand western and “Cheesy but fun” will be an accurate description 99% of the time.
Giving the audience what they want is always a path to short term profit – but almost never long term respect. e.g. “Max Brand” was a pen name for Frederick Faust to begin with – and is still a brand name today – pick up a Max Brand paper back and the title of the individual book is probably smaller on the cover than “Max Brand”
Edgar Rice Burroughs created Tarzan – and the history of THAT iconic (and cheesy) character is beyond the scope of this article …
Beautiful simplicity
If “simple and entertaining” is done at a high level it might get the “elegant” label.
At first glance elegance and cheesy are polar opposites – but the difference is in the implementation and individual interpretation.
The first Star Wars (A New Hope) comes to mind – I loved the movie as an adventure story when I was 10 years old. By the time I was 20 it had become a little cheesy. When I re-watched it at 40 I notice the “meat and potatoes” under the cheese.
The story being “implemented” in Star Wars has deep mythological roots – what changed was MY individual interpretation of the movie …
While I’m at it Casablanca (1942) AND Citizen Kane (1941) routinely make the list of “great American movies” and both have a certain amount of “cheesy” in them –
AND don’t get me started on The Great Gatsby – (either the 1925 novel OR the movie interpretations) – The 1974 Robert Redford and Mia Farrow version captures the “feel” of the novel – which is VERY “cheesy and sympathetic”
yes, The Great Gatsby is a great novel. – Baz Luhrmann might deserve the “king of cheesy” title, but you know – different subject …
So yes “classics” can be cheesy. BUT in general noticeable “cheesy-ness” is going to be interpreted as profiteering and maybe exploitation. i.e. a little cheesy goes a long way and TOO much ruins the product.
When I was a child …
1 Corinthians 13:11(“When I was a child, I spoke as a child, I understood as a child, I thought as a child; but when I became a man, I put away childish things.”) drives home the point that “cheesy youthful moral reasoning” is always bad.
Youthful arrogance and prejudice should give way to more mature (and humble) attitudes developed by experience and education. It is a lot easier to “know everything” when your world is relatively small and experience is limited.
In THOSE cases the “cheesy” probably gets consumed with the assumption that it is the norm. Which was kinda the point of Bob Dylan’s “My Back Pages” – but that is a different subject …
Ah, but I was so much older then Iโm younger than that now
Bob Dylan
That youthful ignorance of the “cheesy” should naturally dissipate with time and exposure to the NOT “cheesy” — BUT just because you enjoyed something when you were “a child” doesn’t mean you can’t enjoy it when you “grow up.”
That enjoyment should be re-framed and not glorified by nostalgia – i.e. “I remember loving this when I was smaller” vs “Things today will never be as good as my memory of ‘whatever’”
A little learning is a dangerous thing ; Drink deep, or taste not the Pierian spring : There shallow draughts intoxicate the brain, And drinking largely sobers us again.
Alexander Pope
Nostalgia isn’t evil …
There has been “research” done that pinpoints the age at which “musical tastes” get locked in.
As I remember the study – they came to the conclusion that the music we are exposed to under the age of 10 tends to have a watershed type effect – i.e. it can have a positive OR a negative impact on later musical preferences.
My guess is that “parental relationships” become a lurking variable — if music reminds someone of their parents THAT is what they are reacting to, not the music.
e.g. “I LOVE that song my mother/father used to play that all the time” vs “I hate that song my mother/father used to play it ALL the time”
Childhood memories aside – the human brain keeps developing into our late 20s – and it is around that time when “band names start sounding the same” and “music just isn’t as good as it used to be” to the average person.
If someone works in the “music industry” in some form – then their tastes may not calcify as much as non-music industry folks. However that is also going to be an exercise in the “expert mind” vs the “amateur mind” – which is also a different subject.
SO if someone hears a song AND it reminds them of being in the 7th grade (13ish) – MY guess is that the song will FEEL “cheesy” to them simply because they are being reminded of that time in their life.
“Beautiful” by Christina Aguilera came up a “cheesy and sympathetic” – yes, it is one of those songs that has a very high perceived “cheesy” content level – but get past the “cheese” and it is about self acceptance and independence. Scratch the surface and the message is “think for yourself” and/or be a critical thinking individual
Ms Aguilera was 19ish when she recorded/released the song in 2002 – and I’m gonna guess that at 44ish in 2025 SHE probably has a different view of “Beautiful” – but my point is that there is “meat” under the “cheesy”
umm, but for me I still hear “talented 19 year old” because I’m that guy in the back of the room yelling “Play Freebird!” ๐
A “social media” post had a poll going about who would win between “Professor Albus Dumbledore” (from the “Harry Potter” books) and Gandalf the grey/white (from The Lord of the Rings – LotR).
Well, I didn’t bother voting in the poll – I think Dumbledore was winning – but that isn’t the point.
Polls
The “winner” of ANY poll is going to based on the survey/poll group. This particular poll is fun because it allows “fans” to be “fans” – i.e. fans of the Harry Potter books are obviously going to choose Dumbledore, and fans of LotR are obviously going to choose Gandalf.
Short answer: my bias is for Gandalf. BUT there are assumptions to be explained.
Movies vs Books
The Harry Potter MOVIES had the luxury of the author still being around. Ms Rowling didn’t write the screenplays – but she provided “assistance”/input to make sure the movies basically kept to the plots of the novels. The point being that “Dumbledore in the books” is pretty much the same as “Dumbledore in the movies.”
J.R.R. Tolkien died in 1973. Professor Tolkien sold the “film, stage and merchandising rights” to United Artists in 1969. The “internet version” of the story is that he sold the rights because of inheritance tax issues. I have no idea what the deal was – but is sounds like he made a good decision – he got ยฃ104,000 (adjusted for inflation around ยฃ1.2 million) AND secured royalties for any future productions.
The “book to movie” translation always comes with “storytelling issues.” What works in “book” can be hard to bring to the screen. Which means there are major differences between “Gandalf in the books” and “Gandalf in the movies.”
Of course Peter Jackson’s LotR is great – and the “core story” is intact. Both the movies and the books tell an epic story of a battle between good and evil.
The BIG difference between LotR book and movie is “character arcs.” Professor Tolkien was writing an “epic” with “epic heroes” – you know, big, bold, and confident. While Peter Jackson tried to make the characters a little less “big, bold, and confident” – which of course also allows the actors to “act” …
SO Gandalf in the movies is not as “powerful” as Gandalf in the books. Ok, Gandalf is obviously not “weak” in the movies – however the character is thousands of years old, he is NOT human. “Wizards” in the LotR are “created beings”/”agents of the divine.”
Think of the end of The Fellowship of the Ring where Gandalf fights the Balrog. In both movie and book Gandalf emerges victorious, In the movies he dies and is reborn as “Gandalf the White” BUT in the books he doesn’t die. The implication in the books is closer to “leveling up” – he gets promoted not “reborn.”
Man vs the Divine
For what it is worth: I’m not sure that wizards in LotR can “die.” They have a physical form that can be destroyed e.g. Saruman at the end of TheReturn of the King (book) – but is that a “permanent death” or just a temporary inconvenience.
The Iliad (another epic) comes to mind. Hector (the hero of Troy) vs Achilles ( Greek hero) isn’t a fair fight in the original version – i.e. Achilles is part human and part “divine.” SO “mortal vs divine being” is never going to end well for the mortal.
The “movie” version of the Iliad (Troy – 2004) includes a great fight scene between Achilles (Brad Pitt) and Hector (Eric Bana) – but when Hector meets Achilles in the original text, Hector runs, and Achilles chases …
What made he ancient Greek “gods” divine was their long life. Which brings up another point – IF “being” is eternal and they get into a disagreement with “mortal” – then all the “eternal” needs to do is wait for the mortal to shuffle off the mortal coil. That is kind of a theme running through the Iliad – but I’m wandering off on a tangent …
It’s Time!
Dumbledore vs Gandalf as a contest between skilled professionals (or chess/checkers/pick a game) might be a toss up. Neither is “all powerful”, Dumbledore is a human with “magic powers” – and Gandalf IS a “magic being”.
Which is probably why the poll caught my attention in the first place.
Of course Dumbledore can and does die – so in a contrived “battle to the death” then Dumbledore doesn’t have a chance. e.g. Gandalf could go away for 500 years, come back to visit Dumbledore’s grave and say “I win!”
Voldemort vs Sauron is also a no-contest for the same reasons. What would happen if Voldemort managed to get hold on the One Ring? Sauron feared someone using THE ring against him, but would even an exceptionally powerful mortal have been able to control the ring, or would he simply become a more powerful Gollum?
The Ring of power might extend life but The Odyssey comes to mind. Odysseus (the man who gave us the Trojan Horse) had the opportunity to stay with Calypso (a nymph/minor goddess) – she even promised him eternal life. Odysseus desperately wanted to get back to his (mortal) wife – but also implied is that he was wise enough to see that unintended consequences are inevitable. i.e. “eternity” in a mortal body that continues to age wouldn’t be any fun (e.g. Tithonus)
Anyway, if Dumbledore and Gandalf actually met they would probably play chess, drink wine, and swap stories about little folk – not have a fight to the death …
Maybe that one line sums up “logic 101” and/or “statistics 101.”
The example I used to hear was that there was a positive correlation between ice cream sales and drowning. As ice cream sales increase so does the number of deaths by drowning.
BUT eating ice cream does not CAUSE drowning deaths — i.e. when is more ice cream sold? in the summer. When do more people go swimming? in the summer.
This concept is important – just in general – but also when talking about the rise of “streaming” and “movie theater” attendance.
Movies
When going to the “movies” first became a cultural event 100ish years ago it was a much different experience. Back in that “golden era” of movie theaters folks would go as a WEEKLY “family night out” — there might have been a news reel, a cartoon, and then a feature presentation.
Other “family entertainment” options might have been staying home and listening to the radio. “Live theater”, and musical concerts might have been an option IF they happened to be in town. Back at that time the “Circus” coming to town would have been a much bigger deal.
The primary source of “news” would have still been print newspapers – and “sports” like boxing, horse racing, baseball, college football were popular – again either on the radio or attending live events.
BUT “the movies” were the bread and butter of family entertainment.
Television
The “golden age of radio” was relatively short – from the late 1920s to the 1950’s. Radio and movies might have been in the same general “entertainment” markets but they are much different “experiences.”
“Visuals AND sound” tends to beat “just sound” — BUT “going to the movies” would have been an EVENT, while turning on the radio an everyday experience.
When Television became popular in the 1950s it ended the “golden age” of radio – and also forced the “movie industry” to adapt.
e.g. hunt up some old “B” Westerns and you’ll discover that they tend to be about an hour long – and the “weekly serial” adventure/cliff hanger shorts tend to be 20 to 45 minutes. Which sounds a LOT like “television” program lengths to the “modern audience.”
A lot of those “B” Western stars also had radio shows – and the popular show made the jump from radio to television. There was still a sizable market for both television and radio in the early days. The popular shows probably had a comic book and/or daily newspaper comic strip as well.
The “point” being that folks wanted “entertainment” NOT a specific TYPE of entertainment.
Television ended the “weekly ritual” of going to the movies.
The “movie industry” responded by increasing the “production value” of movies. Movies were “bigger” and “better” than television programming.
The “movie” advantage was still the bigger screen and the EVENT status. The product required to attract the audience into the theaters obviously changed – gimmicks like 3D, “Technicolor”, CinemaScope came and went.
Now, the one 20th Century invention that can rival television for “cultural impact” is the automobile. I would tend to argue that the increased “mobility” automobiles allowed makes them the most influential and/or culturally transformational. BUT the point is arguable.
This “automobile” changed “dating and mating” rituals. PART of that change involved “going to the movies.” At the height there were 4,000 “drive in” movie theaters spread across the U.S. (in the 1950s).
All of those Baby-boomers doing there thing would have found the “drive in” the more economical option. The post war economic boom created “teenagers” would have had “going to the movies” as an option to “get away from parents” and be, well, “teenagers.”
The “movie theater business” was disrupted by a Supreme Court ruling in 1948. United States v. Paramountย on May 4, 1948 effectively ended the “studio system” – “studio” would no longer be allowed to own “theaters.”
An unintended consequence of ending the “studio system” was that a lot of “talent” was released from contracts, studios opened up their film libraries and/or sold them to television stations. The number of “regular moviegoers” decreased from 90 million in 1948 to 46 million in 1958. Television ownership went from 8,000 in 1946 to 46 million in 1960
SO if you REALLY want to put a date on the START of the death of the “movie theater business” – May 4, 1948
Cable, VCRs, DVDs …
Of course “movie theaters” have had a long slow decline. To coin a phrase: The reports of “movie theater’s death” has been greatly exaggerated …
Cable TV rolled across the U.S. starting in the 1970’s. HBO came along in 1972.
“You want romance? In Ridgemont? We can’t even get cable TV here, Stacy, and you want romance!”
Fast Times at Ridgemont High 1982
Drive in theaters continued to close – but they haven’t disappeared yet.
By the 1970’s television had replaced “the movies” in terms of “cultural impact” – BUT the “birth of the blockbuster” illustrated that “the movies” weren’t dead yet.
Of course the typical “movie theater” has not made a large % of their profits from SHOWING movies for a long time – i.e. theaters tend to make money at the concession stand NOT from ticket sales.
The fact that “going to the movies” was still a distinct experience from “watching at home”
Movie studios were gifted a new revenue stream in the 1980s when “VCR” ownership created the “VHS/Video Rental Store.”
Again, “seeing it in the theater” with a crowd on the big screen with “theater quality sound” is still a distinct experience.
DVD’s provided superior picture AND sound than VHS – and the DVD quickly replaced the VCR. The “Rental Store” just shifted from VHS tapes to DVD’s.
BUT the BIG impact of DVD’s was their durability and lightweight. DVDs could be played multiple times with out lose of quality (VHS tapes degraded a little each viewing), AND they could even be safely (cheaply) mailed.
Netflix started in 1997. The “Reed Hastings/Netflix story” is interesting – but not important at the moment.
From a “movie theater” point of view – “The Phantom Menace” being released as a “digital” film in 1999 was a “transitional moment.”
The music industry as a whole bungled their “digital” transition to the point that a couple generations of folks have grown up expecting “music” to be “free.” THAT is a different subject —
I’ll point out that a “digital product” can easily be reproduced without lose of quality. If I have a “digital” copy of “media” I can easily reproduce exact duplicates. No need for a “manufacturing” and a “shipping” process – just “copy” from 1 location to the new location. Exact copy. Done.
For the “movie industry” in the short term the transition to “digital” helped lower distribution costs. Copies of films didn’t need to be created and shipped from theater to theater in “cans of film” – just copy the new movie to the digital projector’s hard drive and you are all set.
The combination of the “home computer” and “internet access” also deserve the “cultural shift” label – but it was really “more of the same” done “faster and cheaper.”
Streaming
It is trendy to blame “streaming” movies of the death of “theaters” — but hopefully by this point I’ve made the point that “streaming” is not the CAUSE of the decline of theaters. At best the “rise of streaming” and the “decline of theaters” are correlated – BUT (all together now)
Correlation never equals causality.
“Streaming” deserves credit for killing “Movie rental stores” — but the “theater experience” is still the “theater experience”
MY issue with “going to the theater” is that ticket prices have pretty much kept up with inflation. Which kinda means a generic “family of four” has to take out a small loan to “go to the movies.”
I’m placing the recent decline in theater attendance on “inflation” and “bad product.”
Yes, the “movie industry” has been churning out self-righteous garbage NOT “entertainment.”
BUT there is still a demand for “family friendly entertainment” — “Inside Out 2” setting box office records illustrates my point
Old Theaters …
I like not having to wait in line – but also kinda miss the “old theater” feel. That 20 screen “mega plex” is nice but there is still room for renovated “old theaters” if they can be updated without losing their “charm.”
To be clear the “charm” of old theaters does NOT include “uncomfortable seats” and feet sticking to the floor. If someone tries to “rehab” a theater I’d spend most of the money on the bathrooms and comfortable seating
Folks need to feel “safe” AND “comfortable” then if the popcorn is a little stale it doesn’t matter …
“Lying” obviously requires a “lie” to build around – with the definition of “lie” (the third definition from Merriam-Webster: “to make an untrue statement withย intentย to deceive”) being the relevant point.
Not that INTENT is required. SO it is POSSIBLE for someone to “tell a story” that is not true, and not be “lying.”
“Telling tall tales” has probably been a kind of “sport” to rascals, rogues, and tramps as long as there have been “rascals, rogues, and tramps.” Maybe a form of good-natured “right of passage” – e.g. think “wide-eyed novice” listening intently to “grizzled veteran” telling “stories” that get more and more “factually challenged.”
IF at SOME point the “grizzled veteran” passes a point where the “wide eyed novice” gets the joke – then everyone laughs. The “novice” isn’t as wide-eyed and is on their way to “veteran” status.
(of course if “wide-eyed novice” DOESN’T get the joke – then, well, that is a different problem)
“Campfire stories” take on a general form. SOMETIMES there is a kernel of truth – i.e. “legends” are born in the “additions” to the TRUE story. It is probably in those “additions” that we can track “cultural value changes.”
Art reflects …
Does life imitate art, or does art imitate life?
And the answer is, well, “yes.”
We can quickly get lost in definitions – e.g what is “art?” How about if we agree that “art REFLECTS an IDEAL of life.” Art must be “created,” which requires a “creator” — i.e. the “art” reflects the character of the “artist”/creator.
Creativity is allowing oneself to make mistakes. Art is knowing which ones to keep.
Scott Adams
Since the “artist” does not exist in a cultural vacuum the “art” ends up reflecting the society in which the artist lives.
Plot and Story
The difference between “plot” and “story” is that “plot” requires causality.
e.g. “A” happens, then “B” happens, then “C” happens is a “story” but NOT a plot.
If “A” happens, then “B” happens BECAUSE of A, and then “C” happens because of “B” (or “A” or “A&B” – depending on just how complicated you wanna get) – THAT is “plot”
Someone “telling stories” will have a “plot” but there will be intentional “plot holes” testing the listener’s level of gullibility.
e.g. grizzled veteran: “There I was – just me and my horse, supplies running out, horse almost dead. Suddenly, I was attacked by a gang of 40 cut-throats that would kill me just for my boots.
I shot the nearest on in the leg, jumped on my horse and headed up the mountain. Now, those cut-throats were REALLY angry and were threatening to bury me up to my neck and leave me to die. SO I managed to find a small cave where they could only get at me 1 or 2 at a time – let my horse go and waited for them to find me. I was down to just 3 bullets and my knife.
Sure enough, they found me, and then …”
wide eyed novice: ” … and then?”
grizzled veteran: “well, I died of course” (laughter, insults, etc)
(and when that former wide-eyed novice has become “grizzled veteran” they will probably tell the same story to the next batch of wide-eyed novices …)
Stories …
If everyone involved KNOWS the story being told is just a “story” then the audience can willingly engage in “suspension of disbelief” and just enjoy the story.
The required amount of “disbelief” will obviously vary based on genres. The folks “performing” aren’t “intentionally” trying to deceive they are engaging in “storytelling.”
e.g. the audience at a performance of Hamlet doesn’t ACTUALLY believe that they are watching a “Prince of Denmark” wrestling with the fact that his Uncle may or may not have murdered the former King (Hamlet’s father). Hopefully, the audience puts aside “critical thinking” and plays along with the story.
Obviously the folks putting on the performance try their best to be convincing. The highest praise that can be given to a “working actor” MIGHT be that they are ALWAYS “convincing” no matter what role they are playing.
(fwiw: playing “Hamlet” is considered a test of an actor’s acting ability – this is probably why you see so many “famous movie stars” attempt the roll. I have seen a LOT of versions of Hamlet – and most of them are “ok.”
If I’m watching “Hamlet” and I think “that is so and so TRYING to do Hamlet” – then that qualifies as an “ok performance” — but if I forget that it is “BIG NAME” playing Hamlet, then that is “VERY good” performance … and moving on)
Random thought: Strange Brew (1983) borrows plot elements from Hamlet – catching the “Hamlet” references elevated the movie from “cute buddy comedy” to “funny at multiple levels” – and yes, INTENTIONAL plot holes-a-plenty …
Star Wars plot holes …
I have been re-examining WHY I loved the original “Star Wars” trilogy. In part this is because of the “fan reaction” to the latest “Star Wars product.”
Apparently others have done this “re-examination” as well. One such re-examination was trying to point out “plot holes” in “Star Wars” (1977)
In particular they didn’t like the fact that if the “Empire” had blown up the “escape pod” at the beginning the movie ends there. i.e. blow up the escape pod with R2-D2 and C-3P0 and the story ends there.
BUT that is NOT a “plot hole” – yes, the movie turns on that point BUT it also helps establish that the “Empire” are the bad guys.
The scene could easily have been taken out – but it serves a “storytelling” purpose. The “Empire” is the “evil authoritarian organization” – notice that the anonymous characters WOULD have blown up the “escape pod” IF they had detected “life forms.” i.e. the anonymous character’s (lack of) action illustrates that “fate”/luck is gonna be part of the story.
“Fate” interferes throughout “Star Wars” – with “Stormtrooper’s” marksmanship being another great example (e.g. they are extremely precise when shooting at “not major characters” but can’t hit anything important when “major character” is involved)
Now, if the movie was trying to be “gritty and realistic” then “fate interfering” might constitute “plot hole.”
I also like to point out that R2-D2 in the “Star Wars universe” is an “agent of fate” or the “finger of the divine” — apparently immortal and all-knowing. Seriously, notice how many times R2 is instrumental in things “working out” for the heroes.
Sure, R2 get “blown up” a lot – but always returns good as new. If “Star Wars” was hard core science fiction THAT would be a HUGE plot hole – but since it is a space fairy tale set in a galaxy far, far away, just part of the suspension of disbelief.
BUT if you want to talk about REAL plot-holes – I have always been (mildly) bothered by the fact that after the heroes escape the Death Star – and KNOW they are being tracked – that they (apparently) go straight to the Rebel Base.
By this point George Lucas has done a masterful job of storytelling – and the fact that the Empire easily tracks the heroes to the Rebel Base – setting up the climactic battle – is easily overlooked.
Ok, Leia tells Han they are being tracked – Han doesn’t believe her, but even if there is a slight possibility of them being tracked then they should logically have gone ANYWHERE else except the Rebel Base.
THEN when they are far away from danger AND the Rebel Base – they could have easily transferred the data as required. Or maybe find the tracking device – and send it ANYWHERE else than the Rebel Base.
“You’re going to need a bigger boat.”
Chief Brody
The “Battle of Yavin” is kind like the oxygen tank exploding at the end of Jaws (1975). If the audience has to THINK about it, then it becomes a problem.
If we have been guided along properly then we are probably “all in” on that plot hole. The plot hole goes completely unnoticed and even gets cheered when told by “expert storyteller.”
I suppose “storytelling 101” always starts with some form of “show don’t tell” – if the “plot” requires 120 minutes of talking heads then you are telling a much different type of story that if you have “action”/pause/more action/short pause/etc.
none of this is a secret. The audience expectations on the ratio of “drama” to “relief” is determined by genre — if you are doing “romantic comedy period piece” then long periods of “talking heads” is expected, BUT if you are doing “space fairy tale” then keep the “talking heads delivery exposition” to a minimum …
it is the genre, silly …
I’m also fond of pointing out that their is plenty of room for different stories and genres – but trying to fit “agenda” into “genre” is almost always a recipe for commercial failure.
random thought: a famous “hamburger chain” started offering salads back in the late 1980’s. I think they were responding to “market demand” for “healthier” options. They are a world wide operation that regularly introduces new items to their menu – so offering salads wasn’t a “bad” idea
the funny thing was that those “hamburger chain salads” could be LESS healthy than the “regular menu” (with salad it is usually the “dressing” that becomes the problem – which had a lot of fat and calories …)
the same chain sells a “fish sandwich” – that is very popular but definitely NOT the “healthy option”
HOWEVER “hamburger chain” never lost sight of the fact that their core product is “meat and potatoes” – they make $$ selling hamburgers and fries
NOW imagine that the “hamburger chain” powers that be decide to turn the menu over to someone that HATES hamburgers and fries – or thinks that “salads” are why people go to “hamburger chain” – well, things aren’t going to go well
the “new menu maker” might blame the customer for them NOT wanting to eat bad salads instead of hamburgers – but that is not gonna change the customers preference.
“New menu maker” will almost certainly get bombarded with criticism from lovers of “hamburger and fries” – and sales/profits will plummet.
Of course the folks that hired “new menu maker” will defend their decision – but that just means that THEY are (probably) the franchises (REAL) problem not the “new menu maker” and certainly NOT the fans …
if you want another “movie franchise” example – compare and contrast the first “Matrix” (1999) with “Matrix Resurrections” (2021) – notice the difference in the ratio of “action” to “exposition” …
It bothers me a little when a “random comedian” comes out and describes their “theory of humor” as being “pain.”
Usually it is an “established” entertainer – and they present the idea that “all humor is based on pain” as being a form of received wisdom.
Obviously anytime the word “all” creeps into the discussion the chances of the statement in question being 100% correct is small.
Along the same path – someone recently tried to argue that “Star Wars” was “woke” from day 1 – and, well, my response is dotted line connected to the above …
Life
The idea of “stress” as a negative force in daily life has been around for years. Someone in a “big business marketing department” came up with a slogan about “reducing stress” as a way to sell soap/soup/something else – but “stress” is not inherently positive or negative.
The human body has a generic “stress response” but our perception of “stress” is relative. The “positive” form of stress (eustress) gets a lot less attention than the “negative” form of stress (distress).
“Become a possibilitarian. No matter how dark things seem to be or actually are, raise your sights and see possibilities — always see them, for they’re always there.”
Norman Vincent Peale
Obviously folks WANT eustress – but that tends to get marketed as “fun” or “happiness.”
It becomes a truism that the only thing we can truly “control” is out attitude towards “stress.” “Life” is gonna happen, all we can really control is how we choose to react.
Set the “way back machine” to 100 years ago and we would find this “life reaction” automatically influenced by “religion.” “People of the book” might have referenced the “wisdom books” (e.g. Job, Proverbs,ย Ecclesiastes) – all of which are worthy of study.
Job tells us that “Man that is born of a woman is of few days and full of trouble.” (Job 14:1) but also “Thou shalt call, and I will answer thee: thou wilt have a desire to the work of thine hands.” (Job 14:15) — which could be examples of reacting to “distress” and then “eustress”
.. and then of course this quote from Proverbs:
ย A merry heart doeth good like a medicine: but a broken spirit drieth the bones.
Proverbs 17:22
Humor
Today the “four noble truths” of Buddhism are on my mind – with the point being that “all humor is based on pain” sounds a lot like “life is suffering.”
It is more accurate to say that life is “stress” NOT “life is pain/suffering.”
I automatically reject the statement “ALL humor is based on pain” – because “ALL humor is based on ‘life’” – which is “stress” NOT “pain”
Pain and pleasure are also “relative” terms to a certain degree – both are “sensations” but perceiving them as feeling “pleasant” or “unpleasant” requires some context
If we divide the world between “Optimists” on one side and “Pessimists” on the other and charted the general population on that line – we would (probably) see a classic bell curve. Most people would be in the “middle” and very few would be on the extremes — BUT my guess is that most “comedians” are found in the “extremes” – either “optimist” or “pessimist.”
The point being that I understand WHY someone might say “all humor is based on pain” – not being a “pessimist” (or Buddhist) I simply disagree …
Humor has trouble translating between generations in part because we have to “identify” with the subject to appreciate the humor.
e.g. William Shakespeare has a lot of jokes in his play – that audiences 400 years ago probably thought were hilarious – but need to be translated to modern audiences. In the 21st century Charlie Chaplin’s movies are still “humorous” but not as funny as they were to early 20th century audiences.
Any “topical” humor ceases to be humorous when the “topic” is no longer “topical” e.g. Jackie Mason telling jokes about Ted Kennedy and Henry Kissinger – if you have no idea who Ted Kennedy and Henry Kissinger are, Mr Mason’s delivery is still humorous – but if you recognize the impersonation/truth in the joke it is much funnier
hmm, so maybe all humor is based on truth? The only characters routinely allowed to tell the “truth” in Mr Shakespeare’s plays are the “fools”/court jesters — or maybe Mel Brooks as stand up philosopher is the definitive example …
Star Wars
Any “long running” series is subject to the impact of nostalgia.
e.g. If you have a preference/opinion on which actor did “James Bond” (or Batman or Superman or Spider-Man) best – that opinion is influenced (positive of negative) by the actor/movies that were released when you were “maturing”
SO I was a little surprised when I started hearing folks say that they preferred the “Star Wars prequels” to the original trilogy.
I don’t dislike the “prequels” but think they are obviously not as good as the original trilogy – which may or may not be “true” BUT is 100% influenced by nostalgia on my part.
As I have aged – I am willing to admit that “The Empire Strikes Back”/Episode V is a “better movie” (plot, character development, fx) than “Star Wars” 1977/”A New Hope”/Episode IV – BUT I still prefer Episode IV
With MY bias fully disclosed – I REALLY didn’t like Episodes VIII and IX.
From a storytelling point of view the “middle chapter” tends to be the “strongest” part of most “trilogies” — but ALL three movies being “equally good” is rare
Notice that should be read “intentional trilogy” as in a story told in three parts, NOT just a collection of 3 movies starring the same character
e.g. of Episodes I – II – III – my preference goes III (best), II, I (least favorite),
“Star Wars”/Episode IV stands by itself – mostly because there was no guarantee that the movie would be popular enough to have “sequels” – BUT George Lucas had a general idea for three trilogies, which is why Episodes V and VI become 1 story …
I’ve heard some folks try to argue that Harrison Ford wasn’t happy and that his characters fate at the end of “Empire” was a way for George Lucas to potentially “write him out of the story” — which is implausible at best.
No, Mr Ford didn’t want his career to be forever linked to “Star Wars” and avoided to a lot of publicity — but he wasn’t “Harrison Ford film legend” in 1980 when Empire was released.
Mr Lucas was trying to recreate the old “serial movie” cliff-hanger feel with “Empire” – i.e. he knew there would be an “Episode VI” when making “Episode V.”
The Episode VI ending was just an example of “expert storytelling” and “good business” at a time when “sequels” were common but tended to be “back for more cash” projects rather than “good storytelling.”
e.g. did anyone think that Marvel was actually cleaning up the MCU at the end of “Avengers: Infinity War?” No, there was ALWAYS going to be one more movie that would modify the cliff-hanger ending …
Meanwhile back at the ranch …
I liked Episode VII — in part because “Star Wars” was slapped on the side of the box – but it was entertaining, and “good enough.”
No, I didn’t “connect” with any of the new characters introduced – but this is where that generational shift comes into play. The “Disney sequels” made $billions but the “box office” decreased for both Episode VIII AND then Episode IX
(btw if you rank the Star Wars franchise movies buy adjust for inflation box office — Episode IV is a $billion ahead of the second place movie Episode VII)
I REALLY wanted to like Episode VIII — but it is just tripe with “Star Wars” slapped on the side. My problem was not with the new characters – it was the ridiculous story full of plot holes. Same with Episode IX – though I went in expecting the movie to be terrible and only saw it in the theater out of a need to “see how they mess up the ending”
BUT was the original trilogy or the prequels “woke”? where the Disney sequels “woke”?
What do you mean “woke”?
“Woke” tends to be used as a negative/insult by folks of one political persuasion and a badge of honor by another political persuasion.
TO me “woke” and b.s. (NOT “bachelor of science”) are in the same category — i.e. b.s. isn’t concerned with “truth” so much as convincing an audience that the spreader of b.s. believes something – e.g. the speaker wants the audience to believe that they (speaker and audience) share the same values – though the speaker doesn’t come right out and say what they think/believe.
“Woke” is about pushing an “agenda” more than actually discussing ideas/concepts — with the implication being that EVERYONE must accept the “agenda” and of course you are wrong/stupid/evil if you don’t blindly accept the “agenda”
SO did episodes VIII and IX have an “agenda” — well, no. They were just terrible storytelling.
Notice that “strong female characters” does NOT equal “woke.” Even “strong female characters” combined with “man child idiot fool” male characters is NOT woke – just bad storytelling.
i.e. “Princess Leia” is obviously a strong leader – but she is archetype “mother”/”elder sister” in Episode IV – which is NOT “woke” by any definition
I like to point out that Luke’s journey from “innocence” to “experience” is reflected in his clothing – i.e. he is in “all white” (innocent/pure) in Episode IV – kind of “grey” in V, and then in all black in Episode VI (experienced/mature)
Mr Lucas famously had Carrie Fisher “taped up” to keep her from jiggling in Episode IV – so Leia’s arc is a “maturation”/awakening of a different kind than Luke’s — Leia goes from chaste/all in white/funny hair style in “A New Hope” to “slave girl uniform” in Jedi – and all of the bickering with Han was (probably) supposed to be “suppressed sexual tension” – like an old Howard Hawks movie
I could go on for another thousand words on what I think is “wrong” with Episodes VIII and IX — part of it is about what “leadership” ACTUALLY looks like (umm, which is NOT – go over there for no good reason, then turn around and come back, all while pretending that being a “strong leader” means NOT communicating the plan to subordinates — that isn’t “leadership” that is incompetence — but I digress)
The biggest flaw with the Disney Sequels is how they treated the core trio from the original trilogy — i.e. all that bickering wasn’t sexual tension, it was just bickering – and of course Luke sees his nephew have a bad dream and decides to run away and sulk — disappointing/bad storytelling? yes. “woke”? well, no.
The fact that ALL of the male characters are in “man child” mode waiting for “strong female to tell them what to do” might be an example of incompetent “story by committee” – but PROBABLY not “woke” (unless the agenda was “emasculation”)
ANYWAY
While I’m at it – I didn’t make it past the first couple episodes of the Disney+ series “Andor” (apparently “remove all the humor” and/or be dark and depressing == “adult story telling” for someone at Disney) and the “Obi Wan” mini series was another exercise in unwatchable tripe
True innovation is rare. Ecclesiastes 1:9 is several thousand years old and tells us that “The thing that hath been, it is that which shall be; and that which is done is that which shall be done: and there is no new thing under the sun.”
Of course when we aren’t talking about “big picture life” – innovation on a smaller scale happens every once in a while. Historians can argue about the number of truly “world changing innovations” – things like development of agriculture, domestication of animals, improvements in building materials, etc but that isn’t what I’m concerned with today.
Markets
I enjoyed The Outfit (2022) – which is nominally about a master English tailor who has ended up in a small shop in mid-1950’s Chicago (Mark Rylance’s character describes himself as a “cutter” – it is one of those rare “character driven” gangster movies, the movie has a “tense” energy and we get some “action” – I liked it).
Near the end of the movie a character complains about how her “organization” had been ignored until they started making some “real” money. Which is plot driven exposition as much as anything.
THEN I saw a promo for a new “streaming series” – were the main character makes the same complaint – something like “no one paid attention till we started making money, now everyone wants to take over.”
Ok, both of those examples are “plot driven” but it is important to recognize that the complaint both (fictional characters) are making is that they “innovated”, created a “new” market segment, and then when that market segment became increasingly popular – competitors entered the marketplace.
“Imitation is the sincerest form of flattery that mediocrity can pay to greatness”
– Oscar Wilde
This is the same concept found in the “innovation acceptance curve.” The “innovation acceptance curve” looks like the classic “normal distribution” bell curve – with “innovators” and “early adopters” on one side and “laggards” on the other – and “early” and “late” majority in the middle.
My point is that there is probably a similar “number of competitors” curve that mirrors the “innovation acceptance curve.”
Cell Phones
Think “cell phones” – the first “cell phone” was invented in 1973. In the 1980’s cell phones were extremely rare – and if someone had one it probably looked like a World War 2 “walkie talkie.” By the end of the 1990s cell phones were common. The first iPhone was released in 2007 – which sparked another “innovation acceptance curve” for “smart phones.”
Look at the “cell phone market” – Nokia dominated the early stages of the acceptance curve – but back in the 1990s the “cell phone service providers” tended to “give away” the phone in exchange for the monthly service fee.
I’m sure there were a LOT of other companies making “cell phones” in those “early adopter”/”early Majority” days – and there were obviously other innovators (BlackBerry comes to mind).
If we set the “way back machine” to 2005 (2 years before the iPhone) and asked a random sampling of cell phone users if they would ever think of paying $500 for a cell phone – the response would have been overwhelmingly low, simply because the average user only used their cell phone to take the occasional low resolution picture and make phone calls.
(fwiw: I used to leave my flip phone in the car 99% of the time – because that was where I would need it, and the battery retained a charge for weeks at a time)
Of course in 2005 folks might have also carried around a laptop, a “personal digital assistant”, and/or a dedicated MP3 player (the first Apple iPod released in 2001 – but “portable personal music players” had been around for years).
The point here is that “innovation” is NOT always “market driven.” Successful innovation that results in “market disruption” is about providing something the “masses” didn’t realize they needed.
โIf I had asked people what they wanted, they would have said faster horses.โ
-Henry Ford
Legendary Apple founder Steve Jobs once said that he didn’t rely on “market research” when developing new products. I’m not questioning Mr. Jobs – but (my opinion) his “genius” was in seeing what people “needed” which was often different than what the “thought the wanted.”
Apple, Inc under Mr. Jobs was also known for making superior quality products that fell into the “elegant” category – i.e. achieving “product elegance” required a lot of “product testing” and development. SO Steve Jobs didn’t come back to Apple from the “wilderness” in 1997 and hand down from on high the “iMac”, and then the “iPod”, and finally the “iPod” – but he did create the innovation environment that made them possible.
Market Leaders and Innovation
After Apple disrupted the cell phone market by introducing the “iPhone” – Google, inc acquired the Android operating system and the HTC Dream was the first Android “smart phone” (September 2008)
In 2023 Android OS is the most popular operating system in the world with 70% of the market share. Apple iOS has 28% of the market.
From a “device” point of view – Samsung is the largest “Android” device manufacturer. Apple iOS is “closed source”/proprietary so obviously all “legal” iPhones are running iOS.
From a “profitability” point of view – Apple, Inc is making a good living off of the selling iPhones for $1,000 and the “App Store” brings in $billions a year. So at the moment they are happily perched atop the “market profitability leader” stack – i.e. they don’t have the largest number of “devices” but they dominate the “top end” of the market and are far and away the most profitable.
i.e. you can buy a $50 Android smartphone and you can probably find a $100 iPhone, but it will be several “generations” old …
If you are curious about that other 2% of the mobile market (Android 70%, Apple 28%, other 2%?) – well, in 2023 I’m not sure –
Microsoft tried to have a “mobile” version of Windows for a long time, Microsoft announced “end of life” for Windows Mobile back in 2017, which means 2022 was when Microsoft support ended.
BlackBerry is also still around – so that 2% is mostly old Microsoft Mobile and BlackBerry devices.
The “modern business” cliche is that companies must “innovate or die” – but any “market” will tend to be irrational/unpredictable at a basic level because, well, “people” are involved.
“Innovation” for the sake of “innovation” is a bad idea – hey, if it ain’t broke, don’t perform radical surgery trying to “fix” it. “Intelligent innovation” with an eye on shifting market demands is always a good “long term” plan.
Just like our fictional “market creators” at the start of this article – Nokia was an innovator and dominated the early mobile industry, then the market got big and profitable and then what happened …
Well, Nokia is a case study for why “market share dominance” does not always equal “profitability” – but the answer to what “happened” to Nokia is that Microsoft acquired them in 2013.
You can still buy a “Nokia” phone – they even have the classic “flip phone” – but the Finnish telecom company “Nokia” doesn’t make phones in 2023.
I’m not giving anything away by pointing out that the “old Nokia” employees blamed the “fall of Nokia” on the Microsoft acquisition – i.e. there is a LOT of “Microsoft as evil American corporation” bashing in the documentary – and for-what-it-is-worth they are probably right in their criticism of the contrasting corporate cultures.
BUT “Microsoft/Nokia” isn’t at the top of “worst mergers” of all time by any measure (hey, someone is gonna have to do something SPECTACULARLY stupid on a “Biblical” scale to be worse than AOL/Time Warner).
With 20/20 hindsight – “Nokia mobile” might be in exactly the same spot they are NOW if the Microsoft deal hadn’t happened – i.e. making mid-range Android phones. They certainly didn’t have the resources to compete with Apple and Google for users – so at some point they would (probably) have stopped trying to develop their own mobile OS and thrown in with Google/Android and be exactly where they are today.
Competition
Healthy competition drives intelligent innovation. At a “nation state” level this means that “protectionism” is usually a bad idea.
The “usually” qualifier sneaks in there because of “national security.” Outside of a “national security” concern the best thing for “politicians” to do in regards to “market competition” is “as close to nothing as possible.”
Yes, rules need to be enforced. Criminal activity should be dealt with as “criminal activity” NOT as an excuse for politicians to “wet their beak” meddling in market regulation. e.g. politicians are great at throwing money at bad ideas and extremely bad at encouraging actual “market innovation.”
(just in general the most cost effective thing the “gov’ment” can do is “have a contest” and then encourage the free market to solve the problem and win the contest)
Of course “cronyism” is ALWAYS bad at any level. The Venezuelan oil industry under Hugo Chavez becomes the cautionary tale of “cronyism” disguised as “nationalization.” e.g. no, a “centrally controlled economy” run by “human experts” won’t work on a national scale – and only the greedy and ignorant will try to tell the “masses” that you can get “something for nothing.”
Acres of Diamonds
Russell Conwell (February 15, 1843 โ December 6, 1925) is remembered for giving a speech called “Acres of Diamonds” (feel free to read the lecture at your leisure)
One of the lessons that could be taken from Acres of Diamonds is that the best “market” for someone looking to “innovate” and “compete” is the market that they know best.
I say, beware of all enterprises that require new clothes, and not rather a new wearer of clothes.
– Henry David Thoreau
Just because someone else is doing something similar doesn’t mean that there isn’t room in the marketplace for your idea. e.g. Everyone told Dave Thomas that the United States didn’t need ANOTHER hamburger chain – but in 1969 he started “Wendy’s Old Fashioned Hamburgers” in Columbus, Ohio.
Mr Thomas had worked for the real Colonel Sanders and Kentucky Fried chicken before starting Wendy’s – so he didn’t need “new clothes”, he understood fast food franchising and customer service. btw – Dave Thomas at Wendy’s deserves credit for perfecting the “pick up window” and the “salad bar” among other things.
When Jack Welch was running G.E. they encouraged suggestions/feedback from “ordinary” workers – the idea being that the person that knows how to do the job “better” is probably the person doing the job.
Yes, for every “introduced into production” G.E. probably had hundreds of “impractical” suggestions – but that is like saying that most rocks in the diamond mine are not diamonds, you don’t stop mining for diamonds because of the “not diamonds”
(any organization that encourages suggestions should also have a way of quickly evaluating those suggestions – I’d be happy to take a big consulting fee to figure out a way, but with modern I.T. there are a lot of easily implemented solutions).
Textbooks will through out terms like “unique selling proposition” (USP) – which boil down to “just because other folks are doing it doesn’t mean your slightly different idea won’t work.”
Ideally your idea will do “something” different/better/cheaper — but the fact that a LOT of other folks are doing “whatever” just means that there is a DEMAND for “whatever.” i.e. if you think that you have a truly unique/innovative idea that no one else has thought of – you might be wrong.
It is POSSIBLE that your idea has been tried (and failed) OR that there simply isn’t a profitable market for “whatever.” This is where doing a “competitor analysis” becomes informative – if you can’t find ANY competitors than I’d be worried …
e.g. not surprisingly McDonald’s sells the most hamburgers in the United States but there are 91,989 other “hamburger restaurant businesses” in the U.S. and the number continues to grow.
I don’t know if I would suggest starting a “hamburger restaurant” if you have 20 years of completely unrelated experience – but this is where “franchising” tries to fill in the knowledge/experience gaps for prospective entrepreneurs.
Probably having a good location is just as important as having a recognizable brand – e.g. if I have been driving for 8 hours and I’m hungry and have to use the restroom if “anonymous greasy spoon truck stop” is the only place in sight they would look REAL attractive …
Unlimited Demand
Usually when doing a competition assessment you will factor in the impact that changes in price will have on “market demand” as well as the cost of “switching.”
Specifics aren’t important -this is where the textbooks will talk about “elasticity” – but the core idea is that changes in price can have a large impact on “demand.”
i.e. if you are selling “product x” for $1, something happens, and you need to start charging $2 to stay in business. There are 3 possibilities – you could lose customers, your retain the same number of customers, or you might gain customers (in rare situations).
Your customer reaction to the price change will probably revolve around the “cost” of switching. e.g. how much do competitors charge and how much trouble is it to switch to one of those competitors?
To make up a story – imagine “local gas station” increases their prices. Some folks won’t notice because it is inconvenient to go somewhere else, and some will rearrange their lives so that they never have to buy gas at that location again – and probably the only way a “local gas station” INCREASES customers is if traffic patterns change.
Of course if a competitor is charging 10ยข less and is just across the street – well, that competitor will have long lines and probably put the first station out of business. btw – the cost of gas at “big chain” stations tends to reflect local taxation just as much as the cost of the gasoline – but that is another subject.
BUT if the price of gas gets too high – folks will buy more gas efficient vehicles and cut down on their driving – so gasoline does not have UNLIMITED demand.
The number of items with “unlimited” demand are kind of small – “air” comes to mind, but even then “no longer breathing” is a drastic option, and when “basic necessities” become scarce the breakdown of civil society is gonna happen (riots/war/anarchy).
On a less apocalyptic level – “entertainment” tends to have unlimited demand and also zero switching costs. This is (probably) obvious – the challenge for “creators” becomes not just “making an entertaining video” but finding an audience.
A tiny audience could equal “profitability” – if production costs are controlled and enough “sponsors”/subscribers found. A large audience could equal “huge losses” – if production costs are high and “advertising”/subscribers are not “large enough.”
The same math applies to podcasts, broadcast/cable stations, and motion pictures. When “Superman Returns” was released in 2006 it had a $200 million budget. When it made almost $400 million worldwide it returned a profit, but not enough – e.g. a planned sequel was cancelled
At the same time “The Devil Wears Prada” was released with a $35 million budget. It would make $327 million in box office – AND be considered a huge success making back 8x its budget.
(umm, it isn’t important that I’ve seen one of those movies and it isn’t the one about the fashion industry – the point is one that Disney, Inc is relearning in 2023, i.e. heavily marketing a polished piece of tripe doesn’t make the tripe into a hamburger)
The good folks at Merriam-Webster give us 14 definitions for “time” as a noun, another 5 as a verb, and then 3 more as an adjective.
A quick peek at the etymology tells us that the “time” came into the English language by way of Old English and (Old Norse) words for “tide.”
That “time” and “tide” are related shouldn’t surprise anyone — after all “time and tide wait for no man” is one of those “proverb” things. e.g. If you make your living next to/on a large body of water then the tide going out and coming in probably greatly influences your day to day activities as much as the sun rising and setting.
From an “exploration” point of view “precision time keeping” was essential for sailors because they could use it to determine their longitude. Not being a sailor or even mildly comfortable on a boat that doesn’t have a motor – I’m told you can use a sextant to determine your latitude using the moon and stars.
Obviously in 2023 GPS is used for most voyages. Some high up officials in the U.S. Navy pointed out that we should still teach “basic seamanship.”
I’ve had a career that revolves around “fixing” things because, well, things break — so teaching basic navigation without GPS sounds obvious. e.g. the U.S. Army initial entry training (“basic training”) used to spend a little bit of time teaching the POGs (“persons other than grunts”) how to read a map and use a compass.
“Way back when” I was trained as a medic – which used to mean nine weeks of “basic” and then another period of “AIT” (advanced initial training) — all of which I seem to remember took 6 months in real time. In 2023 Google tells me that the “11B Infantry” training is “One Station Unit Training” lasting 22 weeks.
The “Distance” Problem
Before the “industrial revolution” in the 18th century gave us things like trains, and eventually planes, and automobiles – the fastest human beings could travel on land was on the back of a horse.
Which basically meant that the “average human being” would live and die within 20 miles of where they were born. Since MOST people were ‘subsistence farmers” they probably didn’t have a pressing need to travel exceptionally far.
Of course “ancient peoples” probably formed the first “cities” as equal parts “areas of mutual protection” AND “areas of commerce” — so the “local farmers market” today might be described as an example of the foundation of modern society – “people gotta eat” and “people like to socialize” …
Those ancient subsistence farmers no doubt figured out the cycles of the moon as well as the yearly seasons so they could optimize the output of their farms. Those folks not concerned with the tides still had to “plant” and “harvest” – so “time management” was a consideration even if precise time keeping wasn’t an issue.
Those Ancient Greeks even went so far as to create the idea of a “decisive battle” so they could decide conflicts and get back to their farms with minimal disruption (i.e. if you don’t plant, you can’t expect to harvest) – but that is another story.
The point being that “time” was a constant – how we “redeem the time” is up to the individual – but part of being human is dealing with the inevitability of “time passing.”
The relationship between “distance” (d), “speed” (s), and “time” (t) is probably still a “middle school” math exercise (d= st) which I won’t go into – but it is hard to overstate the impact that “fast and safe high speed travel” changed human society.
My favorite example is “transcontinental” travel in North America. Before the U.S. completed the first transcontinental railroad in 1869 the fastest you could travel from “coast to coast” would take 6 months – e.g. you could probably take a train to Nebraska in a couple days, but then the trip from Nebraska to the west coast would take several months. Or you could sail around South America (Cape Horn) which would also take 6 months (it was probably safer but much more expensive).
btw: Canada’s “transcontinental railroad” opened in 1881 – and is still in operation. Parts of what was the U.S. Transcontinental railroad are still around – but the rise of “automobiles” and the Interstate highway system made “interstate railway passenger travel” unprofitable.
AFTER the transcontinental railroads you could travel coast to coast in about a week. The original “intent” of the U.S. transcontinental railroad was that it would open up trade with Asia – i.e. good shipped in from the “far east” could be shipped across the U.S. — the bigger impact ended up being allowing immigrants from Europe to settle “out west” – which is again, another story.
It is safe to say that the “problem” of distance for “human travel” was solved by the industrial revolution. e.g. Google tell me I can DRIVE from southwestern Ohio to California in 2 days – although I could hop on a plane and travel from CVG to LAX in about 6 hours if I was pressed for time.
If I wanted to go to Chicago (298 miles from Cincinnati) the drive is about 6 hours – but with the cost of gas (if I schedule far enough in advance) the plane trip would still take 4 hours, but probably cheaper than driving.
The point being that “Travelling around the world” in ANY amount of time USED to be an unthinkable adventure because of the distances involved and the lack of safe/speedy travel options – now it is about time management and deciding on how comfortable you wanna be while you travel (and of course whether you want to be shot at when you get where you are going ๐ — and THAT is another story …
Faster Than Light
Back when I was teaching the “Network+” class multiple times a term – the textbook we used would start out comparing/contrasting common “networking media.” The three “common” media covered were 1. coaxial – one relatively large copper cable, 2. unshielded twisted pair (UTP) – 8 smaller copper wires twisted together in pairs, and then 3. “fiberoptic” cable – thin “optical fiber” strands (“glass”).
SO I would lecture a couple hours on the costs/benefits/convenience of the three “media type” – spoiler alert most “local area networks” are using some flavor of UTP because it is still hits that sweet spot between cost/speed/convenience. The take away from that “intro to networking class” about “fiberoptic cabling” was that it was exceptionally fast, but more expensive, and harder to install than the other two.
The “exceptionally fast” part of fiberoptic cabling is because we are dealing with the speed of light. Yes, there are other factors in network “speed” but physics 101 tells us that it is not possible to go faster than the speed of light (which is 300,000 kilometers per second or 186,000 miles per hour)
(oh, and the “slow” part of most “computing systems/networks” is the human beings involved in the process – so UTP is just fine for 99% of the LAN implementations out there – but once again, that is another story)
I’m not a physicist but saying that the speed of “light” is the speed of energy without mass is accurate enough for today. The point being that unless you can “change the rules of the universe” as we understand them today – it is NOT possible to go faster than light (FTL).
There was a lot of optimism that “science” would solve the “interstellar distance” problem during the “space race” period of human history. But “interstellar distance” is mindboggling huge compared to terrestrial travel – AND we keep hitting that hard barrier of the speed of light.
Of course neither “subsistence farmers” OR “trained thinkers” 2,000 years ago comprehended the size of the earth in relation to the rest of the universe – “educated types” probably thought it was round, and might have had a good idea at the earth’s circumference – but travelling “around the world” would have been the stuff of fantasy.
Some well meaning folks were predicting “moon tourism” by the end of the 20th century – and I suppose the distance isn’t the problem with “moon tourism” so much as “outer space” being VERY non-conducive to human life (read that as “actively hostile” to human life).
Gene Rodenberry (probably) came up with the idea for “Star Trek” as a direct result of the “moon mania” of the late 1960’s. Yes, “Star Trek” was conceived of as a “space western” so it was never a “hard” science fiction program – so the “Star Trek” universe tends to get a pass on the FTL issue.
After all humanity had created jet engines that allowed us to break the speed of sound, wouldn’t it be natural to assume that someone would come up with FTL engines? With that in mind “dilithium crystals” fueling warp drive engines that allow our adventurers to go multiples of the speed of light doesn’t sound that far-fetched.
Folks were using “Mach 2” to signify multiple of the speed of sound – why not use “Warp speed” for multiples of the speed of light.
It is easy to forget that “the original series” (TOS) was “cancelled” each year it was produced – after seasons 1 and 2 a fan letter writing campaign convinced the network folks to bring the show back. TOS was always best when it concentrated on the characters and stayed away from the “hard science” as much as possible.
BUT I’m not picking on “Star Trek” – just pointing out the physics …
Time Travel
Mr Einstein’s theory sometimes involves a “though experiment” where we have two newborn babies (or feel free to think of newborn kittens/puppies/hamsters/whatever if the “baby” example gets in the way) AND we put one of the newborns on a “spaceship” and accelerate that ship “close to the speed of light” (we can’t actually go the speed of light – we are just getting as close as possible).
When our imaginary thought experiment ship returns – the newborn on the ship doesn’t appear to have aged but the newborn that stayed behind is now extremely old. This is the “twin paradox” and a lot of folks smarter than me have spent considerable time examining the question –
The point is that Mr Einstein’s theory does not allow for “travelling backwards” in time.
Again, “Star Trek” (TOS) became famous for slingshotting the Enterprise around the sun, and going faster than the speed of light (“light speed break-away factor“) to travel backwards in time.
Of course, if you have “suspended disbelief” and have accepted that the warp drive engines can routinely achieve multiples of the speed of light – then the “Star Trek” writers are just engaged in good storytelling, which again interesting characters and good stories has always been the best part of the “Star Trek” universe.
btw: the most plausible “time travel” in a TOS episode was “The City on the Edge of Forever” – that is the one with Joan Collins for casual fans (season 1 episode 28). It tends to be listed near the top of “best episode” lists for TOS.
I seem to remember someone asking Stephen Hawking about the possibility of time travel “way back when.” (btw: Mr Hawking was a Star Trek fan and has the distinction of being the only “celebrity guest star” to play themselves – TNG Season 6, episode 26 – Data on the holodeck playing poker with Albert Einstein, Isaac Newton, and Stephen Hawking) – as I remember it, Mr Hawking’s response was something along the lines of “if you could travel faster than the speed of light, then time travel might be possible”
Of course that is probably the same as him saying “… and it is also possible that monkeys might fly out of my butt …” – but you know, it is entertainment not “hard science.”
While I’m at it
The “time traveler” in HG Well’s “The Time Machine” explains traveling in time as travelling in another “dimension” – since humanity had created machines to let us travel in the other dimensions (up, down, side to side – e.g. length, width, height) then travel through “time” would just require a new machine.
That “time travel device” just becomes an element of good storytelling – i.e. best practice is to tell what it does and NOT spend a lot of time explaining HOW it works.
Doctor Who and the TARDIS (“Time And Relative Dimensions In Space”) get a short explanation when required – and they added the ability to travel instantaneously through time AND space, probably both as storytelling device and as a nod to Mr Einstein’s “space-time” concept.
“The Planet of the Apes” (1968) used the basic “twin paradox” idea – but then “something happened” and rather than landing on a distant planet they end up back on earth.
In the 1970 sequel “Beneath the Planet of the Apes” the “rescue team” has followed the first group – and this time they say they were caught in a “tear if the fabric of space time” or something. Of course they conveniently land in the same general area as the first crew and everyone speaks English.
There were three more “Planet of the Apes” sequels – they travel back in time in “Escape from the Planet of the Apes” (1971) – I don’t think they bother to explain how the got back, but I haven’t been able to sit through “Escape from the Planet of the Apes” recently.
I think “Planet of the Apes” (2001) was a victim of a writer’s strike – it isn’t particularly re-watchable for any number of reasons – not least of which is that they jump through illogical hoops to have Mark Wahlberg end up back in the present with a monkey Lincoln memorial.
The Andy Serkis as Caesar “Planet of the Apes” trilogy doesn’t bother with the “time travel” trope – substituting a “engineered virus” that (unintentionally) kills most of humanity and makes the surviving humans less intelligent.
“The Final Countdown” (1980) has an aircraft carrier go back in time to 1941 just before the attack on Pearl Harbor. The movie revolves around the “can/should they change history by intercepting the Japanese attack on Pearl Harbor” question – you can watch it for free on Tubi.com if interested.
This time around the time travel is a “finger of God” sort of thing – as I remember it a mysterious storm just appears and the 1980’s era aircraft carrier ends up in 1941. I’ll just point out that it is “plausible” but won’t spoil the ending …
Fred Ward had a long career as a “character actor” that died in 2022. He tried to make the move from “grizzled nice guy co-star/sidekick” to “leading man” multiple times in the 1980’s. He appears destined to be remembered as Kevin Bacon’s co-star in “Tremors” (1990) – he was one of those “instantly recognizable faces but you might not be able to recall his name” actors.
Mr Ward starred in several movies that qualify as “cult classics” (i.e. well made movies that didn’t find a mass audience at the time of release but continue to be popular years later). Mr Ward’s “time travel” movie was 1982’s “Timerider: The Adventure of Lyle Swann” – which isn’t available streaming, but has a blu-ray release which probably illustrates the “cult classic” concept better than anything
As I remember it (I haven’t see the movie in years) – Mr Ward is a dirt bike rider that accidently gets sent back in time (1870s American West) by “secret government experiment” of some kind which he accidently stumbles into — the memorable part is that they manage to slip in a version of the classic time-travel “grandfather” paradox.
Normally the “grandfather” paradox is similar to “Back to the Future” where the time traveler does something to keep their ancestors from meeting/reproducing/whatever. “Timerider” is the other option – where he ends up being his own great-great-grandfather – enjoying the movie doesn’t revolve around that point and it looks like the movie is still being sold on blu-ray in Italy and Spain, so …
The whole “time travel machine” trope got called for its inherent silliness with “Bill and Ted’s Excellent Adventure” (1989) – the movie is funny on multiple levels, and it is safe to say it skewered the whole “travel in time and change events” movie genre — “Bill and Ted’s Bogus Journey” (1991) takes the joke even further but it suffers a little from “sequel-itis” …
I’ll finish with a nod toward “Land of the Lost” both the 1974-1977 kids tv show and the 2009 Will Ferrell movie – where they “slip through” rips in time or something.
I suppose the “science” behind the movie/series is similar to “Indiana Jones and the Dial of Destiny” where it is implied that there are “rips in time” or something that can be predicted and then travelled through.
Yes, I am ignoring the various “multiverse” shows out there – simply because they are just modern “duex ex machina” plots. Worth noting because they reflect humanities desire to be able to go back and “fix” the past, but they quickly wore out their novelty …
well, the obvious problem with the title is “how do you define ‘great’?”
of course everyone that has answered the question has been “correct” – “greatness” is determined by individual tastes. Consider that the credit for creating the “modern summer blockbuster” belongs to “Jaws” (1975) – which was the “greatest box office success” of all time until “Star Wars” (1977) – but if we did a survey of “movie critics” my guess is that neither movie would be in the top 10 if the question is “Name the greatest movie of all time”.
Box Office
Using “raw box office” as a measure of greatness had obvious problems. Most obvious is that “ticket prices” have increased greatly – e.g. in 1940 you could buy a movie ticket for $0.25 – a quarter of a dollar, in 2023 it is considerably more.
If you want to use “ticket sales” as a measure of “greatness” OTHER problems pop-up. In this case “modern movies” expect to make MOST of their ticket sales in the first two weeks or release, will probably not be in wide theatrical release after four weeks, and will probably be available for “home consumption” (in the form of a digital download) in a few months after release.
Before the mid 1980’s “home consumption” of a “major movie” would have been to show it on network television. There were “annual events” for some traditional favorites – “The Wizard of Oz” (1939) was shown annually from 1959 to 1991, “The Ten Commandments” (1956) is still shown annually around “Easter” Time.
Once upon a time “Gone With the Wind” (1940) had been shown in the same theater for decades – so it is the hands down, never gonna be beat “ticket sales” champion movie of ALL TIME.
Awards
Remember that ANY “awards show” is inherently biased. The “Academy Awards” in particular are an “industry insider” group that – for the most part – gives out awards to other “industry insiders.”
SO I notice the Academy Awards when they come out – but I do not consider them a “measure of greatness.” I’m not saying the awards are “not important” – certainly they are important to the folks that get nominated and/or win. I’m just pointing out that the awards are “voted on” by some group and are NOT useful for comparative purposes – e.g. if “movie A” won an Oscar but “movie B” did not win any awards does it automatically mean that “movie A” is BETTER than “movie B”? Nope.
Categories
Is being “ground breaking” the measure of “greatness?” “Birth of a Nation” (1915) helped create the “cinematic vocabulary” we take for granted (but the ending is obviously ‘problematic’) – “Citizen Kane” (1941) also broke ground on “camera movement and special effects” (which is why the ‘movie critics’ tend to love Orson Welles in general and “Citizen Kane” in particular) – “Casablanca” (1942) is in a category all its own but I’ll hold it up as an example of “script greatness.”
to be fair (and for convenience) – there need to be multiple categories, “maybe greatest movie BEFORE ‘television’” (because the “studio movie” standards had to be raised when folks could get “basic entertainment” for free over the air – e.g. a lot of those “old movies” from the 30’s and 40’s feel like “television productions” in terms of length and content – e.g. “Frankenstein” (1931) and “Bride of Frankenstein” (1935) are around 1 hour each – watching them back to back tells a complete story)
then we need to have a “greatest movie under the ‘studio’ system” AND “production code” category – if you are thinking “production code? what is that?” – well, there was a time when ALL movies where “general admission” – the MPAA didn’t come up with the “rating” system until 1968, BEFORE 1968 the “Production Code” was a form of self-censorship that put restrictions of “language and behavior” (e.g. try finding a “major U.S. movie” from before 1968 with profanity or nudity – I always love to point out “The Dirty Dozen” (1968) as working very hard to not use profanity)
oh, and then there are the “not in English movies” – “Breathless” (1960) is a great movie (French crime drama). Akira Kurosawa’s work (Japanese director) had a HUGE influence on American cinema – e.g. even casual “western fans” have probably heard that “The Magnificent Seven” was based on Kurosawa’s “Seven Samurai”
Personal Bias
Since I was young and impressionable in the 1970’s the work of Steven Spielberg, George Lucas, and Francis Ford Coppola has a special place in the “nostalgia chest” – intellectually I can say that “Schindler’s List” (1993) is Mr Spielberg’s “greatest artistic achievement” while still saying I love “Jaws” and “Close Encounters of the Third Kind”.
The Godfather and The Godfather part II are great movies – but my personal favorite “Coppola” movie is “Apocalypse Now.”
As for Mr Lucas – “American Graffiti” (1973) is still a lot of fun to watch (and it foreshadows the “story telling” techniques used in the “Star Wars” franchise – at one level you can say that Mr Lucas was exploring the relationship between “man and machine” in both movies). “The Empire Strikes Back” is arguably a “better” movie than “Star Wars” or “Return of the Jedi”, but c’mon they didn’t even blow up a Death Star!
No discussion on “big budget blockbusters” would be complete without mentioning James Cameron – I was blown away by the 3D effects in “Avatar” (2009) and “Titanic” (1997) was so full of special effects that people don’t think of it as being full of “special effects” (e.g. no, they did not build a replica of the Titanic – it was mostly “computer generated images” (CGI) – and that CGI was part of why it was the “most expensive movie” of all time back in the 20th Century).
BUT my favorite “James Cameron” movies are “The Terminator” (1984) and “Aliens” (1986) – as always YMMV
There have been a couple documentaries about the 1975 blockbuster “Jaws” — which probably illustrates the long term impact of the original movie.
Any “major” movie made in the era of “DVD extras” is going to have an obligatory “making of” documentary – so the fact
“Jaws: The Inside Story” aired on A&E back in 2009 (and is available for free on Kanopy.com). It was surprisingly entertaining – both as “movie making” documentary and as “cultural history.”
This came to mind because the “Jaws movies” have been available on Tubi.com for the last couple months.
full disclosure: I was a little too young to see “Jaws” in the theater — the “edited for tv” version of “Jaws” was my first exposure to the movie, when the movie got a theatrical re-release and ABC aired it on network tv in 1979.
I probably saw the “un-edited” version of “Jaws” on HBO at some point – and I have a DVD of the original “Jaws.” All of which means I’ve seen “Jaws – 1975” a LOT. Nostalgia aside, it still holds up as an entertaining movie.
Yes, the mechanical shark is cringeworthy in 2022 – but the fact that the shark DIDN’T work as well as Spielberg et al wanted probably contributes to the continued “watch – ability” of the movie. i.e. Mr Spielberg had to use “storytelling” technics to “imply” the shark – which ends up being much scarier than actually showing the shark.
i.e. what made the original “Jaws” a great movie had very little to do with the mechanical shark/”special effects.” The movie holds up as a case study on “visual storytelling.” Is it Steven Spielberg’s “best movie”? No. But it does showcase his style/technique.
At one point “Jaws” was the highest grossing movie in history. It gets credit for creating the “summer blockbuster” concept i.e. I think it was supposed to be released as as “winter movie” – but got pushed to a summer release because of production problems.
Source material
The problem with the “Jaws” franchise was that it was never intended to be a multiple-movie franchise. The movie was based on Peter Benchley’s (hugely successful) 1974 novel (btw: Peter Benchley plays the “reporter on the beach” in “Jaws – 1975”).
I was too young to see “Jaws” in the theater, and probably couldn’t even read yet when the novel was spending 44 weeks on the bestseller lists.
“Movie novelizations” tended to be a given back in the 1970’s/80’s – but when the movie is “based on a novel” USUALLY the book is “better” than the movie. “Jaws” is one of the handful of “books made into movies” where the movie is better than the book (obviously just my opinion).
The basic plot is obviously the same – the two major differences is that (in the book) Hooper dies and the shark doesn’t explode.
Part of the legend of the movie is that “experts” told Mr. Spielberg that oxygen tanks don’t explode like that and that the audience wouldn’t believe the ending. Mr Spielberg replied (something like) “Give me the audience for 2 hours and they will stand up and cheer when the shark explodes” — and audiences did cheer at the exploding shark …
(btw: one of those “reality shows” tried to replicate the “exploding oxygen tank” and no, oxygen tanks do NOT explode like it does at the end of Jaws – so the experts were right, but so was Mr Spielberg …)
Sequels
It is estimated that “Jaws – 1975” sold 128 million tickets. Adjust for inflation and it is in the $billion movie club.
SO of course there would be sequels.
Steven Spielberg very wisely stayed far away from all of the sequels. Again, the existential issue with MOST “sequels” is that they tend to just be attempts to get more money out of the popularity of the original – rather than telling their own story.
Yes, there are exceptions – but none of the Jaws sequels comes anywhere close to the quality of the original.
“Jaws 2” was released in summer 1978. Roy Scheider probably got a nice paycheck to reprise his starring role as Chief Martin Brody – Richard Dreyfuss stayed away (his character is supposed to be on a trip to Antarctica or something). Most of the supporting cast came back – so the movie tries very hard to “feel” like the original.
Again – I didn’t see “Jaws 2” in the theater. I remembered not liking the movie when I did see it on HBO – but I (probably) hadn’t seen it for 30 years when I re-watched it on Tubi the other day.
Well, the mechanical shark worked better in “Jaws 2” – but it doesn’t help the movie. Yes, the directing is questionable, the “teenagers” mostly unlikeable, and the plot contrived – but other than that …
How could “Jaws 2” have been better? Well, fewer screeching teenagers (or better directed teenagers). It felt like they had a contest to be in the movie – and that was how they selected most of the “teenagers.”
Then the plot makes the cardinal sin of trying to explain “why” another huge shark is attacking the same little beach community. Overly. contrived.
If you want, you can find subtext in “Jaws – 1975.” i.e. the shark can symbolize “nature” or “fate” or maybe even “divine retribution” take your pick. Maybe it isn’t there – but that becomes the genius in the storytelling – i.e. don’t explain too much, let the audience interpret as they like
BUT if you have another huge shark, seemingly targeting the same community – well, then the plot quickly becomes overly contrived.
The shark death scene in “Jaws 2” just comes across as laughably stupid – but by that time I was just happy that the movie was over.
SO “Jaws 2” tried very hard – and it did exactly what a “back for more cash” sequel is supposed to do – i.e. is made money.
“Jaws 3” was released in summer 1983 and tried to capitalize on a brief resurgence of the “3-D” fad. This time the movie was a solid “B.” The only connection to the first two movies is the grown up Brody brothers – and the mechanical shark of course.
The plot for “Jaws 3” might feel familiar to audiences in 2022. Not being a “horror” movie aficionado, I’m not sure how much “prior” art was involved with the plot — i.e. the basic “theme park” disaster plot had probably become a staple for “horror” movies by 1983 (“Westworld” released in 1973 comes to mind).
Finally the third sequel came out in 1987 (“Jaws: The Revenge”) – I have not seen the movie. Wikipedia tells me that this movie ignores “Jaws 3” and becomes a direct sequel to “Jaws 2” (tagline “This time it is personal”)
The whole “big white shark is back for revenge against the Brody clan” plot is a deal breaker for me – e.g. when Michael Caine was asked if he had watched “Jaws 4” (which received terrible reviews) – his response was ‘No. But I’ve seen the house it bought for my mum. It’s fantastic!’
Thankfully, there isn’t likely to be another direct “Jaws” sequel (God willing).
Humans have probably told stories about “sea monsters” for as long as there have been humans living next to large bodies of water. From that perspective “Jaws” was not an “original story” (of course those are hard to find) but an updated version of very old stories – and of course “shark”/sea monster movies continue to be popular in 2022.
Mr Spielberg
Steven Spielberg was mostly an “unknown” director before “Jaws.” Under ordinary circumstances – an “unknown” director would have been involved in the sequel to a “big hit movie.”
Mr Spielberg explained he stayed away from the “Jaws sequels” because making the original movie was a “nightmare” (again, multiple documentaries have been made).
“Jaws 2” PROBABLY would have been better if he had been involved – but his follow up was another classic — “Close Encounters of the Third Kind” (1977).
It is slightly interesting to speculate on what would have happened to Steven Spielberg’s career if “Jaws” had “flopped” at the box office. My guess is he would have gone back to directing television and would obviously have EVENTUALLY had another shot at directing “Hollywood movies.”
Speculative history aside – “Jaws” was nominated for “Best Picture” (but lost to “One Flew Over the Cuckoo’s Nest”) and won Oscars for Best Film Editing, Best Music (John Williams), and Best Sound.
The “Best Director” category in 1976 reads like a “Director Hall of Fame” list – Stanley Kubrick, Robert Altman, Sidney Lumet, Federico Fellini, and then Milos Forman won for directing “One Flew Over the Cuckoo’s Nest.” SO it is understandable why Mr Spielberg had to wait until 1978 to get his first “Best Director” nomination for “Close Encounters of the Third Kind” …
(btw: the source novel for “One Flew Over the Cuckoo’s Nest” is fantastic – I didn’t care for the movie PROBABLY because I read the book first … )
Best vs favorite ANYWAY – I have a lot of Steven Spielberg movies in my “movie library” – what is probably his “best movie” (if you have to choose one – as in “artistic achievement”) is hands down “Schindler’s List” (1993) which won 7 Oscars – including “Best Director” for Mr Spielberg.
However, if I had to choose a “favorite” then it is hard to beat “Raiders of the Lost Ark” (but there is probably nostalgia involved) …
Full disclosure: “Star Wars” was released in 1977 – when I was 8ish years old. This post started as a “reply” to something else – and grew – so I apologize for the lack of real structure – kind of a work in progress …
I am still a “George Lucas” fan – no, I didn’t think episodes I, II, and III were as good as the original trilogy but I didn’t hate them either.
George Lucas obviously didn’t have all of the “backstory” for the “Jedi” training fully formed when he was making “Star Wars” back in the late 1970’s
in fact the “mystery” of the Jedi Knights was (probably) part of the visceral appeal of the original trilogy (Episodes IV, V, and VI – for those playing along)
As always when you start trying to explain the “how” and “why” behind successful “science fantasy” you run into the fact that these are all just made up stories and NOT an organized religion handed down by a supreme intelligence
if you want to start looking at “source material” for the “Jedi” – the first stop is obvious – i.e. they are “Jedi KNIGHTS” – which should obviously bring to mind the King Arthur legend et al
in the real world a “knight in training” started as a “Page” (age 7 to 13), then became a “Squire” (age 14 to 18-21), and then would become a “Knight”
of course the whole point of being a “Knight” was (probably) to be of service and get granted some land somewhere so they could get married and have little ones
since Mr Lucas was making it all up – he also made his Jedi “keepers of the faith” combing the idea of “protectors of the Republic” with “priestly celibacy” — then the whole “no attachments”/possessions thing comes straight from Buddhism
btw: all this is not criticism of George Lucas – in fact his genius (again in Episodes IV, V, VI) was in blending them together and telling an entertaining story without beating the audience over the head with minutiae
ANYWAY “back in the 20th century” describing something as the “Disney version” used to mean that it was “nuclear family friendly” — feel free to psychoanalyze Walt Disney if you want, i.e. he wasn’t handing down “truth from the mountain” either — yes, he had a concept of an “idealized” childhood that wasn’t real – but that was the point
just like “Jedi Knights” were George Lucas’ idealized “Knights Templar” – the real point is that they are IDEALIZED for a target audience of “10 year olds” – and when you start trying to explain too much the whole thing falls apart
e.g. the “Jedi training” as it has been expanded/over explained would much more likely create sociopaths than “wise warrior priests” — which for the record is my same reaction to Plato’s “Republic” – i.e. that the system described would much more likely create sociopaths that only care about themselves rather than “philosopher kings” capable of ruling with wisdom