Category: computers

  • geeks, nerds, and reboots

    memes

    Merriam-Webster tells me that the word “meme” dates all the way back to 1976 (coined by Richard Dawkins in The Selfish Gene) with the meaning of “unit of cultural transmission.”

    Apparently the “-eme” suffix “indicates a distinctive unit of language structure.” Dr Dawkins combined the Greek root “mim-” (meaning “mime” or “mimic”) with “-eme” to create “meme.”

    Then this interweb thing hit full stride and minimally edited images with captions with (maybe) humorous intent became a “meme.”

    Humor is always subjective, and with brevity (still) being the soul of wit – most “memes” work on a form of “revelatory” humor. The humor comes in “discovering” the connection between the original image, and then the edited/altered image.

    We aren’t dealing in high level moral reasoning or intense logic – just “Picture A”, “Picture B”, brain makes connection – grin/groan and then move on. By definition “memes” are short/trivial.

    Which makes commenting on “memes” even more trivial. Recently on “social media platform” I made an off the cuff comment about a meme that amounted to (in my head) “A=B”, “B=C”, “A+B+C<D.”

    Now, I readily admit that my comment was not logical – but from a certain perspective “true” – if not directly provable from the supplied evidence. It was a trivial response meant to be humorous – not a “grand unifying theory of everything.”

    … and of course I (apparently) offended someone to the point that they commented on my comment – accusing me of “not understanding the meme.”

    Notice the “apparently” qualifier – it is POSSIBLE that they were trying to be funny. My first reaction was to explain by comment (because obviously no one would intentionally be mean or rude on the interweb – i.e. the commenter on my comment must have simply misunderstood my comment 😉 ) – BUT that would have been a fourth-level of trivialness …

    HOWEVER the incident got me thinking …

    geeks and nerds

    Another doctor gets credit for creating the term “nerd” (Dr Suess – If I Ran the Zoo, 1950). It didn’t take on the modern meaning implying “enthusiasm or expertise” about a subject until later years. The term “dork” came about in the 1960’s (… probably as a variation on “dick” – and quickly moving on …) – meaning “odd, socially awkward, unstylish person.”

    Geek” as a carnival performer biting the heads of chickens/snakes, eating weird things, and generally “grossing out” the audience goes back to 1912. Combined with another term it becomes synonymous with “nerd” – e.g. “computer geek” and “computer nerd.”

    random thought: if you remember Harry Anderson – or have seen re-runs of Night Court – his standup act consisted of “magic” and some “carnival geek” bits, he didn’t bite the heads of any live animals though (which would have gotten him in trouble – even in the 1980’s). Of course youtube has some video that I won’t link to – search for “Harry Anderson geek” if curious …


    I think a nerd is a person who uses the telephone to talk to other people about telephones. And a computer nerd therefore is somebody who uses a computer in order to use a computer.

    Douglas Adams

    ANYWAY – to extend the Douglas Adams quote – a “nerd” might think that arguing about a meme is a good use of their time …

    Which brings up the difference between “geeks” and “nerds” – I have (occasionally) used Sheldon and Leonard from the Big Bang Theory to illustrate the difference – with Sheldon being the “geek” and Leonard the “nerd”. Both of their lives revolve around technology/intellectual pursuits but they “feel” differently about that fact – i.e. Sheldon embraces the concept and is happily eccentric (“geek”) while Leonard feels self-conscious and awkward (“nerd”).

    SO when I call myself a “computer geek” it is meant as a positive descriptive statement 😉 – yes, I am aware that the terms aren’t AS negative as they once were, I’m just pointing out that my life has ended up revolving around “computers” (using them/repairing them) and it doesn’t bother me …

    Though I suppose “not being able to use a computer” in 2022 is in the same category that “not able to ride a horse” or “can’t shoot a rifle” would have been a couple hundred years ago … in a time when being “adorkable” is an accepted concept – calling yourself a “geek” or “nerd” isn’t as bad as it used to be — umm, in any case when I say “geek” I’ve never bitten the head off anything (alive or dead), I did perfect biting into and tearing off a part of an aluminum can back in high school – but that is another story …

    reboots

    While I did NOT comment on the comment about my comment – I did use the criticism of my comment as an opportunity for self-examination.

    Background material: The meme in question revolved around “movie franchise reboots.” (again, trivial trivialness)

    In 2022 when we talk about “movie franchise reboots” the first thing that is required is a “movie franchise.”

    e.g. very obviously “Star Trek” got a JJ Abrams reboot. Those “Star Wars” movies were “sequels” not “reboots” but the less said about JJ Abrams and that franchise the better

    the big “super hero” franchises have also obviously been rebooted –

    • Batman in the 1990’s played itself out – then we got the “Batman reboot” trilogy directed by Christopher Nolan,
    • Superman in the 1970’s/80’s didn’t get a movie franchise reboot until after Christopher Reeves died
    • Spider-Man BECAME a movie franchise in the 2000’s, then got a reboot in 2012, and another in 2016/2017

    SO the issue becomes counting the reboots – i.e. Batman in the 1990’s (well, “Batman” was released in 1989) had a four movie run with three different actors as Batman. I’m not a fan of those movies – so I admit my negative bias – but they did get progressively worse …

    Oh, and if we are counting “reboots” do you count Batman (1966) with Adam West? Probably not – it exists as a completely separate entity – but if you want to count it I won’t argue – the relevant point is that just “changing actors” doesn’t equal a “reboot” – restarting/retelling the story from a set point makes a “reboot.”

    However, counting Superman “reboots” is just a matter of counting actor changes – e.g. Christopher Reed made 4 Superman movies (which also got progressively worse) – “Superman Returns” (2006) isn’t a terrible movie – but it exists in its own little space because it stomped all over the Lois Lane/Superman relationship – then we have the Henry Cavill movies that were central to DC comics attempt at a “cinematic universe.”

    We can also determine “reboots” by counting actors with Spider-Man. Of course the Spider-Man franchise very much illustrates that the purpose of the “movie industry” is to make money – not tell inspiring stories, raise awareness, or educate the masses – make money. If an actor becomes a liability – they can be replaced – it doesn’t matter if you setup another movie or not 😉

    There are other not so recent franchises – “Tarzan” was a franchise, maybe we are stretching to call Wyatt Earp a franchise, how about Sherlock Holmes?

    The Wyatt Earp/OK corral story is an example of a “recurring story/theme” that isn’t a franchise. Consider that “McDonald’s” is a franchise but “hamburger joint” is not …

    Then we have the James Bond franchise.

    The problem with the “Bond franchise” is that we have multiple “actor changes” and multiple “reboots.” i.e. Assuming we don’t count Peter Seller’s 1967 “Casino Royale” there have been 6 “James Bond” actors. Each actor change wasn’t a “reboot” but just because they kept making sequels doesn’t mean they had continuity.

    The “Sean Connery” movies tell a longform story of sorts – with Blofeld as the leader of Spectre. The “James Bond” novels were very much products of the post WWII/Cold War environment – but the USSR was never directly the villain in any of the movies, the role of villain was usually Spectre in some form.

    The easy part: The “Daniel Craig” Bond movies were very obviously a reboot of the Blofeld/Spectre storyline.

    The problem is all of those movies between “On Her Majesty’s Secret Service” (1969) and “Casino Royale” (2006).

    “Diamonds are Forever” (1971) was intended to finish the story started in “On Her Majesty’s Secret Service” – i.e. Bond gets married (and retires?), then Blofeld kills Bond’s wife as they leave the wedding – then bad guys drive away – Bond holds his dead wife while saying “We have all the time in the world.” – roll credits.

    (fwiw: Except for the obviously depressing ending “On her Majesty’s Secret Service” is actually one of the better Bond movies)

    Then George Lazenby (who had replaced Sean Connery as Bond) asked for more money than the studio was willing to pay – and they brought back Sean Connery for a much more light hearted/cartoonish Bond in “Diamonds are Forever.” (did I mention the profit making motive?)

    Of course “Diamonds are Forever” starts out with Bond hunting down and killing Blofeld – but that is really the only reference we get to the previous movie – SO reboot? this particular movie maybe, maybe not – but it did signify a “formula change” if nothing else.

    Any attempt at “long form storytelling” was abandoned in preference for a much more “cartoony” James Bond. MOST of the “Roger Moore” Bond movies have a tongue-in cheek feeling to them.

    The “70’s Bond movies” became progressively more cartoonish – relying more on gadgets, girls, violence than storytelling (e.g. two of the movies “The Spy Who Loved Me” and “Moonraker” are basically the same plot). There are a few references to Bond having been married but nothing that would be recognized as “character development” or continuity – it could be argued that each movie did a “soft reboot” to the time after “Diamonds are Forever”, but simply saying that the “continuity” was that there was no “continuity” is more accurate.

    Then we got the “80’s Bond” – “For Your Eyes Only” intentionally backed off the gadgets and promiscuity – Bond visits his wife’s grave and Blofeld makes a (comic) appearance in the “Bond intro action sequence” – so I would call this one a “soft reboot” but not a complete relaunch.

    The same goes for Timothy Dalton’s Bond movies – not a full blown restart, but a continuation of the “upgrading” process – still no memorable continuity between movies – (he only did two Bond movies).

    Pierce Brosnan as Bond in “GoldenEye” (1995) qualifies as another actor change and “soft reboot” – Bond is promiscuous and self-destructive but it is supposed to be as a reaction to his job, not because being promiscuous and self-destructive is cool – but we were back to the tongue in cheek – gadget fueled Bond (two words: “invisible car”).

    The Daniel Craig Bond movies certainly fit ANY definition of a reboot. “No Time to Die” (2021) was the last Bond movie for Mr Craig – but what direction the “franchise” is going is all just speculation at the moment …

    ANYWAY – comparing 27 Bond movies over 58ish years to the modern “Super hero” reboots – was the gist of my trivial answer to a trivial meme (which only took 1,700+ words to explain 😉 )

  • radio.iterudio.com – now radio.clancameron.us

    … this was kind of a “web design” exercise/proof of concept

    functionally this is a “web radio” front end – there is an “on/off” button that toggles playing the stream

    I’m told that the Safari web browser might have issues playing the stream – so this is kind of a “beta test” request

    the music is all “public domain” – the pictures of paintings are examples of the “Hudson River School” movement from the mid-19th century – i.e. they aren’t related to the music, just some files I had available

    I’m always quick to point out that I am NOT a graphical designer – e.g. as I was cobbling the front end together it occurred to me that having the “on/off” toggle at the top (and bigger) might be a better option

    SO if you could take a look honest feedback would be appreciated – MOSTLY I’m curious if it works – the pictures should rotate through 10 images, “current song” info should update, there should be music, is the on/off toggle obvious enough …

    https://radio.clancameron.us/

  • buzzword bingo, pedantic-ism, and the internet

    just ranting

    One of the identifying characteristics of “expert knowledge” is understanding how everything “fits” together. True “mastery” of any field with a substantial “body of knowledge” takes time and effort. Which means there are always more people that “know enough to be dangerous” than there are “real experts.”

    Which is really just recognition of the human condition – i.e. if we had unlimited time and energy then there would be a lot more “true experts” in every field.

    There is a diminishing return on “additional knowledge” after a certain point. e.g. Does anyone really need to understand how IBM designed Token Ring networks? Well, it might be useful for historic reasons and theoretical discussion – but I’ll go out on a limb and say it isn’t worth the effort to become an “expert” on Token Ring – and if you are studying “networking” becoming an expert on Token Ring is not worth the time.

    There are also a lot of subjects where a slightly “incorrect” understanding is part of the learning process. e.g. Remember that high school chemistry class where you learned about electrons orbiting the nucleus at various discrete “energy levels” like tiny moons orbiting a planet? Then remember that college chemistry class where they told you that isn’t the way it actually is – but don’t worry about it, everyone learns it that way.

    (random thought – just because we can’t be sure where something is, doesn’t mean it can be in two spots at the same time – just like that cat in a box – it isn’t half alive and half dead, it is one or the other, we just can’t know which one – and moving on …)

    buzzwords vs jargon vs actual understanding

    Dilbert’s “pointy haired boss” is routinely held up for ridicule for “buzzword spouting” – which – in the most negative sense of the concept – implies that the person using “buzzwords” about “subject” has a very minimal understanding of the “subject.”

    Of course the “Dilbert principle” was/is that the competent people in a company are too valuable at their current job – and so cannot be promoted to “management”. Which implies that all managers are incompetent by default/design. It was a joke. It is funny. The reality is that “management” is a different skillset – but the joke is still funny 😉

    The next step up are the folks that can use the industry “jargon” correctly. Which simply illustrates that “education” is a process. In “ordinary speech” we all recognize and understand more words than we actively use – the same concept applies to acquiring and using the specific vocabulary/”jargon” of a new field of study (whatever that field happens to be).

    However if you stay at the “jargon speaking” level you have not achieved the goal of “actual understanding” and “applied knowledge.” Yes, a lot of real research has gone into describing different “levels”/stages in the process – which isn’t particularly useful. The concept that there ARE stages is much more important than the definition of specific points in the process.

    pedants

    No one want a teacher/instructor that is a “pedant” – you know, that teacher that know a LOT about a subject and thinks that it is their job to display just how much they know — imagine the high school teacher that insists or correcting EVERYONES grammar ALL THE TIME.

    There is an old joke that claims that the answer to EVERY accounting question is “it depends.” I’m fond of applying that concept to any field where “expert knowledge” is possible – i.e. the answer to EVERY question is “it depends.”

    (… oh, and pedants will talk endlessly about how much they know – but tend to have problems applying that knowledge in the real world. Being “pedantic” is boring/bad/counter productive – and ’nuff said)

    Of course if you are the expert being asked the question, what you get paid for is understanding the factors that it “depends on.” If you actually understand the factors AND can explain it to someone that isn’t an expert – then you are a rara avis.

    In “I.T.” you usually have three choices – e.g. “fast”, “cheap” (as in “low cost”/inexpensive), “good”/(as in durable/well built/”it is heavy? then it is expensive”) – but you only get to choose two. e.g “fast and cheap” isn’t going to be “good”, “fast and good” isn’t going to be “inexpensive.”

    Is “Cheap and good” possible? – well, in I.T. that probably implies using open source technologies and taking the time to train developers on the system – so an understanding of “total cost of ownership” probably shoots down a lot of “cheap and good” proposals – but it might be the only option if the budget is “we have no budget” – i.e. the proposal might APPEAR “low cost” when the cost is just being pushed onto another area — but that isn’t important at the moment.

    internet, aye?

    There is an episode of the Simpsons where Homer starts a “dot com” company called Compu Glogal Hyper Meganet – in classic Simpsons fashion they catch the cultural zeitgeist – I’ll have to re-watch the episode later – the point for mentioning it is that Homer obviously knew nothing about “technology” in general.

    Homer’s “business plan” was something like saying “aye” after every word he didn’t understand – which made him appear like he knew what he was talking about (at the end of the episode Bill Gates “buys him out” even though he isn’t sure what the company does – 1998 was when Microsoft was in full “antitrust defense by means of raised middle finger” – so, yes it was funny)

    (random thought: Microsoft is facing the same sort of accusations with their “OneDrive” product as they did with “Internet Explorer” – there are some important differences – but my guess is THIS lawsuit gets settled out of court 😉 )

    ANYWAY – anytime a new technology comes along, things need to settle down before you can really get past the “buzzword” phase. (“buzzword, aye?”) – so, while trying not to be pedantic, an overview of the weather on the internet in 2021 …

    virtualization/cloud/fog/edge/IoT

    Some (hopefully painless) definitions:

    first – what is the “internet” – the Merriam-Webster definition is nice, slightly more accurate might be to say that the internet is the “Merriam-Webster def” plus “that speaks TCP/IP.” i.e. the underlying “language” of the internet is something called TCP/IP

    This collection of worldwide TCP/IP connected networks is “the internet” – think of this network as “roads”

    Now “the internet” has been around for a while – but it didn’t become easy to use until Tim Berners Lee came up with the idea for a “world wide web” circa 1989.

    While rapidly approaching pedantic levels – this means there is a difference between the “internet” and the “world wide web.” If the internet is the roads, then the web is traffic on those roads.

    It is “true” to say that the underlying internet hasn’t really changed since the 1980’s – but maybe a little misleading.

    Saying that we have the “same internet” today is a little like saying we have the same interstate highway system today as we did when Henry Ford invented the Model-T. A lot of $$ has gone into upgrading the “internet” infrastructure since the 1980’s – just like countless $$ have gone into building “infrastructure” for modern automobiles …

    Picking up speed – Marc Andreessen gets credit for writing the first “modern” web browser in the early 1990s. Which kinda makes “web browsers” the “vehicles” running on the “web”

    Britannica via Google tells me that the first use of the term “cyberspace” goes back to 1982 – for convenience we will refer to the “internet/www/browser” as “cyberspace” – I’m not a fan of the term, but it is convenient.

    Now imagine that you had a wonderful idea for a service existing in “cyberspace” – back in the mid-1990’s maybe that was like Americans heading west in the mid 19th century. If you wanted to go west in 1850, there were people already there, but you would probably have to clear off land and build your own house, provide basic needs for yourself etc.

    The cyberspace equivalent in 1995 was that you had to buy your own computers and connect them to the internet. This was the time when sites like “Yahoo!” and/or “eBay” kind of ruled cyberspace. You can probably find a lot of stories of teenagers starting websites – that attracted a lot of traffic, and then sold them off for big $$ without too much effort. The point being that there weren’t a lot of barriers/rules on the web – but you had to do it yourself.

    e.g. A couple of nice young men (both named “Sean”) met in a thing called “IRC” and started a little file sharing project called Napster in 1999 – which is a great story, but also illustrates that there is “other traffic” on the internet besides the “web” (i.e. Napster connected users with each other – they didn’t actually host files for sharing)

    Napster did some cool stuff on the technical side – but had a business model that was functionally based on copyright infringement at some level (no they were not evil masterminds – they were young men that liked music and computers).

    ANYWAY – the point being that the Napster guys had to buy computers/configure the computers/and connect them to the internet …

    Startup stories aside – the next big leap forward was a concept called “virtualization”. The short version is that hardware processing power grew much faster than “software processing” required – SO 1 physical machine would be extremely underutilized and inefficient – then “cool tech advancements” happened and we could “host” multiple “servers” on 1 physical machine.

    Extending the “journey west” analogy – virtualization allowed for “multi-tenant occupation” – at this point the roads were safe to travel/dependable/you didn’t HAVE to do everything yourself. When you got to your destination you could stay at the local bed and breakfast while you looked for a place to stay permanent (or move on).

    … The story so far: we went from slow connections between big time-sharing computers in the 1970’s to fast connections between small personal computers in the 1990’s to “you need a computer to get on the web” and the “web infrastructure” consists mostly of virtualized machines in the early 2000s …

    Google happened in there somewhere, which was a huge leap forward in real access to information on the web – another great story, just not important for my story today 😉

    they were an online bookstore once …

    Next stop 2006. Jeff Bezos and Amazon.com are (probably) one of the greatest business success stories in recorded history. They had a LONG time where they emphasized “growth” over profit – e.g. when you see comic strips from the 1990’s about folks investing in “new economy” companies that had never earned a profit, Amazon is the success story.

    (fwiw: of course there were also a LOT of companies that found out that the “new economy” still requires you to make a profit at some point – the dot.com boom and bust/”bubble” has been the subject of many books – so moving on …)

    Of course in the mid-2000’s Amazon was still primarily a “retail shopping site.” The problem facing ANY “retail” establishment is meeting customer service/sales with employee staffing/scheduling.

    If you happen to be a “shopping website” then your way of dealing with “increased customer traffic” is to implement fault/tolerance and load balancing techniques – the goal is “fast customer transactions” which equals “available computing resources” but could also mean “inefficient/expensive.”

    Real world restaurant example: I’m told that the best estimate for how busy any restaurant will be on any given day is to look at how busy they were last year on the same date (adjusting for weekends and holidays). SO if a restaurant expects to be very busy on certain days – they can schedule more staff for those days. If they don’t expect to be busy, then they will schedule fewer employees.

    Makes sense? Cool. The point is that Amazon had the same problem – they had the data on “expected customer volume” and had gone about the process of coming up with a system that would allow for automatic adjustment of computing resources based on variable workloads.

    I imagine the original goal might have been to save money by optimizing the workloads – but then someone pointed out that if they designed it correctly then they could “rent out” the service to other companies/individuals.

    Back to our “westward expansion” analogy – maybe this would be the creation of the first “hotel chains.” The real story of “big hotel chains” probably follows along with the westward expansion of the railroad – i.e. the railroads needed depots, and those depots became natural “access” points for travelers – so towns grew up around the depots and inns/”hotels” developed as part of the town – all of which is speculation on my part – but you get the idea

    The point being that in 2006 the “cloud” came into being. To be clear the “cloud” isn’t just renting out a virtual machine in someone else’s data center – the distinct part of “cloud services” is the idea of “variable costs for variable workloads.”

    Think of the electrical grid – if you use more electricity then you pay for what you use, if you use less electricity then your electrical expenses go down.

    The “cloud” is the same idea – if you need more resources because you are hosting an eSports tournament – then you can use more resources – build out/up – and then when the tournament is over scale back down.

    Or if you are researching ‘whatever’ and need to “process” a lot of data – before the cloud you might have had to invest in building your own “super computer” which would run for a couple weeks and then be looking for something to do. Now you can utilize one of the “public cloud” offerings and get your data ‘processed’ at a much lower cost (and probably faster – so again, you are getting “fast” and “inexpensive” but you are using “virtual”/on demand/cloud resources).

    If you are interested in the space exploration business – an example from NASA –

    Fog/Edge/IoT?

    The next problem becomes efficiently collecting data while also controlling cost. Remember with the “cloud” you pay for what you use. Saying that you have “options” for your public/private cloud infrastructure is an understatement.

    However, we are back to the old “it depends” answer when we get into concepts like “Fog computing” and the “Internet of things”

    What is the “Internet of Things” well NIST has an opinion – if you read the definition and say “that is nice but a little vague” – well, what is the IoT? It depends on what you are trying to do.

    The problem is that the how of “data collection” is obviously dependent of the data being collected. So the term becomes so broad that it is essentially meaningless.

    Maybe “Fog” computing is doing fast and cheap processing of small amounts of data captured by IoT devices – as opposed to having the data go all the way out to “the cloud” – we are probably talking about “computing on a stick” type devices that plug into the LAN.

    Meanwhile “Edge computing” is one for the salespeople – e.g. it is some combination of cloud/fog/IoT – at this point it reminds me of the “Corinthian Leather” Ricardo Montalban was talking about in car commercials way back when 😉

    Ok, I’m done – I feel better

    SO if you are teaching an online class of substantial length – an entire class only about IoT might be a little pointless. You can talk about various data collecting sensors and chips/whatever – but simply “collecting” data isn’t the point, you need to DO SOMETHING with the data afterwards.

    Of course I can always be wrong – my REAL point is that IoT is a buzzword that gets misused on a regular basis. If we are veering off off into marketing and you want to call the class “IoT electric boogaloo” because it increases enrollment – and then talk about the entire cloud/fog/IoT framework – that would probably be worthwhile.

    it only took 2400+ words to get that out of my system 😉

  • What makes a game a “game”?

    Movie
    “Free Guy” was “cute” and fun. First thought: they are examining a very old question. Maybe at the root of the movie is that the old “unexamined life is not worth living” thing.

    Games
    ANYWAY – the movie deals with ‘gaming’ in general so the “secondary thought” becomes just what makes something a “game?”

    Merriam Webster tells me that the word “game” dates back to the 12th century with roots (eventually) in the Old Norse “gamen” which meant “sport, amusement.”

    So there is that feeling of a “game” being both a “contest/competition” but also having a sense of “joy/fun/entertainment.”

    It might sound obvious but IDEALLY “games” should be “fun” for all of the people involved. If one side is “having fun” and the other side isn’t – then (arguably) they aren’t “playing a game” but engaging in some other activity.

    Competition AND Fun
    I’ll point out that the #1 reason young athletes stop participating in “sports” (in general) is because they aren’t having “fun.”

    The same idea probably applies to “games” in general – i.e. if you aren’t having fun, you will probably stop playing.

    Which is why we see online games constantly releasing “new content” to keep players interested. However, if the game is no longer “fun” participation will dwindle.

    Maybe a “game” has to be “competitive” and “fun.” There is a lot of wiggle room in calling something “competitive” – e.g. the game has to be “challenging,” as in not too hard but also not too easy.

    It is a common “game designer” tactic to make the “lower levels” a tutorial on how to play the game. Then as players master those skills, the level of difficulty rises. In essence EVERY game is a “learning experience” – but usually what you are learning is specific to the game.

    Final Answer
    It is PROBABLY accurate to say that “play” is an indicator of intelligence – i.e. the animals that engage in “playful activity” are illustrating the ability to learn and master activities.

    With humans the types of games a person plays PROBABLY tells you something about that person. But that sounds like a two drink discussion for another time …

    SO what makes a “game” a “game”? A combination of competition (remember it is possible to “compete” against yourself), fun, and the potential for “mastery.”

    If one of those three elements is missing – you are probably engaged in “non game” activity.

    Also important to point out is that what is “fun and challenging” for one person may be “boring busywork” for someone else. As I mentioned above – the games we choose to play say something about us as individuals.

    This was something of a plot hole in “Free Guy” – and is what motivated this post. The movie was entertaining – but “playing a game” implies interaction at some level.

    ok, no spoilers BUT If all someone does is “observe” then they aren’t “playing.”

    Imagine if someone tried to make an “aquarium game” (it has probably been done – I haven’t checked) – for it to be a “game” the player should have to select fish/occupants of aquarium, buy food, feed the fish. Maybe have the ability to sell fish and earn money to buy more/different fish, etc. THAT would be a game.

    BUT if all you do is WATCH the aquarium with zero interaction – well, you aren’t “playing a game” you are WATCHING.

    Again, the interaction is essential – and probably illustrates why the video game industry is bigger than the movie industry …

    Sports
    “Games” can also mean “athletic competition” – e.g. the “Olympic Games”, the “Pan American Games”, the “Commonwealth Games.”

    I “cut the cable” a few years back – so it was surprisingly hard to watch much “live” Olympic coverage. HOWEVER it was also very hard to avoid hearing about the Games.

    To compete at an “Olympic” level the athletes have to put in a large amount of work – no one “accidently” becomes an Olympic athlete.

    Just for fun I’ll argue that the most successful competitors still get “joy” out of playing their chosen sport. It may be cliche to say they play “for the love of the game” – but it is true 99% of the time.

    I have an informed opinion on “youth sports” in general – but that is another post 😉


  • “Movies”, “Records”, and me – part 1

    All the cool kids are doing it…
    I imagine that most people exist on a sort of “sliding scale” of “fashionableness.” At one extreme end is “hip/cool/fashionable/in style/trendy” near the middle is “not as young – but capable of understanding ‘what the kids are saying’” then the other extreme end is “What is everyone talking about? Get off my lawn!”

    Obviously “chronological age” is NOT directly tied to your position on the imaginary “trendiness” scale (just called TS from here on)- but in general “young folks” as a group will be clustered near one end, the parents of those “younger folks” will cluster near the middle, and then the parents of the parents will tend to be near the other extreme.

    There is still “nothing new under the sun” so we see “fashions” repeating. Of course the “fashion” industry is built on the idea that styles will come and go – so I’m not talking about “physical clothes” so much as “styles” — and the difference between “clothes” and “style” probably deserves its own post —

    Now, a handful of things NEVER go out of style – e.g. “good manners” come immediately to mind, but what you think will never go out of style is probably determined by your current location on the TS.

    Wannabes/Posers/Pretenders
    The tricky concept becomes the fact that having “style” and “BEING in style” at not dependent variables – i.e. you can have one, without the other …

    I will quickly say that I am NOT passing judgement on anyone – I am being very “theoretical” – talking about “forms” as it were.

    With that said – we all know (or have been) the person that “tries too hard” and “just doesn’t get it.” I suppose this is where the concept of “coolness” come into play – i.e. if you are TRYING to be “cool” then by definition you aren’t.

    … and of course being worried about how “cool” you are is another sure sign that you aren’t cool – but then being certain that you “cool” also probably means you aren’t. AND we are moving on …

    Vocabulary/Jargon
    ANYWAY – not to sound like a “self-help book” but a person’s vocabulary advertises who they are. In and of itself this isn’t good or bad – i.e. most professions have some “profession specific vocabulary” and if you can “talk the talk” (in general) people will give you the benefit of the doubt that you can “walk the walk.”

    Examples abound – there is even a word for it -however this diatribe (intended in the archaic “prolonged discourse” sense – as I feel myself sliding further to one side of the TS scale) was motivated by the word “movies.”

    Movie
    The word “movie” in English dates back to 1909 as a shortened/slang version of “motion picture.” In 2021 common usage “movie” has almost completely replaced “motion picture.”

    e.g. no one says “I watched a motion picture last night”

    The same can be said for the word “cinema” which is a shortened version of cinematograph – which came to us through the French “cinématographe” which was from the Greek for “motion” and “writing” (though “cinema” is still more popular than “motion picture”)

    Cinema
    then “cinematography” probably falls into the “movie industry jargon” category – the person in charge of a movies “cinematography” may or may not be operating a camera.

    As any amateur photographer will tell you, getting consistently good “pictures” doesn’t happen by accident – there are multiple factors involved. Being able to manipulate those factors to achieve a desired “look” is (probably) what distinguishes the “professional” from the “amateur” photographer/cinematographer.

    btw: The additional problem for cinematography is that people are moving around (both in front of and behind the camera).

    for what it is worth: I’m not going to do a blanket recommendation for ANY directors “body of work” – but in general Stanley Kubrick, John Ford, David Lean, Steven Spielberg, and Ridley Scott always tend to have great “cinematography” in their movies (which didn’t happen by accident).

    Of course George Lucas always had a “good eye” – but not always the biggest budget. Comments by Mr Lucas led me to watch a lot of Akira Kurosawa movies – most of which hold up very well (if you don’t mind subtitles). I’ll just mention that the movie that Kurosawa-san is most known for in the U.S. (Seven Samurai – known as the inspiration for “The Magnificent Seven”) – is my least favorite (it bogs down in the middle)

    … I’m still in full “ramblin’ mode” but also well into TL;DR space – more tomorrow on “records”

  • Nokia-Microsoft, Compaq-HP, thoughts on company size and culture

    Just watched a documentary “The Rise and Fall of Nokia Mobile” – which is available online from various sources (the link is to Tubi).

    From a “history of tech” point of view it was interesting. Nokia is one of the companies that “invented” the mobile phone – i.e. they tell the story of “mobile communication” from Nokia’s perspective.

    That distinction is important – simply because a lot of “co-invention” is always going on. This tendency for multiple companies/people to be working to solve the same problems, and therefore working on competing technological solutions to those problems – is why we have “patents”/copyrights and intellectual property laws in general.

    SO just like (from a business view) who actually wrote a hit song is not as important as whose names are on the copyright filing – who actually invented a technology isn’t nearly as important as what company owns the patent.

    Now, I am not saying Nokia wasn’t a special place to work or that Nokia engineers didn’t do incredible things – but the “rise and fall” of Nokia tells a very old story.

    you know “It’s still the same old story / A fight for love and glory” – best told over a cold beverage, with a piano playing in the background: “small innovative company with a intimate company culture grows from ‘gimmick company’ to market dominance, fortunes are made and lost, outsiders come in and take control – and ultimately the company is relegated to history”

    That is also the “Compaq computers” story as well as the Commodore story. Of course Compaq was on the decline when HP consumed them (“Compaq” still exists as an HP brand.)

    The Nokia documentary – which was made (in part) to help celebrate Finland’s 100 years of independence – takes the view that the “profit hungry Americans came in and ruined Nokia.”

    Ok, sure, that IS what happened – but my point is that what happened is an example of the problems with organizational growth not an example of “what is good about Finland and what is bad about the United States.”

    e.g. small companies can have a very “team oriented” culture – the competition in these environments tends to be focused outward at “the market” in general or maybe a specific large competitor.

    Meanwhile large companies tend to become inefficient bureaucracies with competition being directed internally against other divisions/sections/whatever.

    Slightly funny is that Steve Jobs apparently did to Nokia and the cell phone market what he did to Xerox and the personal computer gui – i.e. he saw they had a superior product and “appropriated” the idea.

    No, the iPhone is NOT the reason Nokia “fell” – but it certainly hastened the demise as an independent company (Microsoft “acquired” Nokia in 2014 and had “ceased operations” on the last vestiges of Nokia in 2016).

    ANYWAY – in a “free market” the small and the quick usually end up beating the big and slow – maybe file that under “business cycles 101” – on the plus side many “former Nokia employees” have started companies, where they will try to replicate what was good about Nokia.

    The reality is that MOST companies DON’T last – and the process from “vibrant startup” to “old company mentioned in documentaries if remembered at all” looks a lot like the rise and fall of Nokia.

    HOWEVER I will say that Finland looks beautiful in the documentary – I’m not going there in the winter, but I my desire to visit has increased since watching the documentary.

  • Security, “cyber security”, system administration

    “Enough”
    During the “lost decade” of my “20’s” I had a LOT of different jobs. PC repair, high school wrestling coach, security guard, and a lot of “student” time in general.

    The (pre 9/11) “security guard” time was nice because I was usually left alone all night. I was there to be visible and act as a deterrent – not perform heroic acts – which since I was looking for a paycheck and not an adrenaline rush was exactly what I wanted.

    Of course there are also “private security personnel” that are highly trained professionals. Obviously the “highly trained professional” is going to demand a larger paycheck than the employee that did the “1 day orientation/computer training.”

    From an “organizational” point of view – security is similar to “insurance.” Both deal with “risk management” – as in “you can’t eliminate ‘risks’ but you can minimize your vulnerability/exposure”.

    SO the best practice with security and insurance is to have “enough” to cover you needs.

    “Sales”
    Then the question becomes “just how much is enough?”

    Have you seen the commercials for “home security system” where a masked intruder breaks a window (in what looks like a nice suburban home) in the middle of the day?

    Then cut to a frightened child and woman clutching each other in a state of panic – followed by the phone ringing and a reassuring voice saying “We have detected a break in at your premises. Authorities have been notified. Do you require assistance?”

    The relieved/grateful woman picks up the phone and says something like “Thank you security system! I don’t know what we would have done without you.” — and then you get the sales pitch from the security monitoring company (e.g. for less than $ a day you can protect your family…)

    Now, I’m not dismissing the need for/utility of these systems – I’m pointing out that the scenario used is “unlikely” at best and designed to manipulate your emotions. After all – can you put a price on “protecting your family?”

    On a less emotionally charged front – the answer to “can you put a price on the security of your business” is “yes.” In a nutshell – you don’t want to pay more for security than the value of the object being secured.

    SO that storage facility housing spare parts for your “commodity widget” making factory PROBABLY doesn’t need as much security as the distribution center that processes orders from customers for your “commodity widget.”

    Now that sales person working on commission might try to convince the “commodity widget maker upper management” that they need the top end security everywhere – and maybe they do – but obviously the “sales person” is biased.

    SO when the widget making enterprise gets past a certain size – they will probably hire a “director of security” or something to evaluate the needs of the company.

    “Cyber”
    That same process/concept applies to “computer network security.” Q. How much “cyber security” do you need? A. “enough”

    As a long time “I.T. professional” my view of “cyber security” is that it is a marketing term. Obviously I am NOT saying that “computer network security” is irrelevant – just that “good system administration” has ALWAYS included “network security.”

    Consider “automobile security” – how much should someone spend to “secure” their car?

    Well, if you have a beat up Ford Pinto with 500,000 miles on it that starts shaking if you go over 65 miles per hour and you only keep to haul garbage to the landfill – then maybe you are comfortable leaving the keys on the dashboard with the windows rolled down. If someone steals the car they might be doing you a favor.

    BUT if you have “new luxury SUV” you might invest in a car alarm, and some form of remote monitoring. If you live in “big city” you might pay for “off street” parking. In any case you certainly aren’t leaving the keys on the dashboard with the windows rolled down.

    Getting back to “computer network security” – MOST networks probably fall into the “nice four door sedan” category. They need to be secured – and they will be compromised if left un-secured – but they aren’t a specific target.

    e.g. roll up the windows, lock the doors, don’t leave valuables in plain sight – and your “family sedan” is probably secure enough. Adhere to “good system administration practices” and your computer network is “probably” secure enough.

    I also like the idea of a Magnificent Seven approach to security – NOT that you need to hire hackers to protect yourself from hackers, but that you need to secure your network enough to make the “casual attacker” go somewhere else.

    IF someone is intentionally targeting your network AND they are willing to spend money and time THEN they will probably be able to compromise your network. Your goals should be to not “make it easy” for them and also to detect and respond to the intrusion when it happens.

    For individuals your small home network probably is more valuable to the bad actors as a resource for “zombie”/spam activity – but still, don’t make it easy on them.

    If you REALLY want to worry about something – more important than the network itself is the data moving on that network – so the biggest threat to the “average network” is the people using the network. Which is a slightly different subject …

    TL;DR
    Yes, there are needs for “security specific” computer professionals – things like penetration testing and security auditing come immediately to mind. The concept of a security “baked in“/first approach to application development is also obvious. I’m just tired of hearing “cyber security” presented as something new and novel …

    e.g. A combination of good backups, sensible user management, and applying encryption to both file storage and network traffic probably protects 90% of “computer networks”

  • The “regulation” thing

    First Principles
    At a “first principles” point of view – USUALLY the best thing for the U.S. Congress to do is “nothing.”

    Arguably the “Founders” believed the same thing. Which is why the U.S. has the system of gov’ment that we have. Just from a practical “organizational behavior” the larger the group of people – the less likely you are to agree on anything.

    The “wheels of gov’ment” are supposed to be slow moving and inefficient – remember the Founders’ goal was the preservation of individual liberty via the limiting of “gov’ment.”

    SO anything that the gov’ment does do, shouldn’t be fast or drastic – again, just in general “government is best which governs least”

    But then …
    Of course the gov’ment isn’t just window dressing – they are supposed to do SOMETHING. e.g. In times of war trying to rule by committee is a recipe for disaster – which is why the POTUS is “Commander in Chief.”

    In the Ancient Roman Republic – traditionally two “Consuls” were elected to “run” the Republic. The Consuls had full executive power, but each also had veto power over the other. This meant legislative stalemate was the norm and preservation of the status quo was achieved – which was kinda what the “powers that be” wanted.

    BUT in times of emergency/war – i.e. when things needed to “get done” – a single person would be put in charge.

    Fans of Ancient History will be familiar with the story of CIncinnatus – but I’ll move on – after pointing out that in peacetime it is always worth taking the time to examine the issue and “get it right” as opposed to “doing it fast.”

    If it moves regulate it …
    Ronald Reagan described the “government’s” view of the economy as: “If it moves, tax it. If it keeps moving, regulate it. And if it stops moving, subsidize it.”

    Which is a great line – but hits on several big issues. The purpose of taxation tends to get “complicated” but isn’t important right now. However, REGULATION should be easier to agree about as far as “purpose” goes. (and subsidizing something is probably just another form of taxation)

    In general “gov’ment regulation” should NEVER be punitory – i.e. protecting the consumer/individual SHOULD always be the purpose of regulation – NOT simply punishing a specific company/industry because it is politically expedient to do so.

    From a practical point of view – that means government should do what individuals can’t do for themselves. SO things like controlling access to a limited resource (e.g. the old radio/television broadcast spectrum), or ensuring that drivers are not a danger to themselves or others (e.g. physical checkup requirements for CDL, or vision tests for driver licenses) are obvious candidates for regulation.

    The form that regulation takes is up for debate – but regulating interstate commerce is obviously one of those functions the Feds are supposed to handle — but one more time “regulation” should not be “political punishment”, it should always serve the consumer.

    Internet/Web/Walled gardens/Facebook…
    The story of the “internet” can be told numerous ways – some of which are interesting, but not important right now.

    (no matter how you choose to tell the “Internet story” – it didn’t just spontaneously appear, but it also wasn’t created by government bureaucrats.)

    If we accept that “sharing information” (to one degree or another) is the point of ANY “network” then the “Internet” has been a great success story.

    BUT you needed affordable personal computers AND the “world wide web” to make it useful to the average person.

    (btw if the “internet” is the “highway” then the “web” is one type of vehicle using that highway – but not the ONLY type of vehicle)

    For a lot of folks back in the early 1990’s “America Online” (AOL) was their first “information service.” AOL charged a monthly fee for access to their network – and bombarded the nation with 3.5″ disks and then CD-ROMs offering monthly free trials.

    Then when the “web” happened – AOL started offering unlimited “Internet” access through their network. AOL still had a lot of “AOL network” content, so people might login to AOL and never leave AOL – this was kind of the “Walled Garden” internet (or if you logged into a local ISP – that homepage might have been your concept of the “web” in general).

    (btw the AOL merger with Time Warner is probably one of the worst mergers of all time from a “combined value” view – i.e. the perception of AOL’s value was much greater than what they actually possessed)

    Maybe the 1990’s could be called the “era of the portal.” The web may have offered access to vast amounts of knowledge – but finding anything was difficult. So “web directories” (like Yahoo! ) ruled the day. Then Google happened in the late 1998 – which is also another story …

    Facebook is just another version of the “walled garden” – and they continue to add services trying to keep users on their platform. Of course the more users accept Facebook as “walled garden” the more Facebook can earn in advertising $$ – which once again, is neither good or bad, just good business

    The important thing to remember is that Facebook is NOT the “Internet.”

    Regulation, but how?
    I’ve seen Facebook running ads (on Facebook) advocating changing the “Internet regulations.” This always comes across as self-serving as well as slightly pernicious – just because the regulations are older than Facebook doesn’t mean they need to be changed JUST for the sake of changing them.

    The problem with “regulation for the sake of regulation” is that it tends to be counter productive. “Big business” actually benefits from “increased regulatory requirements” because it tends to cut down on innovation/disruptive competition.

    “Bad regulation” simply reinforces the status quo and/or cements things in place preventing change. “Good regulation” will protect the consumer while encouraging growth/competition.

    My personal preference would be to apply “newspaper”/media conglomerate standards to Facebook. Hold them accountable for “censorship”, create a truly independent “arbitration board” (not one bought and paid for by Facebook) – but don’t cement them in place as the status quo.

    Regulation should not stifle whatever the “next big thing” may be …

  • User interfaces

    Making a product “easy to use” is never “easy.”

    “Elegant” products are few and far between. Merriam-Webster tells us that “elegant” means “marked by elegance” – which then requires another click for “elegance” and we get “dignified gracefulness or restrained beauty of style”

    An “elegant product” becomes an example of “beautiful simplicity.”

    Under Steve Jobs leadership Apple was known for “striving for elegance.” When he was alive Mr Jobs liked to say that they (i.e. Apple) didn’t do a lot of “product research” – which I believe, BUT we have to distinguish between “product research” as in “asking users what new products they want” and “product testing” as in “testing and improving the user experience with existing products.”

    e.g. Apple did not invent the “mobile music player” but they perfected the “mobile music device” with the iPod. The first couple generations of the iPod become a case study in the “search for elegance.”

    I have had several “iPods” – but I distinctly remember not being able to figure out how to change the volume of an “earlier” release. The product had a “rocker dial” which I assumed if I held down on one side the volume would go up, and if I held down on the other side the volume would go down.

    ANYWAY – It turned out the the volume was controlled by “sliding” and not “rocking” – and once I was shown how it worked it was obvious (and I admit “better”) – so early iPods were beautiful and easy to use, but not “elegant”

    Of course the first step in designing an “elegant” product is that the product does what it is supposed to do (i.e. form still follows function) – this tends to require “high end components”. SO Apple has never sold “cheap” products.

    The number of products that exhibit “pure elegance” is probably zero – i.e. “pure elegance” is (probably) unattainable.

    This becomes an interesting thought experiment: e.g. There are a great number of products that are “easy to use” once you have been shown how to use them. However the number of products that “announce how they work through their design” is very small if not zero.

    Remember that we have to start with a “user” that has no exposure to the product – e.g. if you’ve seen “Demolition Man” (1993) (a “not bad” Sylvester Stallone/Wesley Snipes vehicle) you might remember the “three seashells” joke.

    If you haven’t seen the movie (it is fun, you can probably find it with little effort) – Sylvester Stallone gets brought out of “suspended animation prison” to catch super villain Wesley Snipes – but the plot isn’t important. Mr. Stallone plays the comedic “fish out of water” that doesn’t understand the simplest aspects of “modern civilization” one of which is the “modern” bathroom facilities that consists of “three seashells.”

    The point (if I have one) is that in the movie the “three seashells” are a great example of “un-elegance” (which was used for comedic effect – and no, they never explain how the seashells are used, BUT they make it clear that EVERYONE knows how to use the seashells).

    SO in “modern times” the best we can hope for are products that are obvious to use for those that have experience using similar products.

    The “web design” gold standard has been (some form of) “don’t make the user think” (probably) as long as there have been “web design suggestions.”

    From a “software design” point of view “elegant user interfaces” are also few and far between. “Functional” interfaces are a dime a dozen – but systems that are actually “pleasant to use” are numbered in single digits.

    Combine “functionality” and “ease of use” is never easy BUT if you get it right and have a little bit of luck – you might be the next Google or Facebook …

    This song (“Something” by the Beatles) came to mind as I was composing this post. Beatles fans will recognize this as a “George” song – the song would peak at #3 on the Billboard Hot 100 in November 1969.

    George Harrison was the youngest of the Beatles – which really doesn’t mean anything in the “big picture” (i.e. it isn’t like the age difference was a big deal – they were all within three years of each other) – but becomes significant when we talk about “song writing development”.

    e.g. three years difference is like the difference between “high school seniors” and “high school sophomores” – fwiw: Mr Harrison admitted that he always “looked up” to John Lennon.

    SO “George Harrison songwriter” had the benefit of seeing two of the all time greats become two of the all time greats (“Lennon and McCartney”) but also developed his own distinct “elegant” style.

    (the disadvantage to being a Beatle for “developing song writer” Mr Harrison was that some of his “early” work ends up being compared to “Lennon and McCartney” unfavorably – not that his early work was “bad” so much as “not as good”)

    “Something” becomes a compact “mature love story” – Mr Harrison was in his late 20’s when he wrote the lyrics, so he is writing about the experience of “falling in love” with the realization that what he is feeling might not last.

    Compare that with the “more mature” view in “What is Life” from George Harrison’s first solo album (1970) – and we see why “George was the spiritual one”