Just got off the phone with The Wichita Eagle for an interview about how gaming has changed into something “cool” (Basic premise – there’s more too it than that.) One of my things I realized when he asked me about it, and I told him this, was that it’s actually a cyclic thing. Games have been cool and social previously – just most of culture who wasn’t part of the first era of video games doesn’t remember. The basement troll video gamer stereotype doesn’t really start until the late 80’s, early 90s. With the Atari 2600 / Colecovision / Intellivision era, games were new, exciting, and social. Be it the arcades where people hung out, or the home systems. Having a 2600 or other console at home, people came to play with you (or your brothers and sisters played). You had fun, and if you knew someone who had a system, they sometimes even acted as a social hub of sorts, with everyone gathering to take turns playing. What changed?
The home computer invasion, and the NES. See, home games emulated the concepts of short gameplay, high
repetition (Pacman’s one playfield, or Donkey Kong’s four repeated playfields), and two player modes (either at the same time, or taking turns) found in the arcade. It’s not that it was completely unheard of: Adventure on the 2600 is a good example of a primarily single player, long playing experience. It was just extremely rare. And since arcades were about how many quarters per hour a machine could make for the arcade, the gameplay rarely had much depth. It focused primarily on the action.
Some of that existed because video games were new and experimental. But, a large chunk of it existed because of hardware requirements – memory was expensive, and CPU power was low. But, with the Commodore 64 (for instance) or the NES, there was a lot more room to work with – suddenly single-player games became more common. There was more depth available. Even if the game allowed two players taking turns, you may be waiting quite a while if you’re player two – look at Super Mario Bros. on the NES for a good example of that.
Gaming quit being quick fix, high adrenaline things at home, but slowly evolved into their own worlds for players to experience. They were the new books and TV shows (not that it eliminated more passive forms of entertainment – we still have books, TV shows, and movies) that occupied people’s time. You didn’t want to put the controller down until the wee hours of the morning – particularly if the game didn’t have save points.
Gaming quit being social, and became single player experiences. Â And, the minute it quit being something that was done in view of all of your friends was the moment the social stigma of “gamers” started to take hold. Â It was slow at first, but it gained momentum, and became a part of a brand new culture. Â The basement troll video game player was born. Â And, as the length and depth of games continued to expand, so did things like obsessively playing a game until the wee hours of the morning. Â I can remember playing Sabre Wulf until sunrise, and realizing “Oh, hell. Â I’ve gotta be up for school soon.” Â I didn’t bother to sleep that day – well, I may have napped in Ms. Papke’s English class. Â Some of that basement troll gamer stereotype was caused by the gamers themselves – in some ways, they act like drug addicts, trying to get the most out of their high time, and spending a little time in the real world as possible. Â The gaming worlds were just so much more attractive.
So, how did we get to where we’re at now? Â I mean, gaming is a huge thing now. Â It’s social now. Â And it’s not just accepted, it’s almost expected. Â Two big reasons: marketing, and changes in technology.
I’m going to go after the technology part first. Â I’m sure if you asked people from “back in the day” what the turning point for gaming becoming cool again was, if the were going after the idea of when it became a socially connected phenomena again, they’ll probably say Everquest or World of Warcraft. Â In fact, when answering Matt Riedl, I mentioned WoW as a turning point. Â But, thinking about it a while, I think the turning point actually began with Doom, for two reasons.
Socially playing video games never entirely went away – it just went underground. Â And, of course, the history of online games is longer than most people realize – it didn’t start with EQ or WoW, it dates back to the Electronic Bulletin Board System (BBS) era. Â I was a huge BBS’er at one time (Elysian Fields here in Wichita, Kansas, among others, were my online hangouts), and played a lot of TradeWars, a multiplayer space based strategy game. Â So, it never died. Â But, it wasn’t becoming big, either. Â And it was definitely not a face to face sort of social thing, since modems ruled that era.
Doom struck a cord on multiple levels when it came out. Â It’s not hard to believe that 20 million people played the game in the first two years of release – anyone who had a computer better than a 386 seemed to have a copy. Â Even TV shows made some references to it, though with the basement troll gamer stigma usually attached to it, but it was there.
1993’s first release of Doom had something unlikely in it – multiplayer mode. Â Seriously, that was some ground breaking shit right there – there weren’t any major games with multiplayer baked in really. Â This was at a time where networked computers in offices were now a commonplace thing, rather than having to use sneakernet to move files from machine to machine (sneakernet: Â the process of copying a file onto disk, walking it over to another machine, and loading the file there.) Â So, it was possible to have access to a room full of people, computers with enough power, and a lunch break to play Doom together.
I’m going to take a quick side branch here, away from my love of the original Doom. Â Not only has computer and console games with greater depth nearly killed off short game play experiences, it also killed off the founder of those experiences: Â the arcades. Â Arcades were also hit by the video game crash of the mid 80’s, but had returned in the early 90’s with the release of games like Street Fighter II, which brought a whole new intensity level to multi-player games in the arcade. Â But the reinvigoration of arcades ended up being short lived. Â Cities like New York that had hundreds of arcades slowly dwindled to being something hard to find by the time the 2000’s rolled in. Heck, Wichita at one time seemed to see one every two square miles or so it so, with Greg Stevens, Copper Cue, Golden Cue, Silver Cue, Le Mons, etc. Â It was just too easy for the home experience to be as good as the arcade experience, and without having to pay as much as $.50 each time you played. Â Sure, games like Dance Dance Revolution helped prop up the industry for a bit, but even then cheap video game dance mats took the novelty out of that. Â Even big experiences like Galaxians 3, which was a small room sized video game machine for multiple players, really did little to help prevent the decline of arcades. Â The home consoles and computers just could do too much to replicate the graphics and sound of arcade experiences, but produced something with so much more gameplay. Â May arcades rest in peace… until they rise again (and they will, I think, but not as the old 80’s classic concept.)
Now, a new concept existed: Â LAN parties. Â Bring friends, bring beer, and spend an evening killing each other, hooting and hollering the whole time. Â I mean, that’s friggin’ awesome! Â Multiplayer Doom didn’t have the depth of later games like World of Warcraft, but for a multiplayer experience, there had been nothing like it. Â There was just enough variety and depth for players to adjust strategies easily, slowly improve their mouse-twitching skills,
For those who weren’t into Doom, there was Warcraft. Â No, not World of Warcraft. Â Warcraft & Warcraft II, the games that eventually lead to World of Warcraft. Â Bring your computer to your friend’s house, plug in, and play against all your friends.
And, yeah, I was one of those dorks who hosted LAN parties, particularly once Unreal came out – I was even one of those nerds who got into developing his own levels for the game (Mall of the Skaarj was an Unreal version of the Towne East Mall here in Wichita, with the added twist of having a lot of big honkin’ enemies in it. Â Players usually scrambled to control the flow of bad guys, and once that was done, turn their guns on each other for some Deathmatch. Â That moment where someone finally decided the other players were finally a larger threat than the bad guys were was always followed by cursing from the recipient of that first betrayal. Â I loved it. 🙂 )
Now, games had the opportunity to be social again. Â And, even if you were a stay in person, you could play Doom or Warcraft online (it took a lot more steps back then to play online, though.) Â It was a niche market for a while, but the tech kept moving forward. Â Soon, dial up modems became commonplace (for you kids, that’s back when computers screamed at each other loudly and shrilly to decide what speed to speak at, then eventually quietly allowed you to communicate with the computer(s) over a telephone line. Â A telephone line was like your cell phone, but with a cable attached to the wall that allowed you the opportunity to make a fool of yourself if you walked too far). Â Dial up Internet was quickly becoming a part of every home. Â Then they were replaced by cable modems and high speed access. Â And, of course, we now live in a hybrid environment of cell communication and cable modems (or similar).
Games have benefitted from that advance in technology the whole time, and at times drove it. You’d be hard pressed to find a computer or cell phone without some sort of 3D accelerator in it. Â Thank video games for that. Â The advent of 3D gaming experiences that were rendered on the CPU gave way to the idea of utilizing specialized 3D processors (GPU’s) to handle the load, and slowly make the games more lifelike.
But, the tech wasn’t enough to get us where we are now.
If you look at the version of Doom being shipped now on PC, Xbox One, and PS4, there’s a major difference from it’s original release – it’s being pushed pretty hard on the marketing side. Â Advertisements for games isn’t a completely new thing. Â There were ads for video games way back in the Atari 2600 era, when it was at it’s peak. Â But, they weren’t nearly as common. Â Sales pushes marketing, and marketing pushes sales. Â Oh, and the original video game TV ads. Â Wow. Â Just… wow:
Oh, so they must have improved, right? Â Well, here’s some 90’s and early 2000’s commercials. Â Spoiler: Â they didn’t improve.
Something to notice between those two sets of commercials: Â playing games had evolved from selling a shared experience to primarily selling a single-player experience. Â Golden Eye is considered to be one of the classics for multiplayer FPS on a console – but no mention of it is made. Â The Atari commercials showed kids playing together. Â The 90’s stuff was either more focused on a representation of the action, or on a single player experience.
I’m not going to do same for our current era – I literally can’t find a video highlighting some of the most played advertisements on TV that doesn’t include someone talking over top of all of them. Â It’s also a little more complicated by the amount of ads that are utilized on Facebook and other outlets -vs- the number of ads that were TV only in the 80’s and 90’s. Â But, some quick highlights: Â Rock Band with four players, Call of Duty, any Madden Football game from EA, and a long list of other ads that feature people playing together or against each other. Â Not every game ad is multi-player. Â Halo, in it’s incarnations, isn’t pushed as a major multi-player game, it’s pushed for it’s epic single-player experience most of the time. Â Instead, now, advertising is a blend of both.
Marketers figured out that portraying competition or cooperative experiences were good.
But, even that wasn’t enough: Â as games grew in scope, the number of people required to purchase them to make the next game (and to make profitability) increased. Â So, the amount of marketing increased dramatically over the years. Â You can’t just market to one demographic – you’ve got to start producing products that match quite a few demographics (see EA’s purchase of PopCap Games as an example of that) and market like crazy to those groups. Â PopCap sold for $1.3 billion. Â That’s a lot of copies to have to sell to make up that purchase price.
Here’s the thing: Â you can market to “gamers” (I’ll come back to those quote marks in a moment) as if they were indeed basement trolls as an inside joke. Â You can market to “hard core” gamers straight up as hard hitting, high action games (or heavy strategy, etc.) Â But, eventually, your market is limited. Â Those people talk their own language when it comes to games. Â Want to make real money? Â You need to talk to the masses. Â And to do that, games have to be cool and fun.
Nintendo almost always got “fun” right in their advertising, even if it was goofy as hell sometimes.  Microsoft and Playstation both try to take the more serious route, relying on an epic feel behind the games.  I think that’s a chunk of why Nintendo managed to survive the Wii & the Wii U era so well (which is in it’s twilight days now) – their console wasn’t particularly awesome, and graphically their games are far inferior to XBox 360 / P3 games of the era (and WAY behind the XBox One and PS4).  Instead of trying to reach hard care gamers, they went after everyone else.  Not just that demographic over there (insert me waving my hand vaguely), but everyone.  Of course, having a hand held video game system helped Nintendo a lot, too – though, that era will probably come to an end within the next 5 years.
The Entertainment Software Association (ESA)Â released a cool report on the 2016 Essential Facts of the Computer and Video Game Industry, and here’s an important take away from it:
Neat breakdown. It’s interesting to see that overall, “casual games” (puzzle games, etc.) only represent 0.9% of the console market (the Video Game Super Genres) while it represented 25.8% of the computer gaming market. Â What happened? Â (OH, and don’t worry – it may seem like I’m drifting off course, but we’ll be back to the main gist of the article here shortly)
Great question. Â And, one I’ll tackle another day. Â Seriously, it’s much, much to long to add to this particular 2,600 word (at this point) commentary on the history of the social views of video games. Â But the important part to note is that there’s a giant spread of gamers to market to. Â So much so, that it’s time to address why I said “gamers” earlier.
See, in the old days, “Gamers” were considered to be a certain type of person.  Male, white, probably teenage or 20-somethings.  That same report shows that 63% of  US households have someone who plays games for three or more hours a week, the average gamer age is 35 (and if you hunt down the graphic in that report, you’ll see that’s because gamers are ALL ages, in a really good spread), and 59% male.  The ESA’s report didn’t have anything about racial statistics to prove or disprove white as the primary game players – but, Pew Research did.  Which, basically dispels “white”.  Basically, if you’ve got enough digits to pick up a controller, smartphone, or keyboard and mouse, you may be a gamer – pretty much equal chances overall.
In fact, if you’ve got a few minutes (hey, you’ve read this far, you probably have more than a few minutes to spare), dig through both of those reports a bit. Â There’s a lot of information in there presented fairly quickly (there’s only 10 pages of actual data), and if you happen to be a game developer or marketer, a lot of it is incredibly valuable. Â Oh, another great tidbit from the ESA report? Â 485 of gamers play social games. Â Too bad social games is ill defined here – games that are multiplayer? Â Single player with a social component? Â Facebook games? Â I’d love to dig into that part deeper as a developer.
But, back to the point: Â “Gamers” aren’t “gamers” anymore. Â 63% of households isn’t a minority – it’s the majority.
Which brings us back to the main point – the rebirth of the coolness of games. Â As marketing got more pervasive, video games become ingrained into our culture. Â Marketing kept shoving games in our face, and no matter if you were a gamer or not, eventually you were going to get curious about one of those games, be it Halo or The Sims. Â Something was going to finally trip your trigger, as the marketed to larger and larger audiences, trying to motivate you into becoming a gamer.
At this point, making fun of people who play games is like making fun of someone who likes Nickelback – you may not know it, but probably the majority of people in the room like Nickelback. Â The 11th most popular band in the world. Â Heck, even I make fun of Nickelback, as long as no one knows I’ve got a couple tunes of theirs on my playlist.
Marketing might, coupled with the slow growth of the market over three generations and the technical advances that went with them, changed the perception of gaming. Â It was cool, new, and semi-social.
Then, basement troll gamers became the perception. Â Now, it’s slowly becoming ubiquitous, with a device capable of playing games in everyone’s pocket. Â (Aside: think about that: Â right now, children and teens – Gen Z – are the third generation kids to grow up with video games. Â How wild is that? Â Even wilder is when I think about that fact I was part of the first generation – Gen X – to grow up with video games, and I’ve gotten to see it all happen.)
And, even with where we are, there’s still room for growth – and even room for possibilities for a resurgence in the Arcade concept, though it would be a bit different this time around. Â Fireshark Gaming here in Wichita is a great example of that – while not perfect, the first time I played in his prototype environment was pretty cool, standing right beside the mech I was controlling in a room-sized environment with other players walking around, controlling their projected mechs too. Â Kent’s concept has a lot of room to grow, but it’s got a cool premise that brings back some of the social aspects of arcade based gaming, including an environment that is extremely hard to replicate at home at the moment. Â No, I’m not saying Fireshark Gaming is the future of video games and arcades, just that it’s an example of the innovations that could occur in that realm – arcades are dead, but I have heavy doubts that they’ll stay dead forever.
Will games become uncool again, repeating the previous cycle? Â Sort of. Â I see VR / AR games being the next “uncool” target – because, let’s face it, right now you look silly as hell with an Oculus or Vive hemet on. Â Hands waving in the air to try and touch imaginary objects (or, swinging an interface controller around, which is even weirder looking), reacting to stimuli that doesn’t exist to anyone else in the room? Â Mumbling about stuff you don’t hear? Â Speaking in their own language about latency and focal points? Â Yeah, it’s going to be uncool for a while. Â Oddly, it’s probably not going to be uncool in the same timescale that video games were uncool – tech and marketing evolutions move faster now, and heck, it may be on contacts in 5 – 7 years, rather than a helmet or special glasses you have to put on.
But, games as a whole will continue to be mainstream for a very long time. Â We’ll see evolutions of what’s “cool” and what’s “uncool”, and cycles of popularity in both the industry as a whole and in the various genres, but they’re mostly here to stay. Â Though, and this is food for thought for game developers, I do think we’re towards the edge of the next game industry collapse – not because of social pressure or technology, but because we’re starting to see the marketplace be too clogged. Â But, it won’t be anything like the 1980’s collapse – somewhere between collapse and slump would be a better way of looking at it, and it’s going to affect the biggest players in the market with the most money (and biggest development budgets per title) as stake. Â That’s an article for another day…