Is the World of Warcraft Team Afraid of Legacy Servers?

By now you've probably heard that Blizzard recently forced a World of Warcraft private server, Nostalrius, to shut down via cease and desist. Nostalrius was very popular, attributable mostly to it being fairly well-scripted and 100% "blizz-like" (meaning it pretended to operate just like the official World of Warcraft servers would have operated back in 2006). As a result of its popularity, the shutdown of this private server has generated a tempest of controversy on the Internet—what doesn't, these days?—and a voice has started crying out louder than ever for Blizzard to provide official Legacy/Classic servers for World of Warcraft. It's clear that there is a non-trivial segment of the WoW player base that craves a Legacy server. I am among that crowd.

But Blizzard remains silent. Despite being well within their legal rights to force the operators of Nostalrius to cease and desist, the amount of negative PR and fan backlash generated by this move is quite large—larger, I assume, than they expected. Yet there has been no official statement by the company addressing it one way or another. As far as I am aware, there haven't even been any unofficial statements on social media (e.g. on Twitter).

To anyone that has followed Blizzard over the past couple of decades, their silence is probably no surprise. Communication has never been their strong suit; this is an issue they have been supposedly working to correct for the past few years. In fact, when it comes to Legacy servers especially, Blizzard has said very little at all. The only real example I could find is the famous video of a Blizzcon Q&A session, where a guy asks about it and receives—in my opinion—a very passive-aggressive answer. Even the longer version (starts at around 30 minutes in), where he goes on to discuss the drawbacks of old WoW, comes off as dismissive and almost reductive.

We all know that old World of Warcraft had a ton of problems. Class/spec balance was all sorts of screwed-up, the leveling experience was a grind (to put it nicely), and PvP was horribly broken. The game had imperfections, and Blizzard must believe that the modern World of Warcraft is, in every way, an upgrade on the old one. However, something they are failing to consider is that many of us loved classic WoW in spite and, in some cases, because of those imperfections. Classic WoW is like a beloved, loyal dog that sometimes pees on the floor and tears up your shoes. It might be inconvenient, but it's also kind of endearing.

The old World of Warcraft was a game that felt like it was made by MMO fans, for MMO fans. It had a clear designer's touch to it (a concept I hope to cover in greater detail some time in the future). It felt like it was designed as it was because the designers wanted to play that game. To contrast, much of modern WoW feels strongly designed-by-committee. It feels like, in so many places, the designers said "what's going to please the most people most of the time?"

The result of the dissatisfaction is obvious. Players had flocked to Nostalrius, and other private servers, in numbers that dwarfed many AAA MMO's still operating today. The desire to play WoW as it once was is undeniable. In the aftermath of Nostalrius' shutdown many people are making a strong effort to be clear: we are willing to pay. So why, then, is Blizzard not tapping into this market?

For one, I think Blizzard is afraid of Legacy servers. As a company, as a team, and as a juggernaut of the games industry, Blizzard is deathly afraid that legacy servers would be too popular. It would invalidate the hard work of hundreds of dedicated employees over years and years of continued development. It would announce to everyone that yes, they were right: Cataclysm/Mists of Pandaria/Warlords of Draenor did all lack something that the first three iterations of the game (Classic/Burning Crusade/Wrath of the Lich King) had—some intangible missing attribute that made the experience feel hollow when it should have felt epic.

Let's take a step back from that accusation for a minute. Let's assume that Blizzard isn't afraid of Legacy servers, but instead would rather do them "right" as opposed to just slapping a server up and charging extra to play on it. They are probably concerned that the old software has no way of fitting into their current infrastructure. The security team likely knows about several exploitable flaws in the old client that have since been patched, so Blizzard would have to update the old code to, at a bare minimum, back-port the critical security fixes and infrastructure changes. Suddenly you've now got a fork of the WoW source that must continue to be supported, even if only at some minimal level, on top of the continued development of modern WoW. On top of that, it's possible that they don't even have the old code anymore. Maybe they could find an archive of their old server software, but what if their scripts (the in-game logic code) have only ever moved forward?

Hell, Blizzard being Blizzard, they would probably want to go the extra mile and simulate a staggered release of the expansions. Beginning with Burning Crusade after six months or so, they could unlock the expansions in order for six to eight months each. Eventually, they reset the server again and everyone starts fresh, at level 1, in Classic WoW. It's kind of like how seasons work in Diablo 3, but phased over a much longer period of time. Setting it up like this would require a significant effort to create a hybrid version of the WoW client/server software that could work in this way.

These are all issues that a company with the vast resources and talent at Blizzard's disposal could easily conquer. Each of them introduces a cost, however, both in up-front development and long-term maintenance. Combined with the fact that they very well may be afraid to see just how well the Legacy server would do... well, now you can see why they haven't invested the resources. Would I like to see a Legacy server? Absolutely, especially if they took my "staggered expansion release" idea. Will we ever see one? Probably not, and that makes me a very sad Pandaren.

On the quality problem of today's software engineers

Over the past few months (pretty much since I started a new job in October) I've been conducting a lot of interviews. This new company I'm working for is growing rapidly and has a desperate need for more developers. Normally, this is pretty straightforward: you gather some resumes, do some interviews, and narrow them down until you find a good candidate for the job. However, this company has a very refreshing point of view on hiring. It's a point of view I wish more companies could follow: we will not hire anyone unless we feel 110% confident that they'll be a good fit.

Now, that might seem obvious. "Who hires people that can't do the job?", you ask with a smarmy little smirk. "That's why you're doing interviews." 

But you're wrong. You—the imaginary you that I've invented as a straw man for my argument—are an idiot if you think that most companies aren't just hiring the least stupid person that walks in the door, because they are. In many cases, they're not even bothering to make people write code in their interviews.

I could write an entire article about how a company ends up hiring people that are obviously not qualified for a job, but that's not what I'm most worried about. What I'm worried about is that in the nearly 6 months we've been doing interviews, we haven't been able to hire a single person.

The State of Things

Since November we've interviewed at least 10 people for a Senior Software Engineer job. This person is expected to be very comfortable in C#, know ASP.NET MVC and Entity Framework pretty well, have experience in web development (preferably using Angular), and have a working understanding of SQL. That description applies to both our Level 3 position and our Senior position; the difference between the two is that a Senior is expected to have more experience and a set of intangible qualities that would make them a good leader.

Of those 10 people, only two of the applicants have even met the requirements for our Level 1 position. Let me stress that again: of the ten people that have applied for our Senior Software Engineer position, only two of them were even qualified for an entry-level position. What the hell?!

Some of these applicants were leads or seniors at their current/previous jobs. Some of them were barely-out-of-college kids who thought that playing around with a technology for a few hours meant that they had a firm grasp on its complexities. Yet eight of those applicants couldn't answer basic C# questions. Here's an example question from our interview checklist:

"Tell me the difference between a reference type and a value type, and provide some examples of each. Why would you use a value type over a reference type?"

Only one of those eight could provide any kind of intelligible answer to that question. How does this happen? Who is going around applying for senior-level positions when they're barely capable of producing "Hello World"? Lots of people, apparently. The worst part is that most of these folks, I'd wager, have no clue just how much they suck.

The Formula for Suck

Getting back into the "companies are hiring the least-stupid candidate" idea, imagine this for a second: you're a middle-tier manager at a company (let's call them Golden Persisting) and you need to fill a Level 3 Software Engineer position. You've only got a shoe-string budget, because upper management keeps telling you that your department is an expense and keeps chopping it in half. Instead of the market salary (X) you can only afford to pay a fraction of that (0.75X or less).

Naturally, all of the skilled candidates know that they can make X instead of 0.75X. Instantly you lose those folks. What you've got left are fresh-faced newbies, who are a gamble but will take lower pay, and people blowing smoke up your ass. Both of these types will overestimate their skill level, but dammit, you have to hire someone or else you're going to miss your deadlines! To top it all off, if you wait too long to fill the position, upper management will cut it out of the budget, because you clearly didn't need it, right?

As a result of your need to make a hire and the lacking budget, you end up hiring the least stupid of those newbies or smoke-blowers. Now, that person has a higher-level position on their resume. They are suddenly "worth" that position. Once they decide they're ready to move up, they'll find that Golden Persisting has no opportunity for advancement. Suddenly, our under-qualified Level 3 developer is on the prowl for a Senior-level position.

That's how it happens. Well, that or any of the hundreds of similar scenarios that corporate development produces. Those fresh-faced newbies are likely to end up as the next wave of smoke-blowers as they find themselves in jobs they're not quite prepared for. The old smoke-blowers end up in management, usually due to their tenure in their lead positions. It's a vicious cycle of suck.

The Fateful Eight

I wish I could sit here and tell you that this rampant quality problem has a simple solution. The issues are myriad, however, and there's little to no chance that most of them will ever be fixed. Let's look at a few of our fateful eight Senior Engineer candidates for some less-than-elegant examples of why this problem is endemic:

  • One of the candidates attended a school infamous for its poor IT program and literally admitted he had never hand-written a line of code in his life; he had only copy/pasted.
  • Another candidate spat out all the buzzwords, but when pressed to elaborate came up empty. He ended up being a Java-only developer who had never touched C# before.
  • The most promising of the eight developers told us a story about how he made a "bold" move one night and ripped out half the codebase and replaced it overnight, without telling anyone about it. He mentioned that "it caused problems and bugs for months". He told us this story in response to a prompt asking for a time that he had taken charge and led the team.

At this point, you're probably asking a very relevant question: why didn't you filter these guys out before they even made it to the interview stage?

The Problem With 110%

We didn't filter these guys out because in each and every case, their resumes and phone interviews were just fine. We tend not to give a lot of weight to a resume. After all, as we've seen a hundred times, a resume is usually more filler than thriller and its accuracy is dubious at best. Instead, we rely on a phone interview conducted by our in-house recruiter as the first line of defense. God knows how many potential candidates she's eliminated using the phone interview; I haven't asked and I don't want to know. The ten folks that we've interviewed in-person were people who passed the phone interview filter.

Our phone interview is no joke, either. We ask about 12 questions that run a gamut between basic C# to advanced topics, but there's only so much you can do over the phone. The questions have to be designed in such a way to prevent long, rambling responses (the recruiter is smart but she's no techie; she would have a hard time distinguishing the correct rambling answer from the wrong one), while also getting a quick gauge of which position they'll be recommended for. The Senior-level recommendations answer at least 8 of the questions correctly and (importantly) do not try to BS the other four.

The only explanation I have for some of these folks is that they were googling the questions as the recruiter asked them. Honestly, though, some bad apples were always going to slip through. Phone interviews are not an amazing way to filter candidates, and we might as well not even look at their resumes for all the good they've done us. Ever since I started doing interviews I assumed that I would end up face-to-face with some stinkers occasionally... but an 80% stinker rate shocks me.

Now, finally, you can see the downside of only hiring candidates who fit in 110%. Our team is still woefully understaffed and overworked with no end in sight. The idea of taking the plunge and hiring some of the most promising of the candidates is looking better and better. Of course, if you take "most promising" and put a negative spin on it, you get "least stupid".

How We Can Do Better

I still wouldn't change the hiring policy for anything, though. The value of employing people who are smart and get things done far outstrips the cost of being understaffed. The question is, how can we do better? What can we—employers, educators, current and prospective engineers—do to improve the overall quality in our field?

If you're an aspiring engineer, allow me to pass some advice on to you. Be good at what you do and take pride it in, without being prideful. Teach yourself new things for the sake of it, because you have a passion for programming. Learn how to sell yourself without lying or exaggerating. And most importantly, realize that everyone has to start somewhere and you're not ready to start at the top. No one has anything to gain from hiring a disingenuous developer into a position they're not ready for, not even you.

If you're an educator teaching programming to those fresh-faced newbies, don't just have students memorizing syntax rules for regurgitation on the test. Teach them how to apply what they've learned. Engage them in the power of code. Make them think like coders. There will always be students that show up just to get the degree so they can go earn a paycheck; don't spend too much time on them. Learn to spot the promising ones and get them involved, so that if they didn't already have a passion for programming, they might develop one.

And, for the employers out there hiring under-qualified dunces into positions way over their heads, let me tell you: I feel for you. I've interviewed them, I've worked with them, and I've been a victim of the smoke-blowers. I understand the corporate world; I know our departments tend to be the first to get cuts and the last to get appreciation. I encourage you to read The Mythical Man-Month, a classic tome of wisdom that too few in our industry actually take to heart. Remember that one smart engineer is worth three average ones, and... well, have fun explaining to your bosses that you can't just hire a million monkeys to reproduce the works of Shakespeare without producing a million monkeys' worth of feces while you're at it.

The Legend of Notch

Over the past few days, the media has gone into a mild frenzy covering some recent tweets by notch (the guy that originally created Minecraft, if you didn't know). Here's an example of the tone and content of these tweets:

The problem with getting everything is you run out of reasons to keep trying, and human interaction becomes impossible due to imbalance.
Hanging out in ibiza with a bunch of friends and partying with famous people, able to do whatever I want, and I’ve never felt more isolated.

Mr. Persson walked away from Mojang with over a billion dollars as part of the Microsoft deal. Most of us would trip over ourselves for the opportunity to make even 1/1000th of that, and everyone likes to grandstand about what they would do and how they would spend it. But most of us have never been in this situation.

Notch (Official GDC, via Wikimedia Commons)

Notch (Official GDC, via Wikimedia Commons)

My impression of notch—and this lines up with other impressions of him from various con-goers and media personalities—is that he is your stereotypical "basement programmer" type. The kind of guy who might have turned to programming and computers when real social interactions were more difficult than learning code. The sort of man who strives to prove himself based on his own ideas and merits. When you thrust an unimaginable amount of money into the lap of a person like this... well, I can see the makings of a storm of existential crises.

See, guys like notch are always working on new ideas because they like working on new things. The idea of making something that people enjoy is thrilling; it's the driving force in these peoples' lives. When you give them essentially unlimited money, that drive dies. No longer do you have to work for anything in your entire life. You can be content to just sit back and "enjoy yourself" until you die. Why do anything that looks like "work"?

All of that, combined with the "first-world problems" of being a rich person, have probably taken a toll on notch. I'm sure he will adjust, and I predict that he will get bored of having everything in the world just a credit-card swipe away. The question is, what does a guy like notch, with over a billion dollars in the bank, do once he gets bored? He returns to his roots.

So, I believe that the Legend of Notch does not end on a somber note. Rather, I'm betting that his legend is ongoing. He might never have an idea as good as Minecraft again, but I doubt we've seen the last of notch in the game development space.

Satoru Iwata's legacy and the death of fun in games

Photo By Official GDC [CC BY 2.0 (], via Wikimedia Commons

As anyone with even a passing interest in games knows by now, legendary developer and CEO of Nintendo of Japan Satoru Iwata passed away last month. His influence on Nintendo's direction is unmistakable, and his nature as a fun-loving "games first" kind of guy was unquestionable. The loss of such a juggernaut (still in his prime) is one that has resonated throughout the industry.

Truly, Iwata was unique; he was not just another spoke in the ever-grinding wheel of the games industry. The outpouring of respect for him in the wake of his passing goes to show that even though he may not have been as well-known as Miyamoto, people have an appreciation for his work and what he stood for.

Now that he is gone, the question looms over our heads—that ominous specter of "Where will Nintendo go now?" Iwata had a very clear vision, and was the steel wall of defense against the shareholders' push toward quick, cash-in mobile development. Without his guidance, can Nintendo stay on course? Everyone is asking this question; it's adequately covered in a thousand other places, some of them more respectfully than others.

More pressing to me, however, is the loss of the ideals that he championed. Nintendo, with Iwata behind the wheel, was the "fun" company. You go to Nintendo's console when you just want to have fun. Whether it's the Wii or the Wii U, no one can dispute that Nintendo's games have an intense, laser focus on pure "fun" without any of the cruft that weighs down modern games. I'm willing to bet that Mr. Iwata was a driving force behind that focus. 

That spirit of "fun" is so sorely lacking in the industry right now. Games are more serious than ever. Every time I hear the word "cinematic" in relation to a video game, I cringe. I don't need every game to be a Hollywood-esque drama, slamming me in the face with heavy topics and overdone cutscenes. Sometimes I just want to pick up a controller and have fun for 20-30 minutes. Guess what? Every time I find myself wanting to do that, I turn on the Wii U.

Without Iwata, will Nintendo continue to be the "fun" company? Probably. That's their niche now and most of the games they announced at E3 this year fit into it well enough. Indie games also frequently concentrate on raw fun factor. With the games industry in Japan waning, however, and the western studios busy working on their $100 million Hollywood-style blockbuster games, how long can Nintendo alone carry the torch?

What we need is more studios willing to sit down with a game and really chisel it down until nothing is left but pure fun. All of the "pet features" of the designers and developers need to be on the chopping block. If it's not fun or essential to facilitating the fun, cut it. We need more studios willing to say, "Video games are meant to be just one thing. Fun. Fun for everyone."

Anyway, that's my two cents on the issue. I hope no one thinks that I'm piggybacking on Iwata's death to push my own viewpoints. Rather, I think these issues are very core to who he was as a developer and CEO of Nintendo. I believe they deserve more discussion that they get, and I think he would have agreed.