If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Everybody makes mistakes | 10 Years Ago This Month

The power of the cloud, second-screen gaming, and more promised futures that never came to pass

The games industry moves pretty fast, and there's a tendency for all involved to look constantly to what's next without so much worrying about what came before. That said, even an industry so entrenched in the now can learn from its past. So to refresh our collective memory and perhaps offer some perspective on our field's history, GamesIndustry.biz runs this monthly feature highlighting happenings in gaming from exactly a decade ago.

This column has plenty of recurring themes, but one of the most obvious ones is big companies and renowned developers saying things and placing bets that would turn out to be disastrously ill-advised.

Sometimes it's because these companies are run by fools and these big names are in fact Pagliacci-level clowns. But sometimes it's because even people who know what they're doing are just wrong a whole lot, about a bunch of things. October of 2013 gave us plenty of examples of these points, and I'll do my diplomatic best to avoid saying which ones belong in which category.

The power of the cloud

Let's start with the console warriors, as Microsoft and Sony were both ramping up the launch push for their new consoles, the Xbox One and the PlayStation 4, and touting what they saw as differentiating factors between their similar-seeming capabilities.

For Microsoft, that meant Xbox Live lead program manager John Bruno was doing the rounds promoting Xbox Live Compute and the power of the cloud, the much-touted idea of having remote Microsoft servers doing some of the CPU heavy lifting to allow the actual console in the user's home to punch above its weight.

"We believe that there's going to be higher fidelity experiences over time, because of having that ability to offload those tasks that they often have to trade off with local resource," Bruno told us. "So we do expect higher fidelity games over time, we do expect that the cloud will just be better from a pure computing point of view."

The power of the cloud example given at the time was Forza Motorsport 5 changing up the way the series' Drivatar system trained a player's AI racer profile, using a bit of cloud-based machine learning to crunch numbers while a player was racing online. The end result was that the AI racing profiles players could play against offline would theoretically be better representations of how their friends and rivals would actually race, but it wasn't exactly an iron-clad example of cloud computing fundamentally changing how games would be enjoyed in the new generation.

Just think, if not for the power of the cloud, we would not have had that one multiplayer mode in Crackdown 3

As for the higher-fidelity experiences that players might actually recognize as something that could not have been done before, it would be nearly two years before Microsoft showed off anything of the sort, and that would be an admittedly really cool tech demo for Crackdown 3 from Gamescom 2015.

It would be almost four more years before Crackdown 3 actually came out, and by that time the "power of the cloud" aspects had not only been relegated to a single Wrecking Zone multiplayer mode, but scaled back dramatically as well; Digital Foundry's 2019 assessment of the mode wondered whether the cloud computing aspect was even necessary given what the non-cloud powered Xbox One remaster of Red Faction: Guerilla had shown the year before.

The power of the cloud remains elusive, but not for lack of trying.

Around the same time we saw GameStop and Square Enix dabble in cloud tech before giving up. Google set up a AAA first-party Stadia studio system to take advantage of it and shut it down in less time than the dev cycle for a non-cloud-powered AAA game. Start-up Improbable used promises of cloud-powered gaming to build itself to a valuation of over $1 billion, but games using its cloud tech had a habit of not making it to launch and the company sold off its first-party studios to enthusiastically chase the metaverse bubble.

It's too soon to point and laugh at Ubisoft's Scalar power of the cloud initiative announced last year, but the initial pitch sounded very much like every other cloud pitch so far, with lots of hype about a "virtually infinite amount of computing power" and nothing about how exactly that would be noticeable in the end game for users, how that would translate to more fun, or how to build an economically feasible business that involves a "virtually infinite" level of power consumption for each user.

In summary, Microsoft using the power of the cloud as an argument for people to invest in an Xbox One was a broken promise and a terrible idea. Like, solidly within the company's ten worst ideas around the system launch. (We've discussed a number of the others in previous columns this year.)

Wait just a second-screen…

A woman site on a couch smiling. A bunch of Sony imagery gloats around her with the words "GAME ON" in neon in one corner. It's unclear what's happening.
Sony's second-screen tech tried to address our society's dire understimulation problem

Not content to let others hog all of the generation's worst ideas, Sony borrowed one of them for itself. Having seen Nintendo and Microsoft embrace "second screen" gaming with the Wii U GamePad and the SmartGlass app, respectively, Sony was eagerly rolling out its own second-screen solution. If you thought that was a set-up for something about Vita, I'll just say, "lol no."

Nope, Sony was instead pushing its new PlayStation app, which combined the basic friends list and messaging functionality one would have expected from a PlayStation Network app with support for second screen gameplay for PS4 titles. A version of the app is actually still available, somewhat surprisingly.

What did all that buzz around second screen gaming amount to? Not a tremendous amount, honestly

So what did all that buzz around second screen gaming amount to? Not a tremendous amount, honestly. Here's a handy forum thread from 2015 with the most thorough list of games with second-screen functions that I could find, even including Gamecube titles with GBA connectivity and some Dreamcast titles that used the screen on the system's VMU memory card for gameplay functions. (It's still not comprehensive; for instance, it only lists VMU playcalling in NFL 2K2 but that could be done in the previous two versions of the game as well. And it's obviously only updated through 2015.)

There are some really big games on that list, but there are some I played and quite enjoyed that I still couldn't tell you had any kind of second screen support at all. The fact that some of the games are fondly remembered seems utterly incidental to their appeal or commercial success.

As our own Rob Fahey argued last year, second-screen gaming did eventually take off, but not so much in the form of companion apps or anything of the sort. Instead, the version of second-screen gaming that thrives is the kind where someone plays a game like Zelda: Tears of the Kingdom or Elden Ring, pausing as needed to visit bog-standard websites for answers to their many questions as they play through.

That version of second-screen gaming has thrived, but the one embraced so eagerly by companies like Sony, Microsoft, Nintendo, EA, and Ubisoft was practically dead-on-arrival.

A potpourri of poor predictions

So what else were people excited about that didn't really come to pass?

When asked what would define the coming generation of games, former Epic Games designer Cliff Bleszinski had a few ideas, none of which were centered on the new consoles about to launch. Bleszinski wasn't giving one of the "consoles are doomed" predictions we love so much – he explicitly said they would do fine – but he believed the industry would instead be defined by "things like the Steam Box and the Oculus Rift, honestly."

"Things like the Steam Box and the Oculus Rift"Cliff Bleszinski on what would define the next generation of gaming

Steam Boxes launched as Steam Machines in 2015, and Valve all-but buried them in 2018 by removing them from the Steam storefront navigation. What Valve learned from the exercise has been carried forward into the Steam Deck, so it wasn't a complete loss, but it's fair to say they didn't define the next-generation of games.

Oculus was only five months away from being acquired by Facebook for $2 billion, and the company has lost billions more chasing its VR and AR ambitions since then. VR remains a thing, but it's still a niche and hasn't defined much of anything in the past decade.

Meanwhile, Mad Catz must have heard that Ouya was going to be as big as the iPhone because it threw its ratty old hat into the Android-powered microconsole market with the announcement of the $250 M.O.J.O.

The GamesIndustry.biz style guide says to ignore excessive acronyms for the sake of readability, but I can't help but insisting on capitalizing M.O.J.O. and putting those periods in between every letter to emphasize how absurd the name was because it's not actually an acronym.

A product shot of the Mad Catz MOJO microconsole, with a game pad on the left and a base unit on the right
What could M.O.J.O. stand for? The best we came up with was "Me olvido jugar Ouya."

So how did M.O.J.O. do? Well in a feature about the company's bankruptcy in 2017 – that's your first clue – ex-Mad Catz product manager Aaron Smith told us, "There were a lot of bad bets made. There was a large investment in that M.O.J.O. gaming console concept, which really didn't pan out."

Finally, research firm Superdata scratched that "consoles are doomed" itch I was talking about with a new twist: "Consoles are doomed and they're going to take the industry down with them."

The firm warned the industry might be headed for a crash along the lines of the 1983 collapse of the US home gaming market, because it believed a higher-than-ever console installed base meant the market might be saturated and people would be reluctant to add more hardware to their living rooms.

I've got a few problems with that idea, starting with Superdata's focus on hardware saturation in the 1983 crash with no regard for the relative quality of the software on offering and continuing through the decades of evidence we have that gamers are generally quite willing to kick their old systems to the closet (or the curb) in favor of the latest and greatest.

But did that crash really happen? Well, it didn't look anything like the 1980s crash in the US, but the number of new consoles sold did take a hit of about 80 million systems in the generation. That's less a reflection of Superdata's prognostic abilities and mostly a reflection of Nintendo's inability to recapture lightning in a bottle.

The overall console market was fine, as suggested by the PS4 and Xbox One combining for about 180 million systems sold, which is almost ten million clear of the previous generation's tally for the PS3 and Xbox 360.

On the other hand, Nintendo followed up the cultural phenomenon of the Wii (almost 102 million sold lifetime) with the Wii U (fewer than 14 million sold).

If people only did something when they were sure of the results, they wouldn't do much at all

So you see? Pretty much everyone screws up. And that's fine. Better than fine, in fact. If people only did something when they were sure of the results, they wouldn't do much at all. So really, we need people to be wrong. (Me especially, because these columns would be pretty dull if everyone was always right about what would happen.)

We need people to be wrong because being wrong means they were taking a risk, and taking risks (within reason) can pay tremendous dividends, monetarily and creatively.

That's why even coming off the clearly disappointing launches of the Wii U and 3DS, Nintendo president Satoru Iwata was at a startup conference in Japan ten years ago this month, not touting the latest buzzy innovation the company was chasing but instead proselytizing a willingness to faceplant in public.

"When we talk about Nintendo we cannot ignore [former president] Hiroshi Yamauchi who just recently passed away," Iwata said. "He always said that if you have failure, you don't need to be too concerned. You always have good things and bad, and this reflects the history of Nintendo. If you do the same thing as others, it will wear you out. Nintendo is not good at competing so we always have to challenge [the status quo] by making something new, rather than competing in an existing market."

Thankfully, the Switch is entering its dotage with 130 million systems sold and this nugget about the Nintendo approach gives us a nice feel-good ending to this segment. Just don't come back and read this in a few months when Nintendo announces its next hardware and we're all furious because it's something other than the obvious Switch 2 with backward compatibility for Switch games and cartridges.

What else happened in October of 2013?

● Layoffs were in fashion, as we saw cuts to Sega, Capcom Europe, Gree, Kabam, and TinyCo. At the same time, there were complete closures for EA's Victory Games (briefly known as the BioWare studio making a Command & Conquer game), The Playforge, and BioShock 2 developer 2K Marin.

The 2K Marin closure was particularly odd as Take-Two wouldn't even confirm the news, and months later Take-Two CEO Strauss Zelnick was still telling people 2K Marin was responsible for the future of the BioShock series. (Take-Two would eventually establish the new studio Cloud Chamber to make new BioShock games. It announced the studio in 2019 and we have yet to see its debut game.)

● Crytek had a Bad Tweet promoting the Xbox One launch-exclusive Ryse: Son of Rome by telling people the developers had eaten more than 11,500 dinners at work because of crunch. There was all kinds of angry pushback and we did a round-up of thoughts on crunch from established studio leadership with Warren Spector, Feargus Urquhart, Jason Rubin, and Jordan Thomas weighing in on the subject.

Disappointingly, not only were they resigned to the necessity of crunch, but they specified positive aspects to it as well, like bonding a team into a family, the reward of long hours of collaboration with close friends, or improved morale. Then again, these are all people who thrived in a world where crunch was common and more accepted, so perhaps that shouldn't come as too much of a surprise.

● Grand Theft Auto Online got off to a rough start, with Rockstar first warning people that the servers were buckling under player demand, and then freezing sales of the game's virtual currency until it could sort out its problems. (It would eventually be sorted out.)

● Remember that revolutionary handheld gaming system from an OG hardware maker that could also be slipped into a dock to put the video on your TV? Remember that it was pulled from sale in October of 2013 over licensing issues? Huh? I'm talking about the Neo Geo X, what kind of cheap knock-off are you thinking of?

● Activision Blizzard received the legal go-ahead for a controversial change in ownership. I know that sounds more like 5 Days Ago This Month material, but this was actually about a Delaware Court overturning an injunction preventing an investment group led by Activision Blizzard CEO Bobby Kotick from buying a controlling interest in the company back from then-parent Vivendi.

That deal valued the company at a little over $13 billion. The price went up considerably in the decade since, with Microsoft forking over almost $69 billion to own Activision Blizzard.

● Military thriller author Tom Clancy died at the age of 66, which we bring up here mostly to remind you that while we're all worried about the dystopian future, it wasn't too long ago in our dystopian past that Ubisoft acquired the IP rights to Clancy's name for use in video games and related products. Yet somehow, even a decade after he's not around to cause a fuss anymore, we still don't have a single Tom Clancy's Just Dance game. What was even the point?

Good Call, Bad Call

GOOD CALL: Having just seen EA Sports head Andrew Wilson sell all his stock in Electronic Arts and then be named CEO, EA head of mobile development Frank Gibeau tried a similar negging tactic with Zynga, panning the company in the pages of the New York Times.

"They're not a mobile business," Gibeau said. "We're six or seven times their size in mobile. Zynga fell into a hole because they were completely focused on one platform, which is Facebook."

It took a little while for Gibeau's gambit to pay off, but Zynga would make him the CEO in 2016.

GOOD CALL: In a presentation for developers looking for funding, former IGDA head Jason Della Rocca told them, "The best time to take someone else's money is never. If you can bootstrap, if you have your personal savings, if you don't absolutely have to take external funds, you're always better off."

We've seen plenty of evidence supporting that idea since then, but it's been particularly apparent in recent months with the wave of restructuring, layoffs, and studio shutdowns as companies big and small are suddenly concerned about running sustainable businesses.

As Hooded Horse president and CFO Snow Rui said at the GamesIndustry.biz Investment Summit in August, "One of the saddest things that happened during the period of loose money was that investors encouraged – sometimes even pushed – a lot of previously self-sufficient studios into expanding their team and budget too much, to the point where budget outpaced market potential and the studios were no longer sustainable without investor support.

"When money became tight again, investors pulled their support, and the studios had to downsize or even shut down. This damage could have been avoided with the more cautious approach people are starting to adopt now."

BAD CALL: When asked about his favorite games, Zynga CEO and founder Mark Pincus told a gathering of entrepreneurs, "Right now, I'm pretty bored with all games." His questioner suggested a few possible mobile and social games, including some of Zynga's own, that might be exceptions, but Pincus doubled down, and complained that even Farmville and Cityville weren't as appealing to him as they used to be.

"I want that addiction again," he said, reminding all of us how readily the mobile and social industry used to toss the spectre of compulsive gaming around until governments and doctors started to notice how that "addiction" wasn't just a cute way of saying you like a game but met a more clinical definition of the word.

BAD CALL: In an interview conducted by SimCity creator Will Wright, Wargaming CEO Victor Kislyi was even less subtle about it when talking about the company's free-to-play hit World of Tanks and dismissing concerns about mobile undermining the health of the PC as a gaming platform.

"It's not about the platform," Kislyi said. "Two years from now, there'll be something cool we don't know about today. It doesn't matter. It's not about the platform, it's about the experience. We're drug dealers of experiences. How people feel, the culture, the gameplay experience."

Super gross, right? But I think we can still top it.

BAD CALL: Quantic Dream made an obvious Bad Call by incorporating a debug camera mode in Beyond: Two Souls that would show people the completely nude model it made using the likeness of the celebrity it paid to appear in the game.

That required a real specific mélange of crappy behavior, laziness in covering your tracks, and incredible lack of foresight in thinking this wouldn't be discovered by people. In other words, the exact same set of qualities that would later lead the studio to plagiarize its statement responding to accusations of a toxic, homophobic, and sexist workplace, copying it almost word-for-word from another studio's statement released just three months earlier.

I know I said I would try to avoid specifying where the Pagliacci-level clowns were in this column, but it's here. Here is where the Pagliacci-level clowns are.

Topics in this article

Follow topics and we'll email you when we publish something new about them. 

Author
Brendan Sinclair avatar

Brendan Sinclair

Managing Editor

Brendan joined GamesIndustry.biz in 2012. Based in Toronto, Ontario, he was previously senior news editor at GameSpot in the US.