I’m not really a multiplayer gamer. I used to be. I spent some time on World of Warcraft (about six months with the original vanilla version way back in the day). I spent probably way too much of my youth playing Phantasy Star Online and Powerstone for the Dreamcast. I played a ton of Halo and Halo 2. But I think the last game I really got into the multiplayer on was the original Call of Duty: Modern Warfare. I played several hours of Mass Effect 3’s multiplayer. Even though I enjoyed it, I really just played enough to get the most optimal ending and then never touched it again.
It just seems my priorities have changed. Not only do I really seem to value a dedicated single-player experience more, it’s also that my schedule is random and very packed. I can’t really sit down for several hours every night to dedicate to a multiplayer game, and that basically means I get left in the dust by the people who do (or only play that one game for months on end). That, in turn makes mutliplayer games not that much fun. So essentially I normally need a single-player experience in order for me to give a game a serious look.
This is not a new debate, but a lot of recent games have brought it back into the spotlight. Call of Duty: Ghosts and Battlefield 4 were critically derided for their poor single-player experiences. Tomb Raider’s multiplayer was virtually ignored. Bioshock Infinite was criticized for not having mutliplayer, and Titanfall has had some detractors for not having any single-player content. Sony’s The Order: 1886 also raised some eyebrows when it was announced to be a story-driven, single-player title.
I’ve pretty much explained why I value my single-player games. But I also certainly understand the multiplayer value. Multiplayer extends the life of the game as well as how far you stretch your gaming dollar, because theoretically you can be playing those games for weeks, months or even a year or more before moving on to your next game. But we’ve seen the results when games try to serve both audiences: one version ultimately suffers, either through bad design or just that barely anybody plays it.
Let’s look at Destiny. It’s arguably the most high-profile game coming out in the next six months or so. It’s clearly an open-world MMO shooter. But there are a lot of questions about single-player content, and aside from a vague “yes, there is single-player content” response, we don’t know much about it. Is it any good? Is it just the multiplayer content with one person? Is your multiplayer going to suffer because you are trying to pander to both audiences?
In my opinion, Bungee should probably just have responded with something along the lines of “if you don’t play multiplayer, this game isn’t for you.” And coming from someone who generally prefers a dedicated single-player experience, that’s fine. Not every game needs to be made for everybody. I’d say the gaming audience these days is large enough that you can create a game that is only multiplayer (or even online only, as PC games have been doing that for years) and still be a success. I’d much rather see a game dedicated to one or the other, because much like when you are developing a game for multiple generations of consoles, you are limiting yourself by default. Separate dedicated single-player and multiplayer experiences can and should exist without ever trying to meet in the middle. The last generation was full of games that tried to be both and largely failed. Let’s hope publishers and developers are a little smarter this time around.
What do you think, GameSided readers? Do you prefer single or multiplayer experiences? Can they exist separate but equal? Or do you want all your games to offer both? Let us know in the comments below!
The views expressed in this article explicitly belong to the author, and do not necessarily reflect the views of, nor should be attributed to, GameSided as an organization.