- Stu - PharaohCreator
Is it OK for games to "get there in the end?"
This current generation of console gaming has seen a number of new trends appear - from day one patches to microtransactions and DLC season passes that are sold before their contents are revealed. Some of these are good things (anyone complaining about day one patches should remind themselves of how games used to be, when a broken game remained broken for eternity...) and some are slightly less good. Those are probably a conversation for another day. There's one trend though that I'd like to talk about a while, if I may.
Over the weekend, following the announcement that The Division is now free via Microsoft's Game Pass service, I decided to download it and give it another go. After installing it and firing it up, I was confronted by the avatar I created when I played it shortly following its release all the way back in March 2016. I'd been excited to play The Division - like a lot of gamers I'd been wowed by a couple of impressive E3 showings. It was very much being talked about as being a 'Destiny killer' and I was intrigued. And it was post-apocalyptic - which is pretty much guaranteed to grab my attention.
So, I'd picked it up and played it. After 2 or 3 hours I was beginning to doubt my decision; the gameplay wasn't clicking with me. The menus and UI were horrible. It didn't look as good as I'd expected, but more importantly I was experiencing bouts of appalling slowdown and lag. After 6.5 hours I gave up, and actually traded it in a week or 2 later. I felt really burned by the game, and also wondered if it was somehow my fault that I "didn't get it" when so many people around me were speaking of it so highly. I returned to Destiny, and continued merrily indulging in space magic and alien murder, chuntering to myself that I'd not be going back to it every time I heard its praises being sung. During April and May of 2016, I heard those praises being sung a lot.
And... here we are. Fast forward two and a bit years, and I've not only downloaded it again but also have played a second character well beyond the 6.5 hours I put into my first one. This time around, I'm having a lot of fun. Certainly far more fun than I remember having the first time. Everything about it feels better than I recall - and in 4K it's eye-meltingly pretty. I'm not sure it really has its claws into me, but I plan on at least finishing the campaign and, critically, I get it. I understand now what people love about it - even if they did have to wait a couple of years for the game to emerge from its chrysalis.
This, then, is the trend of this generation that maybe interests me most - and it's the trend of games that get there in the end.
I'm not a PC gamer, but the master race members I know always take great pleasure in telling me that this isn't new, and that the PC space is littered with games like this and has been since forever, and that all us console noobs have no idea. That's fine. As a console noob, this whole 'continuous development' for games is a new thing to me - and it's one of the most interesting trends of this generation not only because of how many games it seems to have affected, but also because of what it's prepping us, the console gamers, for. It's colouring our expectations - and it's happened so gradually that I'm not sure some of us have even noticed.
I can think of three games so far this generation that have been a part of this trend. There are probably many more, but when I sat back to think about this, about ten minutes before starting to write this, I could think of three. The Division was the first one. The others were Destiny and The Elder Scrolls Online. Both of those games launched with some problems. Both of them went through extensive post-launch development and patches. And all three attracted the ire of the usual internet critics.
However, looking at them now with a couple of years' additional development under their belts, all three of them have become the games they (arguably) should have been at the start. Why "arguably" I hear you say... well, because of how they were sold. They were all basically service games, but they all started off their lives being sold as boxed AAA games with DLC plans. All of them launched deeply flawed in one way or another. And all of them got there in the end.
There's no argument that this generation has extended the tails of games. They're bigger and they last longer than ever before - with more and more being surrounded by noisy active communities that demand more and more from the development teams than ever before in terms of content delivery, roadmap visibility, and process transparency. All of these are side-effects, I think, of games that get there in the end. This generation really cemented their position in gaming culture, and as this generation heads toward its end there are more and more of them appearing.
Every time a game launches, there's a bunch of people complaining about the state of it. Content presumed to have been cut from the core game and pushed into chargeable DLC is a particularly popular subject for the complainers, as are any form of microtransaction that can be interpreted as a manifestation the much despised 'pay to win' mechanic that's reared an ugly head in recent years. There's a load of people who play it, either not reading forums or games media, or doing either/both and just not caring. And finally, there's often a group of people who seem to be able to see a bigger picture - who are able to look at a game, see an entertaining core behind the curious and flawed systems that surround it, look beyond the bugs and the problems, and see what a game could become. For my part, I could manage that with Destiny - but I couldn't do it with The Division. The developers had far more of a clue of what their games could become... which probably has something to do with why they make games for a living and I write about them for shits and giggles.
So, having established that games are developing a habit of becoming good eventually, at the start of the article I posed a question: is it OK? The honest answer is that I don't know. I'm aware that posing questions and then failing to answer them is becoming a theme for these pieces, and even knowing that still doesn't make me capable of finding an answer.
I know that it's changing how I consume games, and changing what I expect of them. If I'm playing a single player game, with little to no online component, my expectations are normally very high - in that I expect the game that I've installed to run smoothly, and not need to be patched into another dimension for me to be able to get fun out of it. However, I also expect that experience to be extremely finite - possibly only lasting a few hours. I'm not one of those people that attaches value to the length of a game, necessarily, but I like knowing that I've spent my money on something of quality - and contributed to a studio who have produced something that I can see the value in, in the hopes that they can continue to produce more of those things.
If I buy an online game, my expectations are now for the game to change and evolve over time. I expect to play it for a period and then walk away from it, returning an indeterminate amount of time later to a game that is considerably different to the one that I left behind - with more content, changed gameplay, and modified systems. My expectation is for the community to be listened to (for better or for worse), and for developers to let us know what they have in mind and what they're working on - as long as they don't spoil anything.
Whether it's OK or not almost doesn't matter. In changing how I consume games, the publishers have done what they needed to do to get me into the mindset to be prepared to change how I pay for them. Damn, they're a cunning bunch.