Micro-transactions have always been a hot topic, from the days of buying small chunks of cosmetic DLC to pay-to-win options that give you a competitive advantage. Even the entire Loot Box phenomenon that’s sweeping most competitive first person shooters is a form of micro-transaction. But how bad is it? Have gamers gotten used to buying everything they want/need rather than unlocking it? Or are micro-transactions simply a peripheral investment alongside the base game?
GamingBolt spoke to deltaDNA CEO and co-founder Mark Robinson, whose company offers analysis of deep data and functions as a real-time player marketing platform. We asked Robinson if micro-transactions were good or bad, even in the absence of pay-to-win elements. Did he feel there to be any merit to them, whether it’s in the free to play sphere or in a fully commercial game?
Robinson said that, “The thing is, you’re always going to offend some players whatever you do, which is why player segments are so important to ensure that only those who are receptive to micro-transactions are exposed to them.
“In the commercial game space it’s the same. However, I think it all comes down to how reliant the developer is on the revenues from DLC. Where DLC revenues are vital, then I think micro-transactions force developers to create better player experiences, with more personalized targeting.”
There’s no denying that the aforementioned Loot Box system has offered some benefit in the form of free DLC, post-launch updates and whatnot. Much as Overwatch or Halo 5: Guardians could be faulted for popularizing this phenomenon, the content they offer as a result has been well-received. How other developers follow and expand upon this model in the coming days remains to be seen.