Editorial

Microtransactions | Aggravating Necessities

Microtransactions Pic

When video games initiated their plan for world domination, they began as entertaining time sinks that allowed consumers to immerse themselves in something other than television or reality. From simple experiences such as Pong to more complex titles including the Mass Effect series or Metal Gear Solid, video games have transformed into something more enrapturing than mere amusement. This popular form of entertainment has become a lifestyle, a profession, and even a sport. However, with every evolutionary, positive venture comes some controversial elements, and one such point of contention in the industry is the implementation of microtransactions.

As video games have evolved over almost six decades, microtransactions have become increasingly prominent and necessary to sustain companies as they attempt to meet consumers’ endless demands. Originally, the primary business model presiding over the industry’s financial stability revolved around the single purchase of a full title, while using coins to play games in an arcade supplemented companies’ revenues. Perhaps in part due to the decline of arcades in the Western world, microtransactions have become one of the most controversial elements included in the industry, often requiring players to shell out more money to experience the entirety of a title’s content. While many microtransactions are simply cosmetic, some place a gate between players and gameplay. Downloadable content (DLC) is a gate that grants access to new stories and quests (i.e. Ghost Recon: Wildlands and Dragon Age II) or new playable characters and more weapons (Injustice 2 and 2015’s Star Wars: Battlefront).

To many, microtransactions are irritating hassles that prevent players from experiencing a title’s entirety in a single playthrough, unless said players wait until all content is released before beginning, or restarting, their adventure. Unfortunately, the truth is that microtransactions are a necessary evil in today’s industry given the exponential costs associated with developing AAA games; consumers’ insatiable desires for more titles; and the harsh, competitive climate in which developers and publishers find themselves with each release.

Production costs are one of the main driving forces behind microtransactions. Developmental expenses, such as licensing; voice acting; salaries; software; soundtracks; and many other assets are essential to production, particularly for AAA titles. In addition, during and after a title’s completion, marketing costs (television, print, and online advertisements as well as events and launch parties) act as monumental increases to the monetary strain companies undergo for their releases. Moreover, manufacturing must also be considered. For example, arcade machine production; disc production; and even digital distribution are all factored into a game’s costs. Luckily, the availability of digital copies has reduced companies’ expenditures marginally compared to the past, when physical copies were the only form of distribution.

Microtransactions Pic 2

In the past, when a title was released, players were able to experience all content without having to worry about paying for additional content down the road . In the modern era, players purchase a game and are essentially paying for a fraction of the total content. In the last 10 to 15 years, the prices of new releases have not changed much when one considers the effects of inflation. Consider 2006’s PlayStation 3: when Sony’s PS4 predecessor hit the market, the retail price was USD$599.99. However, by 2013, a  16% inflation rate had pushed the adjusted cost of the console to approximately USD$695. Inflation has dramatically shifted the ratio of content for which consumers are paying in favor of developers and publishers. Buying a title is more like paying for a teaser and purchasing the rest of the experience at different intervals as the content is released. If a player owns a Season Pass, they are paying for the additional content upfront at a small discount. Despite the modern era being deceivingly less expensive than those of the 1990s and earlier, the current model is siphoning increasing amounts of income from loyal gamers.

Those loyal gamers, however, are immovable in their thirst for more. By generous estimates, the primary content of many games can be completed in a matter of hours or days. If the replay value is lacking, those consumers may find themselves bored with the title early and either waiting for more content, a sequel, or a different game entirely. Also, with the widespread belief in shrinking attention spans, producers feeling the need to constantly update their content or release new features is no surprise. Given the drive for developers to perpetually produce something more, microtransactions are necessary to continue recouping expenses as well as making a profit—video games are a business, after all.

Despite business concerns, if companies found a way to provide all of a title’s content upon initial release, perhaps players could be satiated for a longer period of time. However, doing so would likely drive the cost of standard editions higher, making the industry even more expensive for consumers. With attention spans being a concern, more content upfront may grip players for longer, providing companies with more time to work on their next full-scale project. A more upfront model, however, would work best in the realm of single-player games, and would not be practical for MMOs. Sadly, the elements mentioned above (expenses and insatiable customers) have also created a highly competitive climate for developers.

Competition between companies is more intense now than ever before. That heated competitiveness has spread to consumers, particularly with the rivalry between fans of Xbox and PlayStation, not to mention PC gamers who also have passionate viewpoints. Competition, however, does not always expose itself in the form of players battling each other online. Frequently, competition worms its way into the very industry, with many companies trying to find ways to make their releases more addicting to keep fans absorbed. While developers and publishers get money from the initial purchase of a title(more so if customers buy deluxe or gold editions), retaining those consumers long enough for them to buy additional content is a difficult task. Holiday seasons are always ripe with high-end games releasing within a small window of each other, and with how expensive buying a single title can be, many consumers have to pick and choose which games they want, creating a greater incentive for companies to edge out the competition.

Microtransactions Pic 3

Due to this competitive environment, microtransactions are implemented to sustain developers as they come up with more content to give themselves a boost over their rivals. This temporary solution gives developers time to adapt to an ever-changing business that persistently advances technologically and socially. Without competition, companies would have little incentive to advance, and the industry could possibly still be in a more outdated era. While microtransactions are an aggravation for customers, they are also a saving grace for many developers.

The topic of microtransactions could be debated for months without both sides of the argument finding common ground. The video game industry is always expanding, fans are continuously salivating for the newest content, business expenses continue to rise and fall with technological advancements, and fierce competition keeps driving innovation, all of which makes the implementation of microtransactions a muddled area of discussion. Nevertheless, with the current rate at which titles are released across all platforms, microtransactions disappearing altogether is improbable. Companies rely on those smaller purchases to stay afloat and help turn a profit, despite how infuriating the consistent cash siphons are for players.

Pages: 1 2

Click to comment
To Top