Contents

In the video game industry, a first-party developer is part of a company which manufactures a video game console and develops exclusively for it. First-party developers may use the name of the company itself (such as Nintendo), have a specific division name (such as Sony'sPolyphony Digital) or have been an independent studio before being acquired by the console manufacturer (such as Rare or Naughty Dog).[6] Whether by purchasing an independent studio or by founding a new team, the acquisition of a first-party developer involves a huge financial investment on the part of the console manufacturer, which is wasted if the developer fails to produce a hit game in a timely manner.[7] However, using first-party developers saves the cost of having to make royalty payments on a game's profits.[7]

Second-party developer is a colloquial term often used by gaming enthusiasts and media to describe game studios who take development contracts from platform holders and produce games exclusive to that platform.[8] As a balance to not being able to release their game for other platforms, second-party developers are usually offered higher royalty rates than third-party developers.[7] These studios may have exclusive publishing agreements (or other business relationships) with the platform holder, but maintain independence so upon completion or termination of their contracts are able to continue developing games. Examples are Insomniac Games (originally a 2nd party for Sony), Bungie (originally a 2nd party for Microsoft) and Rareware (originally a 2nd party for Nintendo).

Activision in 1979 became the first third-party video game developer,[9] where the term "second-party" originally referred to the consumers. A third-party developer may also publish games, or work for a video game publisher to develop a title. Both publisher and developer have considerable input in the game's design and content. However, the publisher's wishes generally override those of the developer.

The business arrangement between the developer and publisher is governed by a contract, which specifies a list of milestones intended to be delivered over a period of time. By updating its milestones, the publisher verifies that work is progressing quickly enough to meet its deadline and can direct the developer if the game is not meeting expectations. When each milestone is completed (and accepted), the publisher pays the developer an advance on royalties. Successful developers may maintain several teams working on different games for different publishers. Generally, however, third-party developers tend to be small, close-knit teams. Third-party game development is a volatile sector, since small developers may be dependent on income from a single publisher; one canceled game may be devastating to a small developer. Because of this, many small development companies are short-lived.

A common exit strategy for a successful video-game developer is to sell the company to a publisher, becoming an in-house developer. In-house development teams tend to have more freedom in the design and content of a game compared to third-party developers. One reason is that since the developers are employees of the publisher, their interests are aligned with those of the publisher; the publisher may spend less effort ensuring that the developer's decisions do not enrich the developer at the publisher's expense.

In recent years, larger publishers have acquired several third-party developers. While these development teams are now technically "in-house", they often continue to operate in an autonomous manner (with their own culture and work practices). For example, Activision acquired Raven (1997); Neversoft (1999), which merged with Infinity Ward in 2014; Z-Axis (2001); Treyarch (2001); Luxoflux (2002); Shaba (2002); Infinity Ward (2003) and Vicarious Visions (2005). All these developers continue operating much as they did before acquisition, the primary differences being exclusivity and financial details. Publishers tend to be more forgiving of their own development teams going over budget (or missing deadlines) than third-party developers.

A developer may not be the primary entity creating a piece of software, usually providing an external software tool which helps organize (or use) information for the primary software product. Such tools may be a database, Voice over IP, or add-in interface software; this is also known as middleware. Examples of this include SpeedTree and Havoc.

Independents are software developers which are not owned by (or dependent on) a single publisher. Some of these developers self-publish their games, relying on the Internet and word of mouth for publicity. Without the large marketing budgets of mainstream publishers, their products may receive less recognition than those of larger publishers such as Sony, Microsoft or Nintendo. With the advent of digital distribution of inexpensive games on game consoles, it is now possible for indie game developers to forge agreements with console manufacturers for broad distribution of their games.

Other indie game developers create game software for a number of video-game publishers on several gaming platforms.[citation needed] In recent years this model has been in decline; larger publishers, such as Electronic Arts and Activision, increasingly turn to internal studios (usually former independent developers acquired for their development needs).[citation needed]

Video-game development is usually conducted in a casual business environment, with T-shirts and sandals common work attire. Many workers find this type of environment rewarding and pleasant professionally and personally.[10] However, the industry also requires long working hours from its employees (sometimes to an extent seen as unsustainable).[11] Employee burnout is not uncommon.[10]

An entry-level programmer can make, on average, over $66,000 annually only if they are successful in obtaining a position in a medium to large video game company.[12] An experienced game-development employee, depending on his or her expertise and experience, averaged roughly $73,000 in 2007.[13] Indie game developers may only earn between $10,000 to $50,000 a year depending on how financially successful their titles are.[14]

In addition to being part of the software industry,[citation needed] game development is also within the entertainment industry; most sectors of the entertainment industry (such as films and television) require long working hours and dedication from their employees, such as willingness to relocate and/or required to develop games that do not appeal to their personal taste. The creative rewards of work in the entertainment business attracts labor to the industry, creating a competitive labor market which demands a high level of commitment and performance from employees. Industry communities, such as the International Game Developers Association (IGDA), are conducting increasing discussions about the problem; they are concerned that working conditions in the industry cause significant deterioration in its employees' quality of life.[15][16]

Some video game developers (such as Electronic Arts) have been accused of the excessive invocation of "crunch time".[17] "Crunch time" is the point at which the team is thought to be failing to achieve milestones needed to launch a game on schedule. The complexity of work flow and the intangibles of artistic and aesthetic demands in video-game creation create difficulty in predicting milestones.

Most game-development engineers and artists in the United States are considered salaried employees; as "exempt non-hourly-paid professionals", they are not subject to state laws governing overtime.[18] An exception is California, where software developers are specifically protected by a minimum hourly wage to be considered exempt.[19] In 2008, due to the amendment to California Labor Code Section 515.5 by Bill SB 929,[20] the minimum wage was $36 per hour (or $74,880 per year).

Attention to "crunching" was drawn by a 2004 blog post entitled ea_spouse.[21] The protest against crunch time was posted by Erin Hoffman (fiancée of Electronic Arts developer Leander Hasty), who contended that her life was being indirectly destroyed by the company's work policy. This led to debate in the industry but no visible changes until March 2005, when Electronic Arts announced internally that it was planning to extend overtime pay to some employees not currently eligible. As senior game developers age and family responsibilities become more important, many companies are moderating the worst crunch-time practices to attract better-quality staff.[22]

A similar situation was brought to light in January 2010, when a collective group of "Rockstar Spouses", the spouses of developers at Rockstar San Diego, posted an open letter criticizing the management of the studio for deteriorating working conditions for their significant other since March 2009, which included excessive crunch time. This was followed by several former Rockstar employees posting similar complaints of their time there.[23][24] The International Game Developers Association considered that Rockstar's working conditions were exploitative and harmful.[25] A similar concern of crunch time at the same studio arose near the release of Red Dead Redemption 2 in October 2018.[26]

Anonymous Epic Games employees speaking to Polygon spoke of crunch time with 70 to 100 hour weeks by some ever since they released Fortnite Battle Royale, which has drawn a playerbase of millions. While these employees were getting overtime pay, there remained issues of health concerns and inability to take time off without it being seen negatively on their performance.[27]

Similar to other tech industries, video game developers are typically not unionized. This is a result of the industry being driven more by creativity and innovation rather than production, the lack of distinction between management and employees in the white-collar area, and that the pace at which the industry moves that makes union actions difficult to plan out.[35] However, when situations related to crunch time become prevalent in the news, there have typically been followup discussions towards the potential to form a union.[35] A survey performed by the International Game Developers Association in 2014 found that more than half of the 2,200 developers surveyed favored unionization.[36]

In 2016, voice actors in the Screen Actors Guild‐American Federation of Television and Radio Artists (SAG-AFTRA) union doing work for video games struck several major publishers, demanding better royalty payments and provisions related to the safety of their vocal performances, when their union's standard contract was up for renewal. The voice actor strike lasted for over 300 days into 2017 before a new deal was made between SAG-AFTRA and the publishers. While this had some effects on a few games within the industry, it brought to the forefront the question of whether video game developers should unionize.[35][37][38]

A grassroots movement, Game Workers Unite, was established around 2017 to discuss and debate issues related to unionization of game developers. The group came to the forefront during the March 2018 Game Developers Conference by holding a roundtable discussion with the International Game Developers Association (IDGA), the professional association for developers. Statements made by the IDGA's current executive director Jen MacLean relating to IGDA's activities had been seen by as anti-union, and Game Workers Unite desired to start a conversation to lay out the need for developers to unionize.[39] In the wake of the sudden near-closure of Telltale Games in September 2018, the movement again called out for the industry to unionize. The movement argued that Telltale had not given any warning to its 250 employees let go, having hired additional staff as recently as a week prior, and left them without pensions or health-care options; it was further argued that the studio considered this a closure rather than layoffs, as to get around failure to notify required by the Worker Adjustment and Retraining Notification Act of 1988 preceding layoffs.[40] The situation was argued to be "exploitive", as Telltale had been known to force its employees to frequently work under "crunch time" to deliver its games.[41] By the end of 2018, a United Kingdom trade union, Game Workers Unite UK, an affiliate of the Game Workers Unite movement, has been legally established.[42]

A survey of over 4,000 game developers run by the Game Developers Conference in early 2019 found that 47% of respondents felt the video game industry should unionize.[43]

Following Activision Blizzard's financial report for the previous quarter in February 2019, the company said that they would be laying off around 775 employees (about 8% of their workforce) despite having record profits for that quarter. Further calls for unionization came from this news, including the AFL-CIO writing an open letter to video game developers encouraging them to unionize.[44]

In 1989, according to Variety, women constituted only 3% of the gaming industry.[45] In 2013, Gary Carr (the creative director of Lionhead Studios) predicted that within the next 5 to 10 years, the games development workforce would be 50% female.[45] According to Gamasutra's Game Developer Salary Survey 2014, women in the United States made 86 cents for every dollar men made. Game designing women had the closest equity, making 96 cents for every dollar men made in the same job, while audio professional women had the largest gap, making 68% of what men in the same position made.[46]