“No Man’s Sky is so big, the developers built space probes to explore it for them.” That’s from a Polygon report on what is probably the most hyped videogame of the moment. The main thing that seems to fascinate people about No Man’s Sky is its extensive use of procedural content generation (PCG). Put simply, PCG involves using software to generate game content in stead of creating it by hand.
The game content created in this way can be anything. Visuals are the most common thing, but it can also include stuff that players interact with, such as the artificial intelligence of a computer controlled opponent or the placement of items in a level.
A few weeks ago I attended a symposium organised by the Amsterdam University of Applied Sciences (HvA) on “automated game design”. Over the course of the day various researchers and practitioners presented their efforts related to this topic.
Anders Bouwer of the HvA opened the symposium by talking about how the aim of game design automation is to speed things up. This can be achieved by accelerating the transition from design to software development, and by accelerating the flow of feedback from playtests back to design. The main way to do this is to create tools that sit between design and software development.
Two approaches to game design automation became apparent to me over the course of the day. The first and most obvious approach is to use software to automate work that a designer would otherwise have to do manually. This is part of the common story told about No Man’s Sky. The game’s developer is a small independent company which does not have the resources to create the game’s huge galaxy by hand. So in stead, they have crafted software tools which generate planets, vegetation, animals and so on.
The second approach is to provide a designer with what are essentially tools for inspiration. In stead of automating things a human could also do by hand, a designer is enabled to do things she could simply not do without those tools. So it is not about speed and volume, but about quality. It is focused on process in stead of product. Such tools can potentially surprise the designer. Conversely, the stuff produced by No Man’s Sky’s tools must adhere to rules which have been predetermined by designers.
In one of the symposium’s first talks Joris Dormans argued for the latter approach. He argued for the use of procedural content generation tools in the service of improving the game design process. He wants them to be tools to think with.
Thinking with a tool implies a kind of partnership. In stead of being the slave or master of a technology, we become collaborators. In procedural content generation research, this approach is explored through mixed-initiative tools. “Mixed-initiative” refers to the fact that such tools allow for a continuous dialogue between designer and software. One example is Tanagra, a level design tool for 2D platformers. It generates levels in real time while the designer manipulates geometry or a more abstract representation of the level’s pacing.
Mixed-initiative tools such as Tanagra are exciting because they augment a designer’s capabilities beyond speed and volume. Because of their fluid nature they become something like a musical instrument. A designer can perform with these tools. They allow for something similar to sketching. There is a real potential for surprise here, and for discovery. When making such tools the question is not what outcome it should reliably produce, but what process it should reliably support.
In his talk, Joris described his ideal tool as a thing which gives him a lot of variations. He should then be able to tell it what he wants to see more of. In this way, a designer can more easily scan through a game’s possibility space. But this way of working does not enable her to see the full range of things a tool might generate. The designer in this case is a bit like the Hello Games probe, scanning the possibility space of No Man’s Sky, one animated gif at a time.
What if we could zoom out, though? At this year’s Game Developer Conference, Tanagra creator Gillian Smith, accompanied by Julian Togelius, talked about “the power and peril of PCG”. Towards the end of this talk, they show work on understanding the range of outcomes afforded by procedural content generation tools.
The approach is simple: first, criteria are determined by which outcomes are scored. In the case of Tanagra, a number of levels are generated and scored on how hard they are, and on how linear they are. Then, each level is plotted on a heat map. The result allows us to see the shape of Tanagra’s possibility space. In this way the biases in a particular configuration is more easily uncovered.
Enabled with such visualisations of possibility space, procedural content generation tools become instruments in a second sense, namely that of scientific instruments. They can be used like microscopes or macroscopes. We can use them to “see inside of” games and the tools used to make games. They afford powerful new ways of seeing.
It is this promise of new ways of seeing that I find most exciting about procedural content generation tools of the mixed-initiative type, or “procedural instruments” as I propose we call them from now on.
Games are just one kind of algorithmic culture, and more and more kinds of algorithms are used to generate media. However, in media criticism the term “algorithm” is often used rather naively. What the study of procedural content generation tools can teach us is that there is no such thing as a singular algorithm that generates a piece of media. They are assemblages of different approaches to computation, combined with different design practices.
Attending this symposium on automated game design has made me excited about procedural content generation tools aimed at augmenting the capabilities of designers. The big challenge ahead is getting such tools out of the research labs and into the hands of practitioners. This is a non-trivial task. Many of these tools are quite complicated and expensive to get right.
A dissemination of such tools will only happen if we recognise the power they afford us. If we want to become better at making games and playable systems more broadly, we need tools with which we can perform better, and with which we can see better. We need procedural instruments.
Addendum: Cases Presented During the Symposium
- Loren Roosendaal (IC3D Media) talked about how they made earthquake disaster relief training software for the Indonesian government. They were on a tight budget, so they created a tool which collapses buildings. These collapsed buildings were then used as a starting point for level design. He also talked about negotiation training software developed for the Dutch Ministry of Defence called Cultura. It measures player performance. IC3D Media and the MoD use these measurements as input for better level design. They might in future do something like A/B testing of dialog options.
- Thomas Buijtenweg (NHTV) demonstrated a generator he developed for collectible card game (CCG) cards. The generator provides a designer with a bunch of card options which they can then select from. It balances all options using a formula for the card cost.
- Daniel Karavolos (HvA) provided several examples of how he used a tool called Ludoscope to generate videogame levels. It is based on graphs, grids and transformation rules. The approach focuses on modeling the process of creating game content. (PDF)
- Rafael Bidarra (TU Delft) showed two projects. The first demonstrated generation of a meadow in real time based on a vegetation model. The second showed how we they used grammar-based population generation to connect generated game geography with generated game stories. They generate settlements in the geography and relationships between those settlements based on resources and needs. These in turn give rise to “stories” (interactions between individuals in the settlements). The placement of settlements is done in a mixed-initiative way.
- Stefan Leijnen (HvA) and Paul Brinkkemper (Firebrush Studios) talked about MoneyMaker Deluxe, a game about fractional reserve banking. They used Machinations to describe models, which were then used as a blueprint for the generators in the game. (PDF)
Procedural Instruments Enable Powerful Ways of Making and Seeing Playable Systems
“No Man’s Sky is so big, the developers built space probes to explore it for them.” That’s from a Polygon report on what is probably the most hyped videogame of the moment. The main thing that seems to fascinate people about No Man’s Sky is its extensive use of procedural content generation (PCG). Put simply, PCG involves using software to generate game content in stead of creating it by hand.
The game content created in this way can be anything. Visuals are the most common thing, but it can also include stuff that players interact with, such as the artificial intelligence of a computer controlled opponent or the placement of items in a level.
A few weeks ago I attended a symposium organised by the Amsterdam University of Applied Sciences (HvA) on “automated game design”. Over the course of the day various researchers and practitioners presented their efforts related to this topic.
Anders Bouwer of the HvA opened the symposium by talking about how the aim of game design automation is to speed things up. This can be achieved by accelerating the transition from design to software development, and by accelerating the flow of feedback from playtests back to design. The main way to do this is to create tools that sit between design and software development.
Two approaches to game design automation became apparent to me over the course of the day. The first and most obvious approach is to use software to automate work that a designer would otherwise have to do manually. This is part of the common story told about No Man’s Sky. The game’s developer is a small independent company which does not have the resources to create the game’s huge galaxy by hand. So in stead, they have crafted software tools which generate planets, vegetation, animals and so on.
The second approach is to provide a designer with what are essentially tools for inspiration. In stead of automating things a human could also do by hand, a designer is enabled to do things she could simply not do without those tools. So it is not about speed and volume, but about quality. It is focused on process in stead of product. Such tools can potentially surprise the designer. Conversely, the stuff produced by No Man’s Sky’s tools must adhere to rules which have been predetermined by designers.
In one of the symposium’s first talks Joris Dormans argued for the latter approach.1 He argued for the use of procedural content generation tools in the service of improving the game design process. He wants them to be tools to think with.
Thinking with a tool implies a kind of partnership. In stead of being the slave or master of a technology, we become collaborators. In procedural content generation research, this approach is explored through mixed-initiative tools. “Mixed-initiative” refers to the fact that such tools allow for a continuous dialogue between designer and software. One example is Tanagra, a level design tool for 2D platformers. It generates levels in real time while the designer manipulates geometry or a more abstract representation of the level’s pacing.
Mixed-initiative tools such as Tanagra are exciting because they augment a designer’s capabilities beyond speed and volume. Because of their fluid nature they become something like a musical instrument. A designer can perform with these tools. They allow for something similar to sketching. There is a real potential for surprise here, and for discovery. When making such tools the question is not what outcome it should reliably produce, but what process it should reliably support.
In his talk, Joris described his ideal tool as a thing which gives him a lot of variations. He should then be able to tell it what he wants to see more of. In this way, a designer can more easily scan through a game’s possibility space. But this way of working does not enable her to see the full range of things a tool might generate. The designer in this case is a bit like the Hello Games probe, scanning the possibility space of No Man’s Sky, one animated gif at a time.
What if we could zoom out, though? At this year’s Game Developer Conference, Tanagra creator Gillian Smith, accompanied by Julian Togelius, talked about “the power and peril of PCG”. Towards the end of this talk, they show work on understanding the range of outcomes afforded by procedural content generation tools.
The approach is simple: first, criteria are determined by which outcomes are scored. In the case of Tanagra, a number of levels are generated and scored on how hard they are, and on how linear they are. Then, each level is plotted on a heat map. The result allows us to see the shape of Tanagra’s possibility space. In this way the biases in a particular configuration is more easily uncovered.
Enabled with such visualisations of possibility space, procedural content generation tools become instruments in a second sense, namely that of scientific instruments. They can be used like microscopes or macroscopes. We can use them to “see inside of” games and the tools used to make games. They afford powerful new ways of seeing.
It is this promise of new ways of seeing that I find most exciting about procedural content generation tools of the mixed-initiative type, or “procedural instruments” as I propose we call them from now on.
Games are just one kind of algorithmic culture, and more and more kinds of algorithms are used to generate media. However, in media criticism the term “algorithm” is often used rather naively. What the study of procedural content generation tools can teach us is that there is no such thing as a singular algorithm that generates a piece of media. They are assemblages of different approaches to computation, combined with different design practices.
Attending this symposium on automated game design has made me excited about procedural content generation tools aimed at augmenting the capabilities of designers. The big challenge ahead is getting such tools out of the research labs and into the hands of practitioners. This is a non-trivial task. Many of these tools are quite complicated and expensive to get right.
A dissemination of such tools will only happen if we recognise the power they afford us. If we want to become better at making games and playable systems more broadly, we need tools with which we can perform better, and with which we can see better. We need procedural instruments.
Addendum: Cases Presented During the Symposium