In this talk, computational creativity researcher Michael Cook shows how cutting edge ideas from computational creativity can have a huge impact on how games are made and who makes them: by building smart software that can work alongside people as equals; by building tools that let us hold vast generative spaces in the palm of our hand; and by building an AI that can make games as an independent member of the games community. He'll discuss his latest research in automated game design and procedural content generation, paint a picture of a future games industry changed by these ideas, and show how these ideas can be used by game developers today.
Mike Cook is a Senior Research Fellow at the University of Falmouth's Metamakers Institute, where he researches the intersection of automated game design, computational creativity and procedural content generation. He is best known as the creator of ANGELINA, an AI that designs games. ANGELINA has had its work exhibited at galleries and game expos, has released a Top 500 Android game and been commissioned by the New Scientist, and was the first AI to enter a game jam. He is also the designer of Danesh, a tool for analysing and experimenting with procedural content generators. Mike also founded the Procedural Generation Jam, PROCJAM, which brings together experts and newcomers to learn and experiment with generative software, organising annual events, talk days and free resources for people to use to make generative software something accessible to all. He also writes about AI, ethics and creativity for places like The Guardian and Rock, Paper, Shotgun, and develops games under the label Cut Garnet Games.
As games move beyond the screen, and into mixed reality, our bodies are becoming increasingly activated in the game design process. How can we use machine learning to allow us to discover and learn from the way our players move? This talk explores using machine learning to enable physical games using sensors and controllers in novel use cases. Part dive into the history of game controllers, part forward exploration, this talk inspires designers and developers to think beyond the joystick to embodied experiences leveraging the whole body in the game's play space.
Phoenix Perry creates physical games and embodied experiences. Her work looks for opportunities to bring people together to raise awareness of our collective interconnectivity. Current research underway at Goldsmiths, University of London looks at leveraging our other senses, with particular focus on sound and skin based feedback to trigger affective response. A consummate advocate for women in game development, she founded Code Liberation Foundation. This organization teaches women to program games for free. Since starting in 2012, this project has reached over 3000 women in the New York and London areas between the ages of 16 to 60. Fostering professional growth and mentoring new leaders in the field, she strives to infuse the industry with new voices. Currently, she is a Lecturer in Physical Computing at Goldsmiths, University of London and the program leader of the Independent Games and Playable Experience MA, which she authored. Before that she was a Sr. Lecturer at HKU in the Netherlands and an Adjunct Professor at NYU Tandon School of Engineering, NYU Game Center and NYU ITP. She also founded a game studio which she ran for 3 years called Dozen Eyes Games focused on games and installations which created social change. The highlight of this project was the creation of a game with the US State Department to help newly arrived refugees in the US. Concurrently, she owned and ran Devotion Gallery in Williamsburg Brooklyn from 2009-2014 which was a vital cultural establishment producing over 200 classes, exhibitions and events. She is currently doing a PhD at Goldsmiths University of London and has an MS from NYU Tandon School of Engineering.
In this talk, Alan will introduce you to the challenges involved in bringing expressive virtual characters to life in games and entertainment media. He will talk through a number of technological challenges we faced and techniques we found useful in text generation, language understanding, ability to drive character animation and voice, and integration with third party technologies. He will describe some techniques we used to extract interesting character performances from dialog to maximise authored content. Finally, he will discuss some areas of future development being considered and where he thinks the state-of-the-art is going.
Alan started as a programmer for developer/publisher IncaGold in 1999, where he was involved behind the scenes in a number of game releases. He took a break from video games in 2008 to take a Senior Engineer role at sports tracking company Venatrack. Tasked with tracking player and ball movement for Premier League Football games through a system of HD cameras, he tackled problems in image processing, computer vision, visualization, real-time systems, and parallel processing. He was been Technical Lead on city/story builder Little Invasion Tales, helping to build the technology foundation for dynamic storytelling and agent behaviour. He has been with Spirit AI since 2016, working on their core Character Engine technology as Lead Engineer, managing technology direction for generative text, machine learning, and virtual characters.
to be added
Joseph DeLappe is the Professor of Games and Tactical Media at Abertay University in Dundee, Scotland where he relocated early in 2017 after 23 years directing the Digital Media program at the University of Nevada, Reno. A native San Franciscan, he has been working with electronic and new media since 1983, his work in online gaming performance, sculpture and electromechanical installation have been shown throughout the United States and abroad - including exhibitions and performances in Australia, the United Kingdom, China, Germany, Spain, Belgium, the Netherlands, Mexico, Italy, Peru, Sweden and Canada. In 2006 he began the project dead-in-iraq , to type consecutively, all names of America's military casualties from the war in Iraq into the America's Army first person shooter online recruiting game. In 2013, he rode a specially equipped bicycle to draw a 460 mile long chalk line around the Nellis Air Force Range to surround an area that would be large enough to create a solar farm that could power the entire United States. More recently he developed the concept behind ˝Killbox˝, an interactive computer game about drone warfare created with the Biome Collective in Scotland. Killbox was recently nominated for a BAFTA Scotland (British Academy of Film and Television Arts) as ˝Best Computer Game˝.
He has lectured throughout the world regarding his work, including the Museum of Modern Art in New York City. He has been interviewed on CNN, NPR, CBC, the Australian Broadcasting Corporation and on The Rachel Maddow Show on Air America Radio. His works have been featured in the New York Times, The Australian Morning Herald, Artweek, Art in America and in the 2010 book from Routledge entitled ˝Joystick Soldiers The Politics of Play in Military Video Games˝ among many others. He has authored two book chapters, including ˝The Gandhi Complex: The Mahatma in Second Life.˝ Net Works: Case Studies in Web Art and Design, (New York, Routledge 2011) and ˝Playing Politics: Machinima as Live Performance and Document˝, Understanding Machinima Essays on Filmmaking in Virtual Worlds, (London, UK, Continuum 2012).