Mixed Reality Laboratory
  • Print
   
   

Interactive Works

The Mixed Reality Laboratory (MRL), sometimes in partnership with local and international artists, has produced a number of renowned and provocative prototype designs, interactive exhibits, and engaging artistic experiences. Below are a select few of these works we have produced throughout the years. Furthermore, we are committed to transferring knowledge to academia and industry, so routinely publish the outcomes of our research at internationally-recognised conferences and journals.


 
Interactive works from: today — 20102009 — 1999


 

2016

Carolan Guitar

Every guitar tells a story, from the tonewoods that form it, to the craft of its making, to the players that own it, to the places it visits, and to the many songs that it sings. This blog tells the story of a unique guitar; one that has been created with the express purpose of capturing and telling its own life history. 

Publications

Steve Benford, Adrian Hazzard, Alan Chamberlain, Kevin Glover, Chris Greenhalgh, Liming Xu, Michaela Hoare, and Dimitrios Darzentas. 2016. Accountable Artefacts: The Case of the Carolan Guitar. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 1163-1175. https://doi.org/10.1145/2858036.2858306

Steve Benford, Adrian Hazzard, Alan Chamberlain, Kevin Glover, Chris Greenhalgh, Liming Xu, Michaela Hoare, and Dimitrios Darzentas. 2016. Experiencing the Carolan Guitar. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '16). ACM, New York, NY, USA, 3651-3654. https://doi.org/10.1145/2851581.2890264

 
 

 


 

Digitopia

The Digitopia project is in collaboration with the Tom Dale Dance Company to explore audience engagement with graphics and generative music. The Tom Dale Dance Company developed Digitopia, a show – integrating contemporary dance, electronic music and digital technology – which is touring in 16 UK venues between February and April 2016.

The show is about Hex, a simple two-dimensional line who can make angles, but only up to six! His favourite shape is a hexagon. He’s happy with this, but one day he really wants to make a curve. With a lot of effort he learns to bend, then multiply, and suddenly he can create all kinds of shapes, eventually popping into three dimensions, discovering he can turn into anything he wants to. 

Digitopia

 

 


 

Horse Automated Behaviour Identification Tool (HABIT)

HABIT is a interdisciplinary animal-computer interaction research project which could help us understand what animals are thinking and feeling. The aim of the software is to identify horse behaviour from unconstrained (amateur) video so we humans can interpret those reactions and understand why they are happening.

By bringing together experts in animal computer interaction, equitation science, ethology, animal behaviour and biomedical engineering the aim of HABIT is to develop a software programme that will automatically identify the behaviour horses are exhibiting and tell us whether the horse is stressed, sick or suffering.

Publications

Steve North. 2016. Do Androids dream of electric steeds?: the allure of horse-computer interaction. interactions 23, 2 (February 2016), 50-53. https://doi.org/10.1145/2882529

Carol Hall and Amanda Roshier. 2016. Getting the measure of behavior … is seeing believing?. interactions 23, 4 (June 2016), 42-46. https://doi.org/10.1145/2944164

Steve North and Clara Mancini. 2016. Introduction. interactions 23, 4 (June 2016), 34-36. https://doi.org/10.1145/2946043

Steve North, Carol Hall, Amanda Roshier and Clara Mancini. 2015. Habit (Horse Automated Behaviour Identification Tool) Video Poster. In The Second International Congress on Animal Computer Interaction (ACI2015). Proceedings of the 2015 International Workshops on Advances in Computer Entertainment Conference (ACE2015). https://doi.org/10.13140/RG.2.1.4924.6480

Steve North, Carol Hall, Amanda Roshier and Clara Mancini. 2015. Habit: Horse Automated Behaviour Identification Tool – a Position Paper. In Proceedings of ACI@BHCI (Animal Computer Interaction Workshop), British HCI 2015. https://doi.org/10.13140/RG.2.1.3395.0881

 
 
 

 


 

2015

Imagine Digital

Roma Patel facilitated a series of creative workshops over 13 weeks with care home residents from Nottinghamshire Hospice, The Firs and Sycamore House  and Kenyon Lodge. A series of digital technologies were introduced to older adults and created some playful results. Roma drew her inspiration from self-taught artist Joseph Cornell produced work in the 1950’s and 60’s. Cornell’s most characteristic art works were boxed assemblages created from found objects. These are simple shadow boxes, usually fronted with a glass pane, in which he arranged eclectic fragments of photographs or Victorian bric a brac. Many of his boxes, such as the famous Medici Slot Machine boxes, are interactive and are meant to be handled.

Initial workshops engaged residents with pre-made boxes consisting of a range of objects, a camera and drawing materials as way of stimulating ideas and getting to know individuals better. This was the starting point where feelings, experiences about the arts, technology and cultural activities were gleaned, providing a better understanding of people’s preferences in shaping the work that followed.

 
 

 


 

2014

MOVE

MOVE is an architectural prototype and research platform to explore the relationship of body movements and movements in adaptive architecture. Using a Kinect motion sensor, MOVE tracks the gross body movements of a person and allows the flexible mapping of those to the movement of building components. In this way, a person inside MOVE can immediately explore the creation of spatial configurations around them as they are created through the body. This can be done live, by recording body movements and replaying them and through manual choreography of building elements. Trial feedback has shaped our four-stage iterative design and development process. The video shows Tetsudo performers Hamish Elliott and Natalie Heaton exploring interaction with MOVE. 

 

 


 

The Prediction Machine

The Prediction Machine is an interactive artwork, based on end of the pier fortune telling machines. The machine marks ‘moments of climate change’ in our everyday lives and prints out ‘climate fortunes’ for 30 years in the future, that visitors to the machine can take away with them. These predictions use live weather data captured at a local weather station, matched with projected climate data from future climate models provided by scientists at the UK MET office, and observations by local people. The machine links up to an interactive website that combines narrative and visual representations of the data with more traditional science communication.

The Prediction Machine has been developed in collaboration with local people in the East Midlands (UK) and Rio State (Brazil), engineers, computer scientists, climate scientists and researchers.

Publications
Rachel Jacobs, Steve Benford, Ewa Luger, and Candice Howarth. 2016. The Prediction Machine: Performing Scientific and Artistic Process. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems (DIS '16). ACM, New York, NY, USA, 497-508. https://doi.org/10.1145/2901790.2901825
 
 
 

 


 

2013

Ministry of Provenance

Working with the Nottingham-based artists group Urban Angel, the game The Apoloclypse of the Ministry of Provenance mixes theatre, art, gaming and research for players to untangle events in a complex mystery thriller, based on the activities of a sinister government organisation, by understanding the history of ownership and modifications of objects.

The game involves players answering questions and completing tasks relating to the digital provenance of objects. These decisions will then be used to assess how people process information and how important provenance — the origin of information — is in decision-making.

Publications

Khaled Bachour, Richard Wetzel, Martin Flintham, Trung Dong Huynh, Tom Rodden, and Luc Moreau. 2015. Provenance for the People: An HCI Perspective on the W3C PROV Standard through an Online Game. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, New York, NY, USA, 2437-2446. https://doi.org/10.1145/2702123.2702455

 
 
 

 


 

2012

Cargo

Cargo is a mixed reality street game with elements of treasure hunt and tag. The theme of the game is that a team of six to eight players is trying to help one member (called Cargo) escape the city before being caught by police. Players race against the clock in an attempt to gather enough credits to win the game. In order to collect enough credits, players must visit, or check in, at a number of these game stations scattered around the city. When a player checks in, the station either rewards them with credit or, if the station is “dead”, wipes out their credit total. Players need to figure out which stations are dead and which ones are live. They are aided in that by a software agent we call the Instructor, who calls players on their phones with relevant in-game information.

Publications

Stuart Moran, Nadia Pantidi, Khaled Bachour, Joel E. Fischer, Martin Flintham, Tom Rodden, Simon Evans, and Simon Johnson. 2013. Team reactions to voiced agent instructions in a pervasive game. In Proceedings of the 2013 international conference on Intelligent user interfaces (IUI '13). ACM, New York, NY, USA, 371-382. https://doi.org/10.1145/2449396.2449445

 
 

 


 

I'd Hide You

I'd Hide you is an online game of stealth, cunning and adventure. Jump onboard with a team of illuminated runners live from the streets as they roam the city trying to film each other.

In I'd Hide You, see the world through the runners eyes as they stream video: ducking and diving, chatting to passersby, taking you down the back alleys to their secret hiding places. And play against your friends online at the same time. Use your wits to choose which runner to ride with. Get a snap of another runner onscreen without getting snapped and you score a point. Get snapped by someone else and you lose a life.

Publications
Stuart Reeves, Christian Greiffenhagen, Martin Flintham, Steve Benford, Matt Adams, Ju Row Farr, and Nicholas Tandavantij. 2015. I'd Hide You: Performing Live Broadcasting in Public. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, New York, NY, USA, 2573-2582. https://doi.org/10.1145/2702123.2702257
 
 

 


 

The Malthusian Paradox

The Malthusian Paradox is an interactive transmedia narrative you can experience online and in real life. The main narrative is also supported by series of short films that will give you clues and move the story along. You become part of the story. It’s like being right inside a thriller where you are one of the characters.

Publications
Elizabeth Evans, Martin Flintham, and Sarah Martindale. The Malthusian Paradox: performance in an alternate reality game. 2014. Personal and ubiquitous computing 18, 7. Springer London, 1567-1582. https://doi.org/ 10.1007/s00779-014-0762-7
 
 

 


 

Screens in the Wild

Screens in the Wild is a collaborative research project initiated by researchers from the Space Group at University College London. It investigates how media screens located in urban space can be designed to benefit public life, rather than merely transmit commercial content. Screens in the Wild is funded by Research Councils UK through the Digital Economy Programme.

Publications

Nemanja Memarovic, Ava Fatah gen. Schieck, Holger Schnädelbach, Efstathia Kostopoulou, Steve North, and Lei Ye. 2016. Longitudinal, cross-site and "in the Wild": a study of public displays user communities' situated snapshots. In Proceedings of the 3rd Conference on Media Architecture Biennale (MAB). ACM, New York, NY, USA, Article 1, 10 pages. https://doi.org/10.1145/2946803.2946804

Ava Fatah gen. Schieck, Holger Schnädelbach, Wallis Motta, Moritz Behrens, Steve North, Lei Ye, and Efstathia Kostopoulou. 2014. Screens in the Wild: Exploring the Potential of Networked Urban Screens for Communities and Culture. In Proceedings of The International Symposium on Pervasive Displays (PerDis '14), Sven Gehring (Ed.). ACM, New York, NY, USA, 166-168. https://doi.org/10.1145/2611009.2617199

Steve North, Holger Schnädelbach, Ava Fatah gen. Schieck, Wallis Motta, Lei Ye, Moritz Behrens, Efstathia Kostopoulou, . 2013. Tension space analysis: exploring community requirements for networked urban screens. In Human-Computer Interaction--INTERACT 2013: 14th IFIP TC 13 International conference. Springer, Berlin, 81-98. https://doi.org/10.1007/978-3-642-40480-1_6

 
 
 

 


 

2010

Bucking Bronco 

A bucking bronco provides a quite intensive physical experience for a single rider (usually sufficient to throw them off) that is controlled in real-time (by a human operator). Bronco rides clearly push back on the rider, throwing them around and demanding considerable physical exertion as well as concentration. This one is different though — it's controlled by your breathing.

Publications
Joe Marshall, Duncan Rowland, Stefan Rennick Egglestone, Steve Benford, Brendan Walker, and Derek McAuley. 2011. Breath control of amusement rides. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11). ACM, New York, NY, USA, 73-82. https://doi.org/10.1145/1978942.1978955
 
 

 


 

Exploding Places

Exploding Places is a prototype of a real world SIM city or Monopoly, played live on the streets of Woolwich and developed in partnership with artists Active Ingredient and Horizon Digital Economy Research Institute. You play on the phone screen and through headphones, as you walk the town’s real streets. Exploding Places encourages us to explore and discover Woolwich in a new way, meet other players in both the real and fictional world of the game and experience the history and geography of Woolwich in a playful way.

Publications
Martin Flintham, Chris Greenhalgh, Andrew Greenman, Tom Lodge, Richard Mortier, Rachel Jacobs, Matt Watkins, and Robin Shackford. 2010. Towards a Platform for Urban Games. In Proceedings of Digital Futures '10.
 
 
 

 


  

View older interactive works from between 1999 and 2009

Mixed Reality Laboratory

The University of Nottingham
School of Computer Science
Nottingham, NG8 1BB


telephone: +44 (0) 115 846 6780
email: mrl@cs.nott.ac.uk