Mixed Reality Laboratory
  • Print
   
   

Interactive Works

The Mixed Reality Laboratory (MRL), sometimes in partnership with local and international artists, has produced a number of renowned and provocative prototype designs, interactive exhibits, and engaging artistic experiences. Below are a select few of these works we have produced throughout the years. Furthermore, we are committed to transferring knowledge to academia and industry, so routinely publish the outcomes of our research at internationally-recognised conferences and journals.

View works from 2017 2016 2015 Earlier

 

Climb!

Climb! Is a non-linear composition written for Disklavier which the pianist undertakes a metaphorical journey up a mountain, playing musical codes that are hidden within the score to control their path and trigger musical and visual effects, including the piano engaging them in an unusual physical duet. This research/performance was supported through the following EPSRC project: Fusing Semantic and Audio Technologies for Intelligent Music Production and Consumption (EP/L019981/1). 

 

  

Data Journeys Archway

The Data Journeys Archway was a first experiment as part of the development of a larger archways project, an outcome of and follow-on from Andrew Wilson’s shared research residency with Mixed Reality Lab and Sustrans, the national active travel charity, and funded by Horizon. The original inspiration for the archways project came from a member of Sustrans’ staff, who said they thought part of the activity of the organisation was to get people to “notice their journeys”. “Noticing journeys” seemed a productive phrase, which could apply not just to travel journeys but personal journeys, personal histories and perhaps even personal health and wellbeing.

 

 

Thresholds

Artist Mat Collishaw’s recreation of the world’s first photographic exhibition as an immersive multiuser VR experience. Six visitors at a time explore Mat’s beautiful virtual world. They are also walking around a shared physical set which is populated with physical props that align with the virtual furniture. A unique experience that pushes the boundaries of visual/audio and haptic/tactile sensory alignment. The work opens at Somerset house next Wednesday where it runs for three weeks before touring over the summer. 

 

 

VR Playground

Artist Brendan Walker’s exploration of sensory misalignment. Riders don VR headsets and ride playground swings to navigate four virtual worlds that play with different re-couplings of the visual and kinaesthetic. Opens at Norwich and Norfolk festival this week before then touring.

 

 
 

 

Carolan Guitar

Every guitar tells a story, from the tonewoods that form it, to the craft of its making, to the players that own it, to the places it visits, and to the many songs that it sings. This blog tells the story of a unique guitar; one that has been created with the express purpose of capturing and telling its own life history. 

Publications

Steve Benford, Adrian Hazzard, Alan Chamberlain, Kevin Glover, Chris Greenhalgh, Liming Xu, Michaela Hoare, and Dimitrios Darzentas. 2016. Accountable Artefacts: The Case of the Carolan Guitar. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 1163-1175. https://doi.org/10.1145/2858036.2858306

Steve Benford, Adrian Hazzard, Alan Chamberlain, Kevin Glover, Chris Greenhalgh, Liming Xu, Michaela Hoare, and Dimitrios Darzentas. 2016. Experiencing the Carolan Guitar. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '16). ACM, New York, NY, USA, 3651-3654. https://doi.org/10.1145/2851581.2890264

 
 

 

Digitopia

The Digitopia project is in collaboration with the Tom Dale Dance Company to explore audience engagement with graphics and generative music. The Tom Dale Dance Company developed Digitopia, a show – integrating contemporary dance, electronic music and digital technology – which is touring in 16 UK venues between February and April 2016.

The show is about Hex, a simple two-dimensional line who can make angles, but only up to six! His favourite shape is a hexagon. He’s happy with this, but one day he really wants to make a curve. With a lot of effort he learns to bend, then multiply, and suddenly he can create all kinds of shapes, eventually popping into three dimensions, discovering he can turn into anything he wants to. 

Digitopia

 

  

Horse Automated Behaviour Identification Tool (HABIT)

HABIT is a interdisciplinary animal-computer interaction research project which could help us understand what animals are thinking and feeling. The aim of the software is to identify horse behaviour from unconstrained (amateur) video so we humans can interpret those reactions and understand why they are happening.

By bringing together experts in animal computer interaction, equitation science, ethology, animal behaviour and biomedical engineering the aim of HABIT is to develop a software programme that will automatically identify the behaviour horses are exhibiting and tell us whether the horse is stressed, sick or suffering.

Publications

Steve North. 2016. Do Androids dream of electric steeds?: the allure of horse-computer interaction. interactions 23, 2 (February 2016), 50-53. https://doi.org/10.1145/2882529

Carol Hall and Amanda Roshier. 2016. Getting the measure of behavior … is seeing believing?. interactions 23, 4 (June 2016), 42-46. https://doi.org/10.1145/2944164

Steve North and Clara Mancini. 2016. Introduction. interactions 23, 4 (June 2016), 34-36. https://doi.org/10.1145/2946043

Steve North, Carol Hall, Amanda Roshier and Clara Mancini. 2015. Habit (Horse Automated Behaviour Identification Tool) Video Poster. In The Second International Congress on Animal Computer Interaction (ACI2015). Proceedings of the 2015 International Workshops on Advances in Computer Entertainment Conference (ACE2015). https://doi.org/10.13140/RG.2.1.4924.6480

Steve North, Carol Hall, Amanda Roshier and Clara Mancini. 2015. Habit: Horse Automated Behaviour Identification Tool – a Position Paper. In Proceedings of ACI@BHCI (Animal Computer Interaction Workshop), British HCI 2015. https://doi.org/10.13140/RG.2.1.3395.0881

 
 
 

 

Augmented Bird Table

The Augmented Bird Table (ABT) was developed in partnership with Disaster Response charity Rescue Global on the EPSRC ORCHID project. ABT is a projection-vision system that augments map-based planning with digital capabilities, while preserving established, physical pen-and-paper-based work practices.

The two key objectives for the Augmented Bird Table are:

  1. to support the collaborative creation, display, and dissemination of the commonly recognised information picture (CRIP) to enable better decision making, including recording of the actions that make up this collaborative process (i.e., the provenance),
  2. to explore intelligent (machine) reasoning on that information to further guide human decision making, using artificial intelligence techniques and machine learning, for example to predict the outcomes of courses of actions, and to provide recommendations for task allocation and path planning.

Publications

Joel E. Fischer, Stuart Reeves, Tom Rodden, Steve Reece, Sarvapali D. Ramchurn, and David Jones. 2015. Building a Birds Eye View: Collaborative Work in Disaster Response. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, New York, NY, USA, 4103-4112. https://doi.org/10.1145/2702123.2702313

 
 

  

Imagine Digital

Roma Patel facilitated a series of creative workshops over 13 weeks with care home residents from Nottinghamshire Hospice, The Firs and Sycamore House  and Kenyon Lodge. A series of digital technologies were introduced to older adults and created some playful results. Roma drew her inspiration from self-taught artist Joseph Cornell produced work in the 1950’s and 60’s. Cornell’s most characteristic art works were boxed assemblages created from found objects. These are simple shadow boxes, usually fronted with a glass pane, in which he arranged eclectic fragments of photographs or Victorian bric a brac. Many of his boxes, such as the famous Medici Slot Machine boxes, are interactive and are meant to be handled.

Initial workshops engaged residents with pre-made boxes consisting of a range of objects, a camera and drawing materials as way of stimulating ideas and getting to know individuals better. This was the starting point where feelings, experiences about the arts, technology and cultural activities were gleaned, providing a better understanding of people’s preferences in shaping the work that followed.

 

 

Karen

Over the next week you have calls with Karen once or twice a day. Working from home as a freelancer, it soon becomes clear that Karen is slightly chaotic with few boundaries between her personal and professional lives. She contacts you late at night from her bedroom. She overshares and in return is very nosy about you. Before long she is becoming very friendly and wants to speak often. She gets hurt if you don’t call her.

She’s fun and funny, always cheekily pushing her friendliness into new areas. You can decide how open to be and how to handle her inquisitive nature. 

 

 

Storm in a Teacup

 A sensor on Hastings Pier is monitoring waves and transmitting live data over the World Wide Web. Teacups in The Hub connected to this data are exhibiting some rather unusual behaviour in response. Hastings has experienced twelve major storms over the past twelve months, and another could soon be on its way...

Storm in a Teacup

 
 

 

View works from 2017 2016 2015Earlier

 

Mixed Reality Laboratory

University of Nottingham
School of Computer Science
Nottingham, NG8 1BB


telephone: +44 (0) 115 846 6780
email: mrl@cs.nott.ac.uk