Mixed Reality Laboratory

Talk by Tommy Nilsson and Richard Ramchurn

 
Location
Mixed Reality Lab Meeting Space
Date(s)
Friday 4th May 2018 (12:00-13:00)
Description

Tommy and Richard will give short talks to the lab this week.

Tommy Nilsson – Contra-vision video scenarios in user research

Scenarios are stories about people and technologies. In the context of usability research they can be thought of as "soft" prototypes that expose not only the functionality of a proposed system but also specific claims about the user experience, while making these available to potential end-users for assessment. A vivid scenario of a novel concept can in this sense promote innovative thinking and help raise relevant questions early on in a design process.

Although generally popular, the use of scenarios is also facing its own fair share of criticism. Most notably, scenarios have been accused of forcing the designer’s own vision down on prospective users, without adequately taking into account concerns of the general public. This has led some scholars to suggest that achieving “socially legitimate” scenarios can only be done through a bottom-up approach in the form of broader user engagement.

To address this problem, we have developed a pair of contradicting video scenarios, each envisioning a radically different approach to system design. Neither of these “contra-visions” is right, nor is it meant to be. Rather, both represent polar extremes which are designed to provoke user’s reaction and reflection. Through a series of focus groups, we have asked members of the general public to discuss these scenarios and ultimately to converge on a middle ground by developing a scenario of their own.

In my lab talk I will give a brief overview of this methodology and cover some of the challenges and opportunities that we came across while relying on contra-vision video scenarios in user research.

Richard Ramchurn – The MOMENT: A New Brain-Controlled Movie

While many still consider interactive movies an unrealistic idea, current delivery platforms like Netflix, commercial VR, and the proliferation of wearable sensors mean that adaptive and responsive entertainment experiences are an immediate reality. Our prior work demonstrated a brain-responsive movie that showed different views of scenes depending on levels of attention and meditation produced by a commercialized home-entertainment brain sensor. Based on lessons learned, this demonstration exhibits the new interactions designed for our new brain-controlled movie, The MOMENT, being released in 2018.

Mixed Reality Laboratory

University of Nottingham
School of Computer Science
Nottingham, NG8 1BB


email: mrl@cs.nott.ac.uk