Nupura Ketkale's profile

Interactive Musical Museum: A UX research project

This project: 

This project is a part of the thesis project at Newcastle University, collaborating with the Great North Hancock Museum. The musical section of the museum has been developing interesting ways to engage visitors with the museum objects. The musical instruments in the museum are kept behind glass cases which is a huge turn-off for visitors because they can’t interact with them. And exploring the playability of the instrument is the primary aspect of any musical instrument. This project helps in engaging the visitors in the musical section of the museum by allowing them to interact with the instruments through digital displays.
 
Duration: 3 months

Methods: Observational study, Interviews, Thematic analysis, Bingo selection activity, Think aloud testing, wizard-of-oz technique, feedback capture grid.

Tools used: Figma, Adobe Illustrator, Miro​​​​​​​
 
Design Process: 
I followed the double-diamond design process for this project.
Finding the right problem:

An observational study at the Great North Hancock Museum: Helped me in understanding how visitors interact with museum objects and their behaviour around the digital screens. I recorded the movement of people using an observational scale based on Judy Diamond's chapter on observational tools, with four categories for visitor engagement:
 
Ignore (I): Visitors who pass within two meters of a digital display but fail to stop
Skim (S): Visitors who look briefly at a digital display but fail to stop
Attend (A): Visitors who stop briefly with both feet for five seconds
Engage (E): Visitors who stop and actively read information or interact with digital displays.  
Thematic analysis:
To make sense of all the data captured from observational study and interviews, I designed themes using thematic analysis method. The thematic analysis is nothing but studying the raw data (which in this case interview transcripts), finding repeatative patterns and clustering them together within a theme. Each theme represents a seperate problem or current state of interaction in the museum. 
Finding the right solution:

In the second phase i.e. finding the right solution, I created personas and scenarios based on the data gathered from the research phase. I made two personas. The right side image represents the perspective of a musician who wants to play the instrument and needs an engaging experience. While left side image focuses on the non-musician visitor who is interested in listening to the music of the instruments. 

Personas:
Scenarios:

With the user group and their goals and frustrations in mind, I created two scenarios for my product in the museum environment. The first scenario (right side) shows the visitors’ interaction without my design intervention in the museum, while the second scenario (left side) focuses on visitors’ interaction with my design intervention. 
Low-fidelity prototype:

After designing the personas and scenarios, and the data gathered during observational study, thematic analysis, and interviews with museum designers, I designed low-fidelity prototypes.
From the findings in the research phase, the three main features of the product are hearing the selected instrument, watching a performance of it, and playing the instrument by themselves

With these features, I created paper prototypes by adding and removing some extra features to gain more functionality in the product.



Bingo selection activity:

Mid Fidelity Prototype:

Later from the findings, I created mid-fidelity prototypes in Figma.

The three main features are Seeing the instrument getting played in performance, Hearing the instrument's sound, Play the instrument by yourself. Additionally, I added the shared video and soundtrack functionality. 

Based on the bingo selection activity, the highest voted elements or prototypes are used with a combination.
Wizard of Oz method:

I used the wizard of oz method to test my product with the users. One of the important things while playing an instrument is "to feel it", meaning feeling the vibrations. For generating the vibrations I did some trial experiments like putting a Bluetooth speaker under or behind the tablet, using Web APIs such as Navigator. vibrate(), etc. But they all had some limitations which were not suitable for this project. Because I needed the vibrations to generate at the same time when the users touch the instrument string. So, I went for the simplest solution of using two phones (one to send a message and one to receive the message and generate vibration). I used another phone to play the respective sound notes when strings are touched. 

But only vibrations were not enough to give the feeling of touching the instrument. So I animated the vibrating strings in Figma. This was the setup for the wizard of oz method.
Testing phase:

For testing, I decided to conduct two tests, one for usability testing and the other one for user experience testing. For the usability testing, I used think aloud protocol, in which users are asked to talk aloud about what they doing and thinking while interacting with the prototype. The participants were given a task sheet (left side image) with 4 tasks to complete in any order that they like. This testing method helped me to ask more questions about the interaction and would also help me in gaining more insights from users as they speak about what they are doing. 
I also performed A/B testing on the product to check whether the users liked the interaction with and without the vibrations.

After the tasks are completed, the users were asked some questions about their experience with the product (right side image). The post-test questionnaire helped me in gaining insights about their true experience with the product.

To gather all the data captured during the testing phase, I created a feedback capture grid.

Feedback capture grid:
Final prototype:
Limitations & future work:

1. Wider audience age range should be considered.

2. Software limitation for integrating vibrations and sound generation.

3. Further expert evaluation and iterations are needed.
 
4. Including VR and AR technology rather than using digital interfaces.
Interactive Musical Museum: A UX research project
Published:

Owner

Interactive Musical Museum: A UX research project

Published: