PianoBoard | Learning Piano in AR

Project in Progress..

The project is done in collaboration with Serge Shack (concept development, technical side of the project)

___

[PianoBoard] is a mobile AR experience that teaches one how to play piano. Using the VR headset with the passthrough camera view, the player would see the sequence of AR keys ‘falling’ onto the keyboard in a certain order and rhythm to let them practice piano.

I was tasked to conduct a usability testing for the app, and improve its UI. The results of the testing proved the concept was of a great interest, however, there were a number of problems highlighted in both on-boarding and in-game experiences.

The main goals of the testing were to find out users’ main pain points, as well as expectations for how to use the app and where to find the content. There were a few technical specifications that were known to be of inconvenience to the users, but would have to be left in the app – such as printing out the marker images.

On-boarding / Pre – Game Experience

Original solution:

Problems:
1. Unclear order of actions.
2. Unclear which parts of Home screen UI are interactive.
3. Poor UI organisation.
4. There are 2 different scenarios of onboarding users, and a scenario for the returning users, none of which are shown in the original UI.

To solve the problem, there were suggested 3 scenarios: for the new user who’s playing an analogue piano, for the new user who is playing a digital piano with a MIDI output, and for the returning user. Each of the scenarios provides a clear path and gives clear instructions as for what the user is meant to do to start practicing. The interactive prototype for the on-boarding experience can be viewed [here].

In-Game Experience

After on-boarding, the in-game experience offers a user to place their phone in the VR headset with the passthrough camera view. User is then asked to calibrate the AR piano and pick a melody to practice, after which the sequence of AR keys would ‘fall’ onto the keyboard in a certain order and rhythm to let them practice piano.

After playtesting the experience on a number of users, there were a few common problems highlighted:
– Information architechture seemed rather complex to navigate.
– Labeling wasn’t clear enough.
– In some instances, labels were too small to be able to read.
– Users would have to move their head closer and further, up and down, to see buttons and labeling on the buttons.
– While playing, users would accidentally trigger buttons by gaze.
– Experience would offer to switch between the VR view and the 2D view, depending on the buttons clicked.
– The Menu icon was too small, while it was surrounded by 4 blocks that are meaningless and are not interactive.

Challenges:
1. To play piano and not have Menu items triggered by gaze.
2. To have Menu items visible, readable, and of an accessible size, shape and tilt.
3. Simplify interactions for controllers like adjusting speed, timeline, and volume.

Based on new Information Architecture map and challenges, there was created a 2D interactive prototype:

Usability testing suggested to also include a success rate stats after the song was played with the number of mistakes done, and the history of played songs with scoring.

Voice Activated Controls

The pre-game and in-game experiences were validated to be intuitive enough, however, gaze controls slowed down the experience significantly.

To simplify interaction within an app we suggested to implement voice activated controls. Similarly to [Voice Activated Chat], we would use template matching recognition method – for the trigger word and commands listed in Menu.

The trigger word would simply be “PianoBoard”, while the list of commands should include:
Open Menu
Close Menu
Speed Up
Speed Down
Volume Up
Volume Down
Callibration
Depth Up
Depth Down
Maximum Depth
Minimum Depth
Height Up
Height Down
Maximum Height
Minimum Height
Song Library
Next
Previous
Play
Pause
etc.

___

Project in Progress..

Date

February 13, 2018

Category

VR Design