The VWG approached Somo to help digitise their internal processes, it was found that although they excelled with data gathering, not much was done with the collected information. The brief was initially to create an Alexa skill to access the sales and registration figures of each car centre.
Based on insights generated from observing and testing home assistance voice control, I designed a mocked up a visual interface that went with the voice prototype, to guide the user down the journey by displaying possible options and progress to the journey.
Context of use
This is an initial prototype for a voice input system to be used during meetings at the VWG office.
As this is an early stage prototype, the assumptions are that the users will be focused on the screen at all times, and there is no touch input in the meeting room screens.
A Hybrid Voice Assistant Graphical Interface (GVUI): Early Prototype
Voice input is faster and more intuitive when the interaction between the user and the system is linear, short and binary, such as turning on the lights or asking for currency exchanges.
However, once the journey branches out or if the interaction has many steps, a voice-only UI will not suffice, even when machine learning is used to predict user intent.
This is the prototype of a GVUI a hybrid voice and graphical UI, to help users navigate through more complex and non-linear journey.