Overview

The Voice Command (internally called VUI) feature is an integral part of dbam.ai’s SQA product (SQA.TW). SQA.TW empowers users to upload documents in various formats, which are then converted by AI into data represented in a knowledge graph. This structured data helps clients detect scope gaps and provides a foundation for generating test cases and automated testing.

https://neo4j.com/graph-visualization-neo4j/

What is a Knowledge Graph?

In knowledge representation and reasoning, a knowledge graph is a knowledge base that uses a graph-structured data model or topology to represent and manage data. It stores interconnected descriptions of entities (e.g., objects, events, abstract concepts), capturing the underlying relationships between these entities.

Skills

UX Design
User Research
Wireframing
Information Architecture
Interaction Design

Outcomes

User Flows
Wireframes

My Role

Product Designer

Timeline

Jun 2024

Objective

While AI-generated knowledge graphs can be complex and may contain misclassifications, user validation remains essential. Given the limitations of navigating intricate knowledge graphs with a mouse, our team opted to enable complementary voice interactions alongside traditional navigation. The goal of this project was to simplify the task of locating and adjusting entities within the knowledge graph using voice commands.

Design Challenges

Multimodal Device Interaction

SQA.TW differs from typical AI chat products as it involves interaction across various devices, such as mice and trackpads, rather than relying solely on voice or text.

Multilayered Feedback and Detailed Interaction

In contrast to basic voice commands, SQA.TW required targeted interaction with graph nodes and lines while providing contextual responses. The challenge was to guide users subtly without detracting from the visual focus on the knowledge graph.

Desk Research

Since there were no direct competitors or examples for multimodal interaction in a similar environment, I researched foundational VUI principles, referencing guidelines from Amazon Alexa, Hey Google, and Siri. Below are design principles extracted from general VUI user behavior and tailored to the specific needs of SQA.TW’s professional users.

General Users

Typical VUI users interact with smart devices using short, context-specific voice commands (e.g., weather updates, music selection, or lighting control).

Core Principles
SQA.TW Users

SQA.TW’s target users, such as PMs or QAs, are tech-savvy and accustomed to handling complex data. They prefer fast, efficient, and precise interactions with multimodal commands (voice, text, and interface).

Core Principles for SQA.TW

Key Findings

Research and observations of QA and developer workflows reveal that while general VUI design principles are broadly applicable, SQA.TW users require a more sophisticated focus on visual hierarchy and efficient voice command responsiveness. Effective voice interactions in this context rely on:

Design Highlight

Concept: Adaptive and Lightweight Widget

The interface consists of modular components responsive to varied device screens, with visual feedback and animations that keep users focused on the main interface. Compact widgets allow users to navigate effectively without compromising visual attention on the knowledge graph.

Voice Command Widget

Rather than traditional overlays, we used a compact, fixed-position widget that prompts user actions through color changes and brief feedback messages, adaptable across both desktop and mobile.

Command Reference

An on-screen button enables quick reference to available commands, preserving the prominence of the knowledge graph.

Streamlined Feedback and Messaging

A compact message box offers targeted guidance without distracting from the knowledge graph.

Extended Interactions

Complex commands are supported with additional modal elements, enabling users to proceed smoothly through multi-step tasks.

Next Step Prompts

The message box dynamically displays prompts for the user’s next possible actions, helping maintain interaction flow.

Design Validation

After defining these interaction patterns, I conducted an initial cognitive walkthrough within the team to validate the task flow against user expectations. Feedback from three test users was positive, and the design approach was approved for MVP development.

Reflection

Designing for VUI demands a unique focus on interaction flow over interface visuals. Given the early MVP stage, flexibility was prioritized to accommodate potential future needs. This project was both challenging and rewarding. Given another opportunity, I would conduct more rigorous user research, including external interviews, to capture deeper user insights.