vEviRemote control for a variably-configured paired device.
As a freelance designer, I helped the vEvi team to move from initial internal designs to a prototyped and ready-to-develop concept.
A polished and prototyped final iteration, and two preceding iterative rounds.
Prototyping, UX Design, UI Design, Visual Design
The project started with a hands-on visual exploration, including the project stakeholders. We took an hour to review existing work and explore basic visual treatments, with minimal UX updates. Beyond visual work, we replaced the Bluetooth device’s battery indicator with a printed battery level in the app’s header, so as not to confuse the Bluetooth device’s battery with the iPhone’s. Finally, at this minimal scope, we replaced the drawer navigation with a tab bar, added a navigation bar button to open Settings, moved the preset selection from the slider-knobs to their own tab, and then started using standard tap areas. We agreed to maintain the color motifs: teal for clockwise motion, red for counterclockwise, and increasingly gray as speed approached zero.
Following this initial round, we received updates regarding the paired device. It would be sold in multiple configurations with up to five motors, and the app needed to allow control of anywhere from two to five, with potential to later extend. Furthermore, while motors did not need to be controlled simultaneously—while one is being actively modified by the user, the others operate at their previously-set value—the user needed to be able to operate the app without looking at their phone. Given that the user would be focused on the paired device, the app needed to allow motors to be controlled with minimal visual attention.
I proposed three solutions. Both maintained the slider-like interaction of the internal versions, but used the entire screen as a gestural area. Additionally, rather than using numeric values or small amounts of color to indicate motor state, the entire screen was colored so that the user could quickly confirm motor settings at only a glance, or in low-light conditions, light cast from their phone's screen.
This solution paged between motors, triggered by screen-edge gesture. Motors were controlled by a touch starting within the bounds of the control wheel, minus the large "lock" control, while allowing for panning across the diameter. Taps at the controls horizontal extremes triggered the highest acceleration, while taps and pans within the diameter had a proportionately lower and directional response. Tapping on the inner circle would lock the motor to its current value, and a second tap was required to unlock motor.
While this offered the simplest control and the pager’s behavior was ideal for switching motors, it did not offer an easily visible cue as to which motor was selected. We later learned that this was of lesser value, given feedback would first be observed through the paired device's behavior.
This option, while a throwaway, helped to prove the efficacy of the pager and wheel control. This design controlled motored through concentric dials with a limited range of motion, requiring visual acuity to discern a motor's value, and being unable to scale reasonably beyond a couple of motors.
A month after being shown different potential solutions, the client decided to proceed with the pager model. After producing an interactive prototype for further testing, I delivered the necessary materials for the client’s internal team to continue in this new direction.
We went ahead with a concept from Round 2 in which motors are paged between, with acceleration value and directionality manipulated through the clickwheel-like interface. Speed is communicated, redundantly with the physical device, by the screen's color.