Reporting experience for presenting results
Role: Lead Product Designer for Analyze
Project: Analyze Reporting Experience
Ultimately people want to share the analysis of their survey results. We want our customers to be able to customize the data and present their results however they like whether it’s adding images, text or having control over the placement of the data.
We have had a shared results web page, but it’s limited in that the user cannot modify it. We heard loud and clear that even something as simple as adding a logo, text annotation or applying brand colors would go along way for our customers.
I adopted the data analysis specific modes of The Explorer vs The Presenter to build empathy for this user experience. In addition to these proto-personas, I leveraged our more company wide personas ranging from advanced users who are comfortable with data to users who are less comfortable with data, but use the app regularly to prepare survey results.
User testing part I
I started by developing hypothesis around what the users expectations might be once they landed into the reporting experience. I created three low fidelity prototypes in Axure that would be used in a first round of user testing to provide feedback for customer preference around reports.
Design Studio workshops
Next I led a series of workshops, based on the Design Studio method of diverge/converge, to get ideas that would inform a mid fidelity design. A notable result was an interaction pattern that would allow the user to drill down into a chart for a more focused experience with the data.
User testing part II
Next I designed a mid fidelity prototype using Sketch and InVision as a ‘Hero Flow’ that walked us through the user journey from entry points into the reporting experience, creating the report, publishing the report, and receiving a shared report.
My User Research partner, Nick Inchausti, set up a moderated, remote user test with the goal to understand user expectations for creating reports, adding data to a report, customizing a report, accessing a shared report and the naming of the experience.
Shown here is a flow diagram from which the Hero Flow was based.
Designing for V1
The results of this recent user test informed my next iteration on the design. I set out to strategize my approach to designing for all aspects of the reporting experience and interaction. This became the structure of my design specifications. This included, but was not limited to:
- Top level flows for entering, exiting and editing a report
- Grid behavior with motion animation within the report creator
- Adaptive charts within a responsive grid
- A drill down interaction model with motion animation (based on our builder design principles)
- Report creation ‘micro’ interactions with toolbars (scaling functionality across multiple versions )
Shown below are some general design artifacts:
Higher fidelity detail can be provided in person as this design is currently being built and as such I have kept this case study intentionally general for now.
I continue to lead the design for this reporting experience and some key areas that I’m currently working through are:
- Scaling the functionality of tools and toolbars, designing for ‘micro’ interactions when creating a report
- Placing objects on the builder canvas as both free form and within a grid
- Adaptive data visualization on a responsive grid
- Multiple access points for navigation back to the analyze area ie moving back and forth between the Presenter and the Explorer personas
- A survey agnostic experience, and alternative approach
To date this has been one of the most fulfilling projects that I have designed for. I have the honor to work with some of the best people I’ve ever worked with and I’m grateful to my product team and fellow designers contributing visual and motion designs as well as our user researcher who works tirelessly to help me (in)/validate my design assumptions.