Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Design Thinking User Group Feedback

55 views

Published on

Findings from redesigning, prototyping, testing and interviewing users of our Engagement Analytic Software Dashboard. What we learned and how we can move forward.

Published in: Design
  • Be the first to comment

  • Be the first to like this

Design Thinking User Group Feedback

  1. 1. Envisioning Better Action Planning (Taking) The Manager Experience
  2. 2. Proprietary & Confidential 1 Who, what, where? Next Generation Software We’ve been exploring the future of the Modern Survey platform. Interacting with prototypes for new dashboards and action planning concepts that will influence the design and direction. Designing for the Future of Employee Listening HP, Swiss Re, Merck, Walgreens, McKesson, McKesson, Marriott + (multiple Aon managers) Aon Client User Group In-Person Session Thursday, September 14th New York, NY
  3. 3. Proprietary & Confidential 2 Why? As much as we think our decisions matter. It’s our users opinions and needs that really count. You are not your user!
  4. 4. Proprietary & Confidential 3 How? Sketch Wireframe Prototype Getting the design right, and the right design. Anyone can design a pretty app, but if it doesn’t solve a real need in an enjoyable way. Nobody’s going to use it. Involving our clients in the design process will lead to better applications built to solve their needs, give them a sense of ownership, and hopefully make them feel like valued partners. The value of prototyping TEST TEST TEST
  5. 5. Proprietary & Confidential 4 The Manager Experience with Action Planning Usability testing lets the design and development teams identify problems before they are coded. The earlier issues are identified and fixed, the less expensive the fixes will be in terms of both dev hours and possible impact to the roadmap.
  6. 6. Proprietary & Confidential 5 What were key insights and takeaways from the meeting? • Overall the feedback was positive. We are headed in the right direction. • Recommending actions based on our best practices was appreciated. • Need better and consistent language. Icons, labeling, and explanations. • Users really seemed to like the idea of the COACH as long as it focuses on insights into their scores and guiding them to smart actions. • Users liked adding actions from the dashboard and the insights provided. • Dashboard needs to be even lighter still. • Need more handholding in selecting actions and understanding Engagement. • Other managers recommendations on actions is appreciated. • Users just want to know what their priorities are. • Find the right balance of just enough info to select the right action. • Platform must be clean, informative, intuitive and able to answer questions. • Show the right information to the right people at the right time. • We should test much more and more often!
  7. 7. Proprietary & Confidential 6 What worked? • Directly add actions from dashboard • Engagement Front and Center • Likes the journey of insight into action. • Soft and inviting design. Generally Intuitive • Likes the info in the coach. Felt personal, not just another KPI • Like the section title as questions posed. Such as “Focus my energy” • Inclined to share because of positive verbiage • Amazon like ratings. Visual language I know • Comfortable scrolling through screen. • Like the coach walkthrough, Coach wizard is recognizable as a feature. • Feels like a next generation app
  8. 8. Proprietary & Confidential 7 What didn’t work? • COACH not immediately recognizable. • A lot displayed. Overwhelming. Going to take a lot of time. Too much reading • I would stop if my score improved. Scroll only if I did poorly • Icons and text language not consistent • Would need to stop, download and talk to team before proceeding • Actions: Not ready to start choosing. • Empty action space was confusing / intimidating • Afraid of “commitment” to actions • Felt constrained by picking just 3 actions • Actions: Still feels complicated overall • To many clicks
  9. 9. Proprietary & Confidential 8 Questions? • Can I get more info on actions? • How do I tie an action to a driver? • Where is the priority coming from? • Why Did my Scores Go Up? • Coach: a lot of translation efforts • How do I find last year’s focus area • Engagement levels? Where are my low hanging fruit (passives) • What does committed mean? • If I assign an action, what if person doesn’t qualify to access the site? • Where can I add my own action? • What does recommended mean? • Actual Actions – how to validate? • How to help manager celebrate the positives? • Not sure if engagement score is good, bad or otherwise • Is this my team’s or my results?
  10. 10. Proprietary & Confidential 9 Ideas? • Link to external or Aon thought leadership • “Share” should crowd source value not create anxiety & exposure • Coach should pop up on log in • Filters: can this float with you down the page as you scroll? • Combine insights with actions on the same screen for reference • Weight one action over another. “If you work on x, the results will be 10 fold compared to other actions” • Go lighter on the dashboard. • Categories of actions for different target groups • Engagement as part of performance management goals. May have 2 systems, admins, logins, etc… • Balance of just enough • Language more observational vs Judemental. “Good job”, vs “this is why score went up” • System should tell me when to get out and talk to my team. • Process more open to team feedback

×