• Share
  • Email
  • Embed
  • Like
  • Private Content
Formative Usability Testing in Agile: Piloting New Techniques at Autodesk
 

Formative Usability Testing in Agile: Piloting New Techniques at Autodesk

on

  • 1,310 views

UX experts from Autodesk discuss new techniques of formative usability testing piloted by the AutoCAD UX group in their agile user-centered design process. ...

UX experts from Autodesk discuss new techniques of formative usability testing piloted by the AutoCAD UX group in their agile user-centered design process.
You can view the entire webinar here: http://goo.gl/C4uT9

Statistics

Views

Total Views
1,310
Views on SlideShare
705
Embed Views
605

Actions

Likes
4
Downloads
5
Comments
0

2 Embeds 605

http://info.userzoom.com 588
https://internal.autodesk360beta.com 17

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Alfonso speak.
  • For those who aren’t familiar with Autodesk and AutoCAD, I’ll begin by giving a brief overview on what it is that we do, and how we moved from a waterfall-based development approach to Agile last year. What you’ll learn about, is how the AutoCAD User Research adapted to this move by forming feature-specific Customer Councils that were leveraged at different stages of the Agile development process:Both:At the start of a Sprint, when testing the design before implementationAnd:At the end of a Sprint, when testing features that have been implemented You’ll also learn about:How to recruit and plan for a Customer CouncilHow to get the most out of a CouncilAnd some of our other learnings around what success might look like
  • I’ll go ahead and get started by setting the context for our discussion. Olivia and I work in the AutoCAD group at Autodesk, on the User Experience team led by JoAnna Cook. Autodesk is a multinational software company that focuses on 2D and 3D design software for use in a variety of industries like the architecture, construction, manufacturing, and entertainment. One of Autodesk’s flagship products is AutoCAD, which is a software application for computer-aided design (CAD) and drafting. I believe the first version came out in 1982, and is has generally been released on an annual cycle. There are other groups at Autodesk that have adopted Agile, but today we’re talking specifically about the AutoCAD group. We have plans to transition most of our software into cloud services over the next few years, and in addition to desktop AutoCAD products we also have web and mobile applications.  But today, we’ll talk mostly about how we used Customer Councils to get user feedback on features in the desktop application.
  • The introduction of Agile into the AutoCAD development process represented a really big change for our organization, because the annual release cycle had trained us to do work sequentially, where we moved from planning to development, then to deployment and testing. The challenge with this approach is that the main focus of user testing comes at the completion of development, when it’s often too late to make any significant changes to the design or implementation of a given feature. When the AutoCAD group moved to Agile last year, our research team re-structured the customer feedback process to be more responsive and flexible to customer needs and user feedback.  By validating the quality and value of our releases on a per-Sprint rather than yearly basis, we were able to: Receive feedback earlier, in the context of users own environments and drawingsMake course corrections after each sprint based on customer feedback
  • Here’s a quick look at the process overview. At the bottom of this timeline, you’ll see that the Agile process starts at the initiative and release planning phases, where the AutoCAD team might identify three or four new features that we’d like to roll out in the next version. Some examples might be new Point Cloud functionality or new ways to share and collaborate on drawings saved in the Cloud. It ends at the release beta, when we are primarily testing for bugs. In between these two phases, we conduct iterative concept and usability testing. Before each Sprint, we want to test the design to understand whether we are building the right thing, and at the end of the Sprint we want to test the working product to see if it works as expected.  In order to understand the scope of this research, I want to briefly explain how we developed the minimum viable experience that we would be testing in these two phases of research.
  • The AutoCAD group defines minimum viable experience – or MVX – as the smallest usable and sellable set of functionality that provides value to the user. We use a story-mapping process to understand and prioritize user needs for each feature area, and draw cut-lines across scenarios to form meaningful chunks. This is done separately for each feature, such as Point Clouds or the Design Feed feature (which is a social feed that allows users to communicate and collaborate on designs shared in the Cloud). One example of an MVX for the Point Clouds feature might be the ability to create and attach a point cloud to an AutoCAD drawing. You might not be able to edit it just yet, but the MVX is just that you can create and attach it.
  • The AutoCAD UX team set experience goals that define the Minimum Viable Experience (MVX) for the MVS. To get user feedback on the MVX for AutoCAD features, we developed feature-specific Customer Councils. We used a product called Centercode to develop online communities for each of these Councils – and this is where we posted announcements, feature videos, tasks, and satisfaction surveys. Customer Councils were like mini-betas for each feature area, and each Council had about 50-75 users in it to start. The idea was the users would be highly engaged because they were recruited for each Council based specifically on these needs and interests. Participants committed to testing a new build every 1-2 sprints, and completing key tasks and satisfaction surveys. They were not compensated for their participation in the Councils, but if they participated in 1:1 feedback sessions, then we did offer our standard research compensation. The reason we chose NOT to compensate for the Council participation is because we wanted to emphasize the relationship and partnership rather than making one-off transaction-based interactions.
  • Most of the AutoCAD research we do is remote, and we recruited users from all over the world to participate in the Customer Councils.  In order to build trust and develop a collaborative relationship with users, we posted pictures and bios of our team on the online communities, as well as posting those of the Council participants as well. Putting a face to a name really helped.
  • Once we move past the planning stage, we begin to work on designs – getting customer feedback as we develop and evolve the designs is critical.
  • Testing the design is an iterative process. We use a wide array of tools and techniques to get customer feedback, depending on the stage of the design and what kinds of customer feedback we need.
  • The critical method we use is one-on-one conversations with customers, usually members of our customer councils. These are less formal than structured usability testing sessions, and may touch on several topics. I maintain a list of topics – both immediate design feedback, and items that may be more strategic in nature. For each conversation, I’ll introduce a subset of topics that are immediately relevant.
  • Current environment and needs – we ask people to show us what kinds of work they are doing, and to discuss their needs for things we could be doing better.
  • We often introduce concepts for discussion – in this example, we’re looking at two approaches for a cloud feature. We might use a simple sketch like this to illustrate aspects, then ask the customer to discuss the pros and cons of each approach given their needs.
  • In this example, we developed some more complete concept sketches – representing actual UI, but still sketches – and ask people to discuss the different features they illustrated. By seeing what people said they would use, and how they discussed extending the concepts, we were able to bring their feedback to later feature prioritization.
  • Then, we will get to actual designs that we want to refine based on user feedback. This was a design for a fairly complex filtering system that we wanted to get just right, so we brought design sketches to the conversation and had the customers walkthrough the design.
  • Eventually, we get some kind of real prototype for customers to evaluate. In this example, people were giving us feedback on some new UI chrome elements. They pointed out that there’s a lot of unused space in this example, and for them efficient use of screen space was critical.
  • And finally we may use the time for traditional usability testing. Here, we were looking at this slider – customers found it confusing, so we needed to go back and refine the design.
  • In summary, we learn a lot from these informal conversations. From specific design feedback to longer-term needs and concerns. We also build better working relationships with our customers. Through these conversations, they become more invested in our success as they see their comments reflected in our design changes. Also, our team members learn from the real-world problems of our customers, and they can begin thinking in terms of real people who might be using a feature.
  • At the end of each Sprint, we want to be able to test the code that was just built. We want to know whether the product works as the users expected.
  • To validate the product, we give participants a build of AutoCAD to use on their own machines. This is like a mini 1-2 week betas with customer council participants. At the end of each Sprint, we post a series of tasks that ask Council members to try out new functionality – then they complete satisfaction surveys and engage in discussion forums about potential problems or ideas.  If we determine that task success of most users and user satisfaction meets our standard goal metrics, then MVX is validated for a given scenario and it’s considered “done” from the customer’s point of view. If we determine that task success of most users and user satisfaction does not meet our goals, then we redesign to address usability issues or feature gaps, and re-evaluate in the next Sprint cycle.
  • As I mentioned previously, we used an online product called Centercode to develop online communities for each Customer Council. This is where we posted announcements, feature videos, tasks, and satisfaction surveys. In this example, the Point Cloud customer council started their first set of tasks by asking participants to upload their scan files and try navigating around their Point Cloud scans. Within hours of posting these tasks, users were uploading their own scans and discussing use cases and improvement ideas on the community forums.
  • Another example of a Customer Council task is the Connected Desktop Council, which was focused on a feature called the Design Feed. The Design Feed is a social feed that is attached to a drawing file stored in the cloud, and everyone who has access to that shared drawing can comment in the feed, reply to comments, and even attach photos that are displayed in the feed. The Design Feed assumes that there is collaboration on a shared drawing, and in order to effectively get feedback on the feature, we needed to simulate this type of work situation. Customer Councils are just a vehicle to get feedback, but they are most effective when the tasks are designed with the specific feature in mind. So we ended up dividing users into 11 small groups, and assigning an AutoCAD team member to be a captain for each team. We gave them a shared base drawing of a house and backyard, and challenged them to modify the drawing by collaborating with their team members through the Design Feed. The participants got very creative with this challenge, and many teams built out intricate features like a golf range, alligator pit, and a mother in laws house surrounded by a moat. More importantly though, we were able to get really useful usability feedback on specific aspects of the Design Feed because people were actually using it much like they would on a drawing they were sharing with others.
  • So we’ve been doing this for about a year – what have we learned?
  • In recruiting for the customer councils – we were looking to recruit people both for current initiatives and for a pool for future ones, so we cast our net broadly. We sent an initial email to our existing pools – previous usability participants, user groups, attendees to our annual conference – asking them to respond to a survey if they were interested in helping us with our products.We then figure out, based on product usage, industry, interests, and so forth, who would be a good match for our targeted customer councils. We sent invitations to those selected customers.Response rates varied a lot, depending on how closely we were able to profile-match to the feature. Our point cloud customers are working in an emerging area, and they are really interested and engaged with the topic. We got a 75% response rate! But the connected desktop customer council was much more general, and it was hard to find a precise profile. We got a more typical recruiting response rate – 7%.
  • Once you have identified your customer council members, then you need to plan for the council environment carefully. Know and use the tools you plan to work with – for us, this included our Center Code beta platform for community features, screen sharing tools like GoToMeeting, and any software we were using for remote testing.Plan to have your customers engage immediately. Post your intros and photos, and ask them to post theirs. Ask them to share samples of their work, and start to interact with each other.And make expectations clear up front about how often you will be contacting them and how much time you expect them to spend. We asked for a commitment of 8-10 hours over 2 weeks when evaluating a build, for instance.
  • When you are releasing a build, plan and introduce it with care.Identify key task flows and spell out the steps required. With each build release, we held a webinar where we demoed the release and task steps, had the PM talk about where the product was going, and took general questions. This got people interested in downloading and seeing it right away.Find ways to get people engaged with each other – discussion topics can be useful. And healthy competition, like with the Backyard challenge for connected desktop.And use your tools to track feedback and comments closely once your release is out there. By monitoring the early comments, you can catch problems and issues early and help mitigate them in order to get useful feedback.
  • To get the most out of a customer council overall, try to be creative about the ways in which you get design feedback – look at informal methods and early concept testing.Also, make sure you focus on very targeted features and workflows. What will users really care about?Make sure you ask questions in between build releases. This gets you additional data, and helps the council members stay engaged.And be realistic about how frequently you can post code releases. It can be a big commitment for you and for them, and you want to make sure you don’t burn out your council members by asking for too much time from them at once.
  • How do you know when you are successful?And that’s all!
  • Any questions?

Formative Usability Testing in Agile: Piloting New Techniques at Autodesk Formative Usability Testing in Agile: Piloting New Techniques at Autodesk Presentation Transcript

  • Webinar: Formative Usability Testing in Agile: Piloting New Techniques at Autodesk Eunice Chang, Autodesk Olivia Williamson, Autodesk #uzwebinar
  • Speakers: Eunice Chang Senior Principal User Researcher Autodesk Speaker Olivia Williamson Principal User Experience Designer Autodesk Speaker Alfonso de la Nuez Co-Founder and Co-CEO UserZoom Moderator
  • Quick Housekeeping • Chat box is available if you have any questions • There will be time for Q&A at the end • We will be recording the webinar for future viewing • All attendees will receive a copy of the slides/recording • Twitter hashtag: #uzwebinar www.userzoom.com
  • About UserZoom  All-in-one Enterprise software solution that helps Businesses costeffectively test, measure and improve UX over websites & mobile apps.  Offer online or remote user research & testing solutions, saving UXers time, money, effort, and a lot of actionable insights  UX Consultants since ‟01, SaaS since „09 Product Suite:  Unmoderated Remote Usability Testing  In Sunnyvale (CA) Manchester (UK), Munich (DE) and Barcelona (Spain)  Remote Mobile Usability Testing  Online Surveys (web & mobile)  Online Card Sorting  90% renewal rate, 50% revenue growth rate in the last 3 years  Tree Testing  Screenshot Click Testing  Screenshot Timeout Testing (5-sec test)  Web VOC  Mobile VOC www.userzoom.com
  • Agenda • Brief introduction to Autodesk and AutoCAD • AutoCAD‟s move from waterfall to Agile development • User feedback through Customer councils • Testing the design • Testing the product • Learnings www.userzoom.com
  • © 2013 Autodesk
  • Introduction of Agile into the AutoCAD development process  Re-structured customer feedback process to meet Agile objectives:  Become more responsive and flexible to changing customer needs & market demands  Create higher quality software with equally high levels of customer delight © 2013 Autodesk
  • Customer Feedback / Process Overview Process overview 8 © 2013 Autodesk
  • Minimum viable solution (MVS) Story-mapping process Smallest usable and sellable set of functionality that provides value to the user
  • What are customer councils? Feature-based Customer Councils  UX team sets experience goals that define minimum viable experience (MVX)  “Mini betas” for each specific initiative (reality capture, web and mobile, etc.)  50+ customers  Committed to testing new build every 1 -2 sprints and completing key tasks and satisfaction surveys  No monetary compensation © 2013 Autodesk
  • What a community Buildingare customer councils? © 2013 Autodesk
  • Testing the design: Are we building the right thing? © 2013 Autodesk
  • Customer Feedback / Design validation Testing the design: Customer feedback When How • One or two sprints before design is scheduled to be implemented • One-on-one conversations • Refine designs to address • Remote user testing usability issues • Card sorting • Update or add new user • Concept sketches stories and scenarios • Prototypes • Add additional detail to future feature planning 13 © 2013 Autodesk Outcomes
  • Testing the design: Conversations  Less formal approach  Ongoing customer engagement  Maintain a topic list  Balance of strategic and immediate  Choose items that are relevant at the moment  Use a variety of tools and techniques © 2013 Autodesk
  • What we talk about … current environment and needs … can’t be exported to Excel © 2013 Autodesk
  • What we talk about … concepts … © 2013 Autodesk
  • What we talk about … feature prioritization … © 2013 Autodesk
  • What we talk about … design sketches … © 2013 Autodesk
  • What we talk about … prototypes … © 2013 Autodesk
  • What we talk about … usability … © 2013 Autodesk
  • Benefits design: What we gain Testing theof Design Validation  Immediate: specific input that affects current design decisions  Ongoing: more detail on the customer‟s environment and application needs  Longterm: working relationships with customers  get customers invested in our success  help our team gain a deeper understanding of the customer‟s world © 2013 Autodesk
  • Testing the product: Does this work as expected? © 2013 Autodesk
  • Testing the product: customer feedback Evaluation of MVX of scenario(s) with customer councils When How • After each • Beta-style user feedback with sprint (or most customer council that uses build sprints, product with their environment and owner to decide drawings readiness) • Customers validate Minimum Viable Experience (MVX) by completing tasks and satisfaction surveys 23 © 2013 Autodesk Outcomes • Validated scenarios deemed “CustomerDone” or not • Prioritized usability issues for feature backlog to guide feature direction
  • Immediate Enthusiasm: Point Cloud Council Immediate enthusiasm: Point Cloud Council  Within days of launching the council, customers are: Posting their own Point Cloud scans Completing tasks Discussing use cases and improvement ideas © 2013 Autodesk
  • Case study: Connected Desktop Council  11 teams participated in the “backyard challenge” © 2013 Autodesk
  • Learnings © 2013 Autodesk
  • Recruiting customer councils Recruiting forfor Customer Councils  Draw on existing pools of customers  Recruit broadly  Define councils by features and topics of interest  Invite targeted customers  Your response rates will vary! © 2013 Autodesk
  • Planning your Customer Council Planning your customer council  Know and use your tools  Engage immediately  Make expectations clear © 2013 Autodesk
  • Planning your MVX testing cycle  Identify task flows  Introduce the release with care  Get people engaged with each other  Track task completion and feedback closely © 2013 Autodesk
  • Recruiting for out of a customer council Getting the most Customer Councils  Look beyond traditional usability testing  Target specific applications and workflows  Ask questions in between builds!  Be realistic about length of customer feedback cycles © 2013 Autodesk
  • Customer Councils: What success looks like  Customers see the results of their design and testing feedback  Design input is gathered early and often  Built features are evaluated in a realistic environment  Design and usability issues treated as seriously as code defects  MVX validated through real customer usage and feedback  Features included in the release only when they meet MVX targets 31 © 2013 Autodesk
  • Q&A