Your SlideShare is downloading. ×
STPCon Fall 2012
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

STPCon Fall 2012

350

Published on

Conference Brochure

Conference Brochure

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
350
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. O cto b e r 1 5 - 1 8 M i a m i , f lo r i da The Leading Conference C O N F E R E N C E F A L L 2 01 2 T h e L e ad i n g C o n f e r e n c e o n so f t w a r e t e st i n g w w w. S T P C o n . c o m 8 7 7. 2 57. 9 5 3 1 Conference Tracks Leadership Test Strategy, Agile Performance Test Mobile Strategic Hands-OnPerspectives Process and Testing Testing Automation Application Business Practicals for Testers Design Testing Alignment
  • 2. C O N F E R E N C E & E X P O 2 0 12 Coast to CoastRegistration Conference Packages and Pricing Main Conference PackageI nformation Dates: Wednesday, October 16 – Friday, October 18 Price: 1,295.00 on or before September 7 OR $Discounts $1,695.00 after September 7Early Bird Discount: Register on or before September 7 4 n Exceptional Keynotesto receive $400.00 off any full conference package n 0 5 Conference Breakout Sessions in 8 Comprehensive TracksTeam Discounts n omprehensive Online Web Access to All C Number of Rate Before Rate After Conference Sessions and Keynotes Attendees Early Bird (9/7) Early Bird (9/7) n etworking Activities Including: Speed Geeking N 1-2 $1,295 $1,695 Sessions; Wrap-Up Roundtables and Sponsored Topic Roundtable Discussions 3-5 $1,195 $1,395 n op Industry Vendors and Participation in T 6-9 $1,095 $1,295 the Sponsor Prize Giveaway n hree Breakfasts, Two Lunches and T 10-14 $995 $1,195 Happy Hour Welcome Reception 15+ $895 $1,095 *Price above is for access to the Main Conference Package. Main Conference Package PLUS 1-DayPlease add $400 to your registration if you are registering for Dates: Tuesday, October 15 – Friday, October 18 the Main Conference Package PLUS 1-Day. Price: 1,695.00 on or before September 7 OR $**Team discounts are not combinable with any other discounts/offers. $2,095.00 after September 7Teams must be from the same company and should be submittedinto the online registration system on the same day. n ncludes I ALL of the Main Conference Package Offerings n ncludes choice of a 1-day pre-conference workshop I n ncludes a pre-conference breakfast and lunch Iregister www.STPCon. com or call 877.257.9531Page 2
  • 3. Coast to Coast Conferences Package (For 6 or More Registrations Only – Book Now and Choose Between 2 Conferences) Dates: Software Test Professionals Conference Fall 2012 (October 16-18, 2012) AND Software Test Professionals Conference Spring 2013 (April 22-24, 2013) Price: Take advantage of the team discounts listed below – but choose which conference each team member should attend. Register your entire team at the discounted rates listed on the Team Discount Chart below – and then decide which conference they should attend: n oftware S Test Professionals Conference Fall 2012 in Miami, FL on October 16-18, 2012 OR Table of n oftware S Test Professionals Conference Spring in San Diego, CA on April 22-24, 2013 Contents Conference Hotel Registration...........................................2-3 Keynote Presentations...........................4-5 Schedule at a Glance.............................6-7 Conference Tracks................................8-9 . Pre-Conference Workshops...............10-11 Session Block 1.................................12-13 Session Block 2.................................14-15 Session Block 3.................................16-17 Session Block 4.................................18-19 Session Block 5.................................20-21 Hilton Miami Downtown Session Block 6.................................22-23 1601 Biscayne Boulevard | Miami, FL 33132 Session Block 7.................................24-25A discounted conference rate of $129.00 is available for conference Session Block 8.................................26-27attendees. Call 1-800-HILTONS and reference GROUP CODE: Session Block 9.................................28-29STP2 / Software Test Professionals Conference rate or visitthe event website at www.stpcon.com/hotel to reserve your room Session Block 10...............................30-31online. Rooms are available until September 25 or until they sell out. Don’t miss your opportunity to stay at the conference hotel. Cancellation Policy: All cancellations must be made Platinum Sponsor in writing. You may cancel without penalty until September 14, 2012 after which a $150 cancellation fee will be charged. No-shows and cancellations after September 28, 2012 will be charged the full conference rate. Cancellation policies apply to both conference and pre-conference workshop registrations. Page 3
  • 4. 9 :90: 0 a 3 0 m m– a – 1 0 : 1: 5 a ma 1 0 3 0 mk e y not e sTuesday, 16 October LEE Henson nosneH EEL Chief Agile Enthusiast n AgileDad Testing Agility Leads to Business Fragility It comes as no surprise that with the downfall of the economy in the US, many organizations have made testing an afterthought. Some have scaled testing efforts back so far that the endproduct has suffered more greatly than one can imagine. This is resulting in major consumer frustration and a greatlack of organizational understanding of how best to tackle this issue. Agile does not mean do more with less. As wejourney into a new frontier where smaller is better, less is more, and faster is always the right answer, traditionaltesting efforts have been morphed and picked apart to only include the parts that people want to see and hear.Effective testing can be done in an efficient manner without sacrificing quality at the end of the day. Learn basedon real world scenarios how others have learned about and conquered this issue. Embark on a journey with ‘ElyExecutive’ as he realizes the value and importance of quality to both his internal and external customer. Discover howa host of internal and external players help Ely reach his conclusion and bask in the ‘I can relate to that’ syndromethis scenario presents. This epic adventure will surely prove memorable and will be one you do not want to miss!Lee Henson is currently one of just over 100 Certified Scrum Trainers (CST) worldwide. He is also a ProjectManagement Professional (PMP) and a PMI-Agile Certified Practitioner and has worked as a GUI web developer,quality assurance analyst, automated test engineer, QA Manager, product manager, project manager, ScrumMaster,agile coach, consultant, training professional. Lee is a graduate of the Disney Management Institute and is theauthor of the Definitive Agile Checklist. He publishes the Agile Mentor Newsletter. Wednesday, 17 October jeff Havens snevaH ffej Speaker n Trainer n Author Uncrapify Your Life! This award-winning keynote is a study in exactly what not to do. Promising to give his audiences permission rather than advice, Jeff will ‘encourage’ your team to criticize others and outsource blamebefore bringing it all home with a serious discussion about proper communication, customer service, and accountabilitypractices. Tired of being told what to do. Sick of attending sessions that tell you how to become a better communicatoror a more effective leader, and you definitely don’t want to hear any more garbage about effective change management.Experience something different. You won’t be told what to do. Instead, you will be given permission to do all of the thingsyou’ve always wanted to do – to become the worst person you can possibly be – to be sure that you are conveniently left off of company emails inviting you to social functions. And if you’re not careful, you might actually learn something.Session Takeaways: n How to avoid negative and unproductive conversations n The power of sincere, straightforward communication n How to approach change in order to achieve seamless integrationA Phi Beta Kappa graduate of Vanderbilt University, Jeff Haven began his career as a high school English teacherbefore branching into the world of stand-up comedy, where he worked with some of the brightest lights in Americancomedy and honed the art of engaging audiences through laughter. But his impulse to teach never faded, and soonhe began looking for an avenue to combine both of his passions into entertaining and meaningful presentations.Page 4
  • 5. C O N F E R E N C E F A L L 2 01 2Wednesday, 17 October(This keynote will be offered immediately following lunch from 2:00pm – 2:45pm) Matt Johnston notsnhoJ ttaM CMO n uTest Mobile Market Metrics: Breaking Through the Hype Serving consumers’ voracious appetite for smartphones, tablets, e-readers, gaming consoles, connect TVs and apps – advancements in mobile technology are happening at warp speed.Manufacturers, carriers and app makers have all their chips on the table, launching dozens of unique devices per year,releasing new and improved operating systems, and aligning behind a multitude of browsers and OS standards. It’s adizzying task for test and engineering professionals to keep up with all the changes, let alone figure out which ones needto be supported in their industry, in their company, and in their department. Yet, despite all the fragmentation in today’smobile universe, tech professionals have to make difficult choices. Daunting? Yes. But there’s something better than tealeaves and a crystal ball to take some of the “guess” out of our guesstimations: mobile market trends and statistics.We’ll look at a wide variety of mobile metrics that cut through the hype, comparing the growing (and waning) popularityof different devices, operating systems and related tech that may influence attendees’ testing and development decisions.Rounding out the discussion, we’ll end with forward-looking insights into the most promising, emerging technologies.Matt Johnston leads uTest’s marketing and community efforts as CMO, with more than a decade of marketingexperience at companies ranging from early-stage startups to publicly traded enterprises. He continues to leaduTest’s efforts in shaping the brand, building awareness, generating leads and creating a world-class communityof testers. Matt earned a B.A. in Marketing from Calvin College, as well as an MBA in Marketing Technology fromNew York University’s Stern School of Business. Thursday, 18 October(9:00am – 10:00am) Karen N. Johnson nosnhoJ .N neraK Founder n Software Test Management, Inc. The Discipline Aspect of Software Testing Your mission is to regression test a website for the umpteenth time, preferably with fresh eyes and a thorough review. After all, you’re the one holding up this software from being used bypaying customers. The pressure mounts and yet procrastination takes hold.It’s time, time to roll up your sleeves and finish the work at hand and yet, you just don’t feel like it. How do youdiscipline yourself to get the job done? Discipline. Focus. How do you pull on the reservoirs of these necessary skills?How do we invoke discipline to get the job done? To begin with, you have to admit you have a challenge to overcomeand in this presentation, the reality of needing to be disciplined and focused to get work done, most especially gettingwork done under pressure, will be discussed.Tactics will be shared for getting through stacks of work when you don’t feel inspired. We will look at how to buildrigor and discipline into your practice in software testing. Software testing takes a certain amount of discipline andrigor; it takes the ability to focus and think while frequently under stressful conditions. This presentation will providean honest look at (as well as practical tips) how to build rigor and discipline into your practice in software testing.Karen N. Johnson is an independent software test consultant. She is a frequent speaker at conferences. Karen is a contributingauthor to the book, Beautiful Testing released by O’Reilly publishers. She is the co-founder of the WREST workshop, moreinformation on WREST can be found at: http://www.wrestworkshop.com/Home.html. She has published numerous articles andblogs about her experiences with software testing. You can visit her website at: http://www.karennjohnson.com. Page 5
  • 6. O c t o b e r 1 5 –1 8 , 2 0 1 2Schedule at a glanceMonday, October 15, 20128:00am - 4:00pm Registration Information8:00am - 9:00am Continental Breakfast Networking9:00am - 4:00pm Pre-Conference Workshops Pre-1: Mobile Test Automation (Brad Johnson, Fred Beringer) Pre-2: Testing Metrics: Process, Project, and Product (Rex Black) Pre-3: New World Performance (Mark Tomlinson) Pre-4: Hands-on: Remote Testing for Common Web Application Security Threats (David Rhoades) 10:30am - 11:00am Morning Beverage Break12:30pm - 1:30pm Lunch3:00pm - 3:30pm Afternoon Beverage Break Tuesday, October 16, 20128:00am - 8:00pm Registration Information8:00am - 9:00am Breakfast9:00am - 10:15am General Session: Testing Agility Leads to Business Fragility (Lee Henson)10:30am - 11:45am Session Block 1 101: Preparing the QA Budget, Effort Test Activities – Part 1(Paul Fratellone) 102: Testing with Chaos (James Sivak) 103: A to Z Testing in Production: Industry Leading Techniques to Leverage Big Data for Quality (Seth Elliott) 104: Mobile Test Automation for Enterprise Business Applications (Sreekanth Singaraju) 105: Test Like a Ninja: Hands-On Quick Attacks (Andy Tinkham) 11:45am - 12:00pm Morning Beverage Break12:00pm - 1:15pm Session Block 2 201: Preparing the QA Budget, Effort Test Activities – Part 2 (Paul Fratellone) 202: Software Reliability, the Definitive Measure of Quality (Lia Johnson) 203: CSI: Miami – Solving Application Performance Whodunits (Kerry Field) 204: Building a Solid Foundation for Agile Testing (Robert Walsh) 205: Complement Virtual Test Labs with Service Virtualization (Wayne Ariola) 1:15pm - 2:30pm Lunch Speed Geeking2:30pm - 3:45pm Session Block 3 301: Leading Cultural Change in a Community of Testers (Keith Klain) 302: Software Testing Heuristics Mnemonics (Karen Johnson) 303: Interpreting and Reporting Performance Test Results – Part 1(Dan Downing) 304: Finding the Sweet Spot – Mobile Device Testing Diversity (Sherri Sobanski) 305: The Trouble with Troubleshooting (Brian Gerhardt) 4:00pm - 4:30pm Tools and Trends Showcase4:30pm - 4:45pm Afternoon Beverage Break4:45pm - 6:00pm Session Block 4 401: Don’t Ignore the Man Behind the Curtain (Bradley Baird) 402: Optimizing Modular Test Automation (David Dang) 403: Interpreting and Reporting Performance Test Results – Part 2 (Dan Downing) 404: Agile vs. Fragile: A Disciplined Approach or an Excuse for Chaos (Brian Copeland) 405: Mobile Software Testing Experience (Todd Schultz) 6:00pm - 8:00pm Exhibitor Hours6:00pm - 8:00pm Happy Hour Welcome Receptionregister www.STPCon. com or call 877.257.9531Page 6
  • 7. C O N F E R E N C E F A L L 2 0 12Wednesday, October 17, 20128:00am - 6:00pm Registration Information8:00am - 9:00am Breakfast Sponsor Roundtables9:00am - 10:15am General Session Sponsor Prize Giveaway: Uncrapify Your Life (Jeff Havens)10:30am - 11:45am Session Block 5 501: Building A Successful Test Center of Excellence – Part 1 (Mike Lyles) 502: How to Prevent Defects (Dwight Lamppert) 503: The Testing Renaissance Has Arrived – On an iPad in the Cloud (Brad Johnson) 504: Non-Regression Test Automation – Part 1 (Doug Hoffman) 505: Slow-Motion Performance Analysis (Mark Tomlinson) 11:45am - 12:00pm Morning Beverage Break12:00pm - 1:15pm Session Block 6 601: Building A Successful Test Center of Excellence – Part 2 (Mike Lyles) 602: Aetna Case Study – Model Office (Fariba Marvasti) 603: Application Performance Test Planning Best Practices (Scott Moore) 604: Non-Regression Test Automation – Part 2 (Doug Hoffman) 605: Keeping Up! (Robert Walsh) 1:15pm - 2:45pm Lunch2:00pm - 2:45pm General Session: Mobile Market Metrics: Breaking Through The Hype (Matt Johnston)3:00pm - 4:15pm Session Block 7 701: Technical Debt: A Treasure to Discover and Destroy (Lee Henson) 702: Dependencies Gone Wild: Testing Composite Applications (Wayne Ariola) 703: Top 3 Performance Land Mines and How to Address Them (Andreas Grabner) 704: Building Automation From the Bottom Up, Not the Top Down (Jamie Condit) 705: There Can Only Be One: A Testing Competition – Part 1 (Matt Heusser) 4:15pm - 4:30pm Afternoon Beverage Break4:30pm - 5:45pm Session Block 8 801: Maintaining Quality in a Period of Explosive Growth – A Case Study (Todd Schultz) 802: Testing in the World of Kanban – The Evolution (Carl Shaulis) 803: Refocusing Testing Strategy Within the Context of Product Maturity (Anna Royzman) 804: Mobile Testing: Tools, Techniques Target Devices (Uday Thongai) 805: There Can Only Be One: A Testing Competition – Part 2 (Matt Heusser) Thursday, October 18, 20128:00am - 1:00pm Registration Information8:00am - 9:00am Breakfast Wrap Up Roundtables9:00am - 10:00am General Session: The Discipline Aspect of Software Testing (Karen N. Johnson)10:15am - 11:30am Session Block 9 901: 7 Habits of Highly Effective Testers (Rakesh Ranjan) 902: Advances in Software Testing – A Panel Discussion (Matt Heusser) 903: Performance Testing Metrics and Measures (Mark Tomlinson) 904: How and Where to Invest Your Testing Automation Budget (Sreekanth Singaraju) 905: Memory, Power and Bandwidth – oh My! Mobile Testing Beyond the GUI (JeanAnn Harrison) 11:30am - 11:45am Morning Beverage Break11:45am - 1:00pm Session Block 10 1001: Redefining the Purpose of Software Testing (Joseph Ours) 1002: Evaluating and Improving Usability (Philip Lew) 1003: Real World Performance Testing in Production (Dan Bartow) 1004: Scaling Gracefully and Testing Responsively (Richard Kriheli) 1005: Quick, Easy Useful Performance Testing: No Tools Required (Scott Barber) Page 7
  • 8. Advanced EducationConference tracksLeadership Perspectives for Testers Understanding the business side of testing is as important as amplifying our approaches and techniques. In this track you will learn how to build a testing budget, effectively manage test teams, communicate with stakeholders, and advocate for testing. 101: Preparing the QA Budget, Effort Test Activities – Part 1 201: Preparing the QA Budget, Effort Test Activities – Part 2 401: Don’t Ignore the Man Behind the Curtain 701: Technical Debt: A Treasure to Discover and Destroy 801: Maintaining Quality in a Period of Explosive Growth – A Case Study 901: 7 Habits of Highly Effective Testers 1001: Redefining the Purpose of Software Testing Strategic Business Alignment The best way to ensure that the goals of your project and the organization are met at the end of production, is to make sure they are aligned from the beginning. This will require an ability to effectively lead diverse teams and gain buy-in and agreement throughout the life of the project. This track will offer sessions based on real life experiencesand case studies where true business alignment was achieved resulting in a successful outcome. 202: Software Reliability, The Definitive Measure of Quality 301: Leading Cultural Change in a Community of Testers 501: Building A Successful Test Center of Excellence – Part 1 601: Building a Successful Test Center of Excellence – Part 2 Test Strategy, Process and Design Before you begin testing on a project, your team should have a formal or informal test strategy. There are key elements you need to consider when formulating your test strategy. If not, you may be wasting valuable time, money and resources. In this track you will learn the strategic and practical approaches to software testing and test case design, based on the underlying software development methodology. 102: Testing with Chaos 302: Software Testing Heuristics Mnemonics 502: How to Prevent Defects 602: Aetna Case Study – Model Office 702: Dependencies Gone Wild: Testing Composite Applications 802: Testing in the World of Kanban – The Evolution 902: Advances in Software Testing – A Panel Discussion 1002: Evaluating and Improving Usability Performance Testing Performance Testing is about collecting data on how applications perform to assist the development team and the stakeholders make technical and business decisions related to performance risks. In this track you will learn practical skills, tools, and techniques for planning and executing effective performance tests. This track will includetopics such as: performance testing virtualized systems, performance anti-patterns, how to quantify performance testing risk,all illustrated with practitioners’ actual experiences doing performance testing. 103: A to Z Testing in Production: Industry Leading Techniques to Leverage Big Data for Quality 203: CSI: Miami – Solving Application Performance Whodunits 303: Interpreting and Reporting Performance Test Results – Part 1 403: Interpreting and Reporting Performance Test Results – Part 2 603: Application Performance Test Planning Best Practices 703: Top 3 Performance Land Mines and How to Address Them 903: Performance Testing Metrics and Measures 1003: Real World Performance Testing in ProductionPage 8
  • 9. C O N F E R E N C E F A L L 2 0 12Test Automation Which tests can be automated? What tools and methodology can be used for automating functionality verification? Chances are these are some of the questions you are currently facing from your project manager. In this track you will learn how to implement an automation framework and how to organize test scripts for maintenance and reusability, as well as take away tips on how to make your automation framework more efficient. 205: Complement Virtual Test Labs with Service Virtualization 402: Optimizing Modular Test Automation 504: Non-Regression Test Automation – Part 1 604: Non-Regression Test Automation – Part 2 704: Building Automation From the Bottom Up, Not the Top Down 904: How and Where to Invest Your Testing Automation Budget Agile Testing The Manifesto for Agile Software Development was signed over a decade ago. The Agile framework’s focus on agility is anything but undisciplined. This track will help participants understand how they can fit traditional test practices into an Agile environment as well as explore real-world examples of testing projects and teams in varying degrees of Agile adoption. 204: Building a Solid Foundation For Agile Testing 404: Agile vs Fragile: A Disciplined Approach or an Excuse for Chaos 605: Keeping Up! 803: Refocusing Testing Strategy Within the Context of Product Maturity Mobile Application Testing The rapid expansion of mobile device software is altering the way we exchange information and do business. These days smartphones have been integrated into a growing number of business processes. Developing and testing software for mobile devices presents its own set of challenges. In this track participants will learn mobiletesting techniques from real-world experiences as presented by a selection of industry experts. 104: Mobile Test Automation for Enterprise Business Applications 304: Finding the Sweet Spot – Mobile Device Testing Diversity 503: The Testing Renaissance Has Arrived (on an iPad in the Cloud) 804: Mobile Testing: Tools, Techniques Target Devices 1004: Scaling Gracefully and Testing Responsively Hands-On Practicals This track is uniquely designed to combine the best software testing theories with real-world techniques. Participants will actually learn by doing in this hands-on format simulating realistic test environments. After all, applying techniques learned in a classroom to your individual needs and requirements can be challenging. Seven technical sessions covering a wide range of testing topics will be presented. Participants must bring a laptop computer and power cord. 105: Test Like a Ninja: Hands-On Quick Attacks 305: The Trouble with Troubleshooting 405: Mobile Software Testing Experience 505: Slow-Motion Performance Analysis 705: There Can Be Only One: A Testing Competition (Part 1) 805: There Can Be Only One: A Testing Compettition (Part 2) 905: Memory, Power and Bandwidth – Oh My! Mobile Testing Beyond the GUI Page 9
  • 10. P r e - C o n f e r e n c eworkshops Monday Pre-2 Rex Black, President, RBCS15 October Testing Metrics: Process, Project, and Product9 : 0 0 a m – 4 : 0 0 p m Some of our favorite engagements involve helping clients implement metrics programs for testing. Facts and measures are the foundation of true understanding, but misuse ofPre-1 Fred Beringer, VP, Product Management, SOASTA metrics is the cause of much confusion. How can we use metrics to manage testing? What metrics can we use toBrad Johnson, VP Product and Channel Marketing,SOASTA measure the test process? What metrics can we use to measure our progress in testing a project? What do metricsMobile Test Automation tell us about the quality of the product? In this workhop, Rex will share some things he’s learned about metrics that youIn this full-day workshop, you will leave with the skills can put to work right away, and you’ll work on some practicaland tools to begin implementing your mobile functional exercises to develop metrics for your testing.test automation strategy. SOASTA’s Fred Beringer and Brad Johnson will walk you through the basics: Workshop Outlinen reating C your plan of “What to Test” 1. Introduction 4. Project Metricsn nabling n resentation P E a mobile app to be testable 2. The How and Why of Metricsn apturing your first test cases n ase study C C n resentation Pn etting validations and error conditions n xercise E S n xercise En unning and executing mobile functional test cases R 5. Product Metricsn un and Games: Mobile Test Automation Hackathon! 3. Process Metrics F n resentation P n resentation P n ase study CAttendees will be provided free access and credentials to n ase study C n xercise Ethe CloudTest platform for this session. Tests will be built n xercise Eand executed using attendees’ own IOS or Android mobile 6. Conclusiondevices, so come with your smartphones charged and readyto become part of the STPCon Fall Mobile Device Test Cloud! Learning Objectives n nderstand U the relationship between objectives Brad Johnson joined the front lines of and metrics the Testing Renaissance in 2009 when n or a given objective, create one or more metrics F he signed on with SOASTA to deliver and set goals for those metrics testing on the CloudTest platform to n nderstand the use of metrics for process, project, U a skeptical and established software and product measurement testing market. n reate metrics to measure effectiveness, efficiency, C Fred Beringer has 15 years of and stakeholder satisfaction for a test process n reate metrics to measure effectiveness, efficiency, C software development and testing experience managing and stakeholder satisfaction for a test project n reate metrics to measure effectiveness, efficiency, C large organizations where he was responsible for developing software and stakeholder satisfaction for a product being tested and applications for customers. Rex Black is a prolific author, practicing in the field of software testing today. His first book, Managing the Testing Process, has sold over 50,000 copies.register www.STPCon. com or call 877.257.9531Page 10
  • 11. C O N F E R E N C E F A L L 2 01 2 Pre-3 Mark Tomlinson, President, West Evergreen Consulting, LLC Pre-4 David Rhoades,Consulting, Inc. Maven Security Senior Consultant,New World Performance Hands-on: Remote TestingIt’s always a significant challenge performance testers for Common Web Applicationand engineers to keep up with the breadth of hot new Security Threatstechnologies and innovations coming at a ferocious pace The proliferation of web-based applications has increasedin our industry. Not only has the technology and system the enterprise’s exposure to a variety of threats. There arelandscape changed in the last few years, but the methods overarching steps that can and should be taken at variousand techniques for performance testing are also updating steps in the application’s lifecycle to prevent or mitigaterapidly. Our responsibility as performance testers and these threats, such as implementing secure design andengineers is to learn and master these new approaches to coding practices, performing source code audits, andperformance testing, optimization and management and maintaining proper audit trails to detect unauthorized use.serve our organizations with the most valuable knowledgeand skill possible. In my travels I’ve often heard questions This workshop, through hands-on labs and demonstrations,from fellow performance testers: Out of all the new will introduce the student to the tools and techniquestechnologies out there, what should I be learning? What needed to remotely detect and validate the presence ofnew tools for performance are out there? How do I better common insecurity for web-based applications. Testingfit in with agile development and operations? How do I will be conducted from the perspective of the end user (asupdated and refresh my skills to keep current and valuable opposed to a source code audit). Security testing helps toto my company? Spend a day with Mark and you will find fulfill industry best practices and validate implementation.answers to these questions. Security testing is especially useful since it can be done at various phases within the application’s lifecycle (e.g.Mark’s workshop will help you to be better prepared to during development), or when source code is not availablechart your own course for the future of performance and for review.your career. Mark will share with you the most recentdevelopments in performance testing tools, approaches This workshop will focus on the most popular and criticaland techniques; how to best consider your current and threats facing web applications, such as cross-sitefuture strategy around performance. scripting (XSS) and SQL injection, based on the industry standard OWASP “Top Ten.” The foundation learned in thisWorkshop Outline class will enable the student to go beyond the top ten vian ntroduction I to the New World of Performance self-directed learning using other industry resources, suchn ew Approaches N and Methods as the OWASP Testing Guide.n ew Performance Roles Nn ew Skills and Techniques N David Rhoades is a senior consultantn ew Performance Tools N with Maven Security Consulting Inc. whichn onclusion C provides information security assessments and training services to a global clientele. Mark Tomlinson is a software tester and His expertise includes web application test engineer. His first test project in 1992 security, network security architectures, sought to prevent trains from running into and vulnerability assessments. each other – and Mark has metaphorically been preventing “train wrecks” for his customers for the past 20 years. Page 11
  • 12. 1 0 : 3 0 a m – 1 1 : 4 5 a mSession block 1 Tuesday Seth Eliot is Senior Knowledge Engineer for Microsoft Test Excellence 16 October focusing on driving best practices for services and cloud development and testing across the company. Paul Fratellone, a 25 year career veteran in quality and testing, has 101 TestFratellone, Program Director Quality Paul Consulting, MindTree Consulting been a Director of QA and is well- seasoned in preparing department Preparing the QA Budget, wide budgets. Effort Test Activities – Part 1 This session will be an introduction to preparing a test team budget not only from a manpower perspective but Sreekanth Singaraju has more also from the perspective of testing effort estimation than 12 years of senior technology and forecasting. The speaker will be pulling from real- leadership experience and leads life scenarios that have covered a myriad of situations Alliance’s QA Testing organization of creating or inheriting a budget for testing services. in developing cutting edge solutions. Using these experiences, the speaker will highlight good practices and successes in addition to pitfalls to avoid. James Sivak has been in the computer How to develop a budget for both project based (large technology field for over 35 years, enterprise wide projects through mid-sized) and portfolio/ beginning with the Space Shuttle program testing services support across the organization and over the years has encompassed will be presented. In part one, the financial/accounting warehouse systems, cyclotrons, operating aspects of a budget (high-level) will be discussed and systems, and now virtual desktops. how full-time equivalents (FTEs) are truly accounted; the details of a “fully loaded” resource will be explained; Andy Tinkham is an expert consultant, and the difference between capital and operational focusing on all aspects of quality and expenditures and potential tax effects will be presented. software testing (but most of all, test We will also develop a resource estimation model that automation). Currently, he consults for represents real-world situations and introduce confidence a Magenic, a leading provider of .NET and levels/iterations of a budget/resource plan. and testing services. Session Takeaways: n ow H to prepare a QA/Testing Department Budget n ow H to create an estimation model 102 James Sivak, Director of QA, Unidesk Testing with Chaos Testers can become so focused on the testing at hand and how to make that as efficient as possible (generally driven by management directives) that environments and tests unconsciously get missed. Broadening the vision to include ideas from chaos and complexity theories can help to discover why certain classes of bugs only get found by the customer. This presentation will introduce these ideas and how they can be applied to tests and test environments. Incorporating randomness into testsregister www.STPCon. com or call 877.257.9531Page 12
  • 13. C O N F E R E N C E F A L L 2 01 2and the test environments will be discussed. In addition, With the commercialization of IT, mobile apps arethoughts on what to analyze in customer environments, increasingly becoming a standard format of deliveringlooking through the lens of complexity, will be presented. applications in enterprises alongside Web applications.Comparisons of environments with and without facets Testing organizations in these enterprises are facing theof chaos will be detailed to give the audience concrete prospect of adding new testing interfaces and devices toexamples. Other examples of introducing randomness the long list of existing needs. This session will showcaseinto tests will be provided. an approach and framework to developing a successfulYou will walk away with ideas on why and how to add mobile test automation strategy using real life examples.chaos to your testing, catching another class of bugs Using a framework developed by integrating Robotiumbefore your customers do. and Selenium, this session will walk participants through an approach to developing an integrated test automationSession Takeaways: framework that can test applications with web and n deas I on what makes testing and test mobile interfaces. environments complex n enefits of adding chaos to the testing This session highlights the following key aspects of B n ision perspectives in viewing current test developing an integrated automation framework: V environments and how to disrupt the testing paradigm n obile M test automation strategy n est T data management n eporting R 103 Seth Eliot, Senior Knowledge Engineer in Test, Microsoft n est Lab configuration, including multi-device T management n hallenges, lessons learned and best practices C A to Z Testing in Production: to develop a successful frameworkIndustry Leading Techniques toLeverage Big Data for Quality Testing in production (TiP) is a set of softwaremethodologies that derive quality assessments not from 105 Andy Tinkham, Principal Consultant, Magenictest results run in a lab but from where your servicesactually run – in production. The big data pipe from real Test Like a Ninja: Hands-Onusers and production environments can be used in a Quick Attacksway that both leverages the diversity of production while In this session, we’ll talk about quick attacks – small testsmitigating risks to end users. By leveraging this diversity that can be done rapidly to find certain classes of bugs.of production we are able to exercise code paths and use These tests are easy to run and require little to no upfrontcases that we were unable to achieve in our test lab or planning, which makes them ideal for quickly gettingdid not anticipate in our test planning. feedback on your application under test. In this session, we’ll briefly talk about what quick tests are and then diveThis session introduces test managers and architects toTiP and gives these decision makers the tools to develop into specific attacks, trying them on sample applicationsa TiP strategy for their service. Methodologies like (or maybe even some real applications!). This is a hands- on session, so bring a laptop or pair with someone whoControlled Test Flights, Synthetic Test in Production, Load/Capacity Test in Production, Data Mining, Destructive has one, and try these techniques live in the session!Testing and more are illustrated with examples from Session Takeaways:Microsoft, Netflix, Amazon, and Google. Participants will A n knowledge of some quick tests that yousee how these strategies boost ROI by moving focus to can use immediately in your own testinglive site operations as their signal for quality. n pecific tools and techniques to execute S these quick tests n xperience executing the quick tests E 104 SreekanthAlliance Global Services Services, Singaraju, VP of Testing n eferences for further exploration of quick R tests and how they can be appliedMobile Test Automation forEnterprise Business Applications Page 13
  • 14. 1 2 : 0 0 p m – 1 : 1 5 p mSession block 2 Tuesday Wayne Ariola is Vice President of Strategy and Corporate Development at 16 October Parasoft, a leading provider of integrated software development management, quality lifecycle management, and dev/ test environment management solutions. 201 Test Consulting, MindTree Consulting Kerry Field has over 35 years of Paul Fratellone, Program Director Quality experience in IT, including applications development, product and service support, systems and network Preparing the QA Budget, Effort Test Activities – Part 2 management, functional QA, capacity planning, and application performance. In Part 2 of this session, the speaker will showcase excel-based worksheets to demonstrate test effort, test Paul Fratellone, a 25 year career cycles and test case modeling techniques and how to veteran in quality and testing, has introduce confidence in escalating levels/iterations of a been a Director of QA and is well- budget as information increases in stability and accuracy seasoned in preparing department throughout a project/testing support activity. Covered wide budgets. will also be the effects of the delivery life cycle methods and budgeting. How will testing budgets differ between Lia Johnson began her career at the CIA waterfall and agile delivery methods/life cycles? and NASA as a software developer upon Session Takeaways: graduation from Lamar University with n How to budget for test cycles a BBA in Computer Science and is now n ealing D with Agile vs. Waterfall project budgets working for Baker Hughes Incorporated as a Software Testing Manager. Robert Walsh, a proponent of Agile software development processes 202 Lia Johnson, Manager, Drilling Inc. Software Testing, Baker Hughes Evaluation and techniques, believes strongly in delivering quality solutions that solve Software Reliability, real customer problems and provide The Definitive Measure of Quality tangible business value. Throughout the project life-cycle, measures of quality, test success, and test completion are indicative of progress. The culmination of these metrics is often difficult for some stakeholders to translate for acceptance. Software reliability dependencies begin with model driven tests derived from use- cases/user stories in iterative or agile development methodologies. Aligning the test process with a project’s development methodology and in many cases PDM is integral to achieving acceptable levels of software reliability. Relevant dependencies that impact software reliability begin with test first, test continuously. Some dependencies are within the scope of responsibility for software test professionals. Others require collaboration with the entire project team. Developing applications according to standard coding practices is also essential to attain software quality. Unit tests developed and executed successfully impact software quality. These dependencies and more will be discussed as we exploreregister www.STPCon. com the definitive measure of quality, software reliability. or call 877.257.9531Page 14
  • 15. C O N F E R E N C E F A L L 2 01 2Session Takeaways: others to succeed. Each pillar is the responsibility of a n he T Definitive Measure of Quality different group within the Agile development team, and n he T Importance of Process Alignment to when everyone does his part, the result is a solution Development Methodologies where the whole is greater than the sum of the parts. n ependencies of Software Reliability  D Session Takeaways: n ain G an understanding of the challenges faced when testing in Agile environments 203 KerryManagementSpecialist, Bank CSS Field, APM Team, US n earn how the four pillars provide a solid foundation L on which a successful testing organization can be builtCSI: Miami – Solving Application n earn why the pillars are interdependent and LPerformance Whodunits understand why the absence of one leaves a gap in the testing effortApplication performance problems continue to happendespite all the resources companies employ to prevent them. When the problem is difficult to solve, the organization 205 Parasoft Wayne Ariola, VP of Strategy,will often bring in outside expertise. We see a similar themeplayed out on TV detective shows. When the local authorities are baffled, the likes of a Jim Rockford, Lt. Columbo, or Complement Virtual Test Labs withAdrian Monk are brought in. These fictional detectives Service Virtualizationroutinely solved difficult cases by applying their unique This session explains why service virtualization is thepowers of observation, intuition, and deductive reasoning. perfect complement to virtual test labs – especiallyThe writers also gave each character a unique background when you need to test against dependent applicationsor odd personality trait that enhanced their entertainment that are difficult or costly to access and/or difficult tovalue.  Solving real world performance problems requires configure for your testing needs.much more than a combination of individual skills and anentertaining personality. This session will present examples Virtual test labs are ideal for “staging” dependentof real life application performance issues taken from the applications that are somewhat easy to access and haveauthor’s case files to illustrate how successful resolution low to medium complexity. However, if the dependentdepends on effective teamwork, sound troubleshooting application is very complex, difficult to access, orprocess, and appropriate forensic tools. expensive to access, service virtualization might be a better fit for your needs. Service virtualization fills the gap around the virtual test lab by giving your 204 Robert Walsh, Senior Consultant, Excalibur Solutions, Inc. team a virtual endpoint in order to complete your test environment. This way, the developers and testers can access the necessary functionality whenever theyBuilding a Solid Foundation For want, as frequently as they want, without incurring anyAgile Testing access fees. For another example, assume you haveSoftware testing in Agile development environments a mainframe that’s easily accessible, but difficult tohas become a popular topic recently. Some believe configure for testing, service virtualization fills the gapthat conventional testing is unnecessary in Agile by providing flexible access to the components of theenvironments. Further, some feel that all testing in Agile mainframe that you need to access in order to exerciseshould be automated, diminishing both the role and the the application under test. It enables the team to test vs.value of the professional tester. While automated testing a broad array of conditions – with minimal setup.is essential in Agile methodologies, manual testing has a Session Takeaways:significant part to play, too. n emove R roadblocks for performance testing,Successful Agile testing depends on a strategy built functional testing Agile/parallel developmenton four pillars: automated unit testing, automated n lose the gap that exists with incomplete or Cacceptance testing, automated regression testing, and capacity-constrained staged test environmentsmanual exploratory testing. The four are interdependent, n treamline test environment provisioning time and Sand each provides benefits that are necessary for the costs beyond traditional virtualization Page 15
  • 16. 2 : 3 0 p m – 3 : 4 5 p mSession block 3 Tuesday Dan Downing teaches load testing and over the past 13 years has led 16 October hundreds of performance projects on applications ranging from eCommerce to ERP and companies ranging from startups to global enterprises. A Software Tester for the past 15 years, Brian Gerhardt started off in production 301 KeithCenter, Barclays Head of Global Test Klain, Director, support where his troubleshooting skills and relentless search for answers lead Leading Cultural Change in A Community of Testers to a career in Quality Assurance. He is currently with Liquidnet. When Keith Klain took over the Barclays Global Test Center, he found an organization focused entirely on Karen N. Johnson is an independent managing projects, managing processes, and managing software test consultant and a stakeholders – the last most unsuccessfully. Although the contributing author to the book, team was extremely proficient in test management, their Beautiful Testing and is the co-founder misaligned priorities had the effect of continually hitting of the WREST workshop. the bullseye on the wrong target. Keith immediately implemented changes to put a system in place to foster Keith Klain is the head of the testing talent and drive out fear – abandoning worthless Barclays Global Test Center, which metrics and maturity programs, overhauling the training provides functional and non- regime, and investing in a culture that rewards teamwork functional software testing services and innovation. The challenges of these monumental to the investment banking and wealth changes required a new kind of leadership – something management businesses. quite different from traditional management. Sherri Sobanski is currently the Senior Find out how Keith is leading the Barclays Global Strategic Architect Advisor focusing on Test Center and hear his practical experiences defining technical test strategy and innovation objectives and relating them to people’s personal goals. supporting Aetna’s testing team. Sherri Learn about the Barclays Capital Global Test Center’s is also the acting head of Aetna’s “Management Guiding Principles” and how you can Mobile Testing Capability Center. adapt these principles to lead you and your team to a new and better place.  302 Karen N. Test Management, Inc. Software Johnson, Founder, Software Testing Heuristics Mnemonics Are you curious about heuristics and how to use them? In this session, Karen Johnson explains what a heuristic is, what a mnemonic is, and how heuristics and mnemonics are sometimes used together. A number of both heuristics and mnemonics have been created in the software testing community and Karen reviews several of each and gives examples of how to use and apply heuristics and mnemonics. In this session, Karen outlines how to create your own mnemonics and heuristics. She also explores ways to use both as a way to guideregister www.STPCon. com exploratory testing efforts. or call 877.257.9531Page 16
  • 17. C O N F E R E N C E F A L L 2 01 2 303 305 Quality Lead, Liquidnet Dan Downing, Principal Consultant, Brian Gerhardt, Operations and Mentora Group Interpreting Performance The Trouble with TroubleshootingTest Results – Part 1 One of the most valuable skill sets a software testerYou’ve worked hard to define, develop and execute a can possess is the ability to quickly track down the rootperformance test on a new application to determine its cause of a bug. A tester may describe what the bugbehavior under load. Your initial test results have filled appears to be doing, but far too often it is not enougha couple of 52 gallon drums with numbers. What next? information for developers, causing a repetitive code/Crank out a standard report from your testing tool, send build/test/re-code/re-build/re-test cycle to ensue. Init out, and call yourself done? NOT. Results interpretation this interactive session, you will rotate between learningis where a performance tester earns the real stripes. investigative techniques and applying them to actualIn the first half of this double session we’ll start by examples. You will practice reproducing the bug,looking at some results from actual projects and together isolating it, and reporting it in a concise, clear manner.puzzle out the essential message in each. This will be a This session will present several investigative methodshighly interactive session where I will display a graph, and show how they can be used as a means ofprovide a little context, and ask “what do you see here?” information gathering in determining the root cause We will form hypotheses, draw tentative conclusions, of bugs. Common tests that are designed to eliminatedetermine what further information we need to confirm false leads as well as tests that can be used to focus them, and identify key target graphs that give us the best in on the suspect code will be shown and discussed insight on system performance and bottlenecks. Feel free in interactive examples.to bring your own sample results (on a thumb drive so Session Takeaways:I can load and display) that we can engage participantsin interpreting! n uild B a framework around their software to ease finding root causes of software issues n uickly identify where the most likely cause Q 304 Sherri Sobanski, Aetna Strategic Architect Advisor, Senior of the errors are n se a new, empirical investigative method U for software testingFinding the Sweet Spot – n eliver more concise, relevant information to DMobile Device Testing Diversity developers and project managers when describingHow can you possibly do it all?! Infinite hardware/ and detailing issues in the software, becoming software combinations, BYOD, crowdsourcing, virtual more valuable in making fix/ship decisionscapabilities, data security...the list goes on and on.Companies are struggling with insecure, complicatedand expensive answers to the mobile testing challenges.The Aetna technical teams have created an innovativemobile testing set of solutions that not only protect databut allow for many different approaches to mobile testingacross the lifecycle. Whether you want to test fromthe Corporate Lab, from your own backyard or halfwayaround the world, we have a solution…you choose.Session Takeaways: n ow H to tackle the complex nature of mobile testing n echniques T to address BYOD, Offshore Vendors, Crowdsourcing n oftware capabilities for virtual testing of mobile devices S n rivate and public testing clouds P Page 17
  • 18. 4 : 4 5 p m – 6 : 0 0 p mSession block 4 Tuesday Bradley Baird is an SQA professional and has worked at several software 16 October companies where he has created SQA departments test labs from scratch, trained test teams and educated stake holders everywhere on the values of SQA. Brian Copeland has been instrumental in the testing of critical business 401 Bradley Quality,Senior Manager Product Baird, Harman International systems, from mission critical applications to commercial software. Don’t Ignore the Man Behind the Curtain Just as in the Wizard of Oz where all the great pyrotechnics going on in public were controlled by the wizard hidden David Dang is an HP/Mercury Certified behinds the curtain, so it is in QA where all the great Instructor (CI) for QuickTest Professional, quality that the customers see is controlled by the QA WinRunner, and Quality Center and professional. If you ignore the “Wizard” you do so at the has provided automation strategy and risk of compromising Quality. This session will cover the implementation plans to maximize ROI perils of ignoring the QA professional behind the curtain. and minimize script maintenance. We spend months or even years training a QA engineer Dan Downing teaches load testing only to see them walk out the door because a better and over the past 13 years has led opportunity came along. You don’t just lose a person you hundreds of performance projects on lose knowledge that is sometimes hard to replace. applications ranging from eCommerce In this session we will explore ideas on how to develop to ERP and companies ranging from management traits and skills that we perform on a daily startups to global enterprises. basis to make sure our wizards keep pulling those levers Over the past 15 years Todd Schultz has and tooting those horns for us instead of our competitor. tested many types of software, in many Session Takeaways: industries. Todd Schultz is passionate n deas I on managing or leading a QA team about professional development; n ow H to treat a human as a person not a resource mentoring Quality Assurance Engineers n ow to give incentives with limited to no resources H who see QA as a career, and not just a job. n ow to add value through happy team members H n ow to show your value to upper management H 402 David Dang,Zenegy Technologies Consultant, Senior Automation Optimizing Modular Test Automation Many companies have recognized the value of test automation frameworks. Appropriate test automation frameworks maximize ROI on automation tools and minimize script maintenance. One of the most common frameworks is the modular test automation approach. This approach uses the same concept as software development: building components or modules shared within an application. For test automation, the modular approach decomposes the application under test into functions or modules. The functions or modules areregister www.STPCon. com linked together to form automated test cases. While this or call 877.257.9531Page 18
  • 19. C O N F E R E N C E F A L L 2 01 2approach encourages reusability and maintainability, organizations have pointed to Agile as the impetus tothere are many challenges that must be considered and abandon all vestiges of process, documentation, and inaddressed at the start of a project. many ways quality. These organizations have added a fifth principle to the manifesto: Speed of the Delivery overThis presentation describes the key factors the QA quality of the delivered.group needs to consider during the design phase ofimplementing a modular test automation approach. Session Takeaways:The QA group will learn the aspects of the modular test n gile A or Fragile: How to tell which your organization isautomation approach, the benefits of implementing a n gile A Characteristics: The characteristics ofmodular approach, the pitfalls of a modular approach, excellent Agile teamsand best practices to fully utilize the modular approach. n gile Manifesto: What does it mean for processes A and documentation? n gile Testing: The role of “independent” testing in A 403 Mentora Group Dan Downing, Principal Consultant, the Agile framework n gile Tools: Do testing tools have a place in AInterpreting Performance Test Agile development? n gile Façade: Strategies for breaking down the façade AResults – Part 2In the second part of this session, we will try to codify the analytic steps we went through in the first sessionwhere we learned to observe, form hypotheses, drawconclusions and take steps to confirm them, and finally 405 Todd Schultz, Managing Senior Quality Assurance Engineer, Deloitte Digitalreport the results. The process can best be summarized Mobile Software Testing Experiencein a CAVIAR approach for collecting and evaluating In this hands-on session, with devices in hand, we willperformance test results: investigate the issues that mobile testers deal with on n ollecting C n isualizing V n nalyzing A a daily basis. We will explore both mobile optimized n ggregating A n nterpreting I n eporting R websites and native applications, in the name ofWe will also discuss an approach for reporting consumer advocacy and continuous improvement andresults in a clear and compelling manner, with data- gain a new perspective on the mobile computing era assupported observations, conclusions drawn from these we arm ourselves with the knowledge needed to usher inobservations, and actionable recommendations. A link a quality future for this emerging market.will be provided to the reporting template that you can This session will focus on three of the biggest challengesadopt or adapt to your own context. Come prepared to in mobile application testing:participate actively! n evice D Testing – Device fragmentation is a challenge that will continue to increase, and full device 404 Brian Copeland, QAGroup Director, Northway Solutions Practice coverage is often cost prohibitive. We will investigate how device fragmentation affects quality and learn the considerations involved in device sampling size.Agile vs Fragile: A Disciplined n onnectivity Testing – Although some applications CApproach or an Excuse for Chaos have some form of offline mode, most wereThe Manifesto for Agile Software Development was designed with constant connection in mind and cansigned over a decade ago and establishes a set of be near useless without a connection to the internet.principles aimed at increasing the speed at which We will investigate how vital connectivity is to thecustomers can realize the value of a development proper functioning of many mobile applications andundertaking. While the principles of Agile are a explore various testing methodologies.refreshing focus on delivery, they are often hijacked to n nterrupt Testing – Phone calls, text messages, alarms, Ibecome an excuse for the undisciplined development push notifications and other functionality within ourorganization. The Agile framework’s focus on agility is devices can have a serious impact on the applicationanything but undisciplined with principles such as Test under test. We will investigate the types of issues theseDriven Development (TDD); however rogue development interruptions can cause and some strategies for testing. Page 19
  • 20. 1 0 : 3 0 a m – 1 1 : 4 5 a mSession block 5 Doug Hoffman is a trainer in strategies part of this inaugural effort. The purpose of this two part for QA with over 30 years of experience. session is to review a high level study of the building of a His technical focus is on test automation Test Center of Excellence within a mature IT organization and test oracles. His management focus that will be informative for companies of various sizes and is on evaluating, recommending, and growth related to Quality Assurance maturity. leading of quality improvement programs. Session Part 1 Brad Johnson has been supporting n he T Startup – this session will describe how the testers since 2000 as head of monitoring testing center was formed initially, steps taken and test products at Compuware, to build the team, baselining the process and Mercury Interactive and Borland and methodology documentation, initial metrics planned is now with SOASTA where he delivers for reporting, the process involved in selecting the cloud testing on the CloudTest platform. pilot business areas that would be supported and Dwight Lamppert has over 15 years how we prepared the organization for the changes. n hat Worked – we will review the decisions made W of software testing experience in the financial services industry and currently during the initial stages, those decisions that stood manages Software Testing Process with the team throughout the process, and how the Metrics at Franklin Templeton. organization was socialized with the stakeholders Session Takeaways: Mike Lyles has 19 years of IT n ocumenting D core processes required to build a team experience. His current role comprises n electing S the appropriate team member skill sets Test Management responsibilities for a when building a team major company domain covering Store n dentifying core requirements when selecting a I Systems, Supply Chain, Merchandising testing vendor and Marketing. n stablishing core metrics needed for the organization E n electing the pilot group/business area for testing services S Mark Tomlinson is a software tester and test engineer. His first test project in 1992 502 Dwight Lamppert,Templeton sought to prevent trains from running into Senior Test each other – and Mark has metaphorically Manager, Franklin been preventing “train wrecks” for his customers for the past 20 years. How to Prevent DefectsWednesday The session shows how to prevent defects from leaking to later stages of the SDLC. This is a successful case study in which awareness and focus on static testing17 October was increased over a period of 2 years. Defect detection and leakage removal metrics were tracked for each of the projects. The first 2 projects exposed some issues. The metrics were shared and discussed with business 501 Mike Lyles, QA Manager, Lowe’s Companies, Inc. partners – and various process improvements were implemented. The results on the 4 subsequent projects showed marked reductions in defects that leaked intoBuilding A Successful Test Center system testing and production. The quality improvementof Excellence – Part 1 also contributed to less cost and shorter timelinesIn 2008, Lowe’s initiated a Quality Assurance group because re-work was controlled.dedicated to independent testing for IT. This was Lowe’s Session Takeaways:first time separating development and testing efforts forthe 62-year-old company. Mike Lyles was privileged to be A n focus on documentation reviews, walk-throughs, and inspections is a critical first step for building quality into the code early in the SDLCregister www.STPCon. com or call 877.257.9531Page 20
  • 21. C O N F E R E N C E F A L L 2 01 2 n efect D metrics are essential to support things that testing exposes every day and are the basis for initiating process improvement discussions 504 Doug Hoffman, Consultant, Software Quality Methods n hen awareness of static testing is increased on W Non-Regression Test Automation – the project team, the individuals participating often Part 1 stop creating defects as they specify, program, and test the new functionality The principle advantages of automated tests are n he QA organization is the best group to promote T repeatability and speed. The principle disadvantages are that processes that build quality into code early in the SDLC they are relatively more expensive to create than manual tests, require more maintenance than manual tests, and are more limited in the specificity of things they can compare 503 Brad Johnson, VP Product and Channel Marketing, SOASTA  compared with manual tests. Part 1 of this presentation describes another way to approach test automation: to testThe Testing Renaissance Has things that cannot be tested manually. These tests enableArrived (on an iPad in the Cloud) us to focus on learning about the software, can go behind the UI to extend our reach, are not limited to doing the sameThis is the best time in history to be a tester! Emerging thing each time and can perform huge numbers of iterationsfrom the most significant economic “dark age” since and combinations unthinkable using manual testing orthe Great Depression, the world testers now face is automated regression tests. This approach also encouragesone of immense opportunity and challenge as cloud checking broader classes of test outcomes, thus improvingcomputing has blown in overnight and mobile devices the types of errors that can be discovered.have overtaken the planet. In the new world of speedand scale, Renaissance Testers are launching (or Session Takeaways:re-launching) their careers by testing transformational n ow H to use automated tests to extend testers’ capabilitiestechnologies by effectively utilizing those same n he T virtues/drawbacks to automated regression teststransformational technologies. n esign of automated tests for exploration D n model for understanding outside influences and AAttend this session to learn how you can catapult your career with cloud-based test automation and mobile hidden outcomes during testingtesting techniques that are as exciting to learn and use as they are impactful to your end results and personal success! 505 MarkEvergreen Consulting, LLC West Tomlinson, Owner and CEO,Attendees Will Learn To: Slow-Motion Performance Analysis n ecome B a cloud testing expert Every modern-day performance engineer is now routinely n uild B and articulate a distributed mobile asked to diagnose the client side sensitivity and vulnerability testing strategy to performance failure due to slow page loading, high- n hampion a new approach to realistic, repeatable C latency and limited bandwidth. And if you’ve read Steve web and mobile performance testing Souders books you know that behind every browser is a n stablish yourself as an agile testing expert E whole-lotta loading activity going on – activity which obeys with Continuous Testing a somewhat bizarre set of logical rules, depending on n efend “Test” in a world heading toward “DevOps” D the browser type and version. This hands-on session willLearn the new “arts” – embrace this new world – and help you learn how to investigate client-side performancebecome a Renaissance Tester! issues by configuring the browser and a few different tools to make the page render in slow motion. We will coverSession Takeaways: how to analyze the page sequence rendering to highlight n earn L where testers are making a lasting impact objects or milestones in the rendering that are sensitive to on strategic projects high-latency performance issues. We will cover how to find n dentify areas to focus on for career growth I functionality issues with the page that are caused by slow n earn to seek out strategic projects L loading conditions. The techniques learned in this session n ain perspective and optimism about the future G will help you in the core competency for performance root- cause analysis and troubleshooting. Page 21
  • 22. 1 2 : 0 0 p m – 1 : 1 5 p mSession block 6 Wednesday Fariba Alim-Marvasti leads an innovative organization driving Process 17 October improvements across Aetna with delivery responsibility for testing/quality assurance within the Informatics and Medical Management domains. Doug Hoffman is a trainer in strategies for QA with over 30 years of experience. 601 Mike Lyles, QA Manager, Lowe’s Companies, Inc. His technical focus is on test automation and test oracles. His management focus Building A Successful Test Center is on evaluating, recommending, and of Excellence – Part 2 leading of quality improvement programs. In 2008, Lowe’s initiated a Quality Assurance group Mike Lyles has 19 years of IT dedicated to independent testing for IT. This was Lowe’s experience. His current role comprises first time separating development and testing efforts for Test Management responsibilities for a the 62-year-old company. Mike Lyles was privileged to be major company domain covering Store part of this inaugural effort. The purpose of this two part Systems, Supply Chain, Merchandising session is to review a high level study of the building of a and Marketing. Test Center of Excellence within a mature IT organization that will be informative for companies of various sizes and Scott Moore has over 18 years of IT growth related to Quality Assurance maturity. experience with various platforms and Session Part 2 technologies and has tested some of the largest applications and infrastructures n he T Evolution – this session will discuss the many in the world. changes that an evolving testing team typically takes, the steps toward maturity in some areas, the processes used to build an onboarding roadmap, and Robert Walsh, a proponent of Agile socializing the benefits and value of the organization software development processes to other IT and business groups. and techniques, believes strongly in n hat Worked – we will review what the team learned W delivering quality solutions that solve along the way and also evaluate high level processes real customer problems and provide and procedures that were effective from the beginning tangible business value. but evolved as the team grew over the years Session Takeaways: n uilding B an effective roadmap for onboarding additional groups/areas n est practices for socializing the testing group with B other IT teams and obtaining buy-in n stablishing a Quality Board to monitor, govern, and approve E changes to the processes and testing methodologies n ntegrating other groups with testing organizations I n potlighting the need for relevance – growing the S testing organization with emerging technologies 602 Fariba Marvasti, Sr. QA Manager II, Aetna Aetna Case Study – Model Office The Model Office Program came into existence dueregister www.STPCon. com to a contractual stipulation of a newly forged Aetna or call 877.257.9531Page 22
  • 23. C O N F E R E N C E F A L L 2 01 2partnership with a key vendor partner. The Project perform the same exercises as manual tests, only run byteam recognized that Model Office was not traditional a machine. The principle advantages for these automatedIT testing. This new testing had very little to do with tests are repeatability and speed. The principlethe conventional IT aspect but instead was all about disadvantages are that they are relatively more expensiveBusiness Process validation. It was promptly identified to create than manual tests, require more maintenancethat this type of testing was not in the current portfolio than manual tests, and are more limited in the specificityof testing services and therefore would require a new of things they can compare compared with manual tests.methodology and a new team to execute it. Part 2 of this presentation describes oracle mechanismsThe newly established Model Office team successfully that enable testers to take advantage of non-regressionexecuted Model Office testing for several of the automation. The oracles determine whether the software’sProgram’s work streams since its inception in early behavior appears to be normal or erroneous. The oracles2011. Model Office testing is an innovative focus on allow non-regression tests to vary their behavior andBusiness process analysis and concentration on risk still have predictable, checkable outcomes. This sessionidentification. It was recognized that there is obvious presents over a dozen different types of oracle mechanisms.benefit in programs comprehending risks and enabling Session Takeaways:proactive mitigation. As a result it was determinedthat Model Office testing be an available option to all n est T oracle mechanisms for automated testsprograms across the enterprise. The Model Office Process can be described into fourphases: Engagement; Assessment; Execution; and 605 Robert Walsh, Senior Consultant, Excalibur Solutions, Inc.Monitor and Review. Keeping Up! 603 Northway Solutions Group CEO, Scott Moore, President One of the challenges many organizations face when adopting Agile processes is having enough time in each iteration to test what was developed in that iteration.Application Performance Test Developers want to work up to the last minute in thePlanning Best Practices iteration to finish stories and maximize business value. Further, as the project grows over time, the scope ofMany companies experience performance issues in the testing effort grows with it. How can an organizationproduction even after they have tested it to endure ensure it remains true to Agile without sacrificing quality?load. Why? This session will demonstrate the value ofperformance testing for real-world scenarios and how This session will provide some helpful tips to give testersto properly plan for your application’s “perfect storm”. the best chance of keeping pace with development.How to create and react to chaos before it happens in Additionally, the presenter will discuss several strategiesproduction will be explored. Attendees will learn tried and that may be employed if, despite valiant efforts, testingtrue guidelines that can used to plan for realistic peak falls behind. While no single approach may provide theloads and understand what worst case looks like before silver bullet solution, attendees will leave the sessioncustomers do. In addition, we will discuss how cloud armed with ideas and concepts for dealing with thearchitecture and Software-As-A-Service introduces new problem both proactively and reactively.problems. Attendees will walk away with a repeatable testexecution methodology to use on any performance testing Key Takeaways – You will learn: n ain G an understanding of the challenges testers face inproject in the future. fast-paced Agile / iterative development environments n earn several strategies to help them have a better L 604 Doug Hoffman, Consultant, Software Quality Methods chance of keeping pace with development n earn two specific ways to adapt their processes if L they discover they cannot keep pace with developmentNon-Regression Test Automation – n earn effective ways to deal with defects found in LPart 2 different phases of Agile testingAutomated software testing has historically meant havingthe computer run individually crafted test cases that Page 23
  • 24. 3 : 0 0 p m – 4 : 1 5 p mSession block 7 Wednesday Wayne Ariola is Vice President of Strategy and Corporate Development at 17 October Parasoft, a leading provider of integrated software development management, quality lifecycle management, and dev/ test environment management solutions. Andreas Grabner has 10 years of experience as an architect and developer 701 Lee Henson, Chief Agile Enthusiast, AgileDad in the Java and .NET space. In his current role, Andi works as a Technology Technical Debt: A Treasure to Strategist for dynaTrace Software in the Discover and Destroy Methods and Technology team. Some treasures are worth the plunder as they yield vast Lee Henson’s 12 years of experience amounts of riches and jewels. Other treasures are often span a broad array of software better left untouched. This session is about a treasure production roles and responsibilities. that no organization wants to see or face but the rewards He is currently one of just over for acknowledging and destroying the treasure is vast 100 Certified Scrum Trainers (CST) and bountiful. As we continue to see companies grow worldwide. and expand, organizations are building VERY strong feature sets atop a crumbling architectural foundation. Matthew Heusser is a consulting Learn what the seven deadly sins of technical debt software tester and software process are, and the steps you can take to eliminate them! See naturalist who has spent his entire firsthand what the very obstacles are that we are trying adult life developing, testing, or to avoid yet falling right into their path. Learn what the managing software projects. tester’s role is in helping to make certain we never pitfall into this environment again. Leave armed with the steps Jamie Burns started her career in we can take to adjust the rigging and make certain we software testing, has been in the same never sail down the path of endless debt again. discipline for over 19 years in software testing and quality assurance, and has 10+ years in test automation. 702 Wayne Ariola, VP of Strategy, Parasoft Dependencies Gone Wild: Testing Composite Applications The move to SOA and composite applications has undeniably delivered tremendous benefits. However, the associated distribution and reuse creates various challenges from the quality perspective. With many dependencies relying on a centralized endpoint to complete their business processes, it becomes essential that this component always work as expected. Additionally, the fragmentation of system or endpoint ownership (or control) shifts how individuals, third parties, and dependent applications must access and evolve their systems. Moreover, ensuring the quality of the system under test is complicated by the following factors: n nd-to-end E tests need to pass through multiple dependent systems, which are commonly unavailable, evolving, or difficult-to-access for testingregister www.STPCon. com or call 877.257.9531Page 24
  • 25. C O N F E R E N C E F A L L 2 01 2 n ccessing A such systems often involves transaction failure is trying to automate chaos. You won’t be and bandwidth fees successful at automation if you haven’t taken the n eams need to test and tune the system under test T necessary steps to create a repeatable testing process. against a realistic and broad range of performance Session Takeaways: and behavior conditions n hat W are the goals of testing? n est T Environment – is it under control, 703 Andreas Grabner, Technology Strategist, Compuware dynaTrace or out of control? n est Data, do you know what you have, T or is it constantly changing?Top 3 Performance Land Mines n ow have you built your test cases? Are they Hand How to Address Them independent or are they dependent? n et back to basics, develop a test automation GAs a tester, you do not have the luxury or the time to plan and drive automation from the ground uptest repeatedly to try to fix slow, poorly-performingapplications during load tests. You need to get to the cause of performance quickly and fix issues fast andefficiently. There are tools that can help in identifyingapplication issues, but many are not effective for testing 705 Matt Heusser, Principal Consultant, Excelon Developmentin an environment where the boundaries between There Can Be Only One: A Testingpreproduction and production are increasingly blurred. Competition – Part 1This session will discuss the top 3 application issues This double session integrates the previous talks andthat impact performance and offer load testing best applies them, simulating real software testing.  After apractice tips for finding and fixing performance problems. quick exercise to create teams of 2-6 testers (or bringAttendees will learn how to: your own), you will be plunged into the real world of n apidly R determine the root cause of problems down software testing: Fun challenges, delivered under time to the component, SQL statement, or Method level pressure and conditions of uncertainty.  Teams will n educe testing iterations and cycle time R be judged on coverage, bugs found, the quality of the n alidate architecture for maximum scalability V communication they have with “management” and “the rest of the technical staff”, new approaches and strategy/leadership.  Stick around after the competition, 704 Jamie Burns, IT Team Lead, Denver, Colorado because prizes will be awarded on the main stage Thursday morning. Bring your laptop, your mobile device, your mind, andBuilding Automation from the your best test ideas to this high-paced race for testBottom Up, Not the Top Down excellence. See you on the playing field.Most software automation projects fail. Why? Becausetesters, managers, companies think automation tools aremagical and that automation is easy to implement, justrecord and playback – right? Wrong! Too many times,IT projects buy into the ‘easily automate’ software fromthe tools vendor only to find they have shelved the tool6 months down the road. Building an automated testsystem takes just as much thought, design and planningas building the software application under test itself.Before even thinking about an automation tool, youmust have the supporting infrastructure in place for testautomation. Not only does the manual test case haveto be repeatable, but the whole process needs to berepeatable. One of the biggest causes of automation Page 25
  • 26. 4 : 3 0 p m – 5 : 4 5 p mSession block 8 Wednesday Matthew Heusser is a consulting software tester and software process 17 October naturalist who has spent his entire adult life developing, testing, or managing software projects. Anna Royzman is the test lead in a cross-functional product development 801 Todd Schultz, Managing Senior/ Quality Assurance Engineer, Ubermind Deloitte team that creates and delivers game- changing software in the financial Maintaining Quality in a Period of industry, where “quality” is Explosive Growth – A Case Study as important as the “time to market.” A 12 year old software development firm began creating Over the past 15 years Todd Schultz iOS applications with about 20 employees. Less than 2 has tested many types of software, in years later they had 150 employees and were making many industries. Todd is passionate about award winning iOS and Android applications. In early professional development; mentoring 2012 they were purchased by one of the big 4 accounting Quality Assurance Engineers who see QA agencies to be the mobile extension of their world-wide as a career, and not just a job. technology consulting business. Maturing process in a time of explosive growth is an Carl Shaulis has been testing software immense challenge. Since before they launched their for 10+ years and has over 15 years of first iOS application the executive leadership sought to management experience. Carl takes an hire the best and the brightest technologists and turned analytical approach to testing and has them loose to traverse the terrain of this fast advancing the fortitude to embrace change. mobile emerging market. Now, as a part of a 188,000 person corporation, they continue to challenge the status Uday Thonangi has over 10 years of quo and push the agile process boundaries, and continue experience testing applications built for making leading edge mobile applications. client server, AS400 and mobile and over This session is a case study of one Quality Assurance 2 years experience testing, setting up Department managing explosive growth and becoming processes, building test environments and the headlights of an agile whole-team. evaluating tools for mobile applications. Session Takeaways: n oftware S development methods can mature continually in a company of any size n urrent quality assurance methods are over 30 C years old and badly in need of some polish n reat benefit can be realized from fairly simple G process modification and refocus n uality assurance is often the last discipline to fully Q embrace agile methods 802 Carl Shaulis, QA Engineering Manager, HomeAway.com Testing in the World of Kanban – The Evolution In this session, Carl Shaulis presents a journey of a teamregister www.STPCon. com who adopted Kanban and went from 6 week release or call 877.257.9531Page 26
  • 27. C O N F E R E N C E F A L L 2 01 2 cycles down to almost daily. This session will then shiftinto second gear illustrating how the team has evolvedthe practice and how Kanban migrated to other teams.   804 Uday Thongai, Software Test Engineer Lead, Progressive InsuranceSession Takeaways: Mobile Testing: Tools, Techniques n hat W is Kanban? Target Devices n ow H do testers play a role in Kanban? The nature of the mobile industry is in constant flux with n hy are more teams adopting Kanban? W new devices, browser and operating system versions n ow to elevate Kanban to deliver Value H being released constantly. Mobile testers have to keep device, browser and OS combinations in mind with 803 Anna Royzman, QAInc.Lead, multiple IDEs and testing tools to come up with risk Liquidnet Holdings, assessment and test approaches. Usage of built-in SDK utilities, vendor tools and command line goodies areRefocusing Testing Strategy Within explored in this session. Testing tool options for boththe Context of Product Maturity native and mobile web applications with ideas on what selection criteria should be used are showcased withSame as the “complete preplanning ahead” is not the Agile recommendations. The required skill level of mobileway of delivering software, the testing strategy within the application testers is slightly higher. For instance,agile project needs to fine-tune to the various stages of available testing tools expect access to applicationproduct development and maturity. While the team and source code with basic knowledge of Java/Objective Cproduct leads want to be confident about software quality, for native apps testing; understanding of ecosystemstesting tasks and activities need to be lean enough to avoid (Eclipse /Android SDK, Xcode/iOS SDK etc); andunnecessary time/maintenance hurdles or bottlenecks. knowledge of browsers and different rendering enginesWe will review testing methods and styles which fit best at with their effect on how pages are rendered is essential.different stages of product maturity (hint: the definition of This session will provide a framework to manage all“quality” may adjust for each stage). Session highlights: these dimensions of mobile testing. n trategic S ways to use automation, and when manual exploratory testing is a better choice n esting the usability and business assumptions T n everaging exploratory testing as a team activity L 805 Matt Heusser, Principal Consultant, Excelon Development (when and how) n etting ready for live pilots and sales demos G There Can Be Only One: A Testing of the real product Competition – Part 2 n esting coverage refinement from the T This double session integrates the previous talks and customer feedback applies them, simulating real software testing.  After a n isk-based testing R quick exercise to create teams of 2-6 testers (or bring n est automation: one tool doesn’t ‘fit all’ T your own), you will be plunged into the real world of n esigning production “safety net” suite of D software testing: Fun challenges, delivered under time automated tests pressure and conditions of uncertainty.  Teams willSession Takeaways: be judged on coverage, bugs found, the quality of the communication they have with “management” and n ecognizing R what “quality” means for various “the rest of the technical staff”, new approaches and groups of users and stakeholders (Sales, Business, strategy/leadership.  Stick around after the competition, Product, Operations, End Users) because prizes will be awarded on the main stage n hoosing the right testing strategy and solutions C Thursday morning. based on project needs and stakeholders’ goals n iscovering quality risks at different stages of D Bring your laptop, your mobile device, your mind, and product maturity, and fine-tuning testing strategy your best test ideas to this high-paced race for test accordingly (through simulation exercises) excellence. See you on the playing field. n orming intelligent decisions on test tactics and F methods applicable within context-specific situations Page 27
  • 28. 1 0 : 1 5 a m – 1 1 : 3 0 a mSession block 9 Thursday Jean Ann Harrison has been in the Software Testing and Quality Assurance 18 October field for over 12 years including 4 years working within a Regulatory Environment. Her niche is system integration testing, specifically on mobile medical devices. Matthew Heusser is a consulting software tester and software process 901 Rakesh Ranjan, Manager and Software Test Architect, IBM Corporation naturalist who has spent his entire adult life developing, testing, or 7 Habits of Highly Effective Testers managing software projects. As the title of the talk implies, the speaker will describe/ propose seven habits of being highly effective in a Rakesh Ranjan manages a high Tester’s role and will present techniques for adopting performance Test Engineering and those habits. A tester must make a paradigm shift in QA team for IBM’s flagship database his or her job and adopt a professional attitude towards product DB2 and has over 16 years testing and test engineering processes in general. The of software design, development and session’s focal point is on some proven approaches for test experience. obtaining personal and interpersonal effectiveness that are necessary to become a highly effective tester. Sreekanth Singaraju has more Session Takeaways: than 12 years of senior technology leadership experience and leads n earn L what is causing the paradigm shift in test Alliance’s QA Testing organization engineering in developing cutting edge solutions. n dentify bad assumptions and blind spots I n evelop coverage, confidence and common sense in D your every day job Mark Tomlinson is a software tester and n se mathematics and statistics to measure results U test engineer. His first test project in 1992 and effectiveness sought to prevent trains from running into n nderstand the psychology of testing U each other – and Mark has metaphorically been preventing “train wrecks” for his 902 Matt Heusser, Principal Consultant, customers for the past 20 years. Excelon Development Advances in Software Testing – A Panel Discussion If the pace of development and technology are two to three times as fast, what new advances are happening in software testing? The moderator will poll topics from the audience, then use the K-Card Method to evaluate the attention of the room, expanding on topics the room would like to discuss and moving on as energy fades. From Cloud Computing to Mobile, Lean Software Testing to Exploratory Test Automation, Crowdsourced testing, privacy and security, this advanced session is sure to provide insights into today’s challenges in software testing – and how to solve them. In addition to talking points, each expert will provide a paper summary for all attendees, including pointers and links.register www.STPCon. com or call 877.257.9531Page 28
  • 29. C O N F E R E N C E F A L L 2 01 2Session Takeaways: Session Takeaways: A n summary of cutting-edge issues in software A n practical guide to developing a successful testing, along with cutting-edge solutions automation strategy including where to spend your n handful of ideas to try on Monday, some with little A budget dollars and allocate resources or no cost n ase studies outlining strategies to choosing C n balanced discussion of how those ideas have A the following: End-to-End scenarios, Automation worked for a variety of experts Frameworks, Implementation Approaches and n eferences for where to go for more R Test Data n efinitions of metrics and how to capture them to D validate that your strategies are working 903 MarkEvergreen Consulting, LLCGuru, West Tomlinson, Performance Performance Testing Metrics and 905 Jean Ann Harrison, Project Manager, ProjectRealms, Inc.Measures Memory, Power and Bandwidth – OhThis session will cover the expansive understanding of performance testing metrics and how they can My! Mobile Testing Beyond the GUIbe used to drive quality initiatives upstream and Many testers today are asked to test mobile applications.downstream in the application lifecycle. Too often The assumption is to test the mobile application in thetoday we see that performance measurements (CPU, same manner of testing a web application or a client/DISK, MEMORY, NETWORK) are left out of early testing server application. However, if the software tester isautomation and pass/fail criteria, when it would be testing mobile applications, then the software testerabsolutely more cost-effective if we could find and must now become aware of hardware and firmwarefix those bottlenecks earlier. Testers who are new to conditions which were not normally a part of previousperformance testing success criteria will learn new ways test case development. Charging the device while theto correlate and analyze performance measurements software application is being downloaded, installing, inand extrapolate the results in a reliable and repeatable use where the GUI is engaged are all necessary testsway. Testers who are interested in taking their bug- for the mobile application tester. Add another variablefinding skills to a whole new level will be delighted to to charging the device to consider the heat generatedexpand into performance-related testing investigation, from charging the battery, another level of tests canexploration and risk analysis. include network communication, database searches, and memory management. Understanding the combination of software/firmware/hardware all contained within one 904 Testing Services, Alliance Global Services Sreekanth Singaraju, VP, QA and complex system is now a part of any software tester of mobile applications. Testers will work in teams to formulate heuristic How and Where to Invest Your oracles by introducing various hardware conditions Testing Automation Budget to their mobile devices and witnessing different softwareTest automation promises substantial benefits to behaviors. We will use exercises to develop test casesan organization including accelerated test cycles, using exploratory testing to further inspire heuristicincreased coverage, accuracy in reporting and enhanced oracles. The result will generate a series of regressionproductivity. The reality, however, is that in many cases, test ideas to apply for any mobile application automation projects fail before they deliver projected ROI testing project.or they substantially under-perform on the objectives.This session will cover five proven strategies for asuccessful testing automation strategy and how to best allocate your test automation budget. Page 29
  • 30. 1 1 : 4 5 a m – 1 : 0 0 p mSession block 10 Scott Barber is Co-Founder of the Workshop on Performance Thursday 18 October and Reliability, and co-author of Performance Testing Guidance for Web Applications, Beautiful Testing, and Reducing the Cost of Testing. Dan Bartow is VP of Product Management at SOASTA, the leader in 1001 Joseph TestingDirector of Cohesion QA and Ours, Services, Software performance testing from the cloud. Redefining the Purpose of Prior to joining SOASTA he was Sr. Software Testing Engineering Manager at Intuit. Throughout the history of software testing, the profession has evolved from expectations of just meeting requirements Richard Kriheli’s career spans fifteen to ensuring fitness of use. Along the way, testers have been years of quality assurance and design saddled with the burdensome expectations of ensuring overall on accounts such as Walmart, L’Oreal, quality in each of these ways. In this presentation we will Nokia, Vogue, Best Buy others. He discuss the history of software testing in order to understand currently heads testing efforts for Nike where the industry needs to go. We will talk about the at R/GA. common challenges our stakeholders have in understanding Philip Lew is an Adjunct Professor our craft, as well as some of the common negative at Alaska Pacific University and perceptions of value. We will address these with a newly the Project Management College defined purpose of software testing, where the emphasis teaching graduate courses in software is placed on providing information, not just raw data, to engineering, IT project management, stakeholders; information of value. This transition to providing and IT Governance. information will transform testing into a service minded group whose value is transparent and ultimately empowered by their Joseph Ours draws on 15 years’ stakeholders instead of just tolerated by them. experience providing executive-level leadership while managing high profile Session Takeaways: initiatives with a demonstrated ability to n he T evolution of software testing lead people towards successful delivery. A n new definition for the purpose of software testing n ow to deliver value to stakeholders in order to be H empowered by them n better defined testing role that aligns with A business value 1002 Philip Lew, CEO, XBOSoft Evaluating and Improving Usability Today’s web-based applications (WebApps) containing complex business logic and sometimes critical to operating the business, now must have an increased focus on usability as well as the newer and broader term, user experience. Now, mobile apps are following their lead and are gaining more and more sophisticated functionality even with limited screen real estate. This has led to both usability and user experience becoming paramount as there is no up-front investment forcing a user to stay. Evenregister www.STPCon. com though usability has some formal ISO model definitions, or call 877.257.9531Page 30
  • 31. C O N F E R E N C E F A L L 2 01 2 there are no models or formal definitions for userexperience, nor its relation to usability. For usability, ISOprovides general guidelines but lacks implementation 1004 Richard Kriheli, Associate Director of Quality Assurance, R/GAspecifics. UX, on the other hand, does not have any Scaling Gracefully andformal standard definition, although some models have Testing Responsivelybeen developed regarding its elements. Models andresearch have been used mostly for the purpose of The web isn’t just desktop anymore. The web is everywhere,understanding, rather than evaluating and improving. creating a fluid conundrum for the QA professional. AsIn this talk, we draw relationships between usability the testing matrix grows and grows into a multiplatformand user experience and discuss measurement and expanse, we must be ready for not just the size of a screen,evaluation methods that can be used as the first step but also user browsing habits. In this session, we’ll exploretoward improvement. the problems presented by responsive design, examine the flexible grid and forecast systematic solutions.Session Takeaways: Session Takeaways: n nderstand U usability and develop a usability model n nderstand why it is important and how it impacts U n alk W away with an enhanced understanding of the user acceptance nuances of an increasingly fluid web landscape n ethodology for continuous usability improvement M n et an overview of responsive web design and a G n ractical measurements P recommended approach to test beyond the desktop n earn how one organization conducted usability L n itness the various permutations of the flexible grid W testing via a case study applied to a range of platforms 1003 Product Management, SOASTA Dan Bartow, Vice President, 1005 Scott Barber, Chief Technologist, PerfTestPlusReal World Performance Testing Quick, Easy Useful Performancein Production Testing: No Tools RequiredOnline application performance is critical – no one would When most people think of performance testing, theychallenge this statement. Yet the “dirty little secret” think about the hard parts and about the expensive andin the web world is that the amount of performance complicated tools that are required to simulate the activity oftesting done on applications is appallingly low. When thousands of end-users all at the same time, while collectingperformance testing is done it’s usually conducted in a tens or hundreds of thousands of measurements. In reality,test lab. Even with thorough testing in a lab environment, many performance issues can be detected and diagnosedwhen applications are deployed to production they very with exactly the tools and knowledge you already havefrequently topple under the pressure of real-word users. at your disposal using information obtained from quick,The results from this investment in lab testing are not easy and cheap performance tests. In fact, much of thedelivering the answers that leadership needs about web performance related information that stakeholders needapplication performance. to make good decisions and development teams needTesting in production is an essential component of to dramatically improve system performance is easilyworld-class performance methodologies. However, this obtainable by the performance-testing layman.approach is not without its own set of challenges; the Scott will introduce you to several techniques that thethree most common of which are security, test data in performance testing layperson can use to speed up andproduction, and potential live customer impact. In this simplify the collection of valuable performance-relatedsession you’ll learn key elements from the methodology information; many of which you can use during the tutorial tothat companies are using to enable production testing test your current website if it’s accessible from the classroom.and subsequently gain the highest confidence possible in Session Takeaways:their production application performance and reliability.Come to this session and learn how businesses from the n hat W performance testing can be done quickly easilyNew York Stock Exchange through Netflix have solved n he T resources necessary to do itthese problems to enable ongoing production testing. n he experience of having done it T Page 31
  • 32. PERMIT #6563 SLC UT 84115 U.S. POSTAGE PRSRT STD PAIDRegisternow !DiscountsEarly Bird Discount: Register on or beforeSeptember 7 to receive $400.00 off any fullconference packageTeam Discounts Rate Before Rate After Number of Early Bird Early Bird 1115 Elkton Drive, Suite 301 n Colorado Springs, CO 80907 Attendees (9/7) (9/7) 1-2 $1,295 $1,695 3-5 $1,195 $1,395 6-9 $1,095 $1,295 10-14 $995 $1,195 15+ $895 $1,095*Price above is for access to the Main Conference Package.Please add $400 to your registration if you are registeringfor the Main Conference Package PLUS 1-Day.**Team discounts are not combinable with any otherdiscounts/offers. Teams must be from the same companyand should be submitted into the online registration systemon the same day. register @ www.STPCon.com or call 877.257.9531 50 Conference Breakout Sessions in 8 Comprehensive Tracks Leadership Perspectives Test Automation for Testers Strategic Business Agile Testing Alignment Test Strategy, Process Mobile Application Testing and Design Performance Testing Hands-On Practicals

×