Automated softwaretestingmagazine april2013


Published on

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Automated softwaretestingmagazine april2013

  1. 1. An A U TO M AT E D T ESTING I NSTITUTE Publication - www.automatedtestinginstitute.comAutomated .......S T oftware esting MAGAZINE April 2013 $10.95 Navigating Continuous Changeand Developer Tools A Tester In An Ocean of Developer Tools Making the transition Automated from waterfall to Test Automation Testing In Agile agile on Embedded Development Systems test automation When The Unique challenges automation is not Your associated with Only task ‘embedded’ automation THE RIGHT TOOL: Building a Mobile Automation Testing Matrix
  2. 2. OD TestKIT ONDEMAND If you can’t be live, be virtual ODONDEMAND SessionsAnywhere, anytime access to testing and test automation sessions>>>> Available anywhere you have an internet connection <<<<>>>> Learn about automation tools, frameworks & techniques <<<<>>>> Explore mobile, cloud, virtualization, agile, security & management topics <<<<>>>> Connect & communicate with testing experts <<<< ONDEMAND.TESTKITCONFERENCE.COM
  3. 3. Automated S T oftware estingApril 2013, Volume 5, Issue 1ContentsContinuous Change and Developer ToolsA test automator in a “fast-paced environment” is faced with little time for automation and unclear informationabout non-standard, non-GUI systems for which they have no comprehensive tool for dealing with. This issueis dedicated to approaches necessary for successful automation in these types of environments, including closecoordination with developers and the use of tools that may not traditionally be used by testers.FeaturesA tester in an ocean of developer tools 12This article describes one team’s journey from a waterfall environment to an agile-like environment where testers were moregreatly exposed to developer processes and tools. By Michael AlbrechtOvercoming Challenges of Test Automation on Embedded Systems 18This article addresses approaches for effectively adjusting your automation techniques when faced with a non-conventional system suchas an embedded system. By David PalmHow Automated Testing Fits Into Agile Software Development 28This article offers a roadmap for test automation implementation when test automation is not your only task By Bo RoopColumns & DepartmentsEditorial 4 Local Chapter News 16Continuous Change and Development Tools Read featured blog posts from the web.The challenge of “fast-paced” environments. I ‘b’log to u 36Authors and events 6 Read featured blog posts from the web.Learn about AST authors and upcoming events. Go On A Retweet 38SHIFTING TRENDS 8 Read featured blog posts from the web.Automation Ups and DownsAnalyze trends from the ATI Honors. Hot topics in automation 40 The Right Tool For the JobOpen sourcery 10 Building a Mobile Automation Testing MatrixIntroducing... New Mobile Source The AST Magazine is a companion to the ATI Online Reference.Mobile tools in the ATI Honors April 2013 Automated Software Testing Magazine 3
  4. 4. Editorial Navigating Continuous Change and Development Tools by Dion Johnson Working in a “fast-paced environment” is often a daunting task, particularly for a software quality engineers. There are often no straight-forward answers for anything, system requirements are spread out in a loose collection of stories contained in a tool organized largely by build, sprint and or release. And just when you think you’ve got things figured out, there is another layer to be pealed back revealing some fundamental aspects of the system that you, to that point, were never aware of. The task is even more challenging for a software test automator. Test automators are often faced with non-traditional or non-GUI systems and a lower than normal tolerance for investment compared with their “slower-paced” GUI systems for which they have no processes and tools. Next, we are plunged counterparts - which is scary given the comprehensive tool for quickly or even further into an environment that fact that even these so-called “slower- effectively dealing with. Test automation is often less conventional for software paced” projects typically have a low tolerance for investment themselves. This lower tolerance for investment may A test automator is therefore faced finds roots in the fact that “fast-paced” projects are often pretty accepting with little time for automation, unclear of whatever the finished product is. information about non-standard, non- Given the frequent delivery of updates, little hesitation is given to moving a GUI systems for which they have no feature back several releases. As long as the release contains something of comprehensive tool value, no one bats an eye at delaying functionality or even the testing of implementation in this type of situation testers. Entitled “Overcoming the functionality. In addition, processes are often relies on a little ingenuity, close Unique Challenges of Test Automation often left undocumented which makes coordination with developers and the on Embedded Systems”, this feature it difficult to measure the effectiveness use of tools that may not traditionally written by David Palm addresses how of those processes, thus also making it be used by testers. This issue of the to effectively adjust your automation difficult to make a case for investment magazine focuses on automation under techniques when faced with a non- in process improvement and tools these circumstances. conventional system such as an that don’t directly affect the software embedded system. Finally, we address developers’ perception of convenience The first feature entitled “A Tester in An adjusting your automation approaches and efficiency. Ocean of Developer Tools” by Michael to fit into agile development where Albrecht describes one team’s journey multi-tasking is critical. In this article, A test automator is therefore faced from a waterfall environment to an Bo Roop offers a roadmap for test with little time for automation, unclear agile-like environment where testers automation implementation when test information about non-standard, non- were more greatly exposed to developer automation is not your only task.4 Automated Software Testing Magazine April 2013
  5. 5. 5 Annual thATI Automation Honors Celebrating Excellence in the Discipline of Software Test Automation Nominations Begin April 8th!
  6. 6. Authors and EventsWho’s In This Issue? Michael Albrecht Michael Albrecht has been working within the test profession Automated S T oftware esting since the mid 90s. He has been a test engineer, test manager, technical project manager and test Managing Editor strategy/architect manager. From late 2002, Michael has been working intensely with test Dion Johnson management, test process improvement and test automation architectures. He has also been Contributing Editors teaching ways to adapt test-to-agile-system development methodologies Donna Vance like Scrum. Edward Torrie Darren Madonick is a mobile testing Director of Marketing and Events specialist at Keynote Systems. He has six years Christine Johnson of experience in the mobile industry, along with a lifetime of experience as a technology enthusiast. A PUBLICATION OF THE AUTOMATED As a mobile evangelist Madonick works closely TESTING INSTITUTE with Keynote customers to find the right mix of products and services that meet their current needs, as well as future needs for mobile testing and development. He spends many hours both remotely and on site within many customers in various verticals to help plan their mobile testing and development effort. Madonick’s experience in this industry has enabled him to work directly with many Fortune 500 companies on a regular basis, and helps these CONTACT US organizations make important decisions regarding the future of their mobile AST Magazine enterprise. ATI Online Reference David Palm is lead test engineer at Trane, a leading global provider of indoor comfort systems and services and a brand of Ingersoll Rand. After gaining 18 years of experience in embedded software development in the materials ATI and Partner Events testing, automotive, hydraulics, and heating, ventilation and air conditioning industries, Palm Now Available transitioned to embedded software testing in TestKIT On Demand Virtual Conference 2005. He has extensive experience managing the testing process for complex and interconnected embedded control systems, from concept to production. His greatest expertise is in applying test automation to embedded control June 17-18, 2013 systems. ATI Europe TABOK Training Bo Roop is employed as a senior software quality assurance engineer at the world’s largest September 23-25 designer and manufacturer of color measurement systems. He is responsible for testing their retail TestKIT 2013 Conference paint matching systems, which are created using an agile-based software development methodology. Bo helps gather and refine user requirements, prototypes user interfaces, and ultimately performs the software testing of the final product. Testing is a passion of The Automated Software Testing (AST) Magazine is an Bo’s, and he is involved with local software groups as well as a few online Automated Testing Institute (ATI) publication. For more information regarding the magazine visit forums. He’s an advocate for software that meets the customer’s needs and expectations and can frequently be heard trying to redirect teams toward a more customer-centric point of view.6 Automated Software Testing Magazine April 2013
  7. 7. TestKIT Conference 2013 Testing & Test Automation Conference Save the Date! September 23 - 25, 2013 Crowne Plaza Hotel Arlington, VA The KIT is Koming... Back www.testkitconference.comApril 2013 Automated Software Testing Magazine 7
  8. 8. Shifting Trends Automation Ups and Downs The 4th Annual ATI Automation Honors reveal contemporary tool sentiment T here seems to be no definitive favorite in the Best Open Source Functional Automated Test Tool – Java/Java Toolkits subcategory. FEST, a finalist that made its first appearance this year, has taken the top spot, but it better watch its back, because there has been a different winner each year since this subcategory was introduced. The jury is still out on whether the community will eventually lock into a long-term favorite, but for now, FEST is the champion. [ The ATI Honors tells us a lot about current and future tool trends ] M uch like the Best Open Source G Functional Automated Test orillaLogic is no stranger to victories in Tool – Java/Java Toolkits the ATI Honors with its FlexMonkey tool subcategory, the Best Commercial Functional Automated winning first place in the Best Functional Test Tool – Web subcategory has also experienced its fair Automated Test Tool – Flash/Flex subcategory in both 2010 share of turnover. This seems largely due to the in-and-out and 2011, while also being named the runner up in the Best nature of QuickTest Professional (QTP) aka (HP Functional Function Automated Test Tool – Overall subcategory in Tester). Since this tool seems to only have an eligible release 2011, coming in just behind the ever popular Selenium. This every other year, it has only been named a finalist in the organization seems to have truly hit their stride however, awards every other year. In years that it has been a finalist, it in the 4th Annual awards with FlexMonkey’s successor has dominated this and other subcategories. Years that QTP tool known as MonkeyTalk. MonkeyTalk not only picked has not been a finalist seems to be open season for multiple up where FlexMonkey left off by winning Best Functional other tools to shine. The 2nd Annual awards saw SilkTest Automated Test Tool – Flash/Flex subcategory, it also swept win this subcategory in the absence of QTP. This year, it all subcategories in the newly added Best Mobile Automated saw new comer, Automation Anywhere win the subcategory. Test Tool Category. This included the Android, iOS, and Congratulations to Automation Anywhere… at least for now. Overall subcategories. G T his is the first year that the top performance oogle lost. How many times do you get to tool LoadRunner has not been a finalist – say that? Well, in this year’s ATI Honors, due to the fact that it had no eligible release. this is an accurate statement as Google SilkPerformer was the clear benefactor of the high profile lost the crown it held for two years in the Best Open Source absence. SilkPerformer has had a steady assent to the top over Unit Automated Test Tool – C++ subcategory. ATF, a tool that the years, coming in as the Runner Up to LoadRunner in the entered the fray as the runner up in this category last year, Best Commercial Performance Automated Test Tool – Overall pulled an upset by beating Google for the number one spot. subcategory in the 2nd Annual and 3rd Annual awards. With I’m sure Google is not too concerned by this minor setback (if LoadRunner out of the picture, SilkPerformer was unrelenting they are aware of it at all), but our community has spoken and in its quest for number one and it finally achieved the spot this made their voices clear. year.8 Automated Software Testing Magazine April 2013
  9. 9. Crowdamation Crowdsourced Test Automation It will offer the flexibility to use a tool of choice (open source and commercial), have teams operate out of different locations, address the challenges of different platforms introduced by mobile and other technologies, all while still maintaining and building a cohesive, standards-driven automated test implementation that is meant to last. “It’s OK To Follow the Crowd” Inquire at ”dworC eht wolloF oT KO s’tI“April 2013 Automated Software Testing Magazine 9
  10. 10. Open Sourcery Introducing... New Mobile Source Mobile Open Source Tools that Made Their First Appearance in the ATI Honors As the testing community gears up for the 5th Annual ATI Automation Honors, let’s take a look at the current open source finalists and winners that made their first entry into the Honors during the 4th Annual Awards. M obile automated test tool is not totally new to the awards. categories were added MonkeyTalk was in the awards for consideration during under one of its former names: the 4 th Annual ATI Automation FlexMonkey. It’s other former name Honors, which brought several new was FoneMonkey. FoneMonkey and tools to the forefront for recognition FlexMonkey have now combined to in the awards. These tools are form a tool known as MonkeyTalk highlighted in this article. and is a free and open source, cross- platform, functional testing tool MonkeyTalk from GorillaLogic that supports test automation for native iOS and While MonkeyTalk Android apps, as well as mobile web has not technically and hybrid apps. It’s name change been in the ATI was apparently well received as it Table 1: Best Open Source Honors before, it claimed the top prize in each of the Mobile Test Tool Finalist10 Automated Software Testing Magazine April 2013
  11. 11. Open Sourcery three mobile test subcategories. is an iOS integration test framework Calabash that leverages undocumented iOS Frank Most of APIs for easy automation of iOS the mobile The dog inside apps. tools in the ATI Honors either of a bun trotted Zucchini supported iOS or Android, but not into the ATI Let’s see if we can both. Although only nominated in Automation link Frank, one of the Android category, Calabash is Honors as the runner-up in the Best the previously one of the few tools that supports Mobile automated test tool categories were added for consideration during the 4th Annual ATI Automation Honors, which brought several new tools to the forefront for recognition Open Source Mobile Automated mentioned finalists, with our next both mobile platforms. In addition, Test Tool iOS subcategory. Frank finalist, Zucchini, through a series like a couple of the other tools, this is a tool for writing structured of associations. While discussing LessPainful supported tool also acceptance tests and requirements the Frank automated tool, a tool supports Cucumber for developing using Cucumber and have them by the name of Cucumber was automated scripts. execute against an iOS application mentioned. As you are probably already aware of, a cucumber is not KIF (Keep It Functional) only a tool, but the name of food as Robotium well. A food that is often mistaken The final mobile tool that entered Ever heard of K.I.S.S, which stands for a cucumber is a Zucchini. the ATI Honors in the Best Open for Keep It Simple Stupid”? It is a Source Mobile Automated Test Tool principle that asserts the power of Tada! Android subcategory is Robotium. simplicity in system design. A test Maybe this association is how our With Robotium, test automators tool came into the ATI Honors with a next tool was able to make its first can write functional, system and name that may be inferred to follow appearance in the ATI Honors. Or acceptance test scenarios for a similar principle, but instead of maybe it’s because the community testing multiple Android activities. just keeping things simple, this likes the way this tool uses natural Robotium was not only a finalist in framework touts the power of language for interacting with and the Android subcategory, but was keeping things functional. KIF, developing automated tests for iOS also the Runner-up in the Overall which stands for Keep It Functional, based applications. subcategory behind MonkeyTalk.April 2013 Automated Software Testing Magazine 11
  12. 12. “If you can’t beat them join them” A Tester In an OCEAN Of Developer Tools For years we’d been walking in the protected world of waterfall projects far away from requirement negotiations and acceptance tests and suddenly , awoke to a project with short iterations (two weeks) and in the lap of the customer by Michael Albrecht12 Automated Software Testing Magazine April 2013
  13. 13. Imagine an organization used to slow- pace, annual deliveries, always with a Graphical User Interface (GUI), and very little test automation. Suddenly that organization is faced with short iterations (two weeks), no GUI and very high performance requirements. This was the situation my test team faced, and forcing us to deal with the fact that our testing approaches were no longer going to be effective. As a result, we formed a small, technical group to assess our approaches and identify new quality assurance tactics. This article follows our journey to agility prior to the modern popularity of agile.April 2013 Automated Software Testing Magazine 13
  14. 14. The birth of new All requirements started with a picture describing the basic flow, followed tactics by short informative text explaining The first order of business was to find the flow throughout the system. The a way of testing without a GUI. Scary! requirements were divided into groups: Due to lack of tools knowledge, the QA • Basic flow people turned to the developers to find useful tools and experience. To be able • Alternative flows to proceed, the project created a few • Error flows simple rules: • XML structure expected from the 1. Sit together system 2. Hold daily status meetings Figure 1: Meeting Notes Short iterations and 3. Learn some basic programming continuous delivery (same language as the developers) As I mentioned, before a complete 4. Select tools already used by the Development (TDD) approach to system delivery had to take place every developers for unit testing if product development. To adjust to the fortnight, but we actually delivered possible new development practices and software every week. A typical fortnight cycle is 5. Create semi-automatic and the test team identified pertinent tools illustrated in Figure 2. Since the project automatic tests for low-level test automation, including only had one team, the team focus tools for: shifted over time during this two-week 6. Learn from the past cycle. 7. Get going! • XML schema validation • SOAP testing The birth of the technical Developer automation • “Check in” and nightly test automator and tools execution The absence of a GUI increased the need As is often the case, an agreement with for more technical skills to accompany the customer was made without talking Customer requirements excellent domain knowledge, but finding to the test team. The delivery cycles and test cycles all that knowledge in one person is very were built strictly upon the developer’s For the first time in my life I found the hard. So we discarded the traditional delivery capacity. customer requirements very detailed, project approach, and got everyone In addition, the development team but at the same time flexible. The together (both testers and developers). began relying on APIs and open source requirements were created as use In our team we got two testers with some tools for implementing a Test Driven cases from an actor’s point of view. development skills, and one tester with Writing requirements for a technical automator is very easy; finding all that knowledge in one person is very hard. So we discarded the traditional project approach, and got everyone together14 Automated Software Testing Magazine April 2013
  15. 15. big issue since we agreed to limit the R performance tests to search functions, and not updates. Doing the tests at et night limited outside interference. The challenge was in building our own performance tool. Once again we could ro thank team spirit for solving this. The developers implemented extended logging in databases and APIs together sp with a simple GUI to control parameters such as the number of concurrent users, time intervals between executing tests ec and sequence order. As testers, we developed functional tests and scenarios that could be run independently and ti as part of load/performance scenarios. The test cases were both created in Java v code, as well as saved batches in our SOAP test tools. The tricky part came e when we wanted to measure transaction times throughout the system during performance test execution. In Excel, we collected data from the database, API and GUI logging, connected each transaction to the loggings, and created a macro in Excel that calculated the duration for each and every one. Whilevery good business knowledge. The lack now had very precise and demanding we did not spend any money on tools,of development skills within the test transaction time requirements. No we did spend a LOT of time. In ourgroup was cured by getting expert help environment except for production situation it was easier to explain (hide)from the developers within the team. then tool costs.Traditional testers The conclusions of aseeking usable tools! The absence of success storyWith little time and money, weabandoned our thoughts of acquiring a a GUI increased Teamwork, teamwork and once again teamwork – combined with continuous the need for morenew tool. The developers had been using improvements and automated testsimple, open-source tools for some time batches – resulted in a successful testin other projects, so we just started to effort. Being forced to use the sameuse the same tools for our test scenarios. technical skills tools as the developers was a blessing from above, but we probably would’vePerformance test at no been better off without creating our owncost performance tool. It was a mess fromThe company had no earlier experience was sufficient for executing the tests. start to end. I have several other lessonsfrom performance tests, and the customer Running tests in production was not a learned, my top three are: 1. Work as a team 2. Use the same tools 3. Involve the customer in continuous feedback loops. Figure 2: Two Week CycleApril 2013 Automated Software Testing Magazine 15
  16. 16. Local Chapter News Latest From the Local Chapters Local Chapter Training AnnouncementNewsflash NewsflashNewsflash Newsflash Training Announcement! ATI Europe is organizing a TABOK training class in Rotterdam from June 17-18! If you areNewsflash Newsflash interested in this training please contact us at 16 Automated Software Testing Magazine April 2013
  17. 17. w w w. a u tom ATI Local Chapter Program Enhance the awareness of test automation as a discipline that, like other disciplines, requires continuous education and the attainment of a standard set of skillsHelp providecomprehensive, yet Offer training andreadily available events for participation byresources that will aid people in specific areas aroundpeople in becoming more the worldknowledgeable and equipped tohandle tasks related to testingand test automation ATI’s Local Chapter Program is established to help better facilitate the grassroots, global discussion around test automation. In addition, the chapter program seeks to provide a local based from which the needs of automation practitioners may be met. Start a Local Chapter Today Email contact(at) to learn more ATI - Meeting Local Needs In Test AutomationApril 2013 Automated Software Testing Magazine 17
  18. 18. Challenges of, techniques for, and return on investment from automation of embedded systems Overcoming the Unique Challenges of Test Automation on Embedded Systems T By David Palm est automation on an embedded system presents a unique set of challenges not encountered when automating tests in more conventional computing environments. If these differences are recognized and managed, the benefits of such automation—seen in terms of both expanded test coverage and time savings—can be great Speaking broadly, an embedded system is a in common: incorrect logic, math, algorithm computer system designed to interface with, implementation, program flow [branching, and control, some sort of electromechanical looping, etc.], bad data, data boundary issues, device(s). That amalgamation of computing initialization problems, mode switching power with an interface to external devices errors, data sharing, etc. Techniques to creates special challenges when it comes discover these software anomalies are well time to test the system software. Most documented in the software testing field. software shares certain potential anomalies Embedded systems, however, are unique.18 Automated Software Testing Magazine April 2013
  19. 19. Tools, Interfaces and Techniques for Automation on Embedded SystemsApril 2013 Automated Software Testing Magazine 19
  20. 20. What Makes Test Special Tools manageable and maintainable test Automation on Embedded First, choose a tool for test automation artifacts. Test automation is, after all, Systems Unique? on an embedded system. This just software created to test software, Embedded systems introduce many tool should include provisions to so it runs into many of the same factors that can result in anomalous manipulate physical analog and maintenance difficulties faced in more system behavior. These factors include: binary inputs and outputs interfaced to conventional software development. the system being tested. And often an Specifically, there are a number of test • Processor loading embedded system will utilize one or automation tools that utilize graphical • Watchdog servicing more communications protocols—the programming languages. These can be • Power modes (low, standby, automation tool will need to be able extremely useful for rapid prototyping to support these as well. This may, and easy comprehension of specific sleep, etc.) in fact, require a separate automation test steps. But more than one test • Bad interfacing to external developer has found that once these tool. peripherals graphical programs grow beyond a • Peripheral loading (e.g. network For example, the tester might evaluate certain size, the task becomes daunting traffic, user interface requests) the portions of code that manipulate even for the original developer - let hardware input/output (I/O) using alone somebody else - to understand, • Signal conditioning anomalies one automation tool and then utilize modify and extend. (e.g. filtering) a different automation tool to test • Thread priority inversion portions of code that communicate The way inputs and outputs are • Noise conditions using communications protocols such handled in the test tool should be This tool should include provisions to manipulate physical analog and binary inputs and outputs interfaced to the system being tested. While, it is necessary to address as BACnet or ModBus. The ideal abstracted from their particular more conventional software defects, situation, however, is when a given implementation in hardware. A test it is also necessary to consider these automation tool can handle all of the script should not “care” whether a additional factors when creating tests. system inputs and outputs—whether temperature setpoint, for example, Otherwise the test coverage will be hard wired or communicated— comes from a thermocouple, a inadequate and the system will likely together. thermistor, or an RTD. It should not ship with an unacceptable number of “care” if a communicated value comes potentially serious defects. In a similar vein, if the embedded from a BACnet network, a LAN, or system includes a user display it the Internet. Otherwise, a change in While most, if not all, of these may be possible to automate here as system implementation will break all obstacles can be overcome—given well, but frequently this will require a of the scripts. enough time and resources—the fact separate tool specifically designed to remains that addressing them requires test user interfaces. It is also useful for post-test analysis effort above and beyond what would if the tool bundles both the script and be required in a more conventional Debugging an automated test script for the results from a specific run of that computing system. Test automation an embedded system presents many test and archives them in a single file. on an embedded system requires three of the same challenges as debugging This eliminates any confusion that things: special tools, a customized the system software. So the same kind might arise as the test script is updated interface or test “harness” between of tools that the software developers or expanded—there will always be the tester and the system under test, use to manipulate system inputs and a record of exactly what test steps and special automation techniques view the outputs in real-time will be yielded a given set of results. to cover not only common software needed. defects but also those that are unique A number of commercial tools on to embedded systems. An automation tool must also produce the market cover some or all of20 Automated Software Testing Magazine April 2013
  21. 21. Embedded Systems these criteria. But the embedded test optical isolation on some or all automation market has some notable So the same connections between the test gaps and could be better served with harness and the system under test. specialized tools. kind of tools • Non-linear sensors such as thermocouples can be notoriously Special Interfaces that the software difficult to mimic, especially if a very high degree of accuracy is Because embedded systems represent an amalgamation of a computer system developers use necessary. Achieving accuracy to ± 0.5 ºC over the entire operating with external devices, a complete test range may not be too difficult, but automation system will require some sort of interface or “harness” between to manipulate 0.01 ºC is probably going to be very difficult. the automated test tool and the system under test. system inputs and • Presenting a system with a simple DC voltage (e.g. 0-10 Developing this test harness can be both complex and expensive. This view the outputs VDC) or current (e.g. 4-20 mA) is not difficult with off-the-shelf time and monetary cost has to be hardware. But presenting it with factored into the project in order to in real-time will high voltage, variable resistance, or variable capacitance will be get an accurate test schedule and return-on-investment calculation. It is necessary to work closely with both be needed. significantly more difficult and will likely require some custom software and hardware engineers to hardware development. design this test harness, particularly system. They also require spending precious time and money acquiring • End-points and extreme values if expertise in those disciplines is and modifying hardware for use in may be difficult to reproduce with insufficient. the test system. This can become the test harness. For example, Be sure to consider one important, especially burdensome if the hardware when using a simple resistor overarching principle before itself is going through numerous voltage divider to condition an planning and beginning work: An revisions. analog output to interface with automated test harness should not, an analog input, it often is not if at all possible, require any special Sometimes software “hooks” and possible to drive the input all the “hooks” in the software or any special hardware modifications cannot be way to its extremes (especially modifications to the hardware. Both avoided, and the payoff may be more on the high side) to simulate software “hooks” and hardware than sufficient to justify their use—as a “shorted” or “open” input modifications automatically mean that long as the potential pitfalls are fully condition. what is being tested is not the same as understood. But in general, try to • Complex and fast communications what the customer will be using. avoid these technical compromises. protocols are a challenge to automate. Special software “hooks” add Here are some more challenges that overhead and therefore affect the may be encountered when designing a • User-intervention is often still performance of the system under test. test harness for embedded automation: necessary via key pads, touch They also can result in a Catch 22— screens, etc. You can automate if the hooks have to be taken out of • High voltages and currents in the these things, but may not be cost the software just before shipment, system require due attention to effective. On the other hand, the the software has been changed in a the safety of both human beings user interface may be the only fundamental way while the ability to and the system under test. part of an embedded system that test it has been lost. can be cost-effectively automated • The interface to each input or and this may be well worth doing. And hardware modifications to output from the system under facilitate interfacing to an automated test may need to be conditioned • While the subsystems may be test system mean that standard, in order to interface with the manageable on a case-by-case production hardware cannot be used available test hardware. For basis, the ability to service all for tests. This can open the door to example, an analog voltage of the system inputs and outputs shipping the product with subtle may need to be divided down simultaneously can require a defects that appear on production before it is applied to an analog prohibitive amount of processing hardware but not on the modified input, or there may need to be power in the automated test tool.April 2013 Automated Software Testing Magazine 21
  22. 22. TestKIT Conference 2013 Testing & Test Automation Conference Save the Date! September 23 - 25, 2013 Crowne Plaza Hotel Arlington, VA The KIT is Koming... Back www.testkitconference.com22 Automated Software Testing Magazine April 2013
  23. 23. April 2013 Automated Software Testing Magazine 23
  24. 24. That last point brings up yet another be done at reduced speeds? If so, some building a test harness for the factor that must be considered when automation may still be possible and embedded system has been completed, designing a complete embedded test warranted. it is time to create some test scripts. automation system. The automation Here again there are a number of system will have to run fast enough The bottom line is that it is necessary special considerations that should be to sample inputs at a sufficient rate to factor in test harness development, factored in to the automation effort on and assert outputs in a timely fashion. fabrication, and testing of the harness an embedded system. What that means varies from system itself into the project schedule. It is to system. an added bonus if the test harness First, embedded systems can be is designed to be generic and/or vulnerable to initialization problems. In the HVAC industry, for example, expandable, so that it can be applied You can write scripts and have them being able to respond within one to more than one product. This pass ordinarily, just because some second is usually quite sufficient, can enhance the long-term return system input is typically sitting at with many events taking place in the on investment, so watch for these a given value. But if a prior test 5 to 10 second range. This makes test opportunities. script left that system input at a non- automation very feasible. On the other standard value, suddenly a subsequent hand, something like an automobile And given that there may be technical script may fail. So the same test on engine controller or a flight guidance obstacles that would prevent test different test set-up/facility/etc. can system may need to process inputs automation on the entire system, it fail unexpectedly because a less than hundreds or thousands of times per may still be worthwhile to automate comprehensive initialization has been second and assert outputs within even a portion of a project, provided performed. milliseconds of detecting a given that the return on the investment of condition. A test automation system time and effort promises a payoff. To address this, try to have a capable of that level of performance comprehensive initialization sequence may be prohibitively difficult and that can be called by all test scripts. Special Automation Make it a matter of policy that this expensive. Techniques: Some initialization sub-script is called at the But even faced with Typical “Gotchas” in start of each script. Yes, people are such a scenario, can Embedded Software going to complain that it seems to be useful testing Test Automation a waste of time to execute all of these Once appropriate automation tools steps at the start of every single test have been selected and script. But in the end the time will designing be well spent, since chasing errant and conditions caused by initialization Managing tolerances is crucial to successful embedded automation. Real-world systems do not lend themselves well to absolutes.24 Automated Software Testing Magazine April 2013
  25. 25. Embedded Systems problems will be avoided. system occurs when the test is not comprehensive enough to catch an Managing tolerances is crucial to successful embedded automation. Something like an unexpected glitch on a system output that might seem to fall outside of the Real-world systems do not lend specific test case. The difficulty is that themselves well to absolutes. It is not useful for a system requirement automobile engine a given system may have dozens or even hundreds of outputs. It is usually to say that a system needs to control impossible to check the status of all of to a setpoint of 72 F. It is only useful to say that the control must control controller or a flight them in every test step. to the setpoint plus or minus some At the very least, be sure that each test tolerance. Automated tests need to be written to handle the tolerances rather guidance system case explicitly checks the status of all known critical values. But on the than absolutes. Otherwise numerous flip side, so as not to add unnecessary testing errors will be logged when the real world system deviates, even may need to process execution overhead, if it’s truly a “don’t care” then don’t include it. slightly, from those absolutes. Formal script reviews are the solution Race conditions are caused specifically by timing tolerances. inputs hundreds here—other test engineers, hardware engineers and software developers may identify system outputs that were A race condition “is a flaw in an electronic system or process whereby the output or result of the process or thousands of not considered, but that really should be included in the test case. is unexpectedly and critically dependent on the sequence or timing of other events. The term originates times per second To Automate or Not to with the idea of two signals racing Automate: Finding the each other to influence the output first and assert outputs Return on Investment ( h t t p : / / e n . w i k i p e d i a . o rg / w i k i / The first question is whether the Race_condition). In the case of test automation, it most often manifests within milliseconds embedded system testing should be fully automated. The answer is itself in a condition in which the no, generally not. At the very least, test script execution gets to a check point first—perhaps even by just of detecting a given relying completely on automated tests is probably a bad idea. As mentioned a millisecond—and fails the step above, in a system of any significant because the process it’s checking has not caught up. condition. A test complexity there are simply too many inputs and outputs for the tests to be absolutely comprehensive. Many Conversely, the process on the embedded system may have just automation system times manual tests run by individuals with significant understanding of the completed and moved on—so system will catch defects that would the test script fails to detect the desired process state because it has capable of that level have been missed by a more narrowly scripted automated test. already moved on. Fortunately, race conditions are relatively easy to avoid. Tests can use a simple “Wait While” of performance may Total reliance on automated testing will generally not result in sufficient followed by a “Wait For” construct. coverage. There are aspects to most As long the timing requirements for the event that’s being tested are be prohibitively embedded systems that will defy full coverage through automation without understood, this combination will not enormous effort. And in certain only prevent false errors because of the race condition, but will also verify difficult and embedded systems there are human and machine safety considerations— that the system is working inside of in these cases, although the safety its formal timing requirements. expensive. tests can be automated, they should A very big potential “gotcha” in also be run manually so that a human automated testing on an embedded being verifies the safety of the system.April 2013 Automated Software Testing Magazine 25
  26. 26. if the software is defective. For • Remember that partial automation Regression is example, signal conditioning of a given test may still be algorithms such as piece-wise worthwhile. Even if an automated the primary filtering and linearization applied test has to stop execution key to ROI to analog inputs can have bugs at the transition points that are to prompt a user for certain intervention, the test might still relatively difficult to detect but provide better coverage, better can throw the input value wildly reporting, better consistency, out of range. It is easy to create a and be less mind-numbing—and So how does one decide whether test that sweeps the entire range of therefore more prone to being run to automate or not to automate? analog values in small increments accurately during regression— Ultimately this will be determined by looking for these anomalies. Such than a fully manual test. calculating the return on investment a test would be daunting to run (ROI) for the automation effort. manually, is easy to automate, and can catch software defects that Remember, first, that almost every could have catastrophic problems Conclusion obstacle can be overcome: it is purely in the embedded system. (But Test automation on an embedded a function of how much time, money, note that, in this example at least, system presents a unique set of and effort can be expended for an a good code inspection would go challenges not encountered when ROI. The better the metrics available pretty far in eliminating the risk automating tests in more conventional about your automation process, the of such a software defect.) computing environments. Test more information can be provided automation on embedded systems management concerning the ROI • Another way embedded system requires a unique set of software when automating new systems. test automation can have a huge tools. And since embedded systems payoff is to reproduce faults that involve an amalgamation of hardware There are many models available to require large numbers of iterations and software, a specific tester-to- calculate ROI for test automation. to occur, so many that manual controller interface is required. Any of these can applied to test testing would be impractical or Developing this interface can be automation on an embedded system. impossible. For example, I once complicated, challenging, and costly. The main difference will be to factor worked on a serious field issue The test professional must factor in in the time it takes to design, build that occurred very infrequently the cost and time needed to create the and troubleshoot the test harness(es). and at just a few job sites. The automation interface, or the testing It is also necessary to be aware of any software engineers eventually schedule is incomplete. extensions to the test tool that may came up with a set of conditions be required to provide coverage for they thought could reproduce the parts of the embedded system that the Because of the real-time nature problem. An automated test was tool does not already support, such as of embedded systems, the test developed to repeatedly present a new communications protocol or professional must also employ specific those conditions to the system and hardware I/O type. automation techniques. Being aware it turned out that on average the of these unique challenges will greatly error would occur approximately Here are some good rules of thumb to decrease the time needed to debug every 300 presentations. The maximize return on investment from automated tests, which will result in ability to reproduce the error, embedded test automation: successful automation attempts and even that infrequently, enabled greater likelihood of management • First and foremost, regression is the software engineers to craft satisfaction. the primary key to ROI. Repetition a fix. The test was then run for pays the bills. Automate tests that thousands of cycles and we were able to calculate, to a statistically Test automation on an embedded will be run numerous times over system can greatly expand the scope multiple test cycles. exact level of confidence, just how certain we were that the of testing and eliminate defects that • Intelligent selection of the scope remediation actually fixed the would have been virtually impossible of automation is the secondary problem. The payoff of the to identify using manual testing alone. key to ROI. Don’t bite off more automation was a little difficult to Awareness of the unique challenges than can be handled (or paid for). quantify in dollar terms, but the posed by embedded systems can help The low-hanging fruit would be payoff in increased management the test professional to decide on an tests that require large amounts confidence in the competence of appropriate scope of automation, of time to execute and where the engineering group was very avoid pitfalls during test development, catastrophic results could result high. and deliver a successful product.26 Automated Software Testing Magazine April 2013
  27. 27. Training That’s Process Focused, Yet Hands OnSoftware Test Automation Training www . training . automatedtestinginstitute . com Public Courses Software Test Automation Foundations Automated Test Development & Scripting Designing an Automated Test Framework Advanced Automated Test Framework Development Mobile Application Testing & Tools Virtual Courses Automated Test Development & Scripting Designing an Automated Test Framework Advanced Automated Test Framework Development Mobile Application Testing & Tools Come participate in a set of test automation courses that address both fundamental and advanced concepts from a theoretical and hands on perspective. These courses focus on topics such as test scripting concepts, automated framework creation, ROI calculations and more. In addition, these courses may be used to prepare for the TABOK Certification exam.April 2013 Public and Virtual Training Available Automated Software Testing Magazine 27
  28. 28. How Au Fits Int28 Automated Software Testing Magazine April 2013
  29. 29. utomated Testingto Agile Software Development By Bo Roop Being an Automator When Automation Is Not Your Only Task April 2013 Automated Software Testing Magazine 29
  30. 30. H ow does automated software the software save faster.” This was not modify the automated scripts or I would testing fit into the big picture of truly helpful, nor necessarily achievable. work with the developers to help correct agile software development? In While we could have made it save faster, the regression issues. my case, it was more about the it still might not have met his desired tester than it necessarily was about the speed improvement since it was never By sharing the results of the automated testing. clearly defined. So I was tasked with test runs with the developers in the early morphing that ambiguous requirement stages of the new iteration, I could get I was the newest member of an existing into something achievable: “Make bugs fixed quicker. Those regression eXtreme Programming (XP) team. the saving of new customer records issues were knocked out before we built This software team had been working complete in less than 50 percent of the on them and made them unmanageable. together for a few years, but had just current rate.” We benchmarked results We sometimes found incomplete areas begun its transition into using the agile from the existing software to establish a of the software that automated testing methodologies. The company I worked baseline, and then aimed at making the was able to reveal even though the for had a standardized testing team software faster. application hid it from the user. Finding that was used as a shared resource “Todo:” comments in the code always among each of the individual software Once the new software pieces were raised a red flag, and set me off on a development teams. implemented, I then performed ad- more in-depth hunt. hoc and exploratory testing on the new Its members (re)learned each software builds, and I found bugs. This testing was Our development team manager also package as it came, but they were performed within the two week iteration, appreciated the prompt feedback which following the developers at the end of and the feedback to the development staff included metrics-like code coverage the software development cycle only. members was almost instantaneous. The and pass/fail statistics. Those are the There was no early testing integration developers visible items that upper management and the software quality was suffering liked to track, even though we because of the waterfall approach. constantly reminded everyone that those were just a few I was asked to join this new metrics of interest, not all. agile team as a tester, but quickly found that my role Within the first three would be so much more. days of the new iteration I would Once I got up to speed have automated the last on using the software iteration’s newly created and understanding features, passed the results on to our customer’s goals, management and the rest of the team, I found that I had a and then begun looking at the new better understanding features the developers were working of the whole system than on over the past few days in the current the developers who were focused on immediately iteration. just the small areas for which they were corrected the problems and writing code. So I transitioned into a moved on to the next task, and I, of role of product champion and customer course, verified their fixes. We found that running with the software advocate. I worried about the whole automation a week behind gave us a few benefits. The code to be automated The Beginning product and how our customers would use it, like it, and recommend it. had already been manually tested and verified (more on that later), and it was Our development team had a dedicated At the beginning of a new already in a shippable state at the end XP customer in our marketing person, iteration, I would spend the first two to of the last iteration. Remember that we who knew what he wanted the software three days running the existing automated were following strict XP practices. to do, but he couldn’t write a realistic testing suite to verify we didn’t have any requirement. So I spent many hours each regression issues during the previous Stable software is always easier to iteration taking his broad requirements iteration and once all the existing tests automate than software that’s in a state and changing them into achievable passed I’d take off automating the new of flux. These automation tasks also software tasks for the team to implement features from the last two weeks. If filled that gap when the new code hadn’t they didn’t all pass, either because of been developed yet, and manual testing When we started using him as a customer, regression issues or changes in the way would simply be repeating tests from the we’d receive requirements like, “make the code behaved, I’d go through and previous iteration.30 Automated Software Testing Magazine April 2013
  31. 31. Agile Automation our internal XP customer, the marketing E xploratory & Hostile After the software automationtasks were completed, I utilized guy, to refine his requirements into something usable. I had a number of hurdles to overcome with him as our internal customer: He had no background New features/controls While we were working to get requirements refined, I would still grabexploratory testing models to find in software development and didn’t the new builds and manually test the newthe bugs in the new code. The new understand how engineers implemented features. On the project board, anythingfeatures were tested manually before new code. While he was great at creating that was moved into the “ready for test”any of the existing code was regression brochures and marketing campaigns, he column was fair game. Sometimes it wastested again. The developers needed was not as gifted with time management complete and ready, and other times itthe feedback on newly implemented and interpersonal skills. In his haste to was not. So that was another balancingfeatures while that code was still fresh get the new requirements completed, act I had to learn. Some developers havein their minds. During our daily stand- he would rush, meaning we would thicker skin than others, and appreciateup meetings, I received information frequently receive concept-only the immediate feedback, and othersfrom the developers telling me which requirements. We’d hear things like we despise being told their code is brokenareas of the code were in flux or in need need to make the software pretty. (especially if it’s not 100 percent feature-of greater attention. We were working complete). After the next iteration’stogether toward a common goal. Pretty? Really? requirements were delivered, I’d change Once the new software pieces were implemented, I then performed ad- hoc and exploratory testing on the new builds, and I found bugs.I then performed ad-hoc testing as a So I would get started with him during over to running the regression test suite,hostile user. We had customers who the second week of the iteration and and continue my manual testing ofwere forced to use our software by their figure out what his “make it pretty” new builds. These last few days of themanagers, so they would try to find requirement really meant. iteration allowed me to test some of theproblems with our software or create undocumented requirements that neededreasons to not use it. Since our software He didn’t like the way the software attention.provides feedback on the quality of looked and he wanted it to be moretheir work in a production environment, closely aligned with the Windows Before beginning automation of themany viewed the software as a threat to operating system. So we changed his new controls and interfaces, I’d taketheir jobs instead of as a tool to make requirement into, “make the software a dry run at learning the new controlstheir jobs easier. use some of the newer Windows look into the automated software package. I and feel. Stuff like rounded buttons, needed to ensure that controls could beIt was a difficult task for our team to gradients and transparency.” This was scriptable. They needed to be properlyimprove the perception of our software better, but it was still very ambiguous. and consistently named in order to keepwith those customers. So if they were But by removing some of our developer’s the automated scripts as readable asgoing to try to break the software, I had creative license, the software started possible. Hotkeys and shortcuts neededto try to beat them to the punch. I had to looking prettier (in his eyes and ours); to be unique, there needed to be stabilitytry to find the areas where the problems and by taking baby steps toward a real in the code, and the software needed toexisted before they did. By using the requirement, we were at least moving in be ready to be released at the end of eachsoftware in the same fashion as our the right direction. iteration.destructive customers, it became verysolid. My goal was to get all of the requirements I would also pick a few specialized from the marketing guy translated into types of testing to focus in on during developer-speak, and ready to share the tail end of the iteration. Sometimes Requirements After a few days of performingmanual testing of the new builds, I’d during the next iteration planning meeting. In the beginning, it took about five to eight days to get everything defined. Toward the end, we were able to I would focus on the ease of use of the overall software, finding out if it was easy to learn, contained clear and useful warning and/or error messages.change gears and begin working with knock out the requirements much faster. Other times I would focus on WindowsApril 2013 Automated Software Testing Magazine 31