Your SlideShare is downloading. ×
  • Like
Automated softwaretestingmagazine april2013
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Now you can save presentations on your phone or tablet

Available for both IPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Automated softwaretestingmagazine april2013

  • 928 views
Published

 

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
928
On SlideShare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
8
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. An A U TO M AT E D T ESTING I NSTITUTE Publication - www.automatedtestinginstitute.comAutomated .......S T oftware esting MAGAZINE April 2013 $10.95 Navigating Continuous Changeand Developer Tools A Tester In An Ocean of Developer Tools Making the transition Automated from waterfall to Test Automation Testing In Agile agile on Embedded Development Systems test automation When The Unique challenges automation is not Your associated with Only task ‘embedded’ automation THE RIGHT TOOL: Building a Mobile Automation Testing Matrix
  • 2. OD TestKIT ONDEMAND If you can’t be live, be virtual ODONDEMAND SessionsAnywhere, anytime access to testing and test automation sessions>>>> Available anywhere you have an internet connection <<<<>>>> Learn about automation tools, frameworks & techniques <<<<>>>> Explore mobile, cloud, virtualization, agile, security & management topics <<<<>>>> Connect & communicate with testing experts <<<< ONDEMAND.TESTKITCONFERENCE.COM
  • 3. Automated S T oftware estingApril 2013, Volume 5, Issue 1ContentsContinuous Change and Developer ToolsA test automator in a “fast-paced environment” is faced with little time for automation and unclear informationabout non-standard, non-GUI systems for which they have no comprehensive tool for dealing with. This issueis dedicated to approaches necessary for successful automation in these types of environments, including closecoordination with developers and the use of tools that may not traditionally be used by testers.FeaturesA tester in an ocean of developer tools 12This article describes one team’s journey from a waterfall environment to an agile-like environment where testers were moregreatly exposed to developer processes and tools. By Michael AlbrechtOvercoming Challenges of Test Automation on Embedded Systems 18This article addresses approaches for effectively adjusting your automation techniques when faced with a non-conventional system suchas an embedded system. By David PalmHow Automated Testing Fits Into Agile Software Development 28This article offers a roadmap for test automation implementation when test automation is not your only task By Bo RoopColumns & DepartmentsEditorial 4 Local Chapter News 16Continuous Change and Development Tools Read featured blog posts from the web.The challenge of “fast-paced” environments. I ‘b’log to u 36Authors and events 6 Read featured blog posts from the web.Learn about AST authors and upcoming events. Go On A Retweet 38SHIFTING TRENDS 8 Read featured blog posts from the web.Automation Ups and DownsAnalyze trends from the ATI Honors. Hot topics in automation 40 The Right Tool For the JobOpen sourcery 10 Building a Mobile Automation Testing MatrixIntroducing... New Mobile Source The AST Magazine is a companion to the ATI Online Reference.Mobile tools in the ATI Honors http://www.astmagazine.automatedtestinginstitute.com April 2013 www.automatedtestinginstitute.com Automated Software Testing Magazine 3
  • 4. Editorial Navigating Continuous Change and Development Tools by Dion Johnson Working in a “fast-paced environment” is often a daunting task, particularly for a software quality engineers. There are often no straight-forward answers for anything, system requirements are spread out in a loose collection of stories contained in a tool organized largely by build, sprint and or release. And just when you think you’ve got things figured out, there is another layer to be pealed back revealing some fundamental aspects of the system that you, to that point, were never aware of. The task is even more challenging for a software test automator. Test automators are often faced with non-traditional or non-GUI systems and a lower than normal tolerance for investment compared with their “slower-paced” GUI systems for which they have no processes and tools. Next, we are plunged counterparts - which is scary given the comprehensive tool for quickly or even further into an environment that fact that even these so-called “slower- effectively dealing with. Test automation is often less conventional for software paced” projects typically have a low tolerance for investment themselves. This lower tolerance for investment may A test automator is therefore faced finds roots in the fact that “fast-paced” projects are often pretty accepting with little time for automation, unclear of whatever the finished product is. information about non-standard, non- Given the frequent delivery of updates, little hesitation is given to moving a GUI systems for which they have no feature back several releases. As long as the release contains something of comprehensive tool value, no one bats an eye at delaying functionality or even the testing of implementation in this type of situation testers. Entitled “Overcoming the functionality. In addition, processes are often relies on a little ingenuity, close Unique Challenges of Test Automation often left undocumented which makes coordination with developers and the on Embedded Systems”, this feature it difficult to measure the effectiveness use of tools that may not traditionally written by David Palm addresses how of those processes, thus also making it be used by testers. This issue of the to effectively adjust your automation difficult to make a case for investment magazine focuses on automation under techniques when faced with a non- in process improvement and tools these circumstances. conventional system such as an that don’t directly affect the software embedded system. Finally, we address developers’ perception of convenience The first feature entitled “A Tester in An adjusting your automation approaches and efficiency. Ocean of Developer Tools” by Michael to fit into agile development where Albrecht describes one team’s journey multi-tasking is critical. In this article, A test automator is therefore faced from a waterfall environment to an Bo Roop offers a roadmap for test with little time for automation, unclear agile-like environment where testers automation implementation when test information about non-standard, non- were more greatly exposed to developer automation is not your only task.4 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 5. 5 Annual thATI Automation Honors Celebrating Excellence in the Discipline of Software Test Automation Nominations Begin April 8th!www.atihonors.automatedtestinginstitute.com
  • 6. Authors and EventsWho’s In This Issue? Michael Albrecht Michael Albrecht has been working within the test profession Automated S T oftware esting since the mid 90s. He has been a test engineer, test manager, technical project manager and test Managing Editor strategy/architect manager. From late 2002, Michael has been working intensely with test Dion Johnson management, test process improvement and test automation architectures. He has also been Contributing Editors teaching ways to adapt test-to-agile-system development methodologies Donna Vance like Scrum. Edward Torrie Darren Madonick is a mobile testing Director of Marketing and Events specialist at Keynote Systems. He has six years Christine Johnson of experience in the mobile industry, along with a lifetime of experience as a technology enthusiast. A PUBLICATION OF THE AUTOMATED As a mobile evangelist Madonick works closely TESTING INSTITUTE with Keynote customers to find the right mix of products and services that meet their current needs, as well as future needs for mobile testing and development. He spends many hours both remotely and on site within many customers in various verticals to help plan their mobile testing and development effort. Madonick’s experience in this industry has enabled him to work directly with many Fortune 500 companies on a regular basis, and helps these CONTACT US organizations make important decisions regarding the future of their mobile AST Magazine enterprise. astmagazine@automatedtestinginstitute.com ATI Online Reference David Palm is lead test engineer at Trane, contact@automatedtestinginstitute.com a leading global provider of indoor comfort systems and services and a brand of Ingersoll Rand. After gaining 18 years of experience in embedded software development in the materials ATI and Partner Events testing, automotive, hydraulics, and heating, ventilation and air conditioning industries, Palm Now Available transitioned to embedded software testing in TestKIT On Demand Virtual Conference 2005. He has extensive experience managing the testing process for complex http://ondemand.testkitconference.com and interconnected embedded control systems, from concept to production. His greatest expertise is in applying test automation to embedded control June 17-18, 2013 systems. ATI Europe TABOK Training contact@automatedtestinginstitute.com Bo Roop is employed as a senior software quality assurance engineer at the world’s largest September 23-25 designer and manufacturer of color measurement systems. He is responsible for testing their retail TestKIT 2013 Conference paint matching systems, which are created using an http://www.testkitconference.com agile-based software development methodology. Bo helps gather and refine user requirements, prototypes user interfaces, and ultimately performs the software testing of the final product. Testing is a passion of The Automated Software Testing (AST) Magazine is an Bo’s, and he is involved with local software groups as well as a few online Automated Testing Institute (ATI) publication. For more information regarding the magazine visit forums. He’s an advocate for software that meets the customer’s needs and http://www.astmagazine.automatedtestinginstitute.com expectations and can frequently be heard trying to redirect teams toward a more customer-centric point of view.6 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 7. TestKIT Conference 2013 Testing & Test Automation Conference Save the Date! September 23 - 25, 2013 Crowne Plaza Hotel Arlington, VA The KIT is Koming... Back www.testkitconference.comApril 2013 www.automatedtestinginstitute.com Automated Software Testing Magazine 7
  • 8. Shifting Trends Automation Ups and Downs The 4th Annual ATI Automation Honors reveal contemporary tool sentiment T here seems to be no definitive favorite in the Best Open Source Functional Automated Test Tool – Java/Java Toolkits subcategory. FEST, a finalist that made its first appearance this year, has taken the top spot, but it better watch its back, because there has been a different winner each year since this subcategory was introduced. The jury is still out on whether the community will eventually lock into a long-term favorite, but for now, FEST is the champion. [ The ATI Honors tells us a lot about current and future tool trends ] M uch like the Best Open Source G Functional Automated Test orillaLogic is no stranger to victories in Tool – Java/Java Toolkits the ATI Honors with its FlexMonkey tool subcategory, the Best Commercial Functional Automated winning first place in the Best Functional Test Tool – Web subcategory has also experienced its fair Automated Test Tool – Flash/Flex subcategory in both 2010 share of turnover. This seems largely due to the in-and-out and 2011, while also being named the runner up in the Best nature of QuickTest Professional (QTP) aka (HP Functional Function Automated Test Tool – Overall subcategory in Tester). Since this tool seems to only have an eligible release 2011, coming in just behind the ever popular Selenium. This every other year, it has only been named a finalist in the organization seems to have truly hit their stride however, awards every other year. In years that it has been a finalist, it in the 4th Annual awards with FlexMonkey’s successor has dominated this and other subcategories. Years that QTP tool known as MonkeyTalk. MonkeyTalk not only picked has not been a finalist seems to be open season for multiple up where FlexMonkey left off by winning Best Functional other tools to shine. The 2nd Annual awards saw SilkTest Automated Test Tool – Flash/Flex subcategory, it also swept win this subcategory in the absence of QTP. This year, it all subcategories in the newly added Best Mobile Automated saw new comer, Automation Anywhere win the subcategory. Test Tool Category. This included the Android, iOS, and Congratulations to Automation Anywhere… at least for now. Overall subcategories. G T his is the first year that the top performance oogle lost. How many times do you get to tool LoadRunner has not been a finalist – say that? Well, in this year’s ATI Honors, due to the fact that it had no eligible release. this is an accurate statement as Google SilkPerformer was the clear benefactor of the high profile lost the crown it held for two years in the Best Open Source absence. SilkPerformer has had a steady assent to the top over Unit Automated Test Tool – C++ subcategory. ATF, a tool that the years, coming in as the Runner Up to LoadRunner in the entered the fray as the runner up in this category last year, Best Commercial Performance Automated Test Tool – Overall pulled an upset by beating Google for the number one spot. subcategory in the 2nd Annual and 3rd Annual awards. With I’m sure Google is not too concerned by this minor setback (if LoadRunner out of the picture, SilkPerformer was unrelenting they are aware of it at all), but our community has spoken and in its quest for number one and it finally achieved the spot this made their voices clear. year.8 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 9. Crowdamation Crowdsourced Test Automation It will offer the flexibility to use a tool of choice (open source and commercial), have teams operate out of different locations, address the challenges of different platforms introduced by mobile and other technologies, all while still maintaining and building a cohesive, standards-driven automated test implementation that is meant to last. “It’s OK To Follow the Crowd” Inquire at contact@automatedtestinginstitute.com ”dworC eht wolloF oT KO s’tI“April 2013 www.automatedtestinginstitute.com Automated Software Testing Magazine 9
  • 10. Open Sourcery Introducing... New Mobile Source Mobile Open Source Tools that Made Their First Appearance in the ATI Honors As the testing community gears up for the 5th Annual ATI Automation Honors, let’s take a look at the current open source finalists and winners that made their first entry into the Honors during the 4th Annual Awards. M obile automated test tool is not totally new to the awards. categories were added MonkeyTalk was in the awards for consideration during under one of its former names: the 4 th Annual ATI Automation FlexMonkey. It’s other former name Honors, which brought several new was FoneMonkey. FoneMonkey and tools to the forefront for recognition FlexMonkey have now combined to in the awards. These tools are form a tool known as MonkeyTalk highlighted in this article. and is a free and open source, cross- platform, functional testing tool MonkeyTalk from GorillaLogic that supports test automation for native iOS and While MonkeyTalk Android apps, as well as mobile web has not technically and hybrid apps. It’s name change been in the ATI was apparently well received as it Table 1: Best Open Source Honors before, it claimed the top prize in each of the Mobile Test Tool Finalist10 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 11. Open Sourcery three mobile test subcategories. is an iOS integration test framework Calabash that leverages undocumented iOS Frank Most of APIs for easy automation of iOS the mobile The dog inside apps. tools in the ATI Honors either of a bun trotted Zucchini supported iOS or Android, but not into the ATI Let’s see if we can both. Although only nominated in Automation link Frank, one of the Android category, Calabash is Honors as the runner-up in the Best the previously one of the few tools that supports Mobile automated test tool categories were added for consideration during the 4th Annual ATI Automation Honors, which brought several new tools to the forefront for recognition Open Source Mobile Automated mentioned finalists, with our next both mobile platforms. In addition, Test Tool iOS subcategory. Frank finalist, Zucchini, through a series like a couple of the other tools, this is a tool for writing structured of associations. While discussing LessPainful supported tool also acceptance tests and requirements the Frank automated tool, a tool supports Cucumber for developing using Cucumber and have them by the name of Cucumber was automated scripts. execute against an iOS application mentioned. As you are probably already aware of, a cucumber is not KIF (Keep It Functional) only a tool, but the name of food as Robotium well. A food that is often mistaken The final mobile tool that entered Ever heard of K.I.S.S, which stands for a cucumber is a Zucchini. the ATI Honors in the Best Open for Keep It Simple Stupid”? It is a Source Mobile Automated Test Tool principle that asserts the power of Tada! Android subcategory is Robotium. simplicity in system design. A test Maybe this association is how our With Robotium, test automators tool came into the ATI Honors with a next tool was able to make its first can write functional, system and name that may be inferred to follow appearance in the ATI Honors. Or acceptance test scenarios for a similar principle, but instead of maybe it’s because the community testing multiple Android activities. just keeping things simple, this likes the way this tool uses natural Robotium was not only a finalist in framework touts the power of language for interacting with and the Android subcategory, but was keeping things functional. KIF, developing automated tests for iOS also the Runner-up in the Overall which stands for Keep It Functional, based applications. subcategory behind MonkeyTalk.April 2013 www.automatedtestinginstitute.com Automated Software Testing Magazine 11
  • 12. “If you can’t beat them join them” A Tester In an OCEAN Of Developer Tools For years we’d been walking in the protected world of waterfall projects far away from requirement negotiations and acceptance tests and suddenly , awoke to a project with short iterations (two weeks) and in the lap of the customer by Michael Albrecht12 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 13. Imagine an organization used to slow- pace, annual deliveries, always with a Graphical User Interface (GUI), and very little test automation. Suddenly that organization is faced with short iterations (two weeks), no GUI and very high performance requirements. This was the situation my test team faced, and forcing us to deal with the fact that our testing approaches were no longer going to be effective. As a result, we formed a small, technical group to assess our approaches and identify new quality assurance tactics. This article follows our journey to agility prior to the modern popularity of agile.April 2013 www.automatedtestinginstitute.com Automated Software Testing Magazine 13
  • 14. The birth of new All requirements started with a picture describing the basic flow, followed tactics by short informative text explaining The first order of business was to find the flow throughout the system. The a way of testing without a GUI. Scary! requirements were divided into groups: Due to lack of tools knowledge, the QA • Basic flow people turned to the developers to find useful tools and experience. To be able • Alternative flows to proceed, the project created a few • Error flows simple rules: • XML structure expected from the 1. Sit together system 2. Hold daily status meetings Figure 1: Meeting Notes Short iterations and 3. Learn some basic programming continuous delivery (same language as the developers) As I mentioned, before a complete 4. Select tools already used by the Development (TDD) approach to system delivery had to take place every developers for unit testing if product development. To adjust to the fortnight, but we actually delivered possible new development practices and software every week. A typical fortnight cycle is 5. Create semi-automatic and the test team identified pertinent tools illustrated in Figure 2. Since the project automatic tests for low-level test automation, including only had one team, the team focus tools for: shifted over time during this two-week 6. Learn from the past cycle. 7. Get going! • XML schema validation • SOAP testing The birth of the technical Developer automation • “Check in” and nightly test automator and tools execution The absence of a GUI increased the need As is often the case, an agreement with for more technical skills to accompany the customer was made without talking Customer requirements excellent domain knowledge, but finding to the test team. The delivery cycles and test cycles all that knowledge in one person is very were built strictly upon the developer’s For the first time in my life I found the hard. So we discarded the traditional delivery capacity. customer requirements very detailed, project approach, and got everyone In addition, the development team but at the same time flexible. The together (both testers and developers). began relying on APIs and open source requirements were created as use In our team we got two testers with some tools for implementing a Test Driven cases from an actor’s point of view. development skills, and one tester with Writing requirements for a technical automator is very easy; finding all that knowledge in one person is very hard. So we discarded the traditional project approach, and got everyone together14 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 15. big issue since we agreed to limit the R performance tests to search functions, and not updates. Doing the tests at et night limited outside interference. The challenge was in building our own performance tool. Once again we could ro thank team spirit for solving this. The developers implemented extended logging in databases and APIs together sp with a simple GUI to control parameters such as the number of concurrent users, time intervals between executing tests ec and sequence order. As testers, we developed functional tests and scenarios that could be run independently and ti as part of load/performance scenarios. The test cases were both created in Java v code, as well as saved batches in our SOAP test tools. The tricky part came e when we wanted to measure transaction times throughout the system during performance test execution. In Excel, we collected data from the database, API and GUI logging, connected each transaction to the loggings, and created a macro in Excel that calculated the duration for each and every one. Whilevery good business knowledge. The lack now had very precise and demanding we did not spend any money on tools,of development skills within the test transaction time requirements. No we did spend a LOT of time. In ourgroup was cured by getting expert help environment except for production situation it was easier to explain (hide)from the developers within the team. then tool costs.Traditional testers The conclusions of aseeking usable tools! The absence of success storyWith little time and money, weabandoned our thoughts of acquiring a a GUI increased Teamwork, teamwork and once again teamwork – combined with continuous the need for morenew tool. The developers had been using improvements and automated testsimple, open-source tools for some time batches – resulted in a successful testin other projects, so we just started to effort. Being forced to use the sameuse the same tools for our test scenarios. technical skills tools as the developers was a blessing from above, but we probably would’vePerformance test at no been better off without creating our owncost performance tool. It was a mess fromThe company had no earlier experience was sufficient for executing the tests. start to end. I have several other lessonsfrom performance tests, and the customer Running tests in production was not a learned, my top three are: 1. Work as a team 2. Use the same tools 3. Involve the customer in continuous feedback loops. Figure 2: Two Week CycleApril 2013 www.automatedtestinginstitute.com Automated Software Testing Magazine 15
  • 16. Local Chapter News Latest From the Local Chapters Local Chapter Training AnnouncementNewsflash NewsflashNewsflash Newsflash Training Announcement! ATI Europe is organizing a TABOK training class in Rotterdam from June 17-18! If you areNewsflash Newsflash interested in this training please contact us at training@automatedtestinginstitute.com 16 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 17. w w w. a u tom atedtestinginstitute.com ATI Local Chapter Program Enhance the awareness of test automation as a discipline that, like other disciplines, requires continuous education and the attainment of a standard set of skillsHelp providecomprehensive, yet Offer training andreadily available events for participation byresources that will aid people in specific areas aroundpeople in becoming more the worldknowledgeable and equipped tohandle tasks related to testingand test automation ATI’s Local Chapter Program is established to help better facilitate the grassroots, global discussion around test automation. In addition, the chapter program seeks to provide a local based from which the needs of automation practitioners may be met. Start a Local Chapter Today Email contact(at)automatedtestinginstitute.com to learn more ATI - Meeting Local Needs In Test AutomationApril 2013 www.automatedtestinginstitute.com Automated Software Testing Magazine 17
  • 18. Challenges of, techniques for, and return on investment from automation of embedded systems Overcoming the Unique Challenges of Test Automation on Embedded Systems T By David Palm est automation on an embedded system presents a unique set of challenges not encountered when automating tests in more conventional computing environments. If these differences are recognized and managed, the benefits of such automation—seen in terms of both expanded test coverage and time savings—can be great Speaking broadly, an embedded system is a in common: incorrect logic, math, algorithm computer system designed to interface with, implementation, program flow [branching, and control, some sort of electromechanical looping, etc.], bad data, data boundary issues, device(s). That amalgamation of computing initialization problems, mode switching power with an interface to external devices errors, data sharing, etc. Techniques to creates special challenges when it comes discover these software anomalies are well time to test the system software. Most documented in the software testing field. software shares certain potential anomalies Embedded systems, however, are unique.18 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 19. Tools, Interfaces and Techniques for Automation on Embedded SystemsApril 2013 www.automatedtestinginstitute.com Automated Software Testing Magazine 19
  • 20. What Makes Test Special Tools manageable and maintainable test Automation on Embedded First, choose a tool for test automation artifacts. Test automation is, after all, Systems Unique? on an embedded system. This just software created to test software, Embedded systems introduce many tool should include provisions to so it runs into many of the same factors that can result in anomalous manipulate physical analog and maintenance difficulties faced in more system behavior. These factors include: binary inputs and outputs interfaced to conventional software development. the system being tested. And often an Specifically, there are a number of test • Processor loading embedded system will utilize one or automation tools that utilize graphical • Watchdog servicing more communications protocols—the programming languages. These can be • Power modes (low, standby, automation tool will need to be able extremely useful for rapid prototyping to support these as well. This may, and easy comprehension of specific sleep, etc.) in fact, require a separate automation test steps. But more than one test • Bad interfacing to external developer has found that once these tool. peripherals graphical programs grow beyond a • Peripheral loading (e.g. network For example, the tester might evaluate certain size, the task becomes daunting traffic, user interface requests) the portions of code that manipulate even for the original developer - let hardware input/output (I/O) using alone somebody else - to understand, • Signal conditioning anomalies one automation tool and then utilize modify and extend. (e.g. filtering) a different automation tool to test • Thread priority inversion portions of code that communicate The way inputs and outputs are • Noise conditions using communications protocols such handled in the test tool should be This tool should include provisions to manipulate physical analog and binary inputs and outputs interfaced to the system being tested. While, it is necessary to address as BACnet or ModBus. The ideal abstracted from their particular more conventional software defects, situation, however, is when a given implementation in hardware. A test it is also necessary to consider these automation tool can handle all of the script should not “care” whether a additional factors when creating tests. system inputs and outputs—whether temperature setpoint, for example, Otherwise the test coverage will be hard wired or communicated— comes from a thermocouple, a inadequate and the system will likely together. thermistor, or an RTD. It should not ship with an unacceptable number of “care” if a communicated value comes potentially serious defects. In a similar vein, if the embedded from a BACnet network, a LAN, or system includes a user display it the Internet. Otherwise, a change in While most, if not all, of these may be possible to automate here as system implementation will break all obstacles can be overcome—given well, but frequently this will require a of the scripts. enough time and resources—the fact separate tool specifically designed to remains that addressing them requires test user interfaces. It is also useful for post-test analysis effort above and beyond what would if the tool bundles both the script and be required in a more conventional Debugging an automated test script for the results from a specific run of that computing system. Test automation an embedded system presents many test and archives them in a single file. on an embedded system requires three of the same challenges as debugging This eliminates any confusion that things: special tools, a customized the system software. So the same kind might arise as the test script is updated interface or test “harness” between of tools that the software developers or expanded—there will always be the tester and the system under test, use to manipulate system inputs and a record of exactly what test steps and special automation techniques view the outputs in real-time will be yielded a given set of results. to cover not only common software needed. defects but also those that are unique A number of commercial tools on to embedded systems. An automation tool must also produce the market cover some or all of20 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 21. Embedded Systems these criteria. But the embedded test optical isolation on some or all automation market has some notable So the same connections between the test gaps and could be better served with harness and the system under test. specialized tools. kind of tools • Non-linear sensors such as thermocouples can be notoriously Special Interfaces that the software difficult to mimic, especially if a very high degree of accuracy is Because embedded systems represent an amalgamation of a computer system developers use necessary. Achieving accuracy to ± 0.5 ºC over the entire operating with external devices, a complete test range may not be too difficult, but automation system will require some sort of interface or “harness” between to manipulate 0.01 ºC is probably going to be very difficult. the automated test tool and the system under test. system inputs and • Presenting a system with a simple DC voltage (e.g. 0-10 Developing this test harness can be both complex and expensive. This view the outputs VDC) or current (e.g. 4-20 mA) is not difficult with off-the-shelf time and monetary cost has to be hardware. But presenting it with factored into the project in order to in real-time will high voltage, variable resistance, or variable capacitance will be get an accurate test schedule and return-on-investment calculation. It is necessary to work closely with both be needed. significantly more difficult and will likely require some custom software and hardware engineers to hardware development. design this test harness, particularly system. They also require spending precious time and money acquiring • End-points and extreme values if expertise in those disciplines is and modifying hardware for use in may be difficult to reproduce with insufficient. the test system. This can become the test harness. For example, Be sure to consider one important, especially burdensome if the hardware when using a simple resistor overarching principle before itself is going through numerous voltage divider to condition an planning and beginning work: An revisions. analog output to interface with automated test harness should not, an analog input, it often is not if at all possible, require any special Sometimes software “hooks” and possible to drive the input all the “hooks” in the software or any special hardware modifications cannot be way to its extremes (especially modifications to the hardware. Both avoided, and the payoff may be more on the high side) to simulate software “hooks” and hardware than sufficient to justify their use—as a “shorted” or “open” input modifications automatically mean that long as the potential pitfalls are fully condition. what is being tested is not the same as understood. But in general, try to • Complex and fast communications what the customer will be using. avoid these technical compromises. protocols are a challenge to automate. Special software “hooks” add Here are some more challenges that overhead and therefore affect the may be encountered when designing a • User-intervention is often still performance of the system under test. test harness for embedded automation: necessary via key pads, touch They also can result in a Catch 22— screens, etc. You can automate if the hooks have to be taken out of • High voltages and currents in the these things, but may not be cost the software just before shipment, system require due attention to effective. On the other hand, the the software has been changed in a the safety of both human beings user interface may be the only fundamental way while the ability to and the system under test. part of an embedded system that test it has been lost. can be cost-effectively automated • The interface to each input or and this may be well worth doing. And hardware modifications to output from the system under facilitate interfacing to an automated test may need to be conditioned • While the subsystems may be test system mean that standard, in order to interface with the manageable on a case-by-case production hardware cannot be used available test hardware. For basis, the ability to service all for tests. This can open the door to example, an analog voltage of the system inputs and outputs shipping the product with subtle may need to be divided down simultaneously can require a defects that appear on production before it is applied to an analog prohibitive amount of processing hardware but not on the modified input, or there may need to be power in the automated test tool.April 2013 www.automatedtestinginstitute.com Automated Software Testing Magazine 21
  • 22. TestKIT Conference 2013 Testing & Test Automation Conference Save the Date! September 23 - 25, 2013 Crowne Plaza Hotel Arlington, VA The KIT is Koming... Back www.testkitconference.com22 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 23. April 2013 www.automatedtestinginstitute.com Automated Software Testing Magazine 23
  • 24. That last point brings up yet another be done at reduced speeds? If so, some building a test harness for the factor that must be considered when automation may still be possible and embedded system has been completed, designing a complete embedded test warranted. it is time to create some test scripts. automation system. The automation Here again there are a number of system will have to run fast enough The bottom line is that it is necessary special considerations that should be to sample inputs at a sufficient rate to factor in test harness development, factored in to the automation effort on and assert outputs in a timely fashion. fabrication, and testing of the harness an embedded system. What that means varies from system itself into the project schedule. It is to system. an added bonus if the test harness First, embedded systems can be is designed to be generic and/or vulnerable to initialization problems. In the HVAC industry, for example, expandable, so that it can be applied You can write scripts and have them being able to respond within one to more than one product. This pass ordinarily, just because some second is usually quite sufficient, can enhance the long-term return system input is typically sitting at with many events taking place in the on investment, so watch for these a given value. But if a prior test 5 to 10 second range. This makes test opportunities. script left that system input at a non- automation very feasible. On the other standard value, suddenly a subsequent hand, something like an automobile And given that there may be technical script may fail. So the same test on engine controller or a flight guidance obstacles that would prevent test different test set-up/facility/etc. can system may need to process inputs automation on the entire system, it fail unexpectedly because a less than hundreds or thousands of times per may still be worthwhile to automate comprehensive initialization has been second and assert outputs within even a portion of a project, provided performed. milliseconds of detecting a given that the return on the investment of condition. A test automation system time and effort promises a payoff. To address this, try to have a capable of that level of performance comprehensive initialization sequence may be prohibitively difficult and that can be called by all test scripts. Special Automation Make it a matter of policy that this expensive. Techniques: Some initialization sub-script is called at the But even faced with Typical “Gotchas” in start of each script. Yes, people are such a scenario, can Embedded Software going to complain that it seems to be useful testing Test Automation a waste of time to execute all of these Once appropriate automation tools steps at the start of every single test have been selected and script. But in the end the time will designing be well spent, since chasing errant and conditions caused by initialization Managing tolerances is crucial to successful embedded automation. Real-world systems do not lend themselves well to absolutes.24 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 25. Embedded Systems problems will be avoided. system occurs when the test is not comprehensive enough to catch an Managing tolerances is crucial to successful embedded automation. Something like an unexpected glitch on a system output that might seem to fall outside of the Real-world systems do not lend specific test case. The difficulty is that themselves well to absolutes. It is not useful for a system requirement automobile engine a given system may have dozens or even hundreds of outputs. It is usually to say that a system needs to control impossible to check the status of all of to a setpoint of 72 F. It is only useful to say that the control must control controller or a flight them in every test step. to the setpoint plus or minus some At the very least, be sure that each test tolerance. Automated tests need to be written to handle the tolerances rather guidance system case explicitly checks the status of all known critical values. But on the than absolutes. Otherwise numerous flip side, so as not to add unnecessary testing errors will be logged when the real world system deviates, even may need to process execution overhead, if it’s truly a “don’t care” then don’t include it. slightly, from those absolutes. Formal script reviews are the solution Race conditions are caused specifically by timing tolerances. inputs hundreds here—other test engineers, hardware engineers and software developers may identify system outputs that were A race condition “is a flaw in an electronic system or process whereby the output or result of the process or thousands of not considered, but that really should be included in the test case. is unexpectedly and critically dependent on the sequence or timing of other events. The term originates times per second To Automate or Not to with the idea of two signals racing Automate: Finding the each other to influence the output first and assert outputs Return on Investment ( h t t p : / / e n . w i k i p e d i a . o rg / w i k i / The first question is whether the Race_condition). In the case of test automation, it most often manifests within milliseconds embedded system testing should be fully automated. The answer is itself in a condition in which the no, generally not. At the very least, test script execution gets to a check point first—perhaps even by just of detecting a given relying completely on automated tests is probably a bad idea. As mentioned a millisecond—and fails the step above, in a system of any significant because the process it’s checking has not caught up. condition. A test complexity there are simply too many inputs and outputs for the tests to be absolutely comprehensive. Many Conversely, the process on the embedded system may have just automation system times manual tests run by individuals with significant understanding of the completed and moved on—so system will catch defects that would the test script fails to detect the desired process state because it has capable of that level have been missed by a more narrowly scripted automated test. already moved on. Fortunately, race conditions are relatively easy to avoid. Tests can use a simple “Wait While” of performance may Total reliance on automated testing will generally not result in sufficient followed by a “Wait For” construct. coverage. There are aspects to most As long the timing requirements for the event that’s being tested are be prohibitively embedded systems that will defy full coverage through automation without understood, this combination will not enormous effort. And in certain only prevent false errors because of the race condition, but will also verify difficult and embedded systems there are human and machine safety considerations— that the system is working inside of in these cases, although the safety its formal timing requirements. expensive. tests can be automated, they should A very big potential “gotcha” in also be run manually so that a human automated testing on an embedded being verifies the safety of the system.April 2013 www.automatedtestinginstitute.com Automated Software Testing Magazine 25
  • 26. if the software is defective. For • Remember that partial automation Regression is example, signal conditioning of a given test may still be algorithms such as piece-wise worthwhile. Even if an automated the primary filtering and linearization applied test has to stop execution key to ROI to analog inputs can have bugs at the transition points that are to prompt a user for certain intervention, the test might still relatively difficult to detect but provide better coverage, better can throw the input value wildly reporting, better consistency, out of range. It is easy to create a and be less mind-numbing—and So how does one decide whether test that sweeps the entire range of therefore more prone to being run to automate or not to automate? analog values in small increments accurately during regression— Ultimately this will be determined by looking for these anomalies. Such than a fully manual test. calculating the return on investment a test would be daunting to run (ROI) for the automation effort. manually, is easy to automate, and can catch software defects that Remember, first, that almost every could have catastrophic problems Conclusion obstacle can be overcome: it is purely in the embedded system. (But Test automation on an embedded a function of how much time, money, note that, in this example at least, system presents a unique set of and effort can be expended for an a good code inspection would go challenges not encountered when ROI. The better the metrics available pretty far in eliminating the risk automating tests in more conventional about your automation process, the of such a software defect.) computing environments. Test more information can be provided automation on embedded systems management concerning the ROI • Another way embedded system requires a unique set of software when automating new systems. test automation can have a huge tools. And since embedded systems payoff is to reproduce faults that involve an amalgamation of hardware There are many models available to require large numbers of iterations and software, a specific tester-to- calculate ROI for test automation. to occur, so many that manual controller interface is required. Any of these can applied to test testing would be impractical or Developing this interface can be automation on an embedded system. impossible. For example, I once complicated, challenging, and costly. The main difference will be to factor worked on a serious field issue The test professional must factor in in the time it takes to design, build that occurred very infrequently the cost and time needed to create the and troubleshoot the test harness(es). and at just a few job sites. The automation interface, or the testing It is also necessary to be aware of any software engineers eventually schedule is incomplete. extensions to the test tool that may came up with a set of conditions be required to provide coverage for they thought could reproduce the parts of the embedded system that the Because of the real-time nature problem. An automated test was tool does not already support, such as of embedded systems, the test developed to repeatedly present a new communications protocol or professional must also employ specific those conditions to the system and hardware I/O type. automation techniques. Being aware it turned out that on average the of these unique challenges will greatly error would occur approximately Here are some good rules of thumb to decrease the time needed to debug every 300 presentations. The maximize return on investment from automated tests, which will result in ability to reproduce the error, embedded test automation: successful automation attempts and even that infrequently, enabled greater likelihood of management • First and foremost, regression is the software engineers to craft satisfaction. the primary key to ROI. Repetition a fix. The test was then run for pays the bills. Automate tests that thousands of cycles and we were able to calculate, to a statistically Test automation on an embedded will be run numerous times over system can greatly expand the scope multiple test cycles. exact level of confidence, just how certain we were that the of testing and eliminate defects that • Intelligent selection of the scope remediation actually fixed the would have been virtually impossible of automation is the secondary problem. The payoff of the to identify using manual testing alone. key to ROI. Don’t bite off more automation was a little difficult to Awareness of the unique challenges than can be handled (or paid for). quantify in dollar terms, but the posed by embedded systems can help The low-hanging fruit would be payoff in increased management the test professional to decide on an tests that require large amounts confidence in the competence of appropriate scope of automation, of time to execute and where the engineering group was very avoid pitfalls during test development, catastrophic results could result high. and deliver a successful product.26 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 27. Training That’s Process Focused, Yet Hands OnSoftware Test Automation Training www . training . automatedtestinginstitute . com Public Courses Software Test Automation Foundations Automated Test Development & Scripting Designing an Automated Test Framework Advanced Automated Test Framework Development Mobile Application Testing & Tools Virtual Courses Automated Test Development & Scripting Designing an Automated Test Framework Advanced Automated Test Framework Development Mobile Application Testing & Tools Come participate in a set of test automation courses that address both fundamental and advanced concepts from a theoretical and hands on perspective. These courses focus on topics such as test scripting concepts, automated framework creation, ROI calculations and more. In addition, these courses may be used to prepare for the TABOK Certification exam.April 2013 Public and Virtual Training Available www.automatedtestinginstitute.com Automated Software Testing Magazine 27
  • 28. How Au Fits Int28 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 29. utomated Testingto Agile Software Development By Bo Roop Being an Automator When Automation Is Not Your Only Task April 2013 www.automatedtestinginstitute.com Automated Software Testing Magazine 29
  • 30. H ow does automated software the software save faster.” This was not modify the automated scripts or I would testing fit into the big picture of truly helpful, nor necessarily achievable. work with the developers to help correct agile software development? In While we could have made it save faster, the regression issues. my case, it was more about the it still might not have met his desired tester than it necessarily was about the speed improvement since it was never By sharing the results of the automated testing. clearly defined. So I was tasked with test runs with the developers in the early morphing that ambiguous requirement stages of the new iteration, I could get I was the newest member of an existing into something achievable: “Make bugs fixed quicker. Those regression eXtreme Programming (XP) team. the saving of new customer records issues were knocked out before we built This software team had been working complete in less than 50 percent of the on them and made them unmanageable. together for a few years, but had just current rate.” We benchmarked results We sometimes found incomplete areas begun its transition into using the agile from the existing software to establish a of the software that automated testing methodologies. The company I worked baseline, and then aimed at making the was able to reveal even though the for had a standardized testing team software faster. application hid it from the user. Finding that was used as a shared resource “Todo:” comments in the code always among each of the individual software Once the new software pieces were raised a red flag, and set me off on a development teams. implemented, I then performed ad- more in-depth hunt. hoc and exploratory testing on the new Its members (re)learned each software builds, and I found bugs. This testing was Our development team manager also package as it came, but they were performed within the two week iteration, appreciated the prompt feedback which following the developers at the end of and the feedback to the development staff included metrics-like code coverage the software development cycle only. members was almost instantaneous. The and pass/fail statistics. Those are the There was no early testing integration developers visible items that upper management and the software quality was suffering liked to track, even though we because of the waterfall approach. constantly reminded everyone that those were just a few I was asked to join this new metrics of interest, not all. agile team as a tester, but quickly found that my role Within the first three would be so much more. days of the new iteration I would Once I got up to speed have automated the last on using the software iteration’s newly created and understanding features, passed the results on to our customer’s goals, management and the rest of the team, I found that I had a and then begun looking at the new better understanding features the developers were working of the whole system than on over the past few days in the current the developers who were focused on immediately iteration. just the small areas for which they were corrected the problems and writing code. So I transitioned into a moved on to the next task, and I, of role of product champion and customer course, verified their fixes. We found that running with the software advocate. I worried about the whole automation a week behind gave us a few benefits. The code to be automated The Beginning product and how our customers would use it, like it, and recommend it. had already been manually tested and verified (more on that later), and it was Our development team had a dedicated At the beginning of a new already in a shippable state at the end XP customer in our marketing person, iteration, I would spend the first two to of the last iteration. Remember that we who knew what he wanted the software three days running the existing automated were following strict XP practices. to do, but he couldn’t write a realistic testing suite to verify we didn’t have any requirement. So I spent many hours each regression issues during the previous Stable software is always easier to iteration taking his broad requirements iteration and once all the existing tests automate than software that’s in a state and changing them into achievable passed I’d take off automating the new of flux. These automation tasks also software tasks for the team to implement features from the last two weeks. If filled that gap when the new code hadn’t they didn’t all pass, either because of been developed yet, and manual testing When we started using him as a customer, regression issues or changes in the way would simply be repeating tests from the we’d receive requirements like, “make the code behaved, I’d go through and previous iteration.30 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 31. Agile Automation our internal XP customer, the marketing E xploratory & Hostile After the software automationtasks were completed, I utilized guy, to refine his requirements into something usable. I had a number of hurdles to overcome with him as our internal customer: He had no background New features/controls While we were working to get requirements refined, I would still grabexploratory testing models to find in software development and didn’t the new builds and manually test the newthe bugs in the new code. The new understand how engineers implemented features. On the project board, anythingfeatures were tested manually before new code. While he was great at creating that was moved into the “ready for test”any of the existing code was regression brochures and marketing campaigns, he column was fair game. Sometimes it wastested again. The developers needed was not as gifted with time management complete and ready, and other times itthe feedback on newly implemented and interpersonal skills. In his haste to was not. So that was another balancingfeatures while that code was still fresh get the new requirements completed, act I had to learn. Some developers havein their minds. During our daily stand- he would rush, meaning we would thicker skin than others, and appreciateup meetings, I received information frequently receive concept-only the immediate feedback, and othersfrom the developers telling me which requirements. We’d hear things like we despise being told their code is brokenareas of the code were in flux or in need need to make the software pretty. (especially if it’s not 100 percent feature-of greater attention. We were working complete). After the next iteration’stogether toward a common goal. Pretty? Really? requirements were delivered, I’d change Once the new software pieces were implemented, I then performed ad- hoc and exploratory testing on the new builds, and I found bugs.I then performed ad-hoc testing as a So I would get started with him during over to running the regression test suite,hostile user. We had customers who the second week of the iteration and and continue my manual testing ofwere forced to use our software by their figure out what his “make it pretty” new builds. These last few days of themanagers, so they would try to find requirement really meant. iteration allowed me to test some of theproblems with our software or create undocumented requirements that neededreasons to not use it. Since our software He didn’t like the way the software attention.provides feedback on the quality of looked and he wanted it to be moretheir work in a production environment, closely aligned with the Windows Before beginning automation of themany viewed the software as a threat to operating system. So we changed his new controls and interfaces, I’d taketheir jobs instead of as a tool to make requirement into, “make the software a dry run at learning the new controlstheir jobs easier. use some of the newer Windows look into the automated software package. I and feel. Stuff like rounded buttons, needed to ensure that controls could beIt was a difficult task for our team to gradients and transparency.” This was scriptable. They needed to be properlyimprove the perception of our software better, but it was still very ambiguous. and consistently named in order to keepwith those customers. So if they were But by removing some of our developer’s the automated scripts as readable asgoing to try to break the software, I had creative license, the software started possible. Hotkeys and shortcuts neededto try to beat them to the punch. I had to looking prettier (in his eyes and ours); to be unique, there needed to be stabilitytry to find the areas where the problems and by taking baby steps toward a real in the code, and the software needed toexisted before they did. By using the requirement, we were at least moving in be ready to be released at the end of eachsoftware in the same fashion as our the right direction. iteration.destructive customers, it became verysolid. My goal was to get all of the requirements I would also pick a few specialized from the marketing guy translated into types of testing to focus in on during developer-speak, and ready to share the tail end of the iteration. Sometimes Requirements After a few days of performingmanual testing of the new builds, I’d during the next iteration planning meeting. In the beginning, it took about five to eight days to get everything defined. Toward the end, we were able to I would focus on the ease of use of the overall software, finding out if it was easy to learn, contained clear and useful warning and/or error messages.change gears and begin working with knock out the requirements much faster. Other times I would focus on WindowsApril 2013 www.automatedtestinginstitute.com Automated Software Testing Magazine 31
  • 32. specific platform support. While much These items were specifically called out testing requirements of my automation was done on one or from the XP customer or the software two platforms, the software had to run on manager as corporate goals, but were And then we’d start over again. I loved many other computer systems. Windows not usually given individual story or it as a tester because I had complete You want it faster? How much faster? Would you like it two seconds or 30 seconds faster? What if the team can’t achieve that speed? 2000, XP and Vista, as well as a large list task cards. We had to just work them in knowledge of the requirements (how of foreign languages, were all officially whenever we could. should the software behave); I was supported. It was my job to maintain helping write them! The developers these test environments, and to test the loved it because I went through the software on each of the platforms with the latest service packs, running each of the languages. Conclusion So to recap, my two week iterations (10 working days per iteration) hassle of refining the vague, ambiguous or incomplete requirements into something useful. Our development team rule was the software developers Now throw different hardware would look something like this: didn’t have to work on a story or task if configurations into the mix, and my it was ambiguous. The pressure to make testing matrix continued to grow. • Days 1-3: Automate the new difficult decisions was on the marketing More or less RAM, larger hard drives, features that were coded over guy. You want it faster? How much filled disk space, no virtual memory, the past two weeks faster? Would you like it two seconds authenticated on a corporate domain or or 30 seconds faster? What if the team • Days 4-5: Perform manual not, missing drive letters, none of these can’t achieve that speed? Shall we time- testing of the newly added things were ever specifically called out box this research to a half day? What’s features as they are completed in the testing requirements. They were it worth to you? By talking about these all sourced from customers’ desires or • Days 6-7: Refine requirements issues ahead of time, during the previous problems reported from the field. with marketing iteration, there was time to fill the gaps, make the necessary changes, and get the D team useful requirements. ocumentation If these areas were It made for some great requirements found to be working at the and very happy team members. Teamwork end of the iteration, I’d switch over to worrying about how The marketing guy loved it the documentation team was because he got exactly what he being integrated into the wanted with little-to-no arguing development team. Could the with the developers. I had a software be quickly and easily calmer demeanor with him and translated? Would it work within could get the answers out of him the confines of their translation tools? without bringing in a lot of people Could we support field translations? ... which meant the developers could Did our documentation staff understand continue developing while I went to the how the new features worked? If not, I meetings. would provide them with training. If we weren’t ready for documentation and • Days 8-10: Run regression tests. translation, I could test the software’s Continue working to strive Finally, our manager loved it because logging capabilities to ensure that our toward shippable software everyone on his team was happy and technical support team would be able to at the end of the iteration … engaged and the customers were getting support the product in the field. focusing on the non-defined great software from us. Win-win-win.32 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 33. Crowdamation Crowdsourced Test Automation It will offer the flexibility to use a tool of choice (open source and commercial), have teams operate out of different locations, address the challenges of different platforms introduced by mobile and other technologies, all while still maintaining and building a cohesive, standards-driven automated test implementation that is meant to last. “It’s OK To Follow the Crowd” Inquire at contact@automatedtestinginstitute.com ”dworC eht wolloF oT KO s’tI“April 2013 www.automatedtestinginstitute.com Automated Software Testing Magazine 33
  • 34. 34 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 35. We have 203 guests online April 2013 www.automatedtestinginstitute.com Automated Software Testing Magazine 35
  • 36. I ‘B’Log To U Latest From the B Automation blogs are one of the greatest sourc automation information, so the Automated Tes decided to keep you up-to-date with some of th posts from around the web. Read below for so posts, and keep an eye out, because you never will be spotlighted. Blog Name: ATI User Blog Post Blog Name: ATI User Blog Post Post Date: January 2, 2013 Post Date: February 4, 2013Post Title: Top 5 things you should consider Post Title: .Think Your Automation Framework is Better Author: Sudhir G Patil Author: Patrick Quilter Software changes are more frequent “You think your framework is better than and demand stringent Quality mine?” This is the result of the ‘framework’ parameters which enforce a highly stage or those that expand on structured efficient and automated development programming to build robustness into their and quality process. Test Automation programming efforts. This stage produces for this reason has seen a sea change in the best results by modularizing code its adoption levels in the recent years. into reusable functions, components, and Though there are multiple factors parameterizing test data. User-friendliness that are responsible for delivering is an important characteristic so it can be successful test automation, the key is handed off to system analysts and alleviate selecting the right approach. the amount of expensive programmers. Read More at: Read More at: http://www.automatedtestinginstitute.com/home/index. http://www.automatedtestinginstitute.com/home/index.php?option=com_k2&view=item&id=1158:top-5-things-you-should- php?option=com_k2&view=item&id=2643:http-wwwquilmontcom- consider-for-test-automation-investments&Itemid=231 blog&Itemid=23136 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 37. Blogosphere ces of up-to-date teststing Institute has he latest blogome interesting know when your post Blog Name: MyLoadTest Blog Name: Agile Testing Post Date: December 29, 2012 Post Date: January 29, 2013 Post Title: LoadRunner Password Encoder Post Title: IT stories from the trenches #1 Author: Stuart Moncrieff Author: Grig Gheorghiu If you ever need to disguise a password On one of these (production) servers in a VuGen script, you will no doubt I typed ‘ci /etc/passwd’ instead of ‘vi / have used the lr_decrypt() function. etc/passwd’. This had the unfortunate If you have stopped to think for a effect of invoking the RCS check-in second or two, you will have realised command line utility ci, which then “encrypting” the password in your moved ‘/etc/passwd’ to a file named script doesn’t make it more secure ‘/etc/passwd,v’. Instead of trying to in any meaningful way. Anyone with get back the passwd file, I panicked access to the script can decode the and exited the ssh shell. Of course, at password with a single line of code this point there was no passwd file, so nobody could log in anymore. Ouch. I had to go to my boss, admit my screw-up Read More at: Read More at: http://www.myloadtest.com/ http://agiletesting.blogspot.com/2013_01_01_archive.html April 2013 www.automatedtestinginstitute.com Automated Software Testing Magazine 37
  • 38. Go On A Retweet Paying a Visit To Microblogging is a form of is a powerful tool for relaying an communication based on the concept assortment of information, a power of blogging (also known as web that has definitely not been lost on logging), that allows subscribers of the test automation community. Let’s the microblogging service to broadcast retreat into the world of microblogs for brief messages to other subscribers a moment and see how automators are of the service. The main difference using their 140 characters. between microblogging and blogging is in the fact that microblog posts are much shorter, with most services restricting messages to about 140 to 200 characters. Popularized by Twitter, there are numerous other microblogging services, including Plurk, Jaiku, Pownce and Tumblr, and the list goes on-and-on. Microblogging Twitter Name: sheg80 Post Date/Time: Mar 12 Topic: Test Controllers Should Programming Classes be Covering Software Testing, Too? http://lnkd.in/6nnyTx Atomic test cases are awesome http://tinyurl.com/aw36bre Twitter Name: CaitlinBuxton2 Post Date/Time: Mar 27 Topic: Dev & Testing38 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 39. The Microblogs Twitter Name: shubhi_barua The more I learn about #testing Post Date/Time: Mar 18 (or anything else really) the more I Topic: Test Data Visualization realise I haven’t even scratched the surface #keeplearning This is so freaking awesome visualisation of test data coverage. Kind courtesy of @ Hexawise at Moolya! pic.twitter. com/CBXWcfmIGL Twitter Name: testchick Post Date/Time: Mar 30 Topic: Continuous Learning Twitter Name: KadharMR This demo clip shows step by Post Date/Time: Feb 22 step how to design a test with Topic: Parameterizing Selenium different types of virtual users: h t t p : / / w w w. y o u t u b e . c o m / watch?v=EjZoXwTAELs Parameterizing Selenium Web- Driver Tests using TestNG - A Data Driven Approach http:// wp.me/p2RSUo-jx Twitter Name: onloadtesting Post Date/Time: Dec 26 Topic: Load Test with Virtual UsersApril 2013 www.automatedtestinginstitute.com Automated Software Testing Magazine 39
  • 40. Hot Topics in Automation The Right Tool for the Job Building a Mobile Automation Testing Matrix By Darren Madonick For Desktop-based testing it’s the construction, and there is a no-brainer: Use object-based always a tight timeline to achieve scripting to maximize reuse results. The right tool for the across platforms/browsers. job is a tool that takes all of this In today’s mobile world it into consideration, and provides really isn’t that simple. There a platform to consolidate all of are many different platforms, OS these different types of testing versions, form factors and carrier/ approaches. manufacturer customizations. Multiply that by mobile web, The first step is to determine native app, or some hybrid the type of app you are testing. in-between and you’ve got Is it fully native, fully web, or yourself a healthy testing somewhere in between? matrix. A daunting task for even • If it’s fully native, you the most skilled Automation may be able to get to some objects Engineer. on a per platform basis, but you In order to tackle this problem, an Automation Engineer cannot will probably be falling back on text and image based simply look at it from a “one size fits all” perspective to create verification, especially if you are trying to go cross- a set of objects and re-use them across all combinations of platform. platforms. For example, there are fundamental differences in • If it’s fully web, a lot of testing can be done up front in a how an app behaves on iOS and Android, even with something WebKit profiler. When it comes to real devices, element- as basic as a “back button” has its quirks. Although these based testing can be done cross-platform if you want fundamental differences can be grouped together as a step or to instrument, or you can fall back on text and image action, they are unique enough to not be able to simply share verification. an object between the two OS’s. • If it’s somewhere in between, you’ll need to mix and In some cases with mobile testing, you may be able to get to the match. object-level, however this usually requires that you instrument The second step is to find which pieces or steps of your your app, or test on an emulator. While this fulfills a piece of test cases are reusable between each other, and can accept your testing matrix, you will probably need to seek a couple parameterization to fulfill the task. For instance, automating tools to get this done across all platforms. In other cases, the the selection of an item or link on your main screen of your app content you are testing might be HTML-based and you can or landing page: Maximize reuse by engineering a parameter test by WebKit profiling. Again, part of your testing matrix is to accept different values, and reuse it across each test case. fulfilled, however you aren’t quite there. Although you may need to individually determine what type of verification you will use to achieve this on a per platform or This may be enough to satisfy a short-term goal, but at some device level, you will save time in the long run when you write point you need to be testing on real mobile devices. In order additional test cases. to truly automate on mobile, your mobile testing “Utility Belt” needs to be designed in such a way that allows for testing by The third step is to then group those pieces or steps together by object when possible, element when possible, and also be able device screens or pages. This way, as you write the test cases to quickly fall back on text or image verification in order to you have an organizational structure that is easy to identify by satisfy all areas of your testing matrix and assure the highest where you are within the app or site and where you need to quality of your mobile product. navigate to next. Having the flexibility to be able to choose how to get the Following these steps will provide a structure that can be grown testing done is paramount since as an Automation Engineer, to accommodate new features within an app or new sections you very rarely have a say in how a particular app or mobile within a mobile web site. As mobile devices become easier to web site is developed. The job requires you to sometimes automate against, this structure can easily adapt to emerging understand functionality without necessarily being privy to technologies that allow for greater reuse across platforms.40 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013
  • 41. Are You ContributingContent Yet?The Automated Testing Institute relies heavily on the automated testingcommunity in order to deliver up-to-date and relevant content. That’s whywe’ve made it even easier for you to contribute content directly to the ATIOnline Reference! Register and let your voice be heard today! As a registered user you can submit content directly to the site, providing you with content control and the ability to network with like minded individuals. Community Comments Box >> Community Comments Box - This comments box, available on the home page of the site, provides an opportunity for users to post micro comments in real time. >> Announcements & Blog Posts - If you have interesting tool announcements, or you have a concept that you’d like to blog about, submit a post directly to the ATIAnnouncements & Online Reference today. At ATI, you have a community of individuals who would love Blog Posts to hear what you have to say. Your site profile will include a list of your submitted articles. >> Automation Events - Do you know about a cool automated testing meetup, webinar or conference? Let the rest of us know about it by posting it on the ATI site. Automation Add the date, time and venue so people will know where to go and when to be there. Events Learn more today at http//www.about.automatedtestinginstitute.com
  • 42. http://www.googleautomation.com
  • 43. Training That’s Process Focused, Yet Hands OnSoftware Test Automation Training www . training . automatedtestinginstitute . com Public Courses Software Test Automation Foundations Automated Test Development & Scripting Designing an Automated Test Framework Advanced Automated Test Framework Development Mobile Application Testing & Tools Virtual Courses Automated Test Development & Scripting Designing an Automated Test Framework Advanced Automated Test Framework Development Mobile Application Testing & Tools Come participate in a set of test automation courses that address both fundamental and advanced concepts from a theoretical and hands on perspective. These courses focus on topics such as test scripting concepts, automated framework creation, ROI calculations and more. In addition, these courses may be used to prepare for the TABOK Certification exam.April 2013 Public and Virtual Training Available www.automatedtestinginstitute.com Automated Software Testing Magazine 43
  • 44. OD TestKIT ONDEMAND If you can’t be live, be virtual OD ONDEMAND Sessions Anywhere, anytime access to testing and test automation sessions >>>> Available anywhere you have an internet connection <<<< >>>> Learn about automation tools, frameworks & techniques <<<< >>>> Explore mobile, cloud, virtualization, agile, security & management topics <<<< >>>> Connect & communicate with testing experts <<<< ONDEMAND.TESTKITCONFERENCE.COM44 Automated Software Testing Magazine www.automatedtestinginstitute.com April 2013