Presentation - CFUnited 2010: A ColdFusion, Flex
Upcoming SlideShare
Loading in...5
×
 

Presentation - CFUnited 2010: A ColdFusion, Flex

on

  • 1,183 views

 

Statistics

Views

Total Views
1,183
Views on SlideShare
1,178
Embed Views
5

Actions

Likes
1
Downloads
8
Comments
0

2 Embeds 5

http://www.techgig.com 4
http://www.slideshare.net 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Presentation - CFUnited 2010: A ColdFusion, Flex Presentation - CFUnited 2010: A ColdFusion, Flex Presentation Transcript

  • Unit and Functional Testing Your Flex Applications Mike Nimer Dir. Of Engineering nomee.com
      • Isolate each part of the program and show that the individual parts are correct.
      • Provide a strict, written contract that the piece of code must satisfy.
      • Find problems early in the development cycle.
      • Unit testing allows the programmer to re-factor code at a later date, and make sure the module still works correctly
      • --wikipedia
    The goal(s) of testing
  • Types of Testing
    • Manual
    • Humans create test scripts.
      • Step by Step instructions (word doc).
    • Humans execute those tests.
    • Low cost to begin. High cost over the life of a project
    • Testing generally happens less often
    • Automated
    • Humans create test methods. Machines execute those tests.
    • Higher cost to develop. Much lower cost over the life of a project
    • Testing can happen continually
      • On every check-in
  • Automated Testing
    • Testing as part of Development
    • As developers create code, they create tests
    • Test Driven Development – tests first, code 2nd
    • Tests are continually executed by developer to ensure new code does not break existing features
    • Automated build/test environments
    • Part of “Continuous Integration”
    • Code checked out and built automatically
    • Code continually tested with your tests and reports generated
    • Automated QA ensuring a last minute check in does not break a release
  • Types of Automated Testing
    • Unit
    • Tests the smallest unit of code that is possible
    • Usually tests a single method or function
    • Multiple tests for every path through a method
    • Usually created by Developer
    • Integration / functional
    • Tests a combination of units
    • Can be used to test UI Components
    • Confirms that individual pieces of code work together
    • Created by Developer or QA
  • Tools for Unit Testing
    • Unit Testing
    • (before)
      • FlexUnit - The original Flex/ActionScript 3.0 unit testing framework.
      • Fluint –Written for better asynchronous support and support for testing UI components.
    • (today)
      • FlexUnit 4 – A merge of FlexUnit 3 and Fluint. Providing better asyncronous testing support, support for UIComponent testing and IDE integration.
  • Tools for Automated Testing
    • These products all use the Flex Automation API, to monitor, manipulate, and control the Flex application.
    • FlexMonkey
    • TestComplete
    • Selenium (with SeleniumFlex project)
    • Mercury QuickTest Pro 9.1 (QTP)
    • Borland SilkTest
    • IBM Rational Functional Tester
  • Testing Terminology
    • Test Method - A test method is the smallest unit of the testing process. A test method executes code and checks the outcome.
    • At the end of the test method the developer generally makes an “ assertion” , stating the expected outcome of this executed code.
    • Test Case - A test case is a collection of test methods that share a common test environment, also referred to as a test fixture.
    • Test Suite - A test suite is a collection of test cases.
    • Test Runner - A test runner is an application that executes the test methods in your suites and cases.
  • Understanding Assertions
    • assertEquals() —Accepts two parameters and is a valid assertion if the two values are equal (==).
    • assertStrictlyEquals() —Accepts two parameters and is a valid assertion if the two values are strictly equal (===).
    • assertTrue() —Accepts a single parameter and is a valid assertion if the value is true.
    • assertFalse() —Accepts a single parameter and is a valid assertion if the value is false.
    • assertNull() —Accepts a single parameter and is a valid assertion if the value is null.
    • assertNotNull() —Accepts a single parameter and is a valid assertion if the value is not null.
    • fail() — Marks the test failed.
  • Whats new in FlexUnit 4
    • Support for all Junit 4 features
    • Support for UI testing
    • Metadata driven
    • Exception handling
        • New attribute of [Test] Metadata allows you to define expected exceptions (instead of using try/catch)
        • [Test(expects=“TypeError”)]
    • Async test support
    • Support for multiple Runners
    • Hamcrest support
        • Uses the Hamecrest AS3 project for a richer set of Asserts.
    • Theories and Assumptions
    • IDE Support
    • Beta 1 now available.
  • Understanding Metadata
    • FlexUnit 4 can be driven off of specific flexmetadata (similar to java annotations). This allows you to define tests as you see fit, with your own naming conventions or in existing code.
    • Before metadata you had to name your test methods a certain way, and your tests had to extend specific classes.
    • [Test]
    • [Suite]
    • [Before]
    • [After]
    • [BeforeClass]
    • [AfterClass]
    • [Ignore]
    • [DataPoints]
    • [Theory]
    • [RunWith]
  • Creating A Basic Unit Test
    • package simpleTest.tests{
    • import org.flexunit.Assert;
    • public class SimpleTestCase{
    • [Test]
    • public function testMath():void {
    • var x:int = 5 + 3;
    • Assert.assertEquals( 8, x )
    • }
    • }
    • }
  • Creating A Basic Test Suite
    • package simpleTest
    • {
    • import flexunit.framework.TestSuite;
    • import simpleTest.tests.SimpleTestCase;
    • [Suite]
    • [RunWith("org.flexunit.runners.Suite")]
    • public class SimpleTestSuite
    • {
    • // define a var for each test case.
    • public var simpleTestCase:SimpleTestCase;
    • }
    • }
  • Creating A Test Runner
    • <?xml version=&quot;1.0&quot; encoding=&quot;utf-8&quot;?>
    • <mx:Application xmlns:mx=&quot;http://www.adobe.com/2006/mxml&quot;
    • creationComplete=&quot;runMe()&quot;
    • xmlns:flexUnitUIRunner=&quot;http://www.adobe.com/2009/flexUnitUIRunner&quot; >
    • <mx:Script><![CDATA[
      • import simpleTest.SimpleTestSuite;
      • import org.flexunit.listeners.UIListener;
      • import org.flexunit.runner.FlexUnitCore;
      • private var core:FlexUnitCore;
      • public function runMe():void {
      • core = new FlexUnitCore();
      • //Listener for the UI, optional
      • core.addListener( new UIListener( uiListener ));
      • core.run( SimpleTestSuite );
      • }
    • ]]></mx:Script>
    • < flexUnitUIRunner:TestRunnerBase
    • id=&quot;uiListener&quot;
    • width=&quot;100%&quot; height=&quot;100%&quot; />
    • </mx:Application>
  • Asynchronous Testing
    • Asynchronous Tests examples
    • Server calls
    • Timers
    • AIR Sqlite
    • AIR File Access
  • Creating an Asynchronous Unit Test – Setup/Teardown [Before(async)] public function setUp():void{ timer = new Timer( 500, 3 ); } [After(async)] public function tearDown():void { if ( timer ) { timer.stop(); } timer = null; }
  • Creating an Asynchronous Test – Test Method [Test(async)] public function testInTimePass() : void { var obj:Object= {expected:3} timer.delay = 100; timer.addEventListener(TimerEvent.TIMER_COMPLETE, Async.asyncHandler( this, completehandler, 500, obj, null) , false, 0, true ); timer.start(); } protected function completehandler( event:Event, data_:Object):void{ if( event.type == &quot;timerComplete” && ){ Assert.assertEquals(Timer(event.target ).currentCount, data_.expected ); } else{ Assert.fail( &quot;wrong event type&quot;); } }
  • Integration Testing
    • UIComponents
    • A common goal when developing Flex applications is to test UIComponents in a repeatable way
    • UIComponents are internally asynchronous
  • Integration Testing (FlexUnit)
    • FlexUnit Sequences
    • Sometimes a test of a UIComponent can require many steps to setup.
    • For example, you may need to create a Login Form, populate it and then click a button before you can decide if everything worked
    • Sequences are a shorthand to simplify that work into a readable set of steps
  • Testing UIComponents – setup public var form:LoginForm; [Before(async,ui)] public function setUp():void { form = new LoginForm(); Async.proceedOnEvent( this, form, FlexEvent.CREATION_COMPLETE, 200 ); UIImpersonator.addChild( form ); } [After(async,ui)] public function tearDown():void { UIImpersonator.removeChild( form ); form = null; }
  • Testing UIComponents – test method
    • [Test(async,ui)]
    • public function testLogin():void {
      • var passThroughData:Object = new Object();
      • passThroughData.username = 'myuser1';
      • passThroughData.password = 'somepsswd';
      • var sequence:SequenceRunner = new SequenceRunner( this );
      • sequence.addStep( new SequenceSetter( form.usernameTI, {text:passThroughData.username} ) );
      • sequence.addStep( new SequenceWaiter( form.usernameTI, FlexEvent.VALUE_COMMIT, 100 ) );
      • sequence.addStep( new SequenceSetter( form.passwordTI, {text:passThroughData.password} ) );
      • sequence.addStep( new SequenceWaiter( form.passwordTI, FlexEvent.VALUE_COMMIT, 100 ) );
      • sequence.addStep( new SequenceEventDispatcher( form.loginBtn, new MouseEvent( 'click', true, false ) ) );
      • sequence.addStep( new SequenceWaiter( form, 'loginRequested', 100 ) );
      • sequence.addAssertHandler( handleLoginEvent, passThroughData );
      • sequence.run();
      • }
    • protected function handleLoginEvent( event:TextEvent, passThroughData:Object ):void {
      • Assert.assertEquals( passThroughData.password, event.text );
    • }
  • Testing UIComponents – test method
      • var sequence:SequenceRunner = new SequenceRunner( this );
      • sequence.addStep( new SequenceSetter( form.usernameTI, {text:passThroughData.username} ) );
      • sequence.addStep( new SequenceWaiter( form.usernameTI, FlexEvent.VALUE_COMMIT, 100 ) );
      • sequence.addStep( new SequenceSetter( form.passwordTI, {text:passThroughData.password} ) );
      • sequence.addStep( new SequenceWaiter( form.passwordTI, FlexEvent.VALUE_COMMIT, 100 ) );
    • Create a new Sequence
    • Set a property on a UIComponent
    • Wait for the “valueCommit” event to fire, so you know the property was set correctly.
  • Testing UIComponents – test method
      • sequence.addStep(
      • new SequenceEventDispatcher( form.loginBtn, new MouseEvent( 'click', true, false ) ) );
      • sequence.addStep( new SequenceWaiter( form, 'loginRequested', 100 ) );
      • sequence.addAssertHandler( handleLoginEvent, passThroughData );
      • sequence.run();
    • Add a new step that simulates a user clicking on the “login” button.
    • Wait for the “loginRequested” event.
    • Add an “assertHandler” to check the login when it is completed.
    • Run the test.
  • Testing UIComponents – test method
      • //sequence.addAssertHandler( handleLoginEvent, passThroughData );
    • protected function handleLoginEvent(
    • event:TextEvent, passThroughData:Object ):void {
    • Assert.assertEquals( passThroughData.password, event.text );
    • }
    • Use the assertHandler method to test the results of the login action (server call)
    • If the right data exists, success.
  • Integration Testing (FlexMonkey)
    • Recording Scripts
    • Replaying Scripts
  • Tips
    • All Software projects should be thoroughly tested!!
      • Automated testing saves you money by preventing bugs form returning again and again.
    • A getting started rule of thumb for writing a new test case
        • Write one test that passes the “usual” case
        • Write one test that passes by testing the extremes
        • Write one test that is supposed to fail
    • Continue building your test cases, by writing a test case for every bug that is submitted – before you close the bug as fixed.
    • Tests cases should “clean up” after themselves
    • Any developer should be able to run the tests without custom setup – if possible.
  • Questions ?
  • FlexUnit 4 http://opensource.adobe.com/wiki/display/flexunit/FlexUnit Flex Monkey http://code.google.com/p/flexmonkey/ More on Continuous Integration http://www.martinfowler.com/articles/continuousIntegration.html