Automated testing of NASA Software - part 2


Published on

Published in: Technology
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Automated testing of NASA Software - part 2

  1. 1. CESE Automated Testing of Large Multi-language SW Systems using Cloud Computing Technical Presentation Principal Investigator (PI): Dr. Mikael Lindvall, CESE NASA POC: Markland Benson, White Sands Team members: Dharma Ganesan, Dr. Chris Ackermann(CESE) GMSEC, CFS, Space Network, MSL © 2011 Fraunhofer USA, Inc. 1 Center for Experimental Software Engineering
  2. 2. CESE Problems• Test cases are developed manually• Some test execution is automated (e.g., JUnit)• Test cases miss valid “corner” cases• Test cases are also programs: not easy for non- technical stakeholders to understand• Difficult to summarize what was tested Approach: Lightweight Model-based Test Generation and Execution © 2011 Fraunhofer USA, Inc. 2 Center for Experimental Software Engineering
  3. 3. CESETest Generation Workflow
  4. 4. CESE The structure of the approach Model List of abstract actions ITestMonkey List of abstract input data provider TestMonkeyImpl ITestDataProvider methods Interface System Under Test TestDataProviderImplModel is agnostic to the test execution technology ITestMonkey interface hides the test execution frameworkTestMonkeyImpl uses interfaces of the test execution framework
  5. 5. CESE Tools Infrastructure• Modeling: Yed Graphical Editor from yWorks• Model traversal and test generation: Jumbl- Uni. Tennessee• Test Execution: – Junit (Java) – CuTest (C) – Selenium (Web) – UISpec (Java Swing) – Sikuli (Image-based testing of legacy systems)• Glue scripts: – Conversion of Yed models to Jumbl models – Preparing a test suite from generated test cases – Generation of system-specific build files (e.g., makefiles) – Helper scripts to clean-up generated files
  6. 6. CESE Test Generation @ GMSEC …• State-of-the-practice: Test cases are hand-crafted• New initiative started to evaluate the feasibility of the FAST approach• Modeled a portion of the GMSEC Software Bus based on existing test cases and documentation• Automatically generated test cases• Found a few problems (already fixed now)
  7. 7. CESEHand-crafted test case (snippet)public static void main( String args[] ) { Status result = new Status(); Connection conn = new Connection(); ConnectionConfig cfg = new ConnectionConfig( args ); // Create the connection result = ConnectionFactory.Create( cfg, conn ); checkError( false, result, "Creating the connection object" ); // Disconnect result = conn.Disconnect(); checkError( true, result, "Disconnecting before connection is established" ); // Connect result = conn.Connect(); checkError( false, result, "Establishing the connection to the middleware" );} //..main()
  8. 8. CESE Manually developed test cases – source of Inspiration• We reviewed existing Java test cases• Found that the tester has used certain permutations of API-usage• Also, both good and “evil” cases are considered• We used these test cases as a source of reference for building API usage models
  9. 9. CESETest Generation @ GMSEC … APIs of the module under test public interface IConnection { public Status Connect(); public Status Disconnect(); … }
  10. 10. CESEStructure of cFE/CFS
  11. 11. CESE Structure of OSAL
  12. 12. CESE Sample APIs/******************************************************************************** Directory API******************************************************************************/// Makes a new directoryint32 OS_mkdir (const char *path, uint32 access);// Opens a directory for searchingos_dirp_t OS_opendir (const char *path);// Closes an open directoryint32 OS_closedir(os_dirp_t directory);// Removes an empty directory from the file system.int32 OS_rmdir (const char *path);
  13. 13. CESEExample of an OSAL model
  14. 14. CESE API doc of open directory/*-------------------------------------------------------------------------------------- Name: OS_mkdir Purpose: makes a directory specified by path. Returns: OS_FS_ERR_INVALID_POINTER if path is NULL OS_FS_ERR_PATH_TOO_LONG if the path is too long to be stored locally OS_FS_ERR_PATH_INVALID if path cannot be parsed OS_FS_ERROR if the OS call fails OS_FS_SUCCESS if success Note: The access parameter is currently unused.---------------------------------------------------------------------------------------*/int32 OS_mkdir (const char *path, uint32 access);
  15. 15. CESEInside Open Invalid Directory
  16. 16. CESE Sample IMonkey Interface• int32 removeDirectoryValid(void);• int32 removeDirectoryPathNull(void);• int32 removeDirectoryPathTooLong(void);• int32 removeDirectoryPathUnparsable(void);• int32 removeDirectoryCurrent(void);• int32 removeDirectoryNotEmpty(void);• …
  17. 17. CESE Sample generated Test in CuTestvoid Testosal_Filesystem_min_2(CuTest* tc) { status = makeFilesystemValid(); CuAssertIntEquals_Msg(tc,"Filesystem could not be created", OS_FS_SUCCESS, status); status = mountFilesystemValid(); CuAssertIntEquals_Msg(tc,"Filesystem could not be mounted", OS_FS_SUCCESS, status); pointer = openDirectoryValid(); CuAssertTrue(tc, pointer != NULL); … status = removeFilesystemValid(); CuAssertIntEquals_Msg(tc,"Filesystem could not be removed”, status);}
  18. 18. CESE Issues found using this method• File-descriptors after removing file-system: • After somewhat long tests we would run out of file-descriptors • This would even happen with a newly created file-system • OSAL does not remove file-descriptors for files open when the file-system is removed • Unable to create and open files • Some wrong error codes returned
  19. 19. CESE Current Results• An end-to-end approach for test generation – Successfully used on the GMSEC and the CFS and detected bugsNext steps are to apply to the Space Network and the MSL projects © 2011 Fraunhofer USA, Inc. 19 Center for Experimental Software Engineering