Your SlideShare is downloading. ×
0
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
ppt
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
709
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
15
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Software Development Processes in CSED: The Future? Chris Greenough & David Worth Software Engineering Group Computational Science & Engineering Rutherford Appleton Laboratory [email_address]
  • 2. Contents
    • Background & Motivation
    • The Process
    • Some Definitions
    • Software Quality
    • What is a software process?
    • Review of CSED software development
    • Process models & their shortcomings
    • Similar exercises & their outcomes
    • CSED Best Practice
    • Classifying CSED Software
    • Current and future SESP Tools
    • Challenges & Difficulties
    • An Example
    • Next Steps
    • Conclusions
  • 3. Background
    • Over the past year we have been making and critical study of the development processes within CSED in response to one of the recommendation from the EPSRC SLA Review Panel.
    • The recommendation was - CSED should consider implementing minimum standards of design, documentation and testing to ensure that software distributed by Daresbury (and Rutherford Appleton) and used by the scientific community acquires a reputation of being of the highest quality.
    • A draft document has been written, discussed and modified by a small working group within CSED.
    • This presentation will outline the study process and described the current content of the draft Software Development Best Practice document. It is an opportunity to add your comments to the process as a prelude to starting to develop an implementation plan.
  • 4. Plan of Action
    • In response to the SLA Review CSED has started a process to address this recommendation. The process has had three main strands:
      • A review of current software development practice within CSED.
      • A review of current best practice in software development taking into account software development initiatives in similar scientific areas.
      • The development of a CSED software development best practice.
    • The goal of this activity is to promote an incremental improvement of the current practices within CSED using a set of appropriate best practices supported by a selection of suitable software engineering methods and tools.
  • 5. Definition: Software Engineering
    • Software engineering is an engineering discipline that is concerned with all aspects of software production.
    • Software engineers should adopt a systematic and organised approach to their work and use appropriate tools and techniques depending on the problem to be solved, the development constraints and the resources available.
    • Sommerville, Software Engineering, 8 th Edition
  • 6. Software engineering and computer science?
    • Computer science is concerned with theory and fundamentals; software engineering is concerned with the practicalities of developing and delivering useful software.
    • Computer science theories are still insufficient to act as a complete underpinning for software engineering (unlike e.g. physics and electrical engineering).
    • Sommerville, Software Engineering, 8 th Edition
  • 7. What is software quality
    • Software quality is a very difficult term to define!
    • The IEEE definition: Software quality is:
      • The degree to which a system, components, or process meets specified requirements.
      • The degree to which a system, components, or process meets customer or use needs or expectations
    • The Pressman definitions:
      • Conformance to explicitly stated functional and performance requirements, explicit documented development standards, and implicit characteristics that are expect of al professionally developed software.
    • These are very different: one focusing on the users the other focusing on the process.
    • Both definition agree that quality comes through conformance to a requirement specification.
    • Not particularly useful!
  • 8. …software quality continued…
    • Quality criteria include but are not limited to:
      • Economy Correctness Resilience Integrity
      • Reliability Usability Documentation Modifiability
      • Clarity Understandability Validity Maintainability
      • Flexibility Generality Portability Interoperability
      • Testability Efficiency Modularity Resuability
    • Because there are so many criteria, we can not use them all for measuring software quality.
    • Some of the desired characteristics can be used in software product standards as guiding actual development work.
    • To be a high quality software, software must be able to run correctly and consistently, have few defects (if there are), handle abnormal situation nicely, and need little installation effort.
  • 9. What is a software process?
    • A set of activities whose goal is the development or evolution of software.
    • Generic activities in all software processes are:
      • Specification - what the system should do and its development constraints
      • Development - production of the software system
      • Validation - checking that the software is what the customer wants
      • Evolution - changing the software in response to changing demands.
    • Sommerville, Software Engineering, 8 th Edition
  • 10. Review of CSED Software Development
    • Requirements
    • Design
    • Formal Methods
    • Languages
    • Graphics
    • Integrated Design Environments
    • Component Systems
    • Parallel/Distributed Technology
    • Data/Web Technology
    • Version Control
    • Software Processes
    • Process Tools
    • Performance
    • Software Quality
    • Testing
    • Distribution
    • Bug Tracing
    • Documentation
    The questionnaire was based on that used by the Harrison CCP Scoping Study with a few minor additions. The intension was to gather information on all elements of software development.
  • 11. Current Practice within CSED (1)
    • Requirements: Requirements gathering is done in an informal way if performed at all. Some documentation is written but not in general. There was only one instance of a detailed requirements document being written.
    • Design: This was not broken down into architectural and detailed design but the responses indicate only informal design is performed. There is no mention of the designs being documented.
    • Formal Methods: Only Object Orientation was mentioned as a formal method.
    • Languages: The dominant language is Fortran in some form: 77, 90 and 95 are mentioned. Some mention of C, C++, Java, perl and python.
    • Graphics: With one exception, DLV, generally only simple graphical systems are currently being used by most Groups. Those developing GUIs and visualisation tools are using OpenGL and Tcl/Tk.
  • 12. Current Practice within CSED (2)
    • Integrated Design Environments: Only Visual Studio on Windows systems is being used.
    • Component Systems: No group is using component style technology in their developments.
    • Parallel/Distributed: MPI is the dominant software harness and SPMD the most common paradigm of parallelism. Also Global Arrays (GA), OpenMP and Shmem are mentioned.
    • Data/Web Technology: Only use is in the CCPForge project and the HSL Archive.
    • Version Control: CVS and Subversion are used with a variety of GUI front ends.
    • Software Processes: In general no formal/documented SDP are used.
    • Process Tools: A general array of compilers, editors (emacs, vi) and debuggers (DDT, Totalview) are used. No group uses software quality tools in general development.
  • 13. Current Practice within CSED (3)
    • Performance: This is an issue for most groups. Tools such as Vampir, xprofile, VTune and gprof are used.
    • Software Quality: Compile options are the main tool for software quality where considered. In general no software quality planning and processes.
    • Testing: In general no testing plans. Some unit testing and testing through the use of data sets as part of regression testing. Some in house tools to monitor and control regression testing.
    • Distribution: Most distribution is performed using tar files and CVS repositories. Web sites and SourceForge are also used. It is unclear whether tools such as automake or auto config are used and the level of documentation was unspecified.
    • Bug Tracking: Bugzilla and forums are in use.
    • Documentation: Very few documentation tools are in use. LaTeX and html appear to be the most common mark up languages. Some user documentation is evident but other documentation is unclear.
  • 14. Elements of Software Development
    • User Requirements
    • Software Requirements: target systems, languages
    • Architectural Design: target systems
    • Detailed Software Design: language, structure, libraries
    • Implementation: languages, quality, version control
    • Unit Testing
    • Integration Testing
    • System Testing
    • Deployment and Acceptance Testing
    • Maintenance: regression testing, quality control, bug tracking, version control
    • Documentation: paper, online (HTML/PDF), system, user
  • 15. The software process
    • A structured set of activities required to develop a software system
      • Specification;
      • Design;
      • Validation;
      • Evolution.
    • A software process model is an abstract representation of a process. It presents a description of a process from some particular perspective.
  • 16. The Waterfall Model Requirements Design Implementation Testing Deployment Maintenance The requirements define the required functionality and operation of the final package. These are driven by what the end-user expects the system to provide. Given a set of user and system requirements design often takes two steps: an overall system architecture followed by a detailed design of the system’s modules and interfaces. Program units are developed according to the detailed design. These are often functionally tested as a unit before integration into the complete system. Testing is one of the most important phases of a system’s development. It takes the form of testing the functionality of each component and then testing the complete integrated system. Deployment and maintenance of the system are the final steps in the process but often two of the hardest. A badly designed and implemented system can be very difficult to maintain.
  • 17. Problems with the Waterfall model
    • The main drawback of the waterfall model is the difficulty of accommodating change after the process is underway.
    • One phase has to be complete before moving onto the next phase
    • Inflexible partitioning of the project into distinct stages makes it difficult to respond to changing customer requirements.
    • Therefore, this model is only appropriate when the requirements are well-understood and changes will be fairly limited during the design process.
    • Few business systems have stable requirements.
    • The waterfall model is mostly used for large systems engineering projects where a system is developed at several sites.
  • 18. Evolutionary development
    • Exploratory development
      • Objective is to work with customers and to Evolutionary development evolve a final system from an initial outline specification. Should start with well-understood requirements and add new features as proposed by the customer.
    • Throw-away prototyping
      • Objective is to understand the system requirements. Should start with poorly understood requirements to clarify what is really needed.
  • 19. Evolutionary development Outline Description Outline Description Outline Description Intermediate versions Initial Version Final Version Validation Design & Implementation Specification
  • 20. Problems with Evolutionary development
    • Problems
      • Lack of process visibility;
      • Systems are often poorly structured;
      • Special skills (e.g. in languages for rapid prototyping) may be required.
    • Applicability
      • For small or medium-size interactive systems;
      • For parts of large systems (e.g. the user interface);
      • For short-lifetime systems.
  • 21. Spiral development
    • Process is represented as a spiral rather than as a sequence of activities with backtracking.
    • Each loop in the spiral represents a phase in the process.
    • No fixed phases such as specification or design - loops in the spiral are chosen depending on what is required.
    • Risks are explicitly assessed and resolved throughout the process.
    • Proposed by Boehm (1988)
  • 22. Boehm Spiral Development Model
  • 23. Spiral Process Activities
    • Objective setting
      • Specific objectives for the phase are identified.
    • Risk assessment and reduction
      • Risks are assessed and activities put in place to reduce the key risks.
    • Development and validation
      • A development model for the system is chosen which can be any of the generic models.
    • Planning
      • The project is reviewed and the next phase of the spiral is planned.
  • 24. Post’s thoughts of software development
  • 25. Some Model Conclusions
    • Software development in a complex task
    • No one model is sufficient to represent the process
    • What is required is a recognition of the important processes and outputs to ensure quality software
  • 26. Software Development – What Others Tried
    • Reviewed current practice in similar activities:
      • ASC Programme in general
      • Software Modernisation Program
      • Lawrence Livermore National Lab
      • Sandia National Lab
      • Los Alamos National Lab
      • SEI Carnegie Mellon
    • Review some of the many standards available (IEEE, ISO, CMM..)
  • 27. ASCI Lessons Learned – Post & Kendall (LANL)
    • Build on successful code development history and prototypes
    • Highly competent and motivated people in a good team are essential
    • Risk identification, management and mitigation are essential
    • Software Project Management: Run the code project like a project
    • Schedule and resources are determined by the requirements
    • Customer focus is essential
      • For code teams and for stakeholder support
    • Better physics is much more important than better computer science
    • Use modern but proven Computer Science techniques,
      • Don’t make the code project a Computer Science research project
    • Provide training for the team
    • Software Quality Engineering: Best Practices rather than Processes
    • Validation and Verification are essential
  • 28. Elements of Software Development
    • User Requirements
    • Software Requirements: target systems, languages
    • Architectural Design: target systems
    • Detailed Software Design: language, structure, libraries
    • Implementation: languages, quality, version control
    • Unit Testing
    • Integration Testing
    • System Testing
    • Deployment and Acceptance Testing
    • Maintenance: regression testing, quality control, bug tracking, version control
    • Documentation: paper, online (HTML/PDF), system, user
  • 29. Steps in CSED Best Practice (1)
    • User and System Requirements: The user requirements of a system should describe the functional and non-functional requirements and expand these in terms that software engineers/implementers have a starting point for the system design. Output: User and System Requirements Document.
    • Management Planning: will contain details of the methods and techniques to be developed with (where necessary) task breakdowns, time scales and dependencies, It should also contain details on any standards being adopted by the project and details of any specific methods/techniques being used in the project. Choice of licence. Also includes document storage etc… Output: Management Plan
  • 30. Steps in CSED Best Practice (2)
    • Quality Assurance Planning: Decide how the quality of the development will be monitored, measured and maintained throughout the project. This could contain detailed processes and use of software tools where thought appropriate. It details how the quality of the system will be audited (measured and demonstrated, verified). Should also define roles and responsibilities. Output: Quality Assurance Plan
    • Software Design: Work out the overall architecture of the system and its components. Draft global data structures and their access methods. For larger projects, design each routine/module in relation to the architecture. Sort out languages, libraries and system constraints. Specify language standards and quality metrics to be applied and describe reasons for any exceptions/extensions used. Define project-specific coding conventions. Output: Software Design Document
  • 31. Steps in CSED Best Practice (3)
    • Test Planning: Develop test methodology and test cases to exercise the software. Must include deployment and acceptance testing and should consider other approaches including unit testing, coverage analysis and black and white box approaches.
      • Unit Testing: Test each unit using methods agreed – API testing if necessary to cover all possible input values – coverage testing the exercise the code fully.
      • Integration Testing: Combine units within the call tree and test their interaction and interfaces.
      • Coverage Analysis: Examine whether test suite executes all statements and redesign test suite as necessary.
      • Deployment and Acceptance Testing: Run sample of representative user test cases test and distribute system using agreed mechanism.
    • Output: Test Plan
  • 32. Steps in CSED Best Practice (4)
    • Implementation: Use the Software Design to do the development according to the agreed processes. Apply any software quality tests and ensure compliance. Includes writing documentation in-line. Output: The Software
    • System Documentation: will contain a very detailed description of all elements of the system. It will bring together how the user and system requirements have been implemented architecture and detailed design. It will include details of the software architectural, call trees, external dependencies, the algorithms used together with a detailed description of the implementation. It should contain everything a trainee developer would need to engage in a new development. Output: The System Documentation
  • 33. Steps in CSED Best Practice (5)
    • Testing: Test software in accordance with agreed Test Plan. Output: Test Report
    • User Documentation to include: User Manual - a complete description of the purpose and functionality of the system together with detailed instructions on how users can use the program. It should have not only a reference section giving details on each option available to the user but also some form of walk through tutorial covering the main elements of the systems usage. It should also contain suitable information on the algorithms being used and their implementation. (essential for non-prototype) Installation Guide - details on how the system is installed and tested on the systems.
  • 34. Steps in CSED Best Practice (6)
    • Project Web Site: The Web is now the main method of informing the scientific community about software and scientific results. The majority of CCPs and major CSED activities have a web presence. As part of the documentation of a software system information should be posted on the Web in a variety of formats. HTML, PDF and PostScript are the three most commonly used formats in the scientific community. The web pages should contain information on bug reporting and release developments as well as contact points.
    • Maintenance: Manage bug reports and fixes together with regression testing and re-distribution.
  • 35. Classes of CSED Software Projects - Scope
    • Production package: Software is being well used by the community and is either under development and/or being supported with bug fixes and support. Regular releases are made and it is available to the wider community through a distribution mechanism.
    • Project software: Software that is only being used and/or developed within a specific project for CSED or their project partners. There is no general release to the wider community.
    • Prototype software: Software developed in-house to test specific modelling or algorithmic ideas. Not intended to be used outside a specific group.
  • 36. CSED Software Development Standard - Summary
  • 37. Classes of CSED Software Projects - Age
    • Legacy: Software that has been around for a long time and is a mixture of programming styles and language dialects. This class of software may not have most of the basic documentation. Action: Audit and apply remedial activities – transformation, standardisation, documentation…
    • Mature: Recent software that is developed and maintained that may have been designed and developed according to some standards and for which the majority of documentation exists. Action: Audit and update current status
    • New: A completely new development. This software will be developed according the requirements of the project and proposed target community (production, project or prototype). Action: Define and follow best practice
  • 38. Software tools – acquired or licensed
    • After a survey of the available tools for the Fortran language the following tools have been selected:
      • ftnchek ( public domain)
      • FORCHECK ( FORCHECK)
      • NAGWare Tools ( Numerical Algorithms Group Ltd)
      • plusFORT suite ( Polyhedron Software Ltd)
      • Understand for Fortran ( Scientific Toolworks Inc.)
      • a range of different compilers (g77, gfortran, lf95, f95, fort..)
      • additional polish and metrics tools under development
      • test generation and management tools to follow
    • These tools provide a basic working set to aid the developer produce and maintain a good language quality during the implementation, testing and maintenance phases.
  • 39. Challenges and Difficulties
    • Provision of Software Tools
      • there are a limited number Fortran 95 based tools
      • many tools are particularly tolerant for mixed code
      • tools generally do not recognise pre-processors
      • language conformance is generally covered
      • transformations are limit
      • The available tools often create too much output which is difficult to navigate and interpret
      • license issues
    • People and Processes
      • Changing any development process is long term – years not months
      • There are very few short term gains to be seen
      • Using addition methods and process takes addition time and effort
      • Author recognition is a serious problem
      • The benefits of using these additional processes are often not immediately obvious
  • 40. An Example: The HSL Subroutine Library
    • HSL is the main software product of the Numerical Algorithms Group
    • HSL is a library of portable, fully documented and tested Fortran packages for large-scale scientific computations
    • O - Input and output aids
    • P - Polynomial and rational functions Q - Numerical integration
    • S - Statistics
    • T - Interpolation and approximation
    • V - Optimization and nonlinear data fitting
    • Y - Test program generators
    • Z - Fortran system facilities
    • A - Computer algebra
    • D - Differential equations
    • E - Eigenvalues and eigenvectors
    • F - Mathematical functions
    • G - Geometrical problems
    • I - Integer valued functions
    • K – Sorting
    • L - Linear programming
    • M - Matrices and linear algebra
    • N - Nonlinear equations
  • 41. HSL Software Development Documentation
    • HSL is a product which has legacy, mature and new elements
    • HSL has been develop over many years
    • The motivation:
      • new staff working on the project
      • external collaborator wishing to contribute
      • the realisation that there too many undocumented rules
      • the requirement to maintain quality standards
      • the need to reduce the time taken in preparing and polishing the software for new releases
  • 42. The HSL Software Development Document
    • The document addresses new developments
    • The document contains 21 pages (plus two check lists)
    • The aim of the report is to provide clear and comprehensive guidelines for those involved in the
      • design
      • development and
      • maintenance
    • of software for HSL.
    • It explains the organisation of HSL, including:
      • the use of version numbers
      • naming conventions
      • the aims and format of the user documentation
      • the programming language standards and style and
      • the verification and testing procedures
  • 43. Guidelines for the development of HSL software
    • Introduction
    • Organisation of HSL
      • Versions
      • Naming conventions
      • File naming conventions
    • Documentation
      • HSL specification documents
        • Summary
        • How to use the package
        • General information
        • Method
        • Example of Use
        • Accompanying reports
    • HSL catalogue
    • Programming language and style
      • Use of Fortran 95
      • HSL rules and conventions
      • Use of control parameters
      • Checking of the user's data
      • The role of information parameters
      • Communication between procedures
      • Use of other library routines
    • Verification and testing
      • Simple examples
      • Comprehensive test
      • Independent testing
    • Use of compilers
    • Role of the Librarian
      • Check lists
    • Licence issues
    • Acknowledgements
    • Appendices
      • Profiling with g95
      • Profiling with nag coverage95
      • Checking line lengths
      • Debugging and checking conformance with standards
      • Polishing Fortran 95 code
      • Polishing Fortran 77 code
      • Other tools
      • Check list for new packages
      • Check list for revised packages
  • 44. Some examples
    • Programming language and style
      • … Packages must adhere to the Fortran 95 standard except that, from September 2006, we allow the use of allocatable structure components and dummy arguments. These are part of the official extension that is defined by Technical Report TR 15581(E) and is included in Fortran 2003. It allows arrays to be of dynamic size without the computing overheads and memory-leakage dangers of pointers….
    • Verification and testing
      • … It should exercise all the modes that the user is permitted to employ (including testing of different settings for the control parameters, errors and warnings, and the different levels of printing). Ideally, it would exercise all possible execution sequences, but we recognize that this would usually not be practicable. Instead, we aim to exercise every statement at least once. Even this is often not practicable; for example, it is desirable to test the stat= variable of a deallocate statement, but it is probably impossible to provoke the statement to fail….
  • 45. …some more examples
    • Use of compilers
      • … having access to a range of up-to-date Fortran compilers (currently, Nag, g95, lf95, ifort). The developer should test his or her software using each of these compilers….
    • Debugging and check for conformance with standards
      • For debugging and checking conformance with standards, it is important to run several compilers. Our experience is that different compilers can find different problems. The following command lines may be used
        • Nag compiler: f95 -C=all -gline
        • Lahey-Fujitsu compiler: lf95 --chk aesux
        • Gnu g95 compiler, allowing allocatable components and dummy arguments:
        • g95 -std=f95 -ftr15581 -Wall -Wimplicit-none -fbounds-check -ftrace=full
        • Intel compiler: ifort -C -u
        • Forchk syntax analysis for Fortran 77: forchk -f77 -nwarn –ninf
      • Since each of these find different problems, they should all be used.
  • 46. Check List for a Revision
    • Package name: . . . . . . . . . . . . . . .
    • Start date: . . . . . . . . . . . . . . .
    • Version: . . . . . . . . . . . . . . .
    • Start with the latest HSL version
    • of codes and documentation: YES/NO
    • Update documentation:
      • Version number: YES/NO
      • Comments on changes: YES/NO
      • Changes: YES/NO
    • Add comments to start of code: YES/NO
    • Simple tests: YES/NO
    • Comprehensive tests:
    • Profile: Coverage: YES/NO
    • Different modes: YES/NO
    • Compilers: g95: YES/NO Nag: YES/NO
    • ifort: YES/NO lf95: YES/NO
    • xlf95: YES/NO
    • Use source BLAS/LAPACK: YES/NO
    • Output portability: YES/NO
    • Independent testing (optional): Tested by . . . . . . . . . . . . . . .
    • Generate other versions (e.g. single): YES/NO
    • Simple tests: YES/NO
    • Comprehensive testing: YES/NO
  • 47. Comparison with CSED Best Practice
  • 48. Next Steps
    • Classify the current software systems currently active within CSED: Product, Project or Prototype – Legacy, Mature or New.
    • A review of all documentation against documentation check list appropriate to the class.
    • A review of all development practice against development phases - to include quality controls, testing processes and change control appropriate to the class.
    • Development and execution of a remedial plan to bring the system up to the required level of conformance with tasks and times scales.
    • Monitor progress against the plan depending on the complexity and scale of the plan (monthly reviews would be recommended).
    • Define specific software metrics and acceptable ranges for conformance – use of tools such as compilers or other tools such as FORCHECK and nag_metrics.
    • Put in place a process of regular audits
  • 49. Things we need to do
    • Development and maintenance processes our software portfolio
    • Establish good software development practices
    • Capture/rescue important legacy software
    • Improve the quality of our software base:
      • language conformance (not just Fortran)
      • portability (over many platforms and architectures)
      • design and structure
      • testing
      • maintenance
      • improve software quality
    • Where possible use tools to automate, to make the process easier and more measurable
    • Ensure our developers adopt some good practices and develop their own process of improvement!
  • 50. Conclusions
    • Much of Best Practice is common sense – however is it formalised common sense
    • The essence of Best Practice is in definition and documentation
    • Software development should be managed like any project
    • The Best Practice outlined is a framework in which to work
    • The process steps have not been defined – these are the remit of the project to defined, implement and audit
    • There are some tools that can aid a process step
    • There is no magic bullet – software and development process improvement is a long term activity

×