VCE IT Theory Slideshows 2011+ By Mark Kelly McKinnon Secondary College Vceit.com PSM - Problem Solving Methodology
Contents The Steps Analysis Design Development Testing Documentation Evaluation
Causes of information problems Here are some examples See the  PSM Analysis Activities slideshow
PSM The “Problem Solving Methodology” ( PSM ) is a model of steps to take when solving a problem.
The Official 2011 VCAA Glossary’s Complete PSM Steps A nalysis D esign D evelopment E valuation
The missing steps Documentation  is not a separate step (but it is still done: it’s put under the  development  heading) (Informal)  Testing  also occurs during development.
MIA There is no  formal testing  phase in the new PSM. Implementation  is also not mentioned.
ANALYSIS
Analysis Investigates the problem before attempting to solve it. Observe the existing system before doing anything else Like a doctor examines you before prescribing pills or surgery
Observing Measure the system’s performance Interview users Refer to system logs (e.g. errors, complaints, repairs) Examine system’s output
First question to ask Is there really a problem? Sometimes, there is no problem: that’s the best the system can do E.g. your home wireless network is not reaching the claimed 54Mbps transfer speed mentioned on the box Not a problem: that’s only a theoretical maximum which is never likely to be actually achieved.
Don’t bother Can’t solve problems that don’t actually exist But you can waste a lot of time and money  trying .
Second question  Can it be fixed ? Technical feasibility Some problems cannot be fixed, e.g. a horse’s broken leg. Pointless to even try. Don’t waste time and money on a hopeless cause.
Third question Is it worth fixing? Economic feasibility Some problems are not worth the necessary time, money and effort You  can  maybe get a pushbike to do 150km/h but surely it’s easier to get a motorbike instead?
And more questions Legal feasibility  – can you fix it without breaking any laws? Operational feasibility  – if you fix it, do you have the staffing, equipment, skill base, money etc to continue to operate it?
Scope of the solution What can the solution  do ?  What  can't  the solution do?  The boundaries or parameters of the solution.  How will the solution benefit the user?
Constraints What conditions need to be considered when designing a solution?  E.g.  cost,  speed of processing,  requirements of users,  legal requirements,  security,  compatibility,  level of expertise,  capacity,  availability of equipment
If the answers are not all “YES” GIVE UP Cancel the project Better to give up now than to proceed and waste far more time and money on a doomed project.
Determine solution requirements What  information  does the solution have to provide?  What  data  is needed to produce the information?  What  functions  does the solution have to provide?
Solution Requirements Put into a  logical design . Lays down the  specifications  of the new or modified system. Specifies what it should be able to achieve.
Logical Design Like a “wish list” of features Only lists  specifications , e.g. “should be able to produce 20,000 invoices in 2 hours with 99.9% accuracy”
But The logical design does not attempt to say how these objectives will be achieved. Don’t jump to conclusions about what software, hardware etc will be needed. Just define what you want it to do.
These requirements can be Functional  - what the solution is required to do Non-functional , which attributes the solution should possess, such as user-friendliness,  reliability,  portability,  robustness,  maintainability.
Tools Logical design tools  to assist in determining the solution requirements include ... context diagrams,  data flow diagrams and  use case diagrams Logical Data Dictionaries (e.g. “What data should be in a sales contract? Hierarchy Charts / Organisational chart Decision Trees
If all the answers are “YES”... All data collected during analysis needs to be documented for later reference. Each step in the PSM must be fully and properly finished before the next step begins
Finally The client, or boss, gives approval to move onto the next step: design
DESIGN
Design... How the solution will  work What interfaces and outout will  look  like
Need to design Hardware needs Software needs Training and documentation requirements Procedures that need to be created or changed Evaluation criteria
Tools data dictionaries and data structure diagrams,  input-process-output (IPO) charts, flowcharts,  pseudocode,  object descriptions.
Designing how bits of the solution fit together – e.g. Website  pages, style sheets, scripts;  Database  queries, forms, reports;  Program  modules, procedures, functions.
Useful tools for this Website : storyboards,  site maps Database : entity-relationship diagrams, data flow diagrams, structure charts,  Multipurpose : hierarchy charts, context diagrams, use cases.
Design Consider alternative designs More than one way to skin a cat Some designs may be technically great, but unacceptable because of  constraints
Design Every  solution has both pros and cons What’s acceptable to a developer might be impossible for the client Selecting a design strategy means achieving maximum “pros” with minimum “cons” What is a pro or con varies from client to client
Physical Design Tools These actually plan how to  build  a system They give instructions on what to do Physical data dictionaries (e.g. “Customer_Surname is string, 25 characters”) Data Flow Diagrams...
Physical Design Tools Storyboards Flow Charts, Nassi-Shneiderman charts Structure Charts IPO charts  Layout diagrams / mockups  Pseudocode
Designing appearances layout diagrams,  annotated diagrams/mock ups prototypes
Designing the evaluation criteria What measures will be used to judge whether or not the solution requirements have been met?  Should relate to the solution requirements identified in the  analysis  stage’s logical design.
Design Must be documented If successful, will lead to Development phase. If not, cancel the project. Is used as a strict instruction manual during the following phases, particularly during…
DEVELOPMENT
Development Often involves a coordinated team using project management to monitor their progress and manage their actions and resources.
Acquire equipment Hardware is bought  off-the-shelf  or  custom  built. Software is bought  off-the-shelf  or  custom  programmed. Custom  equipment is very expensive and slow to create, but suits the user’s needs perfectly. Off-the-shelf  equipment is far cheaper, easier to get and has more support, but needs to offer all the functionality the user needs.
Validation Checks for the reasonableness of data being input, mainly: Range checks Existence checks Type checks Can be done  manually  or  electronically
INFORMAL TESTING
Informal testing Software and hardware are assembled Informal  testing carried out at each stage of development: Component testing : ensures each component works as it should in isolation (e.g. a software module, a hardware device) Integration testing : ensures components work properly together.  e.g. a memory chip can communicate properly with a CPU; a software module receives the right data from another module.
Testing Activities Deciding: What  needs to be tested (e.g. Formulas, buttons, media, links, communications)? How  will each thing be tested? What  test data , if any, will be used? What are the  expected results? Conducting the tests Recording the actual results Correcting any identified errors.
Testing types Unit testing  – tests system components in isolation to see if they behave properly  Integration testing  – tests components once they have been integrated with other components. Often components are all fine by themselves, but errors arise when they try to exchange data.
User Documentation Once the system is finished and mature, documentation is written. See here for the documentation slideshow See  Onscreen user doc slideshow
After development: formal testing Informal “check as you go” testing occurred during development Formal  testing is an official procedure that proves the system works as it should The objectives specified in the  analysis phase’s logical design are confirmed one by one
Formal testing May also involve  user acceptance   testing  (UAT) The customer, or typical people who will be using the system for real, use it and report their findings.
Formal testing Client  – “Show me that it can produce 10,000 invoices in 5 hours” Developer  – OK.  Watch… [Time passes] Client  – “OK. Now show me the improved readability” Developer  – “Compare this new output with sample output from the old system.” Client   - “Good.  Now show me…” etc
EVALUATION
Evaluation Evaluation is  not  testing! Not trying to prove the system works – that was established during testing! Establishes the success or failure of a project Studies the new system to determine if it has achieved the ambitions it had set out in the logical design.
When to evaluate? Not too soon after implementation – users still maybe uncomfortable with it, may not feel “expert” yet Not too long after implementation – users will have forgotten the old system and be unable to compare them Let users use the new system long enough to be comfortable (e.g. a month of  daily  use, six months of  weekly  use)
How to evaluate Measure  whenever possible – time operations, count errors, add up costs etc Interview or survey  only when opinions are being evaluated – e.g. Is it easy to use?, Do you feel comfortable?, Is it fun? Is the output attractive? Don’t  ask “Do you think the new system is faster/more accurate than the old one?” Cold, hard facts are more reliable.
What  to evaluate Evaluate the criteria laid down during design. These criteria should be based on the specifications set out in the logical design. If the original aim was to make a system  faster  and  more accurate , evaluate its  speed  and  error rate .
Evaluation methods For each criterion to evaluate (e.g. speed) there needs to be a method with which to evaluate it.  E.g. Speed –  Time  how long it takes to produce output Accuracy –  count  the number of errors recorded in the error log Fun to use –  interview  users and ask their opinion
Remember Evaluation  criteria  are  topics  to study Evaluation  methods  are  actions taken  to study the topic. Each  criterion  has a corresponding  method .
Efficiency criteria Speed Output produced in a given time Amount of labour required Total Cost of Ownership of the system Including initial cost, running costs, repairs, upgrades, consumables, training
Effectiveness criteria Generally:  Quality  of the product. Accuracy / error rate Reliability Attractiveness of output, readability Ease of use
Effectiveness criteria Fun factor Accessibility Portability Security Compatibility with existing equipment Expandability, flexibility Ruggedness Etc etc
By Mark Kelly McKinnon Secondary College vceit.com IT APPLICATIONS SLIDESHOWS These slideshows may be freely used, modified or distributed by teachers and students anywhere on the planet (but not elsewhere). They may NOT be sold.  They must NOT be redistributed if you modify them.

Problem Solving Methodology 2011 - 2014

  • 1.
    VCE IT TheorySlideshows 2011+ By Mark Kelly McKinnon Secondary College Vceit.com PSM - Problem Solving Methodology
  • 2.
    Contents The StepsAnalysis Design Development Testing Documentation Evaluation
  • 3.
    Causes of informationproblems Here are some examples See the PSM Analysis Activities slideshow
  • 4.
    PSM The “ProblemSolving Methodology” ( PSM ) is a model of steps to take when solving a problem.
  • 5.
    The Official 2011VCAA Glossary’s Complete PSM Steps A nalysis D esign D evelopment E valuation
  • 6.
    The missing stepsDocumentation is not a separate step (but it is still done: it’s put under the development heading) (Informal) Testing also occurs during development.
  • 7.
    MIA There isno formal testing phase in the new PSM. Implementation is also not mentioned.
  • 8.
  • 9.
    Analysis Investigates theproblem before attempting to solve it. Observe the existing system before doing anything else Like a doctor examines you before prescribing pills or surgery
  • 10.
    Observing Measure thesystem’s performance Interview users Refer to system logs (e.g. errors, complaints, repairs) Examine system’s output
  • 11.
    First question toask Is there really a problem? Sometimes, there is no problem: that’s the best the system can do E.g. your home wireless network is not reaching the claimed 54Mbps transfer speed mentioned on the box Not a problem: that’s only a theoretical maximum which is never likely to be actually achieved.
  • 12.
    Don’t bother Can’tsolve problems that don’t actually exist But you can waste a lot of time and money trying .
  • 13.
    Second question Can it be fixed ? Technical feasibility Some problems cannot be fixed, e.g. a horse’s broken leg. Pointless to even try. Don’t waste time and money on a hopeless cause.
  • 14.
    Third question Isit worth fixing? Economic feasibility Some problems are not worth the necessary time, money and effort You can maybe get a pushbike to do 150km/h but surely it’s easier to get a motorbike instead?
  • 15.
    And more questionsLegal feasibility – can you fix it without breaking any laws? Operational feasibility – if you fix it, do you have the staffing, equipment, skill base, money etc to continue to operate it?
  • 16.
    Scope of thesolution What can the solution do ? What can't the solution do? The boundaries or parameters of the solution. How will the solution benefit the user?
  • 17.
    Constraints What conditionsneed to be considered when designing a solution? E.g. cost, speed of processing, requirements of users, legal requirements, security, compatibility, level of expertise, capacity, availability of equipment
  • 18.
    If the answersare not all “YES” GIVE UP Cancel the project Better to give up now than to proceed and waste far more time and money on a doomed project.
  • 19.
    Determine solution requirementsWhat information does the solution have to provide? What data is needed to produce the information? What functions does the solution have to provide?
  • 20.
    Solution Requirements Putinto a logical design . Lays down the specifications of the new or modified system. Specifies what it should be able to achieve.
  • 21.
    Logical Design Likea “wish list” of features Only lists specifications , e.g. “should be able to produce 20,000 invoices in 2 hours with 99.9% accuracy”
  • 22.
    But The logicaldesign does not attempt to say how these objectives will be achieved. Don’t jump to conclusions about what software, hardware etc will be needed. Just define what you want it to do.
  • 23.
    These requirements canbe Functional - what the solution is required to do Non-functional , which attributes the solution should possess, such as user-friendliness, reliability, portability, robustness, maintainability.
  • 24.
    Tools Logical designtools to assist in determining the solution requirements include ... context diagrams, data flow diagrams and use case diagrams Logical Data Dictionaries (e.g. “What data should be in a sales contract? Hierarchy Charts / Organisational chart Decision Trees
  • 25.
    If all theanswers are “YES”... All data collected during analysis needs to be documented for later reference. Each step in the PSM must be fully and properly finished before the next step begins
  • 26.
    Finally The client,or boss, gives approval to move onto the next step: design
  • 27.
  • 28.
    Design... How thesolution will work What interfaces and outout will look like
  • 29.
    Need to designHardware needs Software needs Training and documentation requirements Procedures that need to be created or changed Evaluation criteria
  • 30.
    Tools data dictionariesand data structure diagrams, input-process-output (IPO) charts, flowcharts, pseudocode, object descriptions.
  • 31.
    Designing how bitsof the solution fit together – e.g. Website pages, style sheets, scripts; Database queries, forms, reports; Program modules, procedures, functions.
  • 32.
    Useful tools forthis Website : storyboards, site maps Database : entity-relationship diagrams, data flow diagrams, structure charts, Multipurpose : hierarchy charts, context diagrams, use cases.
  • 33.
    Design Consider alternativedesigns More than one way to skin a cat Some designs may be technically great, but unacceptable because of constraints
  • 34.
    Design Every solution has both pros and cons What’s acceptable to a developer might be impossible for the client Selecting a design strategy means achieving maximum “pros” with minimum “cons” What is a pro or con varies from client to client
  • 35.
    Physical Design ToolsThese actually plan how to build a system They give instructions on what to do Physical data dictionaries (e.g. “Customer_Surname is string, 25 characters”) Data Flow Diagrams...
  • 36.
    Physical Design ToolsStoryboards Flow Charts, Nassi-Shneiderman charts Structure Charts IPO charts Layout diagrams / mockups Pseudocode
  • 37.
    Designing appearances layoutdiagrams, annotated diagrams/mock ups prototypes
  • 38.
    Designing the evaluationcriteria What measures will be used to judge whether or not the solution requirements have been met? Should relate to the solution requirements identified in the analysis stage’s logical design.
  • 39.
    Design Must bedocumented If successful, will lead to Development phase. If not, cancel the project. Is used as a strict instruction manual during the following phases, particularly during…
  • 40.
  • 41.
    Development Often involvesa coordinated team using project management to monitor their progress and manage their actions and resources.
  • 42.
    Acquire equipment Hardwareis bought off-the-shelf or custom built. Software is bought off-the-shelf or custom programmed. Custom equipment is very expensive and slow to create, but suits the user’s needs perfectly. Off-the-shelf equipment is far cheaper, easier to get and has more support, but needs to offer all the functionality the user needs.
  • 43.
    Validation Checks forthe reasonableness of data being input, mainly: Range checks Existence checks Type checks Can be done manually or electronically
  • 44.
  • 45.
    Informal testing Softwareand hardware are assembled Informal testing carried out at each stage of development: Component testing : ensures each component works as it should in isolation (e.g. a software module, a hardware device) Integration testing : ensures components work properly together. e.g. a memory chip can communicate properly with a CPU; a software module receives the right data from another module.
  • 46.
    Testing Activities Deciding:What needs to be tested (e.g. Formulas, buttons, media, links, communications)? How will each thing be tested? What test data , if any, will be used? What are the expected results? Conducting the tests Recording the actual results Correcting any identified errors.
  • 47.
    Testing types Unittesting – tests system components in isolation to see if they behave properly Integration testing – tests components once they have been integrated with other components. Often components are all fine by themselves, but errors arise when they try to exchange data.
  • 48.
    User Documentation Oncethe system is finished and mature, documentation is written. See here for the documentation slideshow See Onscreen user doc slideshow
  • 49.
    After development: formaltesting Informal “check as you go” testing occurred during development Formal testing is an official procedure that proves the system works as it should The objectives specified in the analysis phase’s logical design are confirmed one by one
  • 50.
    Formal testing Mayalso involve user acceptance testing (UAT) The customer, or typical people who will be using the system for real, use it and report their findings.
  • 51.
    Formal testing Client – “Show me that it can produce 10,000 invoices in 5 hours” Developer – OK. Watch… [Time passes] Client – “OK. Now show me the improved readability” Developer – “Compare this new output with sample output from the old system.” Client - “Good. Now show me…” etc
  • 52.
  • 53.
    Evaluation Evaluation is not testing! Not trying to prove the system works – that was established during testing! Establishes the success or failure of a project Studies the new system to determine if it has achieved the ambitions it had set out in the logical design.
  • 54.
    When to evaluate?Not too soon after implementation – users still maybe uncomfortable with it, may not feel “expert” yet Not too long after implementation – users will have forgotten the old system and be unable to compare them Let users use the new system long enough to be comfortable (e.g. a month of daily use, six months of weekly use)
  • 55.
    How to evaluateMeasure whenever possible – time operations, count errors, add up costs etc Interview or survey only when opinions are being evaluated – e.g. Is it easy to use?, Do you feel comfortable?, Is it fun? Is the output attractive? Don’t ask “Do you think the new system is faster/more accurate than the old one?” Cold, hard facts are more reliable.
  • 56.
    What toevaluate Evaluate the criteria laid down during design. These criteria should be based on the specifications set out in the logical design. If the original aim was to make a system faster and more accurate , evaluate its speed and error rate .
  • 57.
    Evaluation methods Foreach criterion to evaluate (e.g. speed) there needs to be a method with which to evaluate it. E.g. Speed – Time how long it takes to produce output Accuracy – count the number of errors recorded in the error log Fun to use – interview users and ask their opinion
  • 58.
    Remember Evaluation criteria are topics to study Evaluation methods are actions taken to study the topic. Each criterion has a corresponding method .
  • 59.
    Efficiency criteria SpeedOutput produced in a given time Amount of labour required Total Cost of Ownership of the system Including initial cost, running costs, repairs, upgrades, consumables, training
  • 60.
    Effectiveness criteria Generally: Quality of the product. Accuracy / error rate Reliability Attractiveness of output, readability Ease of use
  • 61.
    Effectiveness criteria Funfactor Accessibility Portability Security Compatibility with existing equipment Expandability, flexibility Ruggedness Etc etc
  • 62.
    By Mark KellyMcKinnon Secondary College vceit.com IT APPLICATIONS SLIDESHOWS These slideshows may be freely used, modified or distributed by teachers and students anywhere on the planet (but not elsewhere). They may NOT be sold. They must NOT be redistributed if you modify them.