Upcoming SlideShare
×

# Problem Solving Methodology 2011 - 2014

1,403 views
1,397 views

Published on

Definition of the Problem Solving Methodology (PSM)

Published in: Education
0 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

• Be the first to like this

Views
Total views
1,403
On SlideShare
0
From Embeds
0
Number of Embeds
7
Actions
Shares
0
32
0
Likes
0
Embeds 0
No embeds

No notes for slide

### Problem Solving Methodology 2011 - 2014

1. 1. VCE IT Theory Slideshows 2011+ By Mark Kelly McKinnon Secondary College Vceit.com PSM - Problem Solving Methodology
2. 2. Contents <ul><li>The Steps </li></ul><ul><li>Analysis </li></ul><ul><li>Design </li></ul><ul><li>Development </li></ul><ul><ul><li>Testing </li></ul></ul><ul><ul><li>Documentation </li></ul></ul><ul><li>Evaluation </li></ul>
3. 3. Causes of information problems <ul><li>Here are some examples </li></ul><ul><li>See the PSM Analysis Activities slideshow </li></ul>
4. 4. PSM <ul><li>The “Problem Solving Methodology” ( PSM ) is a model of steps to take when solving a problem. </li></ul>
5. 5. The Official 2011 VCAA Glossary’s Complete PSM Steps <ul><li>A nalysis </li></ul><ul><li>D esign </li></ul><ul><li>D evelopment </li></ul><ul><li>E valuation </li></ul>
6. 6. The missing steps <ul><li>Documentation is not a separate step (but it is still done: it’s put under the development heading) </li></ul><ul><li>(Informal) Testing also occurs during development. </li></ul>
7. 7. MIA <ul><li>There is no formal testing phase in the new PSM. </li></ul><ul><li>Implementation is also not mentioned. </li></ul>
8. 8. ANALYSIS
9. 9. Analysis <ul><li>Investigates the problem before attempting to solve it. </li></ul><ul><li>Observe the existing system before doing anything else </li></ul><ul><li>Like a doctor examines you before prescribing pills or surgery </li></ul>
10. 10. Observing <ul><li>Measure the system’s performance </li></ul><ul><li>Interview users </li></ul><ul><li>Refer to system logs (e.g. errors, complaints, repairs) </li></ul><ul><li>Examine system’s output </li></ul>
11. 11. First question to ask <ul><li>Is there really a problem? </li></ul><ul><li>Sometimes, there is no problem: that’s the best the system can do </li></ul><ul><li>E.g. your home wireless network is not reaching the claimed 54Mbps transfer speed mentioned on the box </li></ul><ul><li>Not a problem: that’s only a theoretical maximum which is never likely to be actually achieved. </li></ul>
12. 12. Don’t bother <ul><li>Can’t solve problems that don’t actually exist </li></ul><ul><li>But you can waste a lot of time and money trying . </li></ul>
13. 13. Second question <ul><li>Can it be fixed ? </li></ul><ul><li>Technical feasibility </li></ul><ul><li>Some problems cannot be fixed, e.g. a horse’s broken leg. </li></ul><ul><li>Pointless to even try. </li></ul><ul><li>Don’t waste time and money on a hopeless cause. </li></ul>
14. 14. Third question <ul><li>Is it worth fixing? </li></ul><ul><li>Economic feasibility </li></ul><ul><li>Some problems are not worth the necessary time, money and effort </li></ul><ul><li>You can maybe get a pushbike to do 150km/h but surely it’s easier to get a motorbike instead? </li></ul>
15. 15. And more questions <ul><li>Legal feasibility – can you fix it without breaking any laws? </li></ul><ul><li>Operational feasibility – if you fix it, do you have the staffing, equipment, skill base, money etc to continue to operate it? </li></ul>
16. 16. Scope of the solution <ul><li>What can the solution do ? </li></ul><ul><li>What can't the solution do? </li></ul><ul><li>The boundaries or parameters of the solution. </li></ul><ul><li>How will the solution benefit the user? </li></ul>
17. 17. Constraints <ul><li>What conditions need to be considered when designing a solution? E.g. </li></ul><ul><ul><li>cost, </li></ul></ul><ul><ul><li>speed of processing, </li></ul></ul><ul><ul><li>requirements of users, </li></ul></ul><ul><ul><li>legal requirements, </li></ul></ul><ul><ul><li>security, </li></ul></ul><ul><ul><li>compatibility, </li></ul></ul><ul><ul><li>level of expertise, </li></ul></ul><ul><ul><li>capacity, </li></ul></ul><ul><ul><li>availability of equipment </li></ul></ul>
18. 18. If the answers are not all “YES” <ul><li>GIVE UP </li></ul><ul><li>Cancel the project </li></ul><ul><li>Better to give up now than to proceed and waste far more time and money on a doomed project. </li></ul>
19. 19. Determine solution requirements <ul><li>What information does the solution have to provide? </li></ul><ul><li>What data is needed to produce the information? </li></ul><ul><li>What functions does the solution have to provide? </li></ul>
20. 20. Solution Requirements <ul><li>Put into a logical design . </li></ul><ul><li>Lays down the specifications of the new or modified system. </li></ul><ul><li>Specifies what it should be able to achieve. </li></ul>
21. 21. Logical Design <ul><li>Like a “wish list” of features </li></ul><ul><li>Only lists specifications , e.g. “should be able to produce 20,000 invoices in 2 hours with 99.9% accuracy” </li></ul>
22. 22. But <ul><li>The logical design does not attempt to say how these objectives will be achieved. </li></ul><ul><li>Don’t jump to conclusions about what software, hardware etc will be needed. </li></ul><ul><li>Just define what you want it to do. </li></ul>
23. 23. These requirements can be <ul><li>Functional - what the solution is required to do </li></ul><ul><li>Non-functional , which attributes the solution should possess, such as </li></ul><ul><ul><li>user-friendliness, </li></ul></ul><ul><ul><li>reliability, </li></ul></ul><ul><ul><li>portability, </li></ul></ul><ul><ul><li>robustness, </li></ul></ul><ul><ul><li>maintainability. </li></ul></ul>
24. 24. Tools <ul><li>Logical design tools to assist in determining the solution requirements include ... </li></ul><ul><li>context diagrams, </li></ul><ul><li>data flow diagrams and </li></ul><ul><li>use case diagrams </li></ul><ul><li>Logical Data Dictionaries (e.g. “What data should be in a sales contract? </li></ul><ul><li>Hierarchy Charts / Organisational chart </li></ul><ul><li>Decision Trees </li></ul>
25. 25. If all the answers are “YES”... <ul><li>All data collected during analysis needs to be documented for later reference. </li></ul><ul><li>Each step in the PSM must be fully and properly finished before the next step begins </li></ul>
26. 26. Finally <ul><li>The client, or boss, gives approval to move onto the next step: design </li></ul>
27. 27. DESIGN
28. 28. Design... <ul><li>How the solution will work </li></ul><ul><li>What interfaces and outout will look like </li></ul>
29. 29. Need to design <ul><li>Hardware needs </li></ul><ul><li>Software needs </li></ul><ul><li>Training and documentation requirements </li></ul><ul><li>Procedures that need to be created or changed </li></ul><ul><li>Evaluation criteria </li></ul>
30. 30. Tools <ul><li>data dictionaries and data structure diagrams, </li></ul><ul><li>input-process-output (IPO) charts, </li></ul><ul><li>flowcharts, </li></ul><ul><li>pseudocode, </li></ul><ul><li>object descriptions. </li></ul>
31. 31. Designing how bits of the solution fit together – e.g. <ul><li>Website pages, style sheets, scripts; </li></ul><ul><li>Database queries, forms, reports; </li></ul><ul><li>Program modules, procedures, functions. </li></ul>
32. 32. Useful tools for this <ul><li>Website : storyboards, site maps </li></ul><ul><li>Database : entity-relationship diagrams, data flow diagrams, structure charts, </li></ul><ul><li>Multipurpose : hierarchy charts, context diagrams, use cases. </li></ul>
33. 33. Design <ul><li>Consider alternative designs </li></ul><ul><li>More than one way to skin a cat </li></ul><ul><li>Some designs may be technically great, but unacceptable because of constraints </li></ul>
34. 34. Design <ul><li>Every solution has both pros and cons </li></ul><ul><li>What’s acceptable to a developer might be impossible for the client </li></ul><ul><li>Selecting a design strategy means achieving maximum “pros” with minimum “cons” </li></ul><ul><li>What is a pro or con varies from client to client </li></ul>
35. 35. Physical Design Tools <ul><li>These actually plan how to build a system </li></ul><ul><li>They give instructions on what to do </li></ul><ul><ul><li>Physical data dictionaries (e.g. “Customer_Surname is string, 25 characters”) </li></ul></ul><ul><ul><li>Data Flow Diagrams... </li></ul></ul>
36. 36. Physical Design Tools <ul><ul><li>Storyboards </li></ul></ul><ul><ul><li>Flow Charts, Nassi-Shneiderman charts </li></ul></ul><ul><ul><li>Structure Charts </li></ul></ul><ul><ul><li>IPO charts </li></ul></ul><ul><ul><li>Layout diagrams / mockups </li></ul></ul><ul><ul><li>Pseudocode </li></ul></ul>
37. 37. Designing appearances <ul><li>layout diagrams, </li></ul><ul><li>annotated diagrams/mock ups </li></ul><ul><li>prototypes </li></ul>
38. 38. Designing the evaluation criteria <ul><li>What measures will be used to judge whether or not the solution requirements have been met? </li></ul><ul><li>Should relate to the solution requirements identified in the analysis stage’s logical design. </li></ul>
39. 39. Design <ul><li>Must be documented </li></ul><ul><li>If successful, will lead to Development phase. If not, cancel the project. </li></ul><ul><li>Is used as a strict instruction manual during the following phases, particularly during… </li></ul>
40. 40. DEVELOPMENT
41. 41. Development <ul><li>Often involves a coordinated team using project management to monitor their progress and manage their actions and resources. </li></ul>
42. 42. Acquire equipment <ul><li>Hardware is bought off-the-shelf or custom built. </li></ul><ul><li>Software is bought off-the-shelf or custom programmed. </li></ul><ul><li>Custom equipment is very expensive and slow to create, but suits the user’s needs perfectly. </li></ul><ul><li>Off-the-shelf equipment is far cheaper, easier to get and has more support, but needs to offer all the functionality the user needs. </li></ul>
43. 43. Validation <ul><li>Checks for the reasonableness of data being input, mainly: </li></ul><ul><ul><li>Range checks </li></ul></ul><ul><ul><li>Existence checks </li></ul></ul><ul><ul><li>Type checks </li></ul></ul><ul><li>Can be done manually or electronically </li></ul>
44. 44. INFORMAL TESTING
45. 45. Informal testing <ul><li>Software and hardware are assembled </li></ul><ul><li>Informal testing carried out at each stage of development: </li></ul><ul><ul><li>Component testing : ensures each component works as it should in isolation (e.g. a software module, a hardware device) </li></ul></ul><ul><ul><li>Integration testing : ensures components work properly together. e.g. a memory chip can communicate properly with a CPU; a software module receives the right data from another module. </li></ul></ul>
46. 46. Testing Activities <ul><li>Deciding: </li></ul><ul><ul><li>What needs to be tested (e.g. Formulas, buttons, media, links, communications)? </li></ul></ul><ul><ul><li>How will each thing be tested? </li></ul></ul><ul><ul><li>What test data , if any, will be used? </li></ul></ul><ul><ul><li>What are the expected results? </li></ul></ul><ul><li>Conducting the tests </li></ul><ul><li>Recording the actual results </li></ul><ul><li>Correcting any identified errors. </li></ul>
47. 47. Testing types <ul><li>Unit testing – tests system components in isolation to see if they behave properly </li></ul><ul><li>Integration testing – tests components once they have been integrated with other components. </li></ul><ul><li>Often components are all fine by themselves, but errors arise when they try to exchange data. </li></ul>
48. 48. User Documentation <ul><li>Once the system is finished and mature, documentation is written. </li></ul><ul><li>See here for the documentation slideshow </li></ul><ul><li>See Onscreen user doc slideshow </li></ul>
49. 49. After development: formal testing <ul><li>Informal “check as you go” testing occurred during development </li></ul><ul><li>Formal testing is an official procedure that proves the system works as it should </li></ul><ul><li>The objectives specified in the analysis phase’s logical design are confirmed one by one </li></ul>
50. 50. Formal testing <ul><li>May also involve user acceptance testing (UAT) </li></ul><ul><li>The customer, or typical people who will be using the system for real, use it and report their findings. </li></ul>
51. 51. Formal testing <ul><li>Client – “Show me that it can produce 10,000 invoices in 5 hours” </li></ul><ul><li>Developer – OK. Watch… </li></ul><ul><li>[Time passes] </li></ul><ul><li>Client – “OK. Now show me the improved readability” </li></ul><ul><li>Developer – “Compare this new output with sample output from the old system.” </li></ul><ul><li>Client - “Good. Now show me…” </li></ul><ul><li>etc </li></ul>
52. 52. EVALUATION
53. 53. Evaluation <ul><li>Evaluation is not testing! </li></ul><ul><li>Not trying to prove the system works – that was established during testing! </li></ul><ul><li>Establishes the success or failure of a project </li></ul><ul><li>Studies the new system to determine if it has achieved the ambitions it had set out in the logical design. </li></ul>
54. 54. When to evaluate? <ul><li>Not too soon after implementation – users still maybe uncomfortable with it, may not feel “expert” yet </li></ul><ul><li>Not too long after implementation – users will have forgotten the old system and be unable to compare them </li></ul><ul><li>Let users use the new system long enough to be comfortable (e.g. a month of daily use, six months of weekly use) </li></ul>
55. 55. How to evaluate <ul><li>Measure whenever possible – time operations, count errors, add up costs etc </li></ul><ul><li>Interview or survey only when opinions are being evaluated – e.g. Is it easy to use?, Do you feel comfortable?, Is it fun? Is the output attractive? </li></ul><ul><li>Don’t ask “Do you think the new system is faster/more accurate than the old one?” </li></ul><ul><li>Cold, hard facts are more reliable. </li></ul>
56. 56. What to evaluate <ul><li>Evaluate the criteria laid down during design. </li></ul><ul><li>These criteria should be based on the specifications set out in the logical design. </li></ul><ul><li>If the original aim was to make a system faster and more accurate , evaluate its speed and error rate . </li></ul>
57. 57. Evaluation methods <ul><li>For each criterion to evaluate (e.g. speed) there needs to be a method with which to evaluate it. E.g. </li></ul><ul><li>Speed – Time how long it takes to produce output </li></ul><ul><li>Accuracy – count the number of errors recorded in the error log </li></ul><ul><li>Fun to use – interview users and ask their opinion </li></ul>
58. 58. Remember <ul><li>Evaluation criteria are topics to study </li></ul><ul><li>Evaluation methods are actions taken to study the topic. </li></ul><ul><li>Each criterion has a corresponding method . </li></ul>
59. 59. Efficiency criteria <ul><li>Speed </li></ul><ul><li>Output produced in a given time </li></ul><ul><li>Amount of labour required </li></ul><ul><li>Total Cost of Ownership of the system </li></ul><ul><ul><li>Including initial cost, running costs, repairs, upgrades, consumables, training </li></ul></ul>
60. 60. Effectiveness criteria <ul><li>Generally: Quality of the product. </li></ul><ul><li>Accuracy / error rate </li></ul><ul><li>Reliability </li></ul><ul><li>Attractiveness of output, readability </li></ul><ul><li>Ease of use </li></ul>
61. 61. Effectiveness criteria <ul><li>Fun factor </li></ul><ul><li>Accessibility </li></ul><ul><li>Portability </li></ul><ul><li>Security </li></ul><ul><li>Compatibility with existing equipment </li></ul><ul><li>Expandability, flexibility </li></ul><ul><li>Ruggedness </li></ul><ul><li>Etc etc </li></ul>
62. 62. <ul><li>By Mark Kelly </li></ul><ul><li>McKinnon Secondary College </li></ul><ul><li>vceit.com </li></ul>IT APPLICATIONS SLIDESHOWS These slideshows may be freely used, modified or distributed by teachers and students anywhere on the planet (but not elsewhere). They may NOT be sold. They must NOT be redistributed if you modify them.