Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Earth Viewing Infrared Space System Sensor


Published on

Earth Viewing Systems Satellite Sensor Project, for Professor DiNardo's Course.

The presentation was given on 14th May, 2009.


I realize that some of the graphics do not have their sources cited, but I did not make those slides, and the group members who made them did not remember their sources. So, please forgive this oversight, since I consider it important enough to students of the earth surveillance class at The City College of New York (and elsewhere) that old presentations be available to them.

If, however, you can give me the sources of the graphics that you see, then I will be grateful, and I will be happy to cite them.

Published in: Education, Technology, Business
  • Be the first to comment

Earth Viewing Infrared Space System Sensor

  1. 1. Earth Viewing Infrared Space System Sensor By Earth Viewing Systems (EVS)
  2. 2. Our Team Figure 1 – Company Organization EARTH VIEWING SYSTEM Program Manager Nafiseh Pishbin Chief Architect Mohammed Faissal Halim Chief Scientist Ibrahim Siddo Sub-Contract Manager Lina Cordero Program Scheduler Erika Garofalo
  3. 3. Task <ul><li>Program Manager: </li></ul><ul><ul><li>Assigns tasks and supervise the progress of the program. </li></ul></ul><ul><li>Chief Architect: </li></ul><ul><ul><li>Manages strategic objectives, and current content of our scientific programs. </li></ul></ul><ul><li>Chief Scientist: </li></ul><ul><ul><li>Provides the design of the sub-systems and improve the technical aspect of the design. </li></ul></ul><ul><li>Sub-Contract Manager: </li></ul><ul><ul><li>Communicates with other vendors. </li></ul></ul><ul><li>Program Scheduler: </li></ul><ul><ul><li>Manages the administrator on budget and timeline. </li></ul></ul>
  4. 4. Summary <ul><ul><li>This is the response to the government of West Bali’s request for an EVIRS, which will assess natural and man made events that may cause to take immediate action: </li></ul></ul><ul><ul><li>From 100° West to 160° East Longitude </li></ul></ul><ul><ul><li>From 70° North to 0° Latitude </li></ul></ul>Figure 2 – Area of Interest
  5. 5. Mission Concept Mission Operations Ground Element Launch Element Space Element Spacecraft Bus Payload Orbit and Constellation Subject Command, Control and Communications Architecture Figure 3 – Mission Concept MISSION CONCEPT
  6. 6. Mission Requirements for EVIRS Table 1 – Mission Requirements Target Radiance or Temperature Emissivity Radiance defined in Spectral Band (μm) Length × Width (Meters) Altitude Time Duration Mean Surrounding Background Radiance or Temperature Std Dev Background Radiance or Temperature Cloud Cover KTR Probability Of Detection Or Correct Classification (Percent) OTR Probability Of Detection Or Correct Classification (Percent) KTR/OTR Probability Of False Detection Or In-Correct Classification (Percent) Volcano 800K 0.8 400×400 Ground 2 Hours 400K N/C CFLOS 90 95 1 Volcanic Lava Flow 400K 0.8 -- 2000×2000 Ground Continuous 300K N/C CFLOS 95 97 1 Fire 500K 0.6 -- 10 Acres Ground 10 Minute (Flare-Up) 300K N/C CFLOS 90 95 0.1 Aircraft 1000 W/sr -- 2.7 – 3.0 Point Source 10kM 5 Minutes 1 µFlicks 0.0 μFlicks CFLOS 70 90 0.1 Oil Spillage 60F Variable: 0.5 to unity from 2 to 5 µm - 2000×2000 Sea Level Continuous 60F N/C CFLOS 90 99 0.01 Rocket 50,000 W/sr -- 2.7 – 3.0 Point Source Ground and 10kM 10 seconds 1 μFlicks 1 μFlicks CBLOS (10kM) 90 95 0.01 Missile 500,000 W/sr -- 2.7 – 3.0 Point Source see profile 100 seconds 1 μFlicks 1 μFlicks CBLOS (10kM) 97 98 0.001 High Energy Laser 10 W 10 μrad Beam Width -- 2.971365 Equiv Point Source Ground 100 millisec 1 μFlicks 0.0 µFlicks CFLOS 90 95 1 Oil Rig Burn Off 1000K See Oil Table 2.7 – 3.0 4.2 – 4.5 Equiv Point Source Ground 1 Hour 300K 0.0 μFlicks CFLOS 90 99 1 Industrial Facilities 305K 0.75 -- 100×100 Ground 24 Hours 287K N/C CFLOS 85 95 10 Special Event Unknown Unknown _ 50×50 Ground 1 second 1 µFlicks 0.0 µFlicks CFLOS 90 100 0.1 All requirements shall be met simultaneously over the required surveillance area as defined in paragraph 1.0 of this RFP. As an OTR only Probability of Stereo Coverage for any event shall be 0.90
  7. 7. Elevator Viewgraph Full Surveillance Coverage Mission Control Station Timely communication to the Users 24-7-365 Meets all target detection requirements Thorough Test Program Figure 4 – Elevator Viewgraph Sensors Targets Performance Volcano √ Volcanic Lava Flow √ Forest Fire √ Aircraft √ Oil Spillage √ Rocket √ Missile √ High Energy Laser √ Oil Rig Burn Off √ Industrial Facilities √ Special Events √
  8. 8. Targets Industrial Facilities Missile High Energy Laser Volcanic Lava Flow Aircraft Volcano Forest Fire Rocket Oil Rig Burn Off Oil Spillage
  9. 9. House of Quality Table 2 – House of Quality Weight Meets Quality Cost 6 0.95 5.7 Dimensions 7 0.85 5.95 Weight 9 0.9 8.1 Software 8 0.8 6.4 Hardware 8 0.8 6.4 Staffing 4 0.9 3.6 Management Controls 8 0.9 7.2 Risk Management 9 0.78 7.02 Reviews 3 0.65 1.95 Technical Requirements 8 0.9 7.2 Timeline 5 0.75 3.75 Power 9 0.9 8.1 Expertise of Labor 8 0.95 7.6 Testing 9 0.9 8.1 Target Detection 9 0.8 7.2 TOTAL 110 94.27/110= 85%
  10. 10. Organizational Planning Figure 5 – Organizational Planning of System Design Preliminary Requirements Mission Objectives Mission CONOPS Mission Architecture Trade Study Document Baseline for Further Study Proposal No Yes Evaluate Cost Performance Effectiveness
  11. 11. Organizational Planning Figure 6 – Organizational Planning of the System (V Model)
  12. 12. System Architecture Detection Power: Solar Panel Processing Electronics Cooling System Optics & Detector Telescope Beam Splitter Reflector Refractor FPA and nTDI FOV PSF Figure 7 – Flow-down Graph
  13. 13. Geostationary Orbit Definition: A geosynchronous orbit that is fixed with respect to a position on the Earth. Figure 8 – Geostationary orbit
  14. 14. Technical Plan
  15. 15. Viewing Area 9 Degrees 15 Degrees Figure 9 – Viewing Area
  16. 16. Bore-sight with 16° FOV Figure 10 – Bore-sight with 16° FOV
  17. 17. <ul><li>Low resolution, for any given detector pitch </li></ul><ul><ul><li>Ground Footprint = 9.8 km * 9.8 km </li></ul></ul><ul><li>Telescope Optics difficult to engineer </li></ul><ul><li>Low availability of wide FOV telescopes </li></ul><ul><ul><li> few vendors </li></ul></ul><ul><ul><li> higher risk </li></ul></ul>Bore-sight with 16° FOV
  18. 18. Telescope: Scanner Figure 11 – Scanner Three Mirror Anastigmatic Figure 12 – Scanner 2D Design <ul><li>Characteristics: </li></ul><ul><li>FOV 4.5° </li></ul><ul><li>28 Steps to scan the entire area of interest. </li></ul><ul><li>14 cm aperture </li></ul><ul><ul><li>Monitors low intensity, long duration events </li></ul></ul><ul><ul><li>Monitors high intensity, long term events, for close support of operations . </li></ul></ul>
  19. 19. <ul><li>Multispectral: 2-3um, 3-4um, 4-5um </li></ul><ul><li>4.5 degree FOV </li></ul><ul><li>512 detector array, 12 TDI 15um pitch </li></ul><ul><li>Nominal Ground Footprint = 2.4 km * 2.4 km </li></ul><ul><li>Revisit time = 2 minutes </li></ul><ul><li>Provides Spectral Data For Brightness Temperature Calculations  finds dim targets </li></ul><ul><li>BW= 96 MBps </li></ul>Scanning System
  20. 20. Telescope: Scanner Figure 13 – Scanner Signature
  21. 21. Focal Plane Table 3 – Scanner Technical Specifications SCANNER Array Size 12x512 Raw Charge Capacity N/A Me Detector size (Pixel Pitch) 15 µm Integration Time 100 to 20000 µs Mux Part Number N/A Integration Mode Spectral Band 2-5 µm Operational Mode Detector Cut-off >5 µm Frame Rate 1 to 50 Hz Operating Temperature 90 to 150 K Frame Time ms Nominal Irradiance (measured on FP) cm -2 s -1 Number of Signal Output Channels N/A NEI or NEDT at nominal background cm -2 s -1 Max Data Rate per Video Tap N/A MHz Max Irradiance cm -2 s -1 Cross Talk 1 % Min Irradiance cm -2 s -1 Power Dissipation N/A mW
  22. 22. Telescope: Starer Figure 14 – Starers Figure 15 – Starer 2D Design <ul><li>Characteristics: </li></ul><ul><li>5 stares to view the entire area of interest </li></ul><ul><li>FOV 10° each one </li></ul><ul><ul><li>Detects high intensity, short duration events </li></ul></ul><ul><ul><li>Detects low intensity, long duration events </li></ul></ul>8.96 Degrees 8.96 Degrees
  23. 23. <ul><li>ZnS Lens </li></ul><ul><li>10 degree FOV  built in tolerance </li></ul><ul><li>Nominal Ground Footprint = 6.1 km * 6.1 km </li></ul><ul><li>3 Step Oversampling Resolution =2.05 km * 2.05 km </li></ul><ul><li>Pointing Accuracy = 50 urad </li></ul><ul><li>Revisit Time = 4.7 minutes, BW=66 MBps </li></ul>Telescope: Starer
  24. 24. Telescope: Starer Figure 16 – Starer Signature
  25. 25. Focal Plane Table 4 – Starer Technical Specifications STARER Array Size 1024×1024 Raw Charge Capacity N/A Me Detector size (Pixel Pitch) 18 µm Integration Time 100 to 20000 µs Mux Part Number N/A Integration Mode Spectral Band 2-5 µm Operational Mode Detector Cut-off >5 µm Frame Rate 1 to 50 Hz Operating Temperature 90 to 150 K Frame Time ms Nominal Irradiance (measured on FP) cm -2 s -1 Number of Signal Output Channels N/A NEI or NEDT at nominal background cm -2 s -1 Max Data Rate per Video Tap N/A MHz Max Irradiance cm -2 s -1 Cross Talk 1 % Min Irradiance cm -2 s -1 Power Dissipation N/A mW
  26. 26. Processor <ul><li>SIDECAR™ ASIC </li></ul><ul><li>Designed to manage all aspects of imaging array operation and output digitization. </li></ul><ul><li>Up to 10 MHz A/D conversion with 12-bit resolution per channel. </li></ul><ul><li>Powerful, low-cost, highly flexible, small footprint, and low-power solution. </li></ul>Figure 17 - SIDECAR™ ASIC
  27. 27. Cooler <ul><li>Passive Cooler </li></ul><ul><li>Carbon Carbon Radiator with Aluminum Honey Comb filling. </li></ul><ul><li>Used by NASA and LA-II </li></ul><ul><li>Designed by K-Technology Corp. </li></ul><ul><li>Average Cooling Temperature is 23°. </li></ul>Figure 18– Panel Internal Surface Figure 19 – Panel External Surface
  28. 28. Communication Systems <ul><li>Compress and Communications </li></ul><ul><li>First Level Threshold Algorithm </li></ul><ul><li>Target Detection and Report Generation </li></ul>Figure 20 – Communication Systems
  29. 29. Other Peripherals <ul><li>MODEL-25 PAN & TILT GIMBAL </li></ul><ul><li>Provides closed-loop control </li></ul><ul><li>Pointing accuracy ~ 25 µrad </li></ul><ul><li>Higher payload capacity </li></ul><ul><li>Power = 125 W </li></ul><ul><li>Weight = 54 lbs </li></ul><ul><li>Height = 22.3” </li></ul><ul><li>Elevation Width = 11.5” </li></ul><ul><li>Base Diameter = 16.0” </li></ul>Figure 21 – Gimbal System
  30. 30. Confusion Matrix Target Detection False Alarm Background Missing a Target TARGET: Volcano SensorReality Target No target Target 100 4 No target 0 96
  31. 31. Confusion Matrix Volcano SensorReality Target No target Target 100 4 No target 0 96 Forest Fire SensorReality Target No target Target 90 30 No target 10 70 High Energy Laser SensorReality Target No target Target 100 0 No target 0 100
  32. 32. Management Plan
  33. 33. Kick Off Meeting <ul><li>Our Kick Off meeting is scheduled for June 1, 2009. </li></ul><ul><li>We will: </li></ul><ul><li>Review the completeness and correctness of the requirements. </li></ul><ul><li>Establish and verify a program plan. </li></ul>
  34. 34. System Requirements Review Mtg. <ul><li>Our SRR meeting is scheduled for Dec. 7, 2009. </li></ul><ul><li>We will: </li></ul><ul><li>Identify our subcontractors </li></ul><ul><li>Verify our proposed sensor design </li></ul><ul><li>Strategize for technical risk </li></ul><ul><li>Deviation from proposed sensor </li></ul>
  35. 35. Program Management Review Mtg. <ul><li>Our PMR meeting is scheduled for Mar. 1, 2010. </li></ul><ul><li>We will: </li></ul><ul><li>Present our program management plan. </li></ul><ul><li>Report the structure. </li></ul><ul><li>Define cost and schedule control. </li></ul><ul><li>Establish a risk management system. </li></ul>
  36. 36. <ul><li>Our PDR meeting is scheduled for July 5, 2010. </li></ul><ul><li>We will: </li></ul><ul><li>Show a complete flow-down structure. </li></ul><ul><li>Present sub-systems requirements. </li></ul><ul><li>Display evidence of hardware/software maturity. </li></ul><ul><li>Review the action plan. </li></ul>Preliminary Design Review Meeting
  37. 37. In Process Review Meeting <ul><li>Our IPR meeting is scheduled for Aug. 2, 2010. </li></ul><ul><li>We will present: </li></ul><ul><li>Our program progress. </li></ul><ul><li>Status of the design. </li></ul><ul><li>Status of the subcontractor performance. </li></ul><ul><li>Current actual cost and cost growth issues. </li></ul><ul><li>Deviations from proposed design. </li></ul>
  38. 38. Critical Design Review Meeting <ul><li>Our CDR meeting is scheduled for July 4, 2011. </li></ul><ul><li>We will present: </li></ul><ul><li>Final review of sensor performance. </li></ul><ul><li>Full analysis of detection and false alarm. </li></ul><ul><li>Complete subsystems specifications. </li></ul><ul><li>Integration plan and facility. </li></ul><ul><li>Test plan and facility. </li></ul><ul><li>Sell-off strategies. </li></ul>
  39. 39. <ul><li>Our TRR meeting is scheduled for Sep. 5, 2011. </li></ul><ul><li>We will present: </li></ul><ul><li>Our plan to conduct sensor test. </li></ul><ul><li>Test data analysis, control and procedure. </li></ul><ul><li>Test facility and hardware readiness. </li></ul><ul><li>Test data quality review. </li></ul><ul><li>Quality control plan. </li></ul>Test Readiness Review Meeting
  40. 40. Consent to Ship Review Meeting <ul><li>Our CTSR meeting is scheduled for July 2, 2012. </li></ul><ul><li>We will present: </li></ul><ul><li>Completed tests. </li></ul><ul><li>Functional data of performance. </li></ul><ul><li>Deficiencies in test data collection and a plan to resolve them. </li></ul>
  41. 41. <ul><li>Our RVR meeting is scheduled for Dec. 17, 2012. </li></ul><ul><li>We will: </li></ul><ul><li>Present sensor success. </li></ul><ul><li>Provide written documentation of the EVIRS. </li></ul><ul><li>Provide a formal signature sheet of delivery. </li></ul>Requirements Verification Review Mtg.
  43. 43. Risk Management <ul><li>Risk sources </li></ul><ul><li>Financial markets </li></ul><ul><li>Project failures </li></ul><ul><li>Legal liabilities </li></ul><ul><li>Credit risk </li></ul><ul><li>Accidents </li></ul><ul><li>Natural causes and disasters </li></ul><ul><li>Risk Treatment: </li></ul><ul><li>Avoidance (eliminate) </li></ul><ul><li>Reduction (mitigate) </li></ul><ul><li>Transfer (outsource or insure) </li></ul><ul><li>Retention (accept and budget) </li></ul>
  44. 44. Vendors - Telescope Corning will provide the mirrors. L3 Communications SSG – Tinsley will polish the mirrors for our telescopes.
  45. 45. Vendors - Detectors <ul><li>Sensors for the Focal Plane Array </li></ul><ul><li>(2 – 5 µm, IR) </li></ul>Figure 22 – IR sensors
  46. 46. Vendors - Software <ul><li>Ray tracing Software for Modeling the Telescope. </li></ul><ul><li>Our team has experience designing optical systems using this software. </li></ul>Figure 22 – Software Capabilities Sample Figure 23 – Software Capabilities Sample
  47. 47. Vendors Telescope Cost and Design Information Space Programs Cooling System Sensors Space Programs Space Programs Gimbal System Processor
  48. 48. Table 5 – Budget Breakdown Budget Breakdown Category Item Quantity Price $ FPA Subsystem Starer 1 3.7M Scanner 1 4.0M Telescope Subsystem Starer 5 5.0M Scanner 1 1.0M Communication Systems Compress and Communications 1 3.0M First Level Threshold Algorithm 1 12.0M Target Detection and Report Generation 1 20.0M Signal Processor SIDECAR™ ASIC 1 0.052M Gimbaled Subsystem MODEL-25 PAN & TILT GIMBAL 6 36.0M Thermal Control Carbon-Carbon Radiator (Passive Cooler) 2 4.0M Human Resources Program Management Office 6 1.5M System, Integration and Test Engineering 15 3.0M Quality Engineering 4 1.0M Documentation and Specification Config. Manag. 10 2.0M Software Engineering 300K SLOC 60.0M TOTAL ~160.0M
  49. 49. Additional Specifications Table 6 – Additional Specifications Item Weight Power Dimensions Length Width Height Gimbal System 180.0 lb 750 W 48.0” 34.5” 56.0” Cooling System 11.0 lb 0 0.022” 28.25” 57.24” FPA Scanner N/A 11 mW 0 0.0071” 0.3024” FPA Starer N/A 11 mW 0 0.72” 0.72” Processing System 16.0 lb 300 mW 17.5” 5.3” 13.4” Telescope Starer 354.2 lb N/A 78.0” 45.0” 47.76” Telescope Scanner 190.0 lb N/A 5.886” 3.0” 3.071” TOTAL 751.2 lb < 1000 lb 750.322 W < 1000 W 92” < 96” 48” < 48” 57.24” < 72”
  50. 50. Thanks & Acknowledgements <ul><li>Professor Anthony DiNardo </li></ul><ul><li>Dr. Leona Charles </li></ul><ul><li>David Santoro </li></ul><ul><li>Professor Fred Moshary </li></ul><ul><li>Professor Barry Gross </li></ul><ul><li>Dr. Yaujen Wang </li></ul><ul><li>Northrop Grumman Corp </li></ul>
  51. 51. References <ul><li>Northrop Grumman Corporation </li></ul><ul><li>Santa Barbara Infrared, Inc. </li></ul><ul><li>Teledyne </li></ul><ul><li>L3 Communications – SSG Tinsley </li></ul><ul><li>Mother Planet </li></ul><ul><li>Optenso </li></ul><ul><li>National Aeronautics and Space Administration </li></ul>
  52. 52. References <ul><li>National and Oceanic and Atmospheric Administration </li></ul><ul><li>Defense Acquisition Guidebook </li></ul><ul><li>Department of Defense. “Systems Engineering Fundamentals” </li></ul><ul><li>The National Academies Press </li></ul><ul><li>Aerospace </li></ul><ul><li>Larson, W. Wertz, J. “Space Mission Analysis and Design.” 3 rd Edition. Space Technology Library. </li></ul>
  53. 53. <ul><li>Where do you get the 4096*4096 FPAs for the starer (we could not find one on the Teledyne website)? Teledyne: Yes – Teledyne is working on 4k by 4k only had a 1024*1024 FPA system for our wavelength interval (2-5um). Now, considering that they did not even have a 2048*2048 system, we decided it would be too risky to ask for a custom made 4096*4096 system, and that asking for such a system would be like asking SSG to make an F# = 1 IR telescope for us. So, who makes these mythical 4096*4096 FPA systems in the 2-5um range? In how many years from now can a team use this device for their system, and still call it realistic? </li></ul><ul><li>How to calculate the satellite's reliability and availability? During my conversations, after the competition was closed, it sounded to me like we were only in a position to calculate the reliability of our system, but not its availability. This is because the flywheel and spacecraft bus system, as I understood the RFP, were beyond the scope of the requirements in the RFP. This has nothing to do with the satellite – you (nor anyone) showed they would need time for calibration when on-orbit – that takes away from availability.  No one suggested redundant electronics to improve reliability. </li></ul>Follow Up Technical Errata 1 Professor DiNardo’s responses are in red .
  54. 54. <ul><li>Does anyone use the oversampling system that Ibrahim and I invented? No – you didn’t get it right.  You cannot have both spectral diversity and coverage without accounting for time.  Dr. Crouse has a technology for tine filters but other than that your design exposure was really flawed [Well, we invented it, then I coded it in MATLAB, while Ibrahim looked for similar stuff, and then after that we found out that it had been patented in 1994, and we gave it the name from that invention: ] </li></ul><ul><li>Dr. Hugus made it sound like the algorithm for deciding where a source that showed up during the step stare would be rather difficult (though I did not see a problem there, since all processing would be done on the ground), so I am wondering if my oversampling system failed to win any popularity contests in the real world.   Your step –stare of 50 milisec for four areas was judged not doable </li></ul>Follow Up Technical Errata 2
  55. 55. <ul><li>What is the wavelength selector/color filter scheme commonly used on satellites? When Leona mentioned that we could simply put color filters on our detectors (otherwise, I might have just used a hyperspectral system), I thought that either the filers could be placed in a repeating [2-3um, 3-4um, 4-5um] pattern, or they could be placed so that the first 1/3 of the TDI detectors would detect 2-3um, the next 1/3rd 3-4um, and the last 1/3rd 4-5um. Talking to the evaluation team, however, I got the impression that the latter approach if very 'iffy' in its fabricability, while the former is not fabricable, at all.  You don’t even have to ask this question … the filtering should be a opritcal component… since you’ve don’t so much optical modeling, put the filter anywhere in the optical path and see what its ramification is. </li></ul>Follow Up Technical Errata 3
  56. 56. <ul><li>Now, I would really appreciate this one: Do you know a good book for calculating the F# of a compound reflector/refractor combination system? The reason the Ibrahim and I decided to use a modification of an existing file in the OpTaliX library is that we could not come up with the F# of anything we designed; and we were not very good at coming up with the field flattening optics. My ideal system would have been one that did not just flatten the field, but bent it the other way, so as to compensate for the earth's curving away from the sensor, starting from a zenith angle of zero degrees -- of course, a very flat field might require too large an F#.   Jenkins and White, any optical design book… F# is just the atan of the angle of the rays on the focal plane </li></ul>Follow Up Technical Errata 4
  57. 57. <ul><li>Start with a complete radiometric model (including noise figures) for the sources that you are trying to detect  use an aperture size in the ballpark area, to get this model started: </li></ul><ul><ul><li>do not wait to have a designable and fabricable telescope before you do this model  you might have to divide your telescope’s collecting aperture area among multiple apertures, but you will need a complete radiometric model to: </li></ul></ul><ul><ul><ul><li>Decide how much flux collecting aperture area you will need. </li></ul></ul></ul><ul><ul><ul><li>Decide the target signature characteristics that you will use to pick out targets. </li></ul></ul></ul><ul><li>ALWAYS make presentation/Powerpoint slides of any idea that you settle on using, as soon as you decide to use that idea , fully detailing the idea. Pretend that you have to justify the idea to yourself, if necessary. It can be very difficult to make slides at a later time, and you may end up skimming details that your audience needs to understand your idea, but which you thought were trivial. In our case, we had to explain quite a few key ideas by hand, and a lot of verbiage. </li></ul>Follow Up: Architect’s Personal Notes 1
  58. 58. <ul><li>Convey all your requirements to management ASAP  they may not like your changing decisions every time you find some new information, as you go along, but: </li></ul><ul><ul><li>You are not doing them a favor by waiting to finalize your decision on a component, before you tell them anything. </li></ul></ul><ul><ul><li>You will do them a favor by keeping them in the loop, even at their chagrin, because they are that support team that: </li></ul></ul><ul><ul><ul><li>Informs you of the project timelines that you helped formulate  they can help you shift your focus to matters of more immediate concern. </li></ul></ul></ul><ul><ul><ul><li>Keeps you up to speed as to what outside resources can and cannot be acquired, and how long these resources will take (resources like parts, consultations, etc.) </li></ul></ul></ul><ul><ul><ul><li>Also, they are your control team: a well meaning engineer can try to introduce ideas that are something like a KISS OF DEATH (an that may look good initially, but turns out to be bad, in the end). Listen to your management when they tell you to just stick to the requirements: </li></ul></ul></ul><ul><ul><ul><ul><li>Think of this as work first, play later: work is accomplished when you fulfill your project requirements; play is what you do to squeeze in sweet ideas of your own. </li></ul></ul></ul></ul><ul><ul><ul><li>Further, your management team is really doing you a favor, if they are asking questions of your approach: that way, you get to keep things in perspective, so you do not go off track. </li></ul></ul></ul>Follow Up: Architect’s Personal Notes 2
  59. 59. <ul><li>A very important piece of advice, modified from Professor DiNardo: Do not throw in a new feature at the last minute: it is a surprise, and in a proposal, a surprise may not be a good thing: </li></ul><ul><ul><ul><li>A last minute addition may not be very well thought out, and what you thought was doable may in fact, not be. </li></ul></ul></ul><ul><ul><ul><li>Also note: that last minute arrives a lot sooner than you think! </li></ul></ul></ul><ul><ul><li>Perhaps the best way to avoid this last minute syndrome (hey, give me some license, here) is to officially close the project’s Blue Sky Phase (to borrow from Robert Colwell, author of The Pentium Chronicles, and Chief Architect to the Intel Pentium Processor) at some early (or so) stage, so that every idea pursued goes through a proper evaluation. Structure the project, so that you actually have a Blue Sky Phase, so that you actually have a phase that you can close. </li></ul></ul>Follow Up: Architect’s Personal Notes 3
  60. 60. <ul><li>There is nothing better than a little fight in the team (bear with me: I know this sounds a little counterproductive): </li></ul><ul><ul><li>Whenever you a team of highly motivated people in a risky, but rewarding situation, you will have to expect a few fights; not necessarily because of the people’s egos, but because of the people’s very nature that they want to do well, and they each come to the table with ideas that they thing will benefit the team </li></ul></ul><ul><ul><ul><li>The technical components will improve as you debate with your teammates the various possible approaches that you could take in development. </li></ul></ul></ul><ul><ul><ul><li>The ways that the management works can also improve if there is dissent by those who have learned other approaches. At the least, the team will get a chance to evaluate its methods, which could lead to the adoption of new ones. </li></ul></ul></ul><ul><ul><ul><li>The interactions between the technical and managerial parts of an organization may also improve, if there are debates on how thing could be done better. </li></ul></ul></ul><ul><ul><li>Your teammates may think that they are being civilized by not voicing disapproval, but they are only holding back any good ideas that they may have; and let’s face it: without dissent, you are running a dictatorship, and few people that I know want to be that lonely. </li></ul></ul><ul><ul><li>Sincerely, Faissal </li></ul></ul>Follow Up: Architect’s Personal Notes 4