Software blueprintFrom Wikipedia, the free encyclopediaJump to: navigation, searchA software blueprint is the final product of a software blueprinting process. Its name derivesfrom the analogy drawn with the popular use of the term blueprint (within traditionalconstruction industry). Therefore, a true software blueprint should share a number of keyproperties with its building-blueprint counterpart:Contents 1 Properties common to blueprints o 1.1 Step-by-step procedure from blueprint to finished article o 1.2 Focused on a single application aspect o 1.3 Selection of optimal description medium o 1.4 Localization of aspect logic o 1.5 Orthogonalization 2 Examples o 2.1 GUI form design o 2.2 Machine translatable co-ordination languages (e.g. CDL) o 2.3 Class designers o 2.4 Software designers 3 See also 4 External linksProperties common to blueprintsStep-by-step procedure from blueprint to finished articleSoftware blueprinting processes advocate containing inspirational activity (problem solving) asmuch as possible to the early stages of a project in the same way that the construction blueprintcaptures the inspirational activity of the construction architect. Following the blueprinting phaseonly procedural activity (following prescribed steps) is required. This means that a softwareblueprint must be prescriptive and therefore exhibit the same formality as other prescriptivelanguages such as C++ or Java. Software blueprinting exponents claim that this provides thefollowing advantages over enduring inspiration: Potential for automatic machine translation to code Predictable timescales after blueprinting phase Software architects intentions reflected directly in codeFocused on a single application aspect
Software blueprints focus on one aspect to avoid becoming diluted by compromising choice ofdescription medium and to ensure that all of the relevant logic is localized.Selection of optimal description mediumThe single aspect focus of a software blueprint means that the optimal description medium canbe selected. For example, algorithmic code may be best represented using textual code whereasGUI appearance may be best represented using a form design.The motivation behind selecting an intuitive description medium (i.e. one that matches well withmental models and designs for a particular aspect) is to improve: Ease of navigation Ease of understanding Fault detection rate Ability to manage complexityLocalization of aspect logicThe localization of aspect logic promoted by the software blueprinting approach is intended toimprove navigability and this is based on the assumption that the application programmer mostcommonly wishes to browse application aspects independently.OrthogonalizationSoftware blueprinting relies on realizing a clean separation between logically orthogonal aspectsto facilitate the localization of related logic and use of optimal description media describedabove.ExamplesGUI form designThe GUI form design (see GUI toolkit) is widely adopted across the software industry andallows the programmer to specify a prescriptive description of the appearance of GUI widgetswithin a window. This description can be translated directly to the code that draws the GUI(because it is prescriptive).Machine translatable co-ordination languages (e.g. CDL)Languages such as the Concurrent Description Language (CDL) separate an applicationsmacroscopic logic (communication, synchronization and arbitration) from complex multi-threaded and/or multi-process applications into a single contiguous visual representation. Theprescriptive nature of this description means that it can be machine translated into an executable
framework that may be tested for structural integrity (detection of race conditions, deadlocksetc.) before the microscopic logic is available.Class designersClass designers allow the specification of arbitrarily complex data structures in a convenientform and the prescriptive nature of this description allows generation of executable code toperform list management, format translation, endian swapping and so on.Software designersClasses are used as building blocks by software designers to model more complex structures. Insoftware architecture the Unified Modeling Language (UML) is an industry standard used formodeling the blueprint of software. UML represents structure, associations and interactionsbetween various software elements, like classes, objects or components. It helps the softwaredesigner to design, analyze and communicate ideas to other members of the softwarecommunity.Blueprint was born out of frustration with development environments, deployment processes,and the complexity of configuration management systems.Blueprint insists development environments realistically model production and that starts withusing Linux. Blueprint only works on Debian- or Red Hat-based Linux systems. We recommendVirtualBox, Vagrant, Rackspace Cloud, or AWS EC2 for development systems that use the sameoperating system (and version) as is used in production.On top of the operating system, we recommend using the same web servers, databases, messagequeue brokers, and other software in development and production. This brings developmentvisibility to entire classes of bugs that only occur due to interactions between productioncomponents.When development and production share the same operating system and software stack, theyalso share the same interactive management tools, meaning developers and operators alike don’tneed to maintain two vocabularies. Well-understood tools like apt-get/dpkg, yum/rpm, and thewhole collection of Linux system tools are available everywhere. Blueprint is unique relative toother configuration management in encouraging use of these tools.What’s common to all configuration management tools is the desire to manage the whole stack:from the operating system packages and services through language-specific packages, all theway to your applications. We need to span all of these across all our systems. To pick onRubyGems arbitrarily: RubyGems is purposely ignorant of its relationship to the underlyingsystem, favoring compatibility with Windows and a wide variety of UNIX-like operating
systems. Blueprint understands the macro-dependencies between RubyGems itself and theunderlying system and is able to predictably reinstall a selection of gems on top of a properlyconfigured operating system.When constructing this predictable order-of-operations used to reinstall files, packages, services,and source installations, Blueprint, along with other configuration management tools, takes greatcare in performing idempotent actions. Thus Blueprint prefers to manage the entire contents of afile rather than a diff or a line to append. Idempotency means you can apply a blueprint over andover again with confidence that nothing will change if nothing needs to change.Because Blueprint can reverse-engineer systems, it is of particular use migrating legacy systemsinto configuration management. It doesn’t matter when you install Blueprint: changes made tothe system even before Blueprint is installed will be taken into account.It is just a blueprint, a representation of what the software will be like or how the software will performFEASIBILITY STUDY – SOFTWAREENGINEERINGFEASIBILITY STUDY – SOFTWAREENGINEERINGA feasibility study is carried out to select the best system that meets performance requirements.The main aim of the feasibility study activity is to determine whether it would be financially andtechnically feasible to develop the product. The feasibility study activity involves the analysis ofthe problem and collection of all relevant information relating to the product such as the differentdata items which would be input to the system, the processing required to be carried out on thesedata, the output data required to be produced by the system as well as various constraints on thebehaviour of the system.Technical FeasibilityThis is concerned with specifying equipment and software that will successfully satisfy the userrequirement. The technical needs of the system may vary considerably, but might include :• The facility to produce outputs in a given time.• Response time under certain conditions.
• Ability to process a certain volume of transaction at a particular speed.• Facility to communicate data to distant locations.In examining technical feasibility, configuration of the system is given more importance than theactual make of hardware. The configuration should give the complete picture about the system’srequirements:How many workstations are required, how these units are interconnected so that they couldoperate and communicate smoothly.What speeds of input and output should be achieved at particular quality of printing.Economic FeasibilityEconomic analysis is the most frequently used technique for evaluating the effectiveness of aproposed system. More commonly known as Cost / Benefit analysis, the procedure is todetermine the benefits and savings that are expected from a proposed system and compare themwith costs. If benefits outweigh costs, a decision is taken to design and implement the system.Otherwise, further justification or alternative in the proposed system will have to be made if it isto have a chance of being approved. This is an outgoing effort that improves in accuracy at eachphase of the system life cycle.Operational FeasibilityThis is mainly related to human organizational and political aspects. The points to be consideredare:• What changes will be brought with the system?• What organizational structure are disturbed?• What new skills will be required? Do the existing staff members have these skills? If not, canthey be trained in due course of time?This feasibility study is carried out by a small group of people who are familiar with informationsystem technique and are skilled in system analysis and design process.Proposed projects are beneficial only if they can be turned into information system that will meetthe operating requirements of the organization. This test of feasibility asks if the system willwork when it is developed and installed.Posted by Sreejith at 9:18 PMLabels: analysis, economic analysis, economical, engineering, feasibility, feasibility study,operational, sad, software, software engineering, technical, technically feasibleFeasibility :-Feasibility is a practical extent to which a project can be performed successfully. To evaluatefeasibility, a feasibility study is performed, which determines whether the solution considered toaccomplish the requirements is practical and workable in the software or not. Such informationas resource availability, cost estimate for software development, benefits of the software toorganization, and cost to be incurred on its maintenance are considered. The objective of thefeasibility study is to establish the reasons for developing a software that is acceptable to users,adaptable to change, and comfortable to aquablished standards.
Types of Feasibility: - various types of feasibility that are commonly considered includetechnical feasibility, operational feasibility and economic feasibility.Technical Feasibility: - Technical Feasibility assesses the current resources and technology,which are required to accomplish user requirement in the software within the allocated time andFor this, the software development team ascertains whether the current resources and technologycan be upgraded or added in the software to accomplish specified user requirements. Technicalfeasibility performs the following tasks:-> It analyses the technical and capabilities of the software development team members.> It determines whether the relevant technology is stable and established.> It ascertains that the technology chosen for software development has large number of user sothat they can be consulted when problems arise, or when improvements are required.Operational Feasibility: - Operational feasibility assesses the extent to which the requiredsoftware performs a series of steps to solve business problems and user requirements. Thisfeasibility is dependent on human resource and involves visualizing whether or not the softwarewill operate after it is developed, and be operated once it is installed. It also performs thefollowing tasks:-> It determines whether or on Economic Feasibility: - Economic feasibility determines whetherthe required software is capable of generating financial gains for an organization. It involves thecost incurred on the software development team, estimated cost of hardware and software, costof performing feasibility study, and so on. For this, it is essential to consider expenses made onpurchases and activities required to carry out software development. In addition it is necessary toconsider the benefits that can be achieved by developing the software.> Cost incurred on software development to produce long-term gains for an organization.> Cost required to conduct full software investigation.> Cost of hardware, software, development team and training.Feasibility Study Process :-Feasibility study comprises the following steps :-1. Information assessement :-Identifies information about whether the system helps inachieving the objectives of the organizatio. It also verifies that the system can be implementedusing new technology and within the budget, and whether the system can be integrated with theexisting system.2. Information collection :- Specifies the sources from where information about software can beobtained. Generally, these sources include users, and the software developement team.3. Report writing :- Uses a feasibility report, which is the conclusion of the feasibility by thesoftware developement team. In includes the recommendation whether the software developmentshould continue or not.Thanks,
TESTINGSoftware testingFrom Wikipedia, the free encyclopediaJump to: navigation, search Software development process A software engineer programming at work Activities and steps Requirements Specification Architecture Construction Design Testing Debugging Deployment Maintenance Methodologies Waterfall Prototype model Incremental Iterative V-Model
Spiral Scrum Cleanroom RAD DSDM RUP XP Agile Lean Dual Vee Model TDD FDD Supporting disciplines Configuration management Documentation Quality assurance (SQA) Project management User experience design Tools Compiler Debugger Profiler GUI designer IDE Build automation v t eSoftware testing is an investigation conducted to provide stakeholders with information aboutthe quality of the product or service under test. Software testing can also provide an objective,independent view of the software to allow the business to appreciate and understand the risks of
software implementation. Test techniques include, but are not limited to, the process of executinga program or application with the intent of finding software bugs (errors or other defects).Software testing can be stated as the process of validating and verifying that a computerprogram/application/product: meets the requirements that guided its design and development, works as expected, can be implemented with the same characteristics, and satisfies the needs of stakeholders.Software testing, depending on the testing method employed, can be implemented at any time inthe development process. Traditionally most of the test effort occurs after the requirements havebeen defined and the coding process has been completed, but in the Agile approaches most of thetest effort is on-going. As such, the methodology of the test is governed by the chosen softwaredevelopment methodology.Different software development models will focus the test effort at different points in thedevelopment process. Newer development models, such as Agile, often employ test-drivendevelopment and place an increased portion of the testing in the hands of the developer, before itreaches a formal team of testers. In a more traditional model, most of the test execution occursafter the requirements have been defined and the coding process has been completed.Contents 1 Overview o 1.1 Defects and failures 2 Input combinations and preconditions 3 Economics o 3.1 Roles 4 History 5 Testing methods o 5.1 Static vs. dynamic testing o 5.2 The box approach 5.2.1 White-Box testing 5.2.2 Black-box testing 5.2.3 Grey-box testing o 5.3 Visual testing 6 Testing levels o 6.1 Unit testing o 6.2 Integration testing o 6.3 System testing o 6.4 Acceptance testing 7 Testing approach o 7.1 Top-down and bottom-up 8 Objectives of testing
o 8.1 Installation testing o 8.2 Compatibility testing o 8.3 Smoke and sanity testing o 8.4 Regression testing o 8.5 Acceptance testing o 8.6 Alpha testing o 8.7 Beta testing o 8.8 Functional vs non-functional testing o 8.9 Destructive testing o 8.10 Software performance testing o 8.11 Usability testing o 8.12 Accessibility o 8.13 Security testing o 8.14 Internationalization and localization o 8.15 Development testing 9 The testing process o 9.1 Traditional CMMI or waterfall development model o 9.2 Agile or Extreme development model o 9.3 A sample testing cycle 10 Automated testing o 10.1 Testing tools o 10.2 Measurement in software testing 11 Testing artifacts 12 Certifications 13 Controversy 14 Related processes o 14.1 Software verification and validation o 14.2 Software quality assurance (SQA) 15 See also 16 References 17 Further reading 18 External linksOverviewTesting can never completely identify all the defects within software. Instead, it furnishes acriticism or comparison that compares the state and behavior of the product against oracles—principles or mechanisms by which someone might recognize a problem. These oracles mayinclude (but are not limited to) specifications, contracts, comparable products, past versions ofthe same product, inferences about intended or expected purpose, user or customer expectations,relevant standards, applicable laws, or other criteria.A primary purpose of testing is to detect software failures so that defects may be discovered andcorrected. Testing cannot establish that a product functions properly under all conditions but canonly establish that it does not function properly under specific conditions. The scope of
software testing often includes examination of code as well as execution of that code in variousenvironments and conditions as well as examining the aspects of code: does it do what it issupposed to do and do what it needs to do. In the current culture of software development, atesting organization may be separate from the development team. There are various roles fortesting team members. Information derived from software testing may be used to correct theprocess by which software is developed.Every software product has a target audience. For example, the audience for video gamesoftware is completely different from banking software. Therefore, when an organizationdevelops or otherwise invests in a software product, it can assess whether the software productwill be acceptable to its end users, its target audience, its purchasers, and other stakeholders.Software testing is the process of attempting to make this assessment.Defects and failuresNot all software defects are caused by coding errors. One common source of expensive defects iscaused by requirement gaps, e.g., unrecognized requirements, that result in errors of omission bythe program designer. A common source of requirements gaps is non-functional requirementssuch as testability, scalability, maintainability, usability, performance, and security.Software faults occur through the following processes. A programmer makes an error (mistake),which results in a defect (fault, bug) in the software source code. If this defect is executed, incertain situations the system will produce wrong results, causing a failure. Not all defects willnecessarily result in failures. For example, defects in dead code will never result in failures. Adefect can turn into a failure when the environment is changed. Examples of these changes inenvironment include the software being run on a new computer hardware platform, alterations insource data, or interacting with different software. A single defect may result in a wide rangeof failure symptoms.Input combinations and preconditionsA very fundamental problem with software testing is that testing under all combinations ofinputs and preconditions (initial state) is not feasible, even with a simple product. This meansthat the number of defects in a software product can be very large and defects that occurinfrequently are difficult to find in testing. More significantly, non-functional dimensions ofquality (how it is supposed to be versus what it is supposed to do)—usability, scalability,performance, compatibility, reliability—can be highly subjective; something that constitutessufficient value to one person may be intolerable to another.Software developers cant test everything, but they can use combinatorial test design to identifythe minimum number of tests needed to get the coverage they want. Combinatorial test designenables users to get greater test coverage with fewer tests. Whether they are looking for speed ortest depth, they can use combinatorial test design methods to build structured variation into theirtest cases.
EconomicsA study conducted by NIST in 2002 reports that software bugs cost the U.S. economy $59.5billion annually. More than a third of this cost could be avoided if better software testing wasperformed.It is commonly believed that the earlier a defect is found, the cheaper it is to fix it. The followingtable shows the cost of fixing the defect depending on the stage it was found. For example, if aproblem in the requirements is found only post-release, then it would cost 10–100 times more tofix than if it had already been found by the requirements review. With the advent of moderncontinuous deployment practices and cloud-based services, the cost of re-deployment andmaintenance may lessen over time. Time detected Cost to fix a defect System Post- Requirements Architecture Construction test release Requirements 1× 3× 5–10× 10× 10–100× Time Architecture - 1× 10× 15× 25–100× introduced Construction - - 1× 10× 10–25×RolesSoftware testing can be done by software testers. Until the 1980s, the term "software tester" wasused generally, but later it was also seen as a separate profession. Regarding the periods and thedifferent goals in software testing, different roles have been established: manager, test lead,test analyst, test designer, tester, automation developer, and test administrator.HistoryThe separation of debugging from testing was initially introduced by Glenford J. Myers in1979. Although his attention was on breakage testing ("a successful test is one that finds abug") it illustrated the desire of the software engineering community to separatefundamental development activities, such as debugging, from that of verification. Dave Gelperinand William C. Hetzel classified in 1988 the phases and goals in software testing in thefollowing stages: Until 1956 - Debugging oriented 1957–1978 - Demonstration oriented 1979–1982 - Destruction oriented 1983–1987 - Evaluation oriented 1988–2000 - Prevention orientedTesting methods
Static vs. dynamic testingThere are many approaches to software testing. Reviews, walkthroughs, or inspections arereferred to as static testing, whereas actually executing programmed code with a given set of testcases is referred to as dynamic testing. Static testing can be omitted, and unfortunately inpractice often is. Dynamic testing takes place when the program itself is used. Dynamic testingmay begin before the program is 100% complete in order to test particular sections of code andare applied to discrete functions or modules. Typical techniques for this are either usingstubs/drivers or execution from a debugger environment.The box approachSoftware testing methods are traditionally divided into white- and black-box testing. These twoapproaches are used to describe the point of view that a test engineer takes when designing testcases.White-Box testingMain article: White-box testingWhite-box testing (also known as clear box testing, glass box testing, transparent boxtesting, and structural testing) tests internal structures or workings of a program, as opposed tothe functionality exposed to the end-user. In white-box testing an internal perspective of thesystem, as well as programming skills, are used to design test cases. The tester chooses inputs toexercise paths through the code and determine the appropriate outputs. This is analogous totesting nodes in a circuit, e.g. in-circuit testing (ICT).While white-box testing can be applied at the unit, integration and system levels of the softwaretesting process, it is usually done at the unit level. It can test paths within a unit, paths betweenunits during integration, and between subsystems during a system–level test. Though this methodof test design can uncover many errors or problems, it might not detect unimplemented parts ofthe specification or missing requirements.Techniques used in white-box testing include: API testing (application programming interface) - testing of the application using public and private APIs Code coverage - creating tests to satisfy some criteria of code coverage (e.g., the test designer can create tests to cause all statements in the program to be executed at least once) Fault injection methods - intentionally introducing faults to gauge the efficacy of testing strategies Mutation testing methods Static testing methods
Code coverage tools can evaluate the completeness of a test suite that was created with anymethod, including black-box testing. This allows the software team to examine parts of a systemthat are rarely tested and ensures that the most important function points have been tested.Code coverage as a software metric can be reported as a percentage for: Function coverage, which reports on functions executed Statement coverage, which reports on the number of lines executed to complete the test100% statement coverage ensures that all code paths, or branches (in terms of control flow) areexecuted at least once. This is helpful in ensuring correct functionality, but not sufficient sincethe same code may process different inputs correctly or incorrectly.Black-box testingMain article: Black-box testingBlack box diagramBlack-box testing treats the software as a "black box", examining functionality without anyknowledge of internal implementation. The tester is only aware of what the software is supposedto do, not how it does it. Black-box testing methods include: equivalence partitioning,boundary value analysis, all-pairs testing, state transition tables, decision table testing, fuzztesting, model-based testing, use case testing, exploratory testing and specification-based testing.Specification-based testing aims to test the functionality of software according to the applicablerequirements. This level of testing usually requires thorough test cases to be provided to thetester, who then can simply verify that for a given input, the output value (or behavior), either"is" or "is not" the same as the expected value specified in the test case. Test cases are builtaround specifications and requirements, i.e., what the application is supposed to do. It usesexternal descriptions of the software, including specifications, requirements, and designs toderive test cases. These tests can be functional or non-functional, though usually functional.Specification-based testing may be necessary to assure correct functionality, but it is insufficientto guard against complex or high-risk situations.One advantage of the black box technique is that no programming knowledge is required.Whatever biases the programmers may have had, the tester likely has a different set and mayemphasize different areas of functionality. On the other hand, black-box testing has been said tobe "like a walk in a dark labyrinth without a flashlight." Because they do not examine thesource code, there are situations when a tester writes many test cases to check something thatcould have been tested by only one test case, or leaves some parts of the program untested.
This method of test can be applied to all levels of software testing: unit, integration, system andacceptance. It typically comprises most if not all testing at higher levels, but can also dominateunit testing as well. Whats Tangible Software Engineering Education? Taichi Nakamura Director, Tangible Software Engineering Education and Research Project Tokyo University of Technology, Tokyo, Japan email@example.com. IntroductionComputer systems have infiltrated many fields such as finance, distribution,manufacturing, education and electronic government and must be safe and secure. Onthe other hand, the demands of customers are subject to bewilderingly change,corresponding to the rapid progress of information technologies and changes in thebusiness environment. Enterprises had to cultivate human resources with highlycompetent skills in information technology. However, enterprises have recently beenurged to be selective and to focus their investment in a global competitive setting.Therefore, IT industry has been asking universities to promote the development ofadvanced IT talent in their students .A decrease in the wish to study IT by the young people who could become the humanresources with the required talent has become a big problem. Universities need towork immediately on the establishment of a method of practical software educationwhich is improved to the level that can be used by businesses and on the developmentof the associated teaching material.The tangible software education research project won the support of the privateuniversity science research upgrade promotion business of the Ministry of Education,Culture, Sports, Science and Technology in fiscal year 2007, and began work at the
Open Research Center which was set up by Tokyo University of Technology to dealwith such a situation. The purpose of the project is to develop the teaching materialand an education method that will promote the development of professional talentwhich has a high degree of professionalism in the rapidly changing field ofinformation technologies.This paper describes the principles of software engineering education led by theeducational philosophy of Tokyo University of Technology, the issues of softwareengineering education, and the purpose of the research.2. The Idea of Software Engineering Education2.1 Principles of the SE course in light of the Universitys principlesTokyo University of Technology has three stated principles:(1) theoretical and technical education of professions contributing to society,(2) education through cutting-edge R&D and the social application of research results,and (3) the development of an ideal environment for an ideal education and research.The project established the principle in software engineering education as "Develophuman resources with design and development abilities that enable them to buildsoftware after analyzing and modeling the customers needs, and with the adaptabilityand management ability to handle their role as a team member under variousconstraints" in order to realize the first principle of the university as shown in Figure1.In order to apply this principle, the project is designing and developing softwareengineering education curricula based on Instructional Design (ID). One of the aimsof the project is to realize PBE (Profile Based Education), which is a method ofproviding educational materials and instructional approach according to each studentslearning curve.
Figure 1. Principle of the Software engineering course in light of the universitys principles2.2 Design of the curriculum for software engineering based on InstructionalDesignThe course curriculum should be designed systematically according to the Analysis,Design, Development, Implementation, and Evaluation (ADDIE) model. The mostimportant aspect of the process of designing the curriculum is that students define theobjectives to be attained with the quantitative index measuring the acquired level ofhuman-oriented knowledge about communication or team-building from criteriajudging whether the knowledge is effective in a business environment.2.3 The system of practical educationIt is more important for practitioners to acquire the design methods and managementmethods required in each phase of the entire system development process rather thanto explore such techniques on statistics and psychology. Practical education tosystematically acquire knowledge of software engineering involves repeatedlystudying the design methods and the management methods which a software engineershould use in every development process. Figure 2 shows a structure of PBL thatemploys practical education cycles in each development process.The universities provide a scenario-based curriculum that requires users to design ormanage the development process in virtual projects and such virtual projects have theadvantage of providing many more simulated experiences than OJT. The curriculumhas been designed with an emphasis on the importance of a process in which a learnersolves exercises that are produced for each phase of the development process for thevirtual project. This resulting process is a fusion of a PBE oriented instructionaldesign cycle and a practical education cycle.<="" font="">
Figure 2. A structure of PBL that employs practical education cycles in each development process3. Quaternary tangible software educationThe tangible software engineering education and research project should addressfollowing issues.New students arrive at the departments of information engineering at universitiesevery year with a variety of ambitions. They may be keen to study programming inorder to be able to make game software; or they may secretly aspire to participateactively as a system engineer in the future. Surprisingly, within six months of enteringthe university their dreams are smashed to smithereens and they lose their motivationto study software engineering. Of course it is also true that quite a lot of students, whoare living in a modern mature society where they can easily get everything they want,have not already formed any hope nor dream. On the other hand information systemshave been growing in both scale and complexity. High competent persons who havereceived advanced education in IT and are highly motivated by their responsibility as
a member of a system development project are required in order to supportinformation system. Whatever their current situation, we have to encourage suchyoung people to become involved in the information infrastructure supporting oursociety. One of the most important conditions for providing software engineeringeducation which will be effective in solving this kind of problem is to motivatestudents to study IT and then to keep their motivation alive. Four tangible issueswhich education staff have to deal with in order to satisfy students needs in softwareengineering education are as follows:1) A tangible curriculumIn order for students to know the purpose of the course and to be able to plan theircareers, course curricula should be designed by referring to the informationtechnology skill standards (ITSS) defined by the Ministry of Economy, Trade andIndustry and enable students to acquire skills related to their future occupation.2) Tangible lecturesIn order for students to understand the lectures and to be motivated to study IT,experienced-based training should be provided to students in the course. Students canshare awareness of issues relating to information systems. Students can subsequentlylearn how to understand abstract concepts and logical thinking in order to gain a deepunderstanding and to develop skills.3) Tangible relationship with the IT industryThe significant difference between skills learned in a classroom and skills required todevelop an information system in a practical situation has been pointed.We develop competent human resources who have the following practical abilities:analyze a customers requirements; build a system model; design and implement asoftware system which meets the customers requirements; have an aptitude forworking in a team under various real-life constraints; and manage money, time, andpeople.4) Tangible profile of a studentThe behavioral track records of each learner are gathered during class and areanalyzed. The relation between the track records and the level of an acquired skill foreach learner should be formulated in order for teaching staff to be able to knowquantitatively the acquired skill level by using a formula derived from the relation. Torealize PBE, the teaching staff have to fit the teaching materials to the students and toestablish appropriate teaching methods according to their learning curves drawn bythe formula. In order to raise a students motivation, maintains it, and indeed causes itto increase, the teaching staff present students with the whole picture of aninformation system, lets students become interested in the information system,provides a comprehensible explanation to the students, and lets students then haveconfidence in developing the system. As a result, they have a feeling of satisfaction.4. Tangible software engineering education and research
4.1. The purpose of Tangible software engineering education and researchThe tangible software engineering education and research project aims to realizesoftware engineering education for the new era with the following two features. 1)The practical software engineering education takes students through the entire lifecycle from requirements definition, through design, implementation and testing, tooperation and maintenance, and bridges the gap between classroom and field practice.2) The software engineering education cultivates skills which infuse students with thejoy of achievement through the experience of actually writing and running programs.We are developing a curriculum system for practical software engineering by usingthe actual achievement of tangible software engineering education, and providing aproject based learning (PBL) environment and teaching materials .4.2. Topics of researchThe tangible software engineering education and research project develops aneducational method for improving human management related-skills such ascommunication, and leadership, which are necessary for a learner to contribute toteam work. In particular, we monitor the behavior of each learner during a lecture,analyze the monitored behavioral track records, and provide the appropriateinstructional approach and teaching materials to each learner . Role-play training isone of the most effective forms of PBL and an appropriate learning method foreducation in project management which requires collaborative working with multiplestakeholders. The scenario is the most important factor in role-play training, becausethe progress of a role-play is affected by the behavior of learners who are each playinga role and operate the role-play according to the scenario . A web applicationsystem similar to a role-play game has been built in order for learners to be able tocarry out the role-play training anywhere and anytime .4.2.1. Profile based software engineering education(1) Personal profile based software engineering educationWe are developing the teaching method for the new era with the following features:taking account of the unique learning curve of each learner; looking beyond the scopeof the existing curriculum; flexibly varying the context of the teaching materials toadapt them to the learners skill level. In order to achieve personal PBE, the profiledata of each learner must be created, including the behavioral track record of thelearner and a record of the times taken to acquire different skill levels. In addition, thefeatures of the character of each learner are added to the created profile data .(2) Team profile based software engineering educationFor a learner to experience team work, it is desirable that a team should have a system
with at least an advanced professional. We extract the behavioral track records of theteam interacting with the agent for the team that achieved the best performance in therole-play exercise. The collaborative model for project management is created usingthe best behavioral records and the individual profile data of the team members. Thismodel can also present a guideline for actions for a team to achieve the bestperformance.4.2.2. Role-play training environment(1) Method for developing a role-play scenarioThe role-play scenario describes the problems which a learner should solve by usingthe management skills taught in a class, the cause of the problems, and the change inthe circumstances of a virtual project which have given rise to the problem with thecause . Points to remember when developing a role-play scenario are as follows,1) The service being developed in the virtual project should be familiar to the learners,2) The level of difficulty of an exercise should be matched to the students skill level,3) Contradictions must be eliminated,4) The instructions that learners refer to undertake the role-play should be writtenclearly,5) The actions learners execute in the role-play should be useful in providing aquantitative indication of the achieved skills of learners,6) An integrated development environment should be built to improve theproductivity of developing role-play scenarios written in HTML and having XMLtags .(2) Role-play training with an agentA software agent system, which implements the advanced professional skills ofproject management and plays the role of a mentor in a role-play, may be useful inimproving the learning effectiveness. We are developing an agent system whichincludes not only an agent with the skills that are gained by experience but alsovarious characters to provide training in human related skills.(3)The role-play training systemWe have developed a web-based application system providing a role-play trainingfacility . The system consists of a Web server, an application server thatimplements the role-play execution core required to run a virtual project and providesuser management, and the database server, as shown in Figure 3. The role-playexecution core implements user administration, PROMASTER administration, thelobby selecting scenario, the role-play engine, and the feedback engine.
Figure 3. Outline of the role-play training system, PROMASTER4.3. Curriculum and teaching materials for tangible software engineeringeducationWe are developing a four-year unified curriculum with teaching materials to realizetangible software engineering education for use not only in universities but also inhigh schools and industry.4.4. Relationship between FDmO in the University and high schools and the ITindustryIn the tangible software engineering education approach, competent engineers
working in the field are asked to provide coaching during the practice in order forstudents to acquire a correct understanding of work in the IT industry at an early stagein their college life and ultimately to be able to find a satisfying job. In addition, theteaching materials we are developing in the project provide information about up-to-date trends in the software industry to teaching staff in high schools, in order toencourage more high school students to take an interest in IT and to be willing towork in the IT industry. Consequently the tangible software engineering educationand research project will provide industry with advanced human resources byconsidering all stages from high school through university. The Software EngineeringEducation and research center (SEED) has been organized to formulate and maintainthe software engineering education system, which closely connects universities tohigh schools, the IT industry, and research institutes. The project has also founded theFaculty Development management Office (FDmO) to liaise with other universities asshown in Figure 4.
Figure 4. Relationship between the FDmO in the University and high school and IT industry5. ConclusionThis paper has introduced the principle of tangible software engineering educationand research. The activities of the project are significant, and are as follows: toprovide highly competent software engineers who can design and implementadvanced and complicated information systems and have practical abilities; and tocontribute to the healthy development of the IT industry, not only in Japan but alsoaround the world, by means of a good relationship between universities and the ITindustry. Young people often have little interest in developing advanced skills becausein the mature society of Japan they can already get everything they want. Few
students want to have a job in the IT field, due to the lack of any means of lettingthem gain a vision for work in this field after graduation. The tangible softwareengineering education and research project aims to achieve a dramatic improvementin the problematic situation of software engineering education, and its activities andresults obtained are extremely significant.6. AcknowledgementsThis research is supported by "Tangible Software Engineering (SE)Education andResearch." as part of the "Program promoting the leveling of private universityacademic research," in which the Ministry of Education, Culture, Sports, Science andTechnology invited public participation for the academic year 2007.TangibilityFrom Wikipedia, the free encyclopediaJump to: navigation, search This article does not cite any references or sources. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed. (December 2009) Look up tangible in Wiktionary, the free dictionary.Tangibility is the attribute of being easily detectable with the senses.In criminal law, one of the elements of an offense of larceny is that the stolen property must betangible.In the context of intellectual property, expression in tangible form is one of the requirements forcopyright protection.
Tangible propertyFrom Wikipedia, the free encyclopediaJump to: navigation, searchTangible property in law is, literally, anything which can be touched, and includes both realproperty and personal property (or moveable property), and stands in distinction to intangibleproperty.In English law and some Commonwealth legal systems, items of tangible property are referred toas choses in possession (or a chose in possession in the singular). However, some property,despite being physical in nature, is classified in many legal systems as intangible property ratherthan tangible property because the rights associated with the physical item are of far greatersignificance than the physical properties. Principally, these are documentary intangibles. Forexample, a promissory note is a piece of paper that can be touched, but the real significance isnot the physical paper, but the legal rights which the paper confers, and hence the promissorynote is defined by the legal debt rather than the physical attributes.A unique category of property is money, which in some legal systems is treated as tangibleproperty and in others as intangible property. Whilst most countries legal tender is expressed inthe form of intangible property ("The Treasury of Country X hereby promises to pay to thebearer on demand...."), in practice bank notes are now rarely ever redeemed in any country,which has led to bank notes and coins being classified as tangible property in most modern legalsystems.References 1. ^ Hon. Giles, J. (May 1, 2008). "R&L ZOOK, INC., d/b/a, t/a, aka UNITED CHECK CASHING COMPANY, Plaintiff, v. PACIFIC INDEMNITY COMPANY, Defendant." (PDF). paed.uscourts.gov. Philadelphia, PA: United States District Court Eastern District of Pennsylvania. p. 6. Archived from the original on 2008-10-05. Retrieved 2011-07-11.Tangible user interfaceFrom Wikipedia, the free encyclopedia
(Redirected from Tangible media)Jump to: navigation, search This article includes inline citations, but they are not properly formatted. Please improve this article by correcting them. (October 2012)A tangible user interface (TUI) is a user interface in which a person interacts with digitalinformation through the physical environment. The initial name was Graspable User Interface,which no longer is used.One of the pioneers in tangible user interfaces is Hiroshi Ishii, a professor in the MIT MediaLaboratory who heads the Tangible Media Group. His particular vision for tangible UIs, calledTangible Bits, is to give physical form to digital information, making bits directly manipulableand perceptible. Tangible bits pursues seamless coupling between these two very differentworlds of bits and atoms.Contents 1 Characteristics of tangible user interfaces 2 Examples 3 State of the art 4 See also 5 References 6 External linksCharacteristics of tangible user interfaces 1. Physical representations are computationally coupled to underlying digital information. 2. Physical representations embody mechanisms for interactive control. 3. Physical representations are perceptually coupled to actively mediated digital representations. 4. Physical state of tangibles embodies key aspects of the digital state of a systemAccording to, five basic defining properties of tangible user interfaces are as follows: 1. space-multiplex both input and output; 2. concurrent access and manipulation of interface components; 3. strong specific devices; 4. spatially aware computational devices; 5. spatial re-reconfigurability of devices.Examples
An example of a tangible UI is the Marble Answering Machine by Durrell Bishop (1992). Amarble represents a single message left on the answering machine. Dropping a marble into a dishplays back the associated message or calls back the caller.Another example is the Topobo system. The blocks in Topobo are like LEGO blocks which canbe snapped together, but can also move by themselves using motorized components. A personcan push, pull, and twist these blocks, and the blocks can memorize these movements and replaythem.Another implementation allows the user to sketch a picture on the systems table top with a realtangible pen. Using hand gestures, the user can clone the image and stretch it in the X and Y axesjust as one would in a paint program. This system would integrate a video camera with a gesturerecognition system.Another example is jive. The implementation of a TUI helped make this product more accessibleto elderly users of the product. The friend passes can also be used to activate differentinteractions with the product.Several approaches have been made to establish a generic middleware for TUIs. They targettoward the independence of application domains as well as flexibility in terms of the deployedsensor technology. For example, Siftables provides an application platform in which smallgesture sensitive displays act together to form a human-computer interface.For collaboration support TUIs have to allow the spatial distribution, asynchronous activities,and the dynamic modification of the TUI infrastructure, to name the most prominent ones. Thisapproach presents a framework based on the LINDA tuple space concept to meet theserequirements. The implemented TUIpist framework deploys arbitrary sensor technology for anytype of application and actuators in distributed environments.A further example of a type of TUI is a Projection Augmented model.State of the artSince the invention of Durell Bishops Marble Answering Machine (1992) two decades ago,the interest in Tangible User Interfaces (TUIs) has grown constantly and with every year moretangible systems are showing up. In 1999 Gary Zalewski patented a system of moveablechildrens blocks containing sensors and displays for teaching spelling and sentencecomposition. A similar system is being marketed as "Siftables".The MIT Tangible Media Group, headed by Hiroshi Ishi is continuously developing andexperimenting with TUIs including many tabletop applications.The Urp and the more advanced Augmented Urban Planning Workbench allows digitalsimulations of air flow, shadows, reflections, and other data based on the positions andorientations of physical models of buildings, on the table surface.
Newer developments go even one step further and incorporate the third dimension by allowing toform landscapes with clay (Illuminating Clay) or sand (Sand Scape). Again differentsimulations allow the analysis of shadows, height maps, slopes and other characteristics of theinteractively formable landmasses.InfrActables is a back projection collaborative table that allows interaction by using TUIs thatincorporate state recognition. Adding different buttons to the TUIs enables additional functionsassociated to the TUIs. Newer Versions of the technology can even be integrated into LC-displays by using infrared sensors behind the LC matrix.The Tangible Disaster allows the user to analyze disaster measures and simulate differentkinds of disasters (fire, flood, tsunami,.) and evacuation scenarios during collaborative planningsessions. Physical objects ‚gpuckss’ allow positioning disasters by placing them on theinteractive map and additionally tuning parameters (i.e. scale) using dials attached to them.Apparently the commercial potential of TUIs has been identified recently. The repeatadlyawarded Reactable, an interactive tangible tabletop instrument, is now distributedcommercially by Reactable Systems, a spinoff company of the Pompeu Fabra University, whereit was developed. With the Reactable users can set up their own instrument interactively, byphysically placing different objects (representing oscillators, filters, modulators... ) andparametrise them by rotating and using touch-input.Microsoft is distributing its novel Windows-based platform Microsoft Surface since last year.Beside multi touch tracking of fingers the platform supports the recognition of physical objectsby their footprints. Several applications, mainly for the use in commercial space, have beenpresented. Examples reach from designing an own individual graphical layout for a snowboardor skateboard to studying the details of a wine in a restaurant by placing it on the table andnavigating through menus via touch input. Also interactions like the collaborative browsing ofphotographs from a handycam or cell phone that connects seamlessly once placed on the tableare supported.Another notable interactive installation is instant city that combines gaming, music,architecture and collaborative aspects. It allows the user to build three dimensional structures andset up a city with rectangular building blocks, which simultaneously results in the interactiveassembly of musical fragments of different composers.The development of the Reactable and the subsequent release of its tracking technologyreacTIVision under the GNU/GPL as well as the open specifications of the TUIO protocolhave triggered an enormous amount of developments based on this technology.In the last few years also many amateur and semi-professional projects, beside academia andcommerce have been started. Thanks to open source tracking technologies (reacTIVision) andthe ever increasing computational power available to end-consumers, the required infrastructureis nowadays accessible to almost everyone. A standard PC, a web-cam, and some handicraftwork allow to setup tangible systems with a minimal programming and material effort. This
opens doors to novel ways of perception of human-computer interaction and gives room for newforms of creativity for the broad public, to experiment and play with.It is difficult to keep track and overlook the rapidly growing amount of all these systems andtools, but while many of them seem only to utilize the available technologies and are limited tosome initial experiments and tests with some basic ideas or just reproduce existing systems, afew of them open out into novel interfaces and interactions and are deployed in public space orembedded in art installations.The Tangible Factory Planning is a tangible table based on reacTIVision that allows tocollaboratively plan and visualize production processes in combination with plans of new factorybuildings and was developed within a diploma thesis.Another of the many reacTIVision-based tabletops is ImpulsBauhaus-Interactive Table andwas on exhibition at the Bauhaus-University in Weimar marking the 90th anniversary of theestablishment of Bauhaus. Visitors could browse and explore the biographies, complex relationsand social networks between members of the movement.Definition of Raw MaterialsA material or substance used in the primary production or manufacturing of a good. Rawmaterials are often natural resources such as oil, iron and wood. Before being used in themanufacturing process raw materials often are altered to be used in different processes. Rawmaterials are often referred to as commodities, which are bought and sold on commoditiesexchanges around the world.Investopedia explains Raw MaterialsRaw materials are sold in what is called the factor market. This is because raw materials arefactors of production along with labor and capital. Raw materials are so important to theproduction process that the success of a countrys economy can be determined by the amount ofnatural resources the country has within its own borders. A country that has abundant naturalresources does not need to import as many raw materials, and has an opportunity to export thematerials to other countries.
DefinitionBasic substance in its natural, modified, or semi-processed state, used as an input to a productionprocess for subsequent modification or transformation into a finished good.Read more: http://www.businessdictionary.com/definition/raw-material.html#ixzz2LVXnrzUOSoftware portabilityFrom Wikipedia, the free encyclopediaJump to: navigation, search It has been suggested that Porting be merged into this article or section. (Discuss) Proposed since June 2012. This article needs additional citations for verification. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed. (November 2011)This article is about portability in itself. For the work required to make software portable, seePorting.Portability in high-level computer programming is the usability of the same software indifferent environments. The prerequirement for portability is the generalized abstraction betweenthe application logic and system interfaces. When software with the same functionality isproduced for several computing platforms, portability is the key issue for development costreduction.Contents 1 Strategies for portability o 1.1 Similar systems o 1.2 Different operating systems, similar processors o 1.3 Different processors 2 Recompilation 3 See also 4 SourcesStrategies for portabilitySoftware portability may involve:
Transferring installed program files to another computer of basically the same architecture. Reinstalling a program from distribution files on another computer of basically the same architecture. Building executable programs for different platforms from source code; this is what is usually understood by "porting".Similar systemsWhen operating systems of the same family are installed on two computers with processors withsimilar instruction sets it is often possible to transfer the files implementing program filesbetween them.In the simplest case the file or files may simply be copied from one machine to the other.However, in many case the software is installed on a computer in a way which depends upon itsdetailed hardware, software, and setup, with device drivers for particular devices, using installedoperating system and supporting software components, and using different drives or directories.In some cases software, usually described as "portable software" is specifically designed to runon different computers with compatible operating systems and processors without any machine-dependent installation; it is sufficient to transfer specified directories and their contents. Softwareinstalled on portable mass storage devices such as USB sticks can be used on any compatiblecomputer on simply plugging the storage device in, and stores all configuration information onthe removable device. Hardware- and software-specific information is often stored inconfiguration files in specified locations (the registry on machines running Microsoft Windows).Software which is not portable in this sense will have to be transferred with modifications tosupport the environment on the destination machine.Different operating systems, similar processorsWhen the systems in question have compatible processors (usually x86-compatible processors ondesktop computers), they will execute the low-level program instructions in the same manner,but the system calls are likely to differ between different operating systems. Later operatingsystems of UNIX heritage, including Linux, BSD, Solaris and OS X, are able to achieve a highdegree of software portability by using the POSIX standard for calling OS functions. SuchPOSIX-based programs can be compiled for use in Windows by means of interface softwaresuch as Cygwin.Different processorsAs of 2011 the majority of desktop and laptop computers used microprocessors compatible withthe 32- and 64-bit x86 instruction sets. Smaller portable devices use processors with differentand incompatible instruction sets, such as ARM. The difference between larger and smallerdevices is such that detailed software operation is different; an application designed to display
Software engineeringFrom Wikipedia, the free encyclopediaJump to: navigation, searchA software engineer programming for the Wikimedia FoundationSoftware engineering (SE) is the application of a systematic, disciplined, quantifiable approachto the design, development, operation, and maintenance of software, and the study of theseapproaches; that is, the application of engineering to software. In laymans terms, it is theact of using insights to conceive, model and scale a solution to a problem. The term softwareengineering first appeared in the 1968 NATO Software Engineering Conference and was meantto provoke thought regarding the perceived "software crisis" at the time. Softwaredevelopment, a much used and more generic term, does not necessarily subsume the engineeringparadigm. The generally accepted concepts of Software Engineering as an engineering disciplinehave been specified in the Guide to the Software Engineering Body of Knowledge (SWEBOK).The SWEBOK has become an internationally accepted standard ISO/IEC TR 19759:2005.For those who wish to become recognized as professional software engineers, the IEEE offerstwo certifications (Certified Software Development Associate and Certified SoftwareDevelopment Professional). The IEEE certifications do not use the term Engineer in their title forcompatibility reasons. In some parts of the US such as Texas, the use of the term Engineer isregulated only to those who have a Professional Engineer license. Further, in the United States
starting from 2013, the NCEES Professional Engineer licences are available also for SoftwareEngineers.Contents 1 History 2 Profession o 2.1 Employment o 2.2 Certification o 2.3 Impact of globalization 3 Education 4 Comparison with other disciplines 5 Software Process o 5.1 Models 5.1.1 Waterfall model 6 Subdisciplines 7 Related disciplines o 7.1 Systems engineering o 7.2 Computer software engineers 8 See also 9 Notes 10 References 11 Further reading 12 External linksHistoryMain article: History of software engineeringWhen the first modern digital computers appeared in the early 1940s, the instructions to makethem operate were wired into the machine. Practitioners quickly realized that this design was notflexible and came up with the "stored program architecture" or von Neumann architecture. Thusthe division between "hardware" and "software" began with abstraction being used to deal withthe complexity of computing.Programming languages started to appear in the 1950s and this was also another major step inabstraction. Major languages such as Fortran, ALGOL, and COBOL were released in the late1950s to deal with scientific, algorithmic, and business problems respectively. E.W. Dijkstrawrote his seminal paper, "Go To Statement Considered Harmful", in 1968 and David Parnasintroduced the key concept of modularity and information hiding in 1972 to help programmersdeal with the ever increasing complexity of software systems. A software system for managingthe hardware called an operating system was also introduced, most notably by Unix in 1969. In1967, the Simula language introduced the object-oriented programming paradigm.
These advances in software were met with more advances in computer hardware. In the mid1970s, the microcomputer was introduced, making it economical for hobbyists to obtain acomputer and write software for it. This in turn led to the now famous Personal Computer (PC)and Microsoft Windows. The Software Development Life Cycle or SDLC was also starting toappear as a consensus for centralized construction of software in the mid 1980s. The late 1970sand early 1980s saw the introduction of several new Simula-inspired object-orientedprogramming languages, including Smalltalk, Objective-C, and C++.Open-source software started to appear in the early 90s in the form of Linux and other softwareintroducing the "bazaar" or decentralized style of constructing software. Then the World WideWeb and the popularization of the Internet hit in the mid 90s, changing the engineering ofsoftware once again. Distributed systems gained sway as a way to design systems, and the Javaprogramming language was introduced with its own virtual machine as another step inabstraction. Programmers collaborated and wrote the Agile Manifesto, which favored morelightweight processes to create cheaper and more timely software.The current definition of software engineering is still being debated by practitioners today asthey struggle to come up with ways to produce software that is "cheaper, better, faster". Costreduction has been a primary focus of the IT industry since the 1990s. Total cost of ownershiprepresents the costs of more than just acquisition. It includes things like productivityimpediments, upkeep efforts, and resources needed to support infrastructure.ProfessionMain article: Software engineerLegal requirements for the licensing or certification of professional software engineers varyaround the world. In the UK, the British Computer Society licenses software engineers andmembers of the society can also become Chartered Engineers (CEng), while in some areas ofCanada, such as Alberta, Ontario, and Quebec, software engineers can hold the ProfessionalEngineer (P.Eng) designation and/or the Information Systems Professional (I.S.P.) designation.In Canada, there is a legal requirement to have P.Eng when one wants to use the title "engineer"or practice "software engineering". In the USA, beginning on 2013, the path for licensure ofsoftware engineers will become a reality. As with the other engineering disciplines, therequirements consist of earning an ABET accredited bachelor’s degree in Software Engineering(or any non-ABET degree and NCEES credentials evaluation), passing the Fundamentals ofEngineering Exam, having at least four years of demonstrably relevant experience, and passingthe Software Engineering PE Exam. In some states, such as Florida, Texas, Washington, andother, software developers cannot use the title "engineer" unless they are licensed professionalengineers who have passed the PE Exam and possess a valid licence to practice. This license hasto be periodically renewed, which is known as continuous education, to ensure engineers are keptup to date with latest techniques and safest practices. The IEEE Computer Society and the ACM, the two main US-based professional organizations ofsoftware engineering, publish guides to the profession of software engineering. The IEEEsGuide to the Software Engineering Body of Knowledge - 2004 Version, or SWEBOK, defines the
field and describes the knowledge the IEEE expects a practicing software engineer to have.Currently, the SWEBOK v3 is being produced and will likely be released in mid-2013. TheIEEE also promulgates a "Software Engineering Code of Ethics".EmploymentIn 2004, the U. S. Bureau of Labor Statistics counted 760,840 software engineers holding jobs inthe U.S.; in the same time period there were some 1.4 million practitioners employed in the U.S.in all other engineering disciplines combined. Due to its relative newness as a field of study,formal education in software engineering is often taught as part of a computer sciencecurriculum, and many software engineers hold computer science degrees.Many software engineers work as employees or contractors. Software engineers work withbusinesses, government agencies (civilian or military), and non-profit organizations. Somesoftware engineers work for themselves as freelancers. Some organizations have specialists toperform each of the tasks in the software development process. Other organizations requiresoftware engineers to do many or all of them. In large projects, people may specialize in onlyone role. In small projects, people may fill several or all roles at the same time. Specializationsinclude: in industry (analysts, architects, developers, testers, technical support, middlewareanalysts, managers) and in academia (educators, researchers).Most software engineers and programmers work 40 hours a week, but about 15 percent ofsoftware engineers and 11 percent of programmers worked more than 50 hours a week in 2008.Injuries in these occupations are rare. However, like other workers who spend long periods infront of a computer terminal typing at a keyboard, engineers and programmers are susceptible toeyestrain, back discomfort, and hand and wrist problems such as carpal tunnel syndrome.The fields future looks bright according to Money Magazine and Salary.com, which ratedSoftware Engineer as the best job in the United States in 2006. In 2012, software engineeringwas again ranked as the best job in the United States, this time by CareerCast.com.CertificationThe Software Engineering Institute offers certifications on specific topics like Security, Processimprovement and Software architecture. Apple, IBM, Microsoft and other companies alsosponsor their own certification examinations. Many IT certification programs are oriented towardspecific technologies, and managed by the vendors of these technologies. These certificationprograms are tailored to the institutions that would employ people who use these technologies.Broader certification of general software engineering skills is available through variousprofessional societies. As of 2006, the IEEE had certified over 575 software professionals as aCertified Software Development Professional (CSDP). In 2008 they added an entry-levelcertification known as the Certified Software Development Associate (CSDA). The ACM hada professional certification program in the early 1980s, which was discontinued dueto lack of interest. The ACM examined the possibility of professional certification of softwareengineers in the late 1990s, but eventually decided that such certification was inappropriate for
the professional industrial practice of software engineering. In 2012, Validated Guru beganoffering the Certified Software Developer (VGCSD) certification; which is heavily influenced bythe global community.In the U.K. the British Computer Society has developed a legally recognized professionalcertification called Chartered IT Professional (CITP), available to fully qualified Members(MBCS). Software engineers may be eligible for membership of the Institution of Engineeringand Technology and so qualify for Chartered Engineer status. In Canada the CanadianInformation Processing Society has developed a legally recognized professional certificationcalled Information Systems Professional (ISP). In Ontario, Canada, Software Engineers whograduate from a Canadian Engineering Accreditation Board (CEAB) accredited program,successfully complete PEOs (Professional Engineers Ontario) Professional PracticeExamination (PPE) and have at least 48 months of acceptable engineering experience are eligibleto be licensed through the Professional Engineers Ontario and can become ProfessionalEngineers P.Eng. The PEO does not recognize any online or distance education however; anddoes not consider Computer Science programs to be equivalent to software engineering programsdespite the tremendous overlap between the two. This has sparked controversy and a certificationwar. It has also held the number of P.Eng holders for the profession exceptionally low. The vastmajority of working professionals in the field hold a degree in CS, not SE, and given the difficultcertification path for holders of non-SE degrees, most never bother to pursue the license.Impact of globalizationThe initial impact of outsourcing, and the relatively lower cost of international human resourcesin developing third world countries led to a massive migration of software development activitiesfrom corporations in North America and Europe to India and later: China, Russia, and otherdeveloping countries. This approach had some flaws, mainly the distance / timezone differencethat prevented human interaction between clients and developers, but also the lower quality ofthe software developed by the outsourcing companies and the massive job transfer. This had anegative impact on many aspects of the software engineering profession. For example, somestudents in the developed world avoid education related to software engineering because of thefear of offshore outsourcing (importing software products or services from other countries) andof being displaced by foreign visa workers. Although statistics do not currently show a threatto software engineering itself; a related career, computer programming does appear to have beenaffected. Nevertheless, the ability to smartly leverage offshore and near-shore resources viathe follow-the-sun workflow has improved the overall operational capability of manyorganizations. When North Americans are leaving work, Asians are just arriving to work.When Asians are leaving work, Europeans are arriving to work. This provides a continuousability to have human oversight on business-critical processes 24 hours per day, without payingovertime compensation or disrupting a key human resource, sleep patterns.While global outsourcing has several advantages, global - and generally distributed -development can run into serious difficulties resulting from the distance between developers.This includes but is not limited to language, communication, cultural or corporate barriers.Handling global development successfully is subject to active research of the softwareengineering community.
EducationA knowledge of programming is a pre-requisite to becoming a software engineer. In 2004 theIEEE Computer Society produced the SWEBOK, which has been published as ISO/IECTechnical Report 19759:2004, describing the body of knowledge that they believe should bemastered by a graduate software engineer with four years of experience. Many softwareengineers enter the profession by obtaining a university degree or training at a vocational school.One standard international curriculum for undergraduate software engineering degrees wasdefined by the CCSE, and updated in 2004. A number of universities have SoftwareEngineering degree programs; as of 2010, there were 244 Campus programs, 70 Onlineprograms, 230 Masters-level programs, 41 Doctorate-level programs, and 69 Certificate-levelprograms in the United States.In addition to university education, many companies sponsor internships for students wishing topursue careers in information technology. These internships can introduce the student tointeresting real-world tasks that typical software engineers encounter every day. Similarexperience can be gained through military service in software engineering.Comparison with other disciplinesMajor differences between software engineering and other engineering disciplines, according tosome researchers, result from the costs of fabrication.Software ProcessA set of activities that leads to the production of a software product is known as softwareprocess. Although most of the softwares are custom build, the software engineering market isbeing gradually shifted towards component based. Computer-aided software engineering (CASE)tools are being used to support the software process activities. However, due to the vast diversityof software processes for different types of products, the effectiveness of CASE tools is limited.There is no ideal approach to software process that has yet been developed. Some fundamentalactivities, like software specification, design, validation and maintainence are common to all theprocess activities.ModelsA software process model is an abstraction of software process. These are also called processparadigms. Various general process models are waterfall model, evolutionary developmentmodel and component-based software engineering model. These are widely used in currentsoftware engineering practice. For large systems, these are used together.Waterfall modelThe waterfall model was one of the first published model for the sofware process. This modeldivides software processes in various phases. These phases are:
Requirements analysis Software design Unit testing System testing MaintenanceTheoretically the activities should be performed individually but in practice, they often overlap.During the maintenance stage, the software is put into use. During this, additional problemsmight be discovered and the need of new feature may arise. This may require the software toundergo the previous phases once again.SubdisciplinesSoftware engineering can be divided into ten subdisciplines. They are: Software requirements: The elicitation, analysis, specification, and validation of requirements for software. Software design: The process of defining the architecture, components, interfaces, and other characteristics of a system or component. It is also defined as the result of that process. Software construction: The detailed creation of working, meaningful software through a combination of coding, verification, unit testing, integration testing, and debugging. Software testing: The dynamic verification of the behavior of a program on a finite set of test cases, suitably selected from the usually infinite executions domain, against the expected behavior. Software maintenance: The totality of activities required to provide cost-effective support to software. Software configuration management: The identification of the configuration of a system at distinct points in time for the purpose of systematically controlling changes to the configuration, and maintaining the integrity and traceability of the configuration throughout the system life cycle. Software engineering management: The application of management activities—planning, coordinating, measuring, monitoring, controlling, and reporting—to ensure that the development and maintenance of software is systematic, disciplined, and quantified. Software engineering process: The definition, implementation, assessment, measurement, management, change, and improvement of the software life cycle process itself. Software engineering tools and methods: The computer-based tools that are intended to assist the software life cycle processes, see Computer Aided Software Engineering, and the methods which impose structure on the software engineering activity with the goal of making the activity systematic and ultimately more likely to be successful. Software quality: The degree to which a set of inherent characteristics fulfills requirements.Related disciplines
Software engineering is a direct subfield of computer science and has some relations withmanagement science. It is also considered a part of overall systems engineering.Systems engineeringSystems engineers deal primarily with the overall system requirements and design, includinghardware and human issues. They are often concerned with partitioning functionality tohardware, software or human operators. Therefore, the output of the systems engineering processserves as an input to the software engineering process.Computer software engineersComputer Software Engineers are usually systems level (software engineering, informationsystems) computer science or software level computer engineering graduates. Thisterm also includes general computer science graduates with a few years of practical on the jobexperience involving software engineering.See also Software portal Software Testing portalMain article: Outline of software engineering Bachelor of Science in Information Technology Bachelor of Software Engineering List of software engineering conferences List of software engineering publications Software craftsmanshipEngineeringFrom Wikipedia, the free encyclopediaJump to: navigation, search
The steam engine, a major driver in the Industrial Revolution, underscores the importance ofengineering in modern history. This beam engine is on display at the main building of theETSIIM in Madrid, Spain.Engineering is the application of scientific, economic, social, and practical knowledge, in orderto design, build, and maintain structures, machines, devices, systems, materials and processes. Itmay encompass using insights to conceive, model and scale an appropriate solution to a problemor objective. The discipline of engineering is extremely broad, and encompasses a range of morespecialized fields of engineering, each with a more specific emphasis on particular areas oftechnology and types of application.The American Engineers Council for Professional Development (ECPD, the predecessor ofABET) has defined "engineering" as:The creative application of scientific principles to design or develop structures, machines,apparatus, or manufacturing processes, or works utilizing them singly or in combination; or toconstruct or operate the same with full cognizance of their design; or to forecast their behaviorunder specific operating conditions; all as respects an intended function, economics of operationor safety to life and property.One who practices engineering is called an engineer, and those licensed to do so may have moreformal designations such as Professional Engineer, Chartered Engineer, Incorporated Engineer,Ingenieur or European Engineer.Contents 1 History o 1.1 Ancient era o 1.2 Renaissance era o 1.3 Modern era
2 Main branches of engineering 3 Methodology o 3.1 Problem solving o 3.2 Computer use 4 Social context 5 Relationships with other disciplines o 5.1 Science o 5.2 Medicine and biology o 5.3 Art o 5.4 Other fields 6 See also 7 References 8 Further reading 9 External linksHistoryMain article: History of engineeringEngineering has existed since ancient times as humans devised fundamental inventions such asthe pulley, lever, and wheel. Each of these inventions is consistent with the modern definition ofengineering, exploiting basic mechanical principles to develop useful tools and objects.The term engineering itself has a much more recent etymology, deriving from the word engineer,which itself dates back to 1325, when an engineer (literally, one who operates an engine)originally referred to "a constructor of military engines." In this context, now obsolete, an"engine" referred to a military machine, i.e., a mechanical contraption used in war (for example,a catapult). Notable exceptions of the obsolete usage which have survived to the present day aremilitary engineering corps, e.g., the U.S. Army Corps of Engineers.The word "engine" itself is of even older origin, ultimately deriving from the Latin ingenium (c.1250), meaning "innate quality, especially mental power, hence a clever invention."Later, as the design of civilian structures such as bridges and buildings matured as a technicaldiscipline, the term civil engineering entered the lexicon as a way to distinguish between thosespecializing in the construction of such non-military projects and those involved in the olderdiscipline of military engineering.Ancient era
The Ancient Romans built aqueducts to bring a steady supply of clean fresh water to cities andtowns in the empire.The Pharos of Alexandria, the pyramids in Egypt, the Hanging Gardens of Babylon, theAcropolis and the Parthenon in Greece, the Roman aqueducts, Via Appia and the Colosseum,Teotihuacán and the cities and pyramids of the Mayan, Inca and Aztec Empires, the Great Wallof China, the Brihadeshwara temple of Tanjavur and tombs of India, among many others, standas a testament to the ingenuity and skill of the ancient civil and military engineers.The earliest civil engineer known by name is Imhotep. As one of the officials of the Pharaoh,Djosèr, he probably designed and supervised the construction of the Pyramid of Djoser (the StepPyramid) at Saqqara in Egypt around 2630-2611 BC. He may also have been responsible forthe first known use of columns in architecture.Ancient Greece developed machines in both the civilian and military domains. The Antikytheramechanism, the first known mechanical computer, and the mechanical inventions ofArchimedes are examples of early mechanical engineering. Some of Archimedes inventions aswell as the Antikythera mechanism required sophisticated knowledge of differential gearing orepicyclic gearing, two key principles in machine theory that helped design the gear trains of theIndustrial revolution, and are still widely used today in diverse fields such as robotics andautomotive engineering.Chinese, Greek and Roman armies employed complex military machines and inventions such asartillery which was developed by the Greeks around the 4th century B.C., the trireme, theballista and the catapult. In the Middle Ages, the Trebuchet was developed.Renaissance eraThe first electrical engineer is considered to be William Gilbert, with his 1600 publication of DeMagnete, who coined the term "electricity".The first steam engine was built in 1698 by mechanical engineer Thomas Savery. Thedevelopment of this device gave rise to the industrial revolution in the coming decades, allowingfor the beginnings of mass production.
With the rise of engineering as a profession in the 18th century, the term became more narrowlyapplied to fields in which mathematics and science were applied to these ends. Similarly, inaddition to military and civil engineering the fields then known as the mechanic arts becameincorporated into engineering.Modern eraThe International Space Station represents a modern engineering challenge from manydisciplines.Electrical engineering can trace its origins in the experiments of Alessandro Volta in the 1800s,the experiments of Michael Faraday, Georg Ohm and others and the invention of the electricmotor in 1872. The work of James Maxwell and Heinrich Hertz in the late 19th century gave riseto the field of electronics. The later inventions of the vacuum tube and the transistor furtheraccelerated the development of electronics to such an extent that electrical and electronicsengineers currently outnumber their colleagues of any other engineering specialty.The inventions of Thomas Savery and the Scottish engineer James Watt gave rise to modernmechanical engineering. The development of specialized machines and their maintenance toolsduring the industrial revolution led to the rapid growth of mechanical engineering both in itsbirthplace Britain and abroad.John Smeaton was the first self-proclaimed civil engineer, and often regarded as the "father" ofcivil engineering. He was an English civil engineer responsible for the design of bridges, canals,harbours and lighthouses. He was also a capable mechanical engineer and an eminent physicist.Smeaton designed the third Eddystone Lighthouse (1755–59) where he pioneered the use ofhydraulic lime (a form of mortar which will set under water) and developed a techniqueinvolving dovetailed blocks of granite in the building of the lighthouse. His lighthouse remainedin use until 1877 and was dismantled and partially rebuilt at Plymouth Hoe where it is known asSmeatons Tower. He is important in the history, rediscovery of, and development of moderncement, because he identified the compositional requirements needed to obtain "hydraulicity" inlime; work which led ultimately to the invention of Portland cement.Chemical engineering, like its counterpart mechanical engineering, developed in the nineteenthcentury during the Industrial Revolution. Industrial scale manufacturing demanded newmaterials and new processes and by 1880 the need for large scale production of chemicals wassuch that a new industry was created, dedicated to the development and large scale