2. APPLICATIONS OF COMPUTER IN RESEARCH
Introduction:
The computers are the emerging tool in the research process.
The main components of Computers are an input device, a Central Processing Unit and an output device.
It is an essential tool for research, whether for academic purpose or for commercial purpose.
Computers play a major role today in every field of scientific research from genetic engineering to astrophysics
research. Computers with internet led the way to a globalized information portal that is the World Wide Web.
Using WWW, researcher can conduct research on massive scale.
Various programs and applications have eased our way into computing our research process.
In this module, various computer software applications and tools are discussed with respect to research activities
like data collection, analysis, etc.
Objectives:
Understand the Features of computers.
To know various steps involved in research process.
Role of Computers in Research Publication.
Introduction of Analysis Tools used in research process.
3. Features of a computer:-
There are many reasons why computers are so important in scientific research and here are some of the reasons are:
SPEED: computer can process numbers and information in a very short time. So researcher can process and analyze data quickly.
By saving time researcher can conduct further research. A calculation that may take a person several hours to process will take
computer mere minutes, if not seconds.
STORAGE DEVICE– Computer can store and retrieve huge data. It can be used when needed. There is no risk of forgetting and
loosing data.
ACCURACY: Computer is incredibly accurate. Accuracy is very much important in scientific research. Wrong calculation could
result an entire research or project being filled with incorrect information
ORGANIZATION: We can store millions of pages of information by using simple folders, word processors & computer programs.
Computer is more productive & safer than using a paper filing system in which anything can be easily misplaced.
CONSISTENCY: computer cannot make mistakes through “tiredness” or lack of concentration like human being. This characteristic
makes it exceptionally important in scientific research. Large calculations can be done with accuracy and speed with the help of
computer.
4. Automatic Device– The programs which are run on computer are automatic through some instructions
Computational Tools : Computers started for the use of powerful calculators, and that service is important to research today.
Huge amount of data can process with the help of computer's. Statistical programs, modelling programs and spatial mapping
tools are all possible use of computers. Researchers can use information in new ways, example layering different types of maps
on one another to discover new patterns in how people use their environment.
Communication : Building knowledge through research requires communication between experts to identify new areas
requiring research and debating results and ideas. Before the invention of computers, this was accomplished through papers
and workshops. Now, the world's experts can communicate via web chats or email. Information can be spread various ways
example by virtual conferences.
Mobility: Researchers can take computers anywhere, it is easier to conduct field research and collect large amount of data. New
areas of research in remote areas or at a community level are carried out by the mobility of computers. Social media sites have a
new medium for interaction with society and collect the information.
5. The Steps in Research Process Research process consists of series of actions necessary to carry out research work effectively
.The sequencing of these steps listed below
(1) Formulating the research problem;
(2) Extensive literature survey;
(3) Developing the hypothesis;
(4) Preparing the research design;
(5) Determining sample design;
(6) Data Collection;
(7) Project Execution;
(8) Data Analysis;
(9) Hypothesis testing;
(10) Generalizations and interpretation,
(11) Preparation of the report or presentation of the results, i.e., formal write-up of conclusions of the research.
6. Computers in Research:
There are various steps necessary to effectively carry out research and the desired sequencing of these steps in the research
process.
This data can be used for different phases of research process.
There are five major phases of the research process:
1. Conceptual phase
2. Design and planning phase
3. Data collection phase
4. Data Analysis phase and
5. Research Publication phase
7. Conceptual Phase and Computer :
The conceptual phase consists of formulation of research problem, extensive literature survey, theoretical frame work and
developing the hypothesis.
Computer helps in searching the existing literature in the relevant field of research.
It helps in finding the relevant existing research papers so that researcher can find out the gap from the existing literature.
Computers help for searching the literatures and bibliographic reference stored in the electronic database of the World Wide
Web’s.
It can be used for storing relevant published articles to the retrieved whenever needed.
This has the advantage over searching the literatures in the form of journals, books and other newsletters at the libraries which
consume considerable amount of time and effort.
Bibliographic references can also be stored in World Wide Web.
In the latest computers, references can be written easily in different styles.
Researcher need not visit libraries .
It helps to increase time for research.
It helps researchers to know how theoretical framework can be built.
8. Design and Planning Phase and Computer
Computer can be used for, deciding population sample, questionnaire designing and data collection.
They are different internet sites which help to design questionnaire.
Software’s can be used to calculate the sample size.
It makes pilot study of the research possible.
In pilot study, sample size calculation, standard deviations are required.
Computer helps in doing all these activities.
Role of Computers in Data collection phase
Empirical phase consists of collecting and preparing the data for analysis:
In research studies, the preparation and computation of data are the most labor-intensive and time consuming aspect of the
work.
Typically the data will be initially recorded on a questionnaire or record for suitable for its acceptance by the computer.
To do this the researcher in connection with the statistician and the programmer, will convert the data into Microsoft word file or
excel spreadsheet or any statistical software data file.
These data can be directly used with statistical Software’s for analysis
9. Data collection and Storage:
The data obtained from the research subjects are stored in computers in the form of word files or excel spread sheets or any
statistical software data file.
This has the advantage of making necessary corrections or editing the whole layout of the tables if needed, which is impossible
or time consuming incase of writing in hand written.
Thus, computers help in data editing, data entry, and data management including follow up actions etc. computers also allow
for greater flexibility in recording and processing the data while they are collected as well as greater ease during the analysis of
these data.
Data exposition:
The researchers are anxious about seeing the data: what they look like; how they are distributed etc. Researchers also examine
different dimension of variables or plot them in various charts using a statistical application.
Data Analysis and Computer:
Data Analysis and Computer phase consist of the analysis of data, interpretation and hypothesis testing.
Data analysis phase consist of statistical analysis of the data and interpretation of results.
Data analysis and interpretation can be done with the help of computers.
For data analysis, software’s available.
These software help in using the techniques for analysis like average, percentage, correlation and all the mathematical
calculations.
10. Software’s used for data analysis are SPSS, GRETL, STATA, SYSAT etc.
Computers are useful not only for statistical analysis, but also to monitor the accuracy and completeness of the data as they
are collected.
This software’s also display the results in graphical chart or graph form.
Computers are used in interpretation also.
They can check the accuracy and authenticity of data.
It helps is drafting tables by which a researcher can interpret the results easily.
These tables give a clear proof of the interpretation made by researcher.
Role of Computer in Research Publication
After interpretation, computer helps is converting the results into a research article or report which can be published.
This phase consists of preparation of the report or presentation of the results, i.e., formal write-up of conclusions reached.
This is the research publication phase.
The research article, research paper, research thesis or research dissertation is typed in word
processing software and converted to portable data format (PDF) and stored and/or published in the world wide web.
Online sites are available through we can convert our word file into any format like html, pdf etc.
Various online applications are also available for this purpose.
Even one can prepare our document using online word processing software and can store/edit/access it from anywhere using
internet.
11. References and computer:
After completing the word document, a researcher need to give source of the literature studied and discussed in references.
Computers also help in preparing references.
References can be written in different styles.
All the details of author’s journals, publication volume Books can be filled in the options “reference‟ given in computer and it
automatically change the information into the required style.
Software used to manage the references.
A researcher needs not to worry about remembering all the articles from where literature in taken, it can be easily managed
with the help of computers.
Simulation:
Simulation is the imitation of the operation of a real-world process or system over time.
Simulation is used in many contexts, such as simulation of technology for performance optimization, safety engineering,
testing, training, education, and video games.
Often, computer experiments are used to study simulation models.
Simulation can be used to show the eventual real effects of alternative conditions and courses of action.
Simulation is mainly used when the real system cannot be engaged, because it may not be accessible, or it may be dangerous
or unacceptable to engage, or it is being designed but not yet built, or it may simply not exist.
Using computers the simulation in research carried out in various fields.
12. Role of Computers in Scientific Research:
There are various computer applications used in scientific research.
Some of the most important applications used in scientific research are data storage, data analysis, scientific simulations,
instrumentation control and knowledge sharing.
Data Storage
Experimentation is the basis of scientific research.
Scientific experiment in any of the natural sciences generates a lot of data that needs to be stored and analyzed to derive
important conclusions, to validate or disprove hypotheses.
Computers attached with experiential apparatuses, directly record data as its generated and subject it to analysis through
specially designed software.
Data storage is possible in SPSS data file, lotus spreadsheet, excel spreadsheet, DOS text file etc.
Data Analysis
Analyzing Huge number of statistical data is made possible using specially designed algorithms that are implemented by
computers.
This makes the extremely time-consuming job of data analysis to be matter of a few minutes.
In genetic engineering, computers have made the sequencing of the entire human genome possible.
Data got from different sources can be stored and accessed via computer networks set up in research labs, which makes
collaboration simpler
13. Scientific Simulations
One of the prime uses of computers in pure science and engineering projects is the running of simulations.
A simulation is a mathematical modeling of a problem and a virtual study of its possible solutions.
For example, astrophysicists carry out structure formation simulations, which are aimed at studying how large-scale structures
like galaxies are formed.
Space missions to the Moon, satellite launches and interplanetary missions are first simulated on computers to determine the
best path that can be taken by the launch vehicle and spacecraft to reach its destination safely.
Instrumentation Control
Most advanced scientific instruments come with their own on-board computer, which can be programmed to execute various
functions.
For example, the Hubble Space Craft has its own onboard computer system which is remotely programmed to probe the deep
space.
Instrumentation control is one of the most important applications of computers
14. Knowledge Sharing through Internet
In the form of Internet, computers have provided an entirely new way to share knowledge.
Today, anyone can access the latest research papers that are made available for free on websites.
Sharing of knowledge and collaboration through the Internet has made international cooperation on scientific projects
possible.
Through various kinds of analytical software programs, computers are contributing to scientific Research in every discipline,
ranging from biology to astrophysics, discovering new patterns and providing novel insights.
When the work in neural network based artificial intelligence advances and computers are granted with the ability to learn and
think for them, future advances in technology and research will be even more rapid.
15. Tools and Applications Used In the Research Process Statistical Analysis Tool:
SPSS
SPSS is the most popular tool for statisticians.
SPSS stands for Statistical Package for Social Sciences.
It provides all analysis facilities like following and many more.
Provides Data view & variable view
Measures of central tendency & dispersion
Statistical inference
Correlation & Regression analysis
Analysis of variance
Non parametric test
Hypothesis tests: T-test, chi-square, z-test, ANOVA, Bipartite variable….
Multivariate data analysis
Frequency distribution
Data exposition by using various graphs like line, scatter, bar, ogive, histogram,
pie chart…
16. Computation is a process of converting the input of one form to some other desired output form using certain control
actions.
According to the concept of computation, the input is called an antecedent and the output is called the consequent.
A mapping function converts the input of one form to another form of desired output using certain control actions.
The computing concept is mainly applicable to computer science engineering.
There are two types of computing,
hard computing, and soft computing.
Hard computing is a process in which we program the computer to solve certain problems using mathematical
algorithms that already exist, which provides a precise output value.
One of the fundamental examples of hard computing is a numerical problem.
Hard computing is used in solving the deterministic problems, prototyping, and modeling. Applications of hard
computing include mobile robot coordination and forecasting combinational problems.
17. Deterministic problem:
Given the same input, produces the same output every time.
Given the same input, takes the same amount of time/memory/resources every time it is run.
What is prototype model example in real life?
In manufacturing, a prototype is a refined version of your product based on user feedback. For example, when
developing a car, the manufacturer starts with a prototype— or model — that costs less and incorporates new
technology.
Modeling involves making a representation of something. Creating a tiny, functioning volcano is an example of
modeling.
As an example of a combinatorial decision problem, consider the Graph Colouring Problem: given a graph G and
a number of colours, find an assignment of colours to the vertices of G such that two vertices that are connected by an
edge are never assigned the same colour.
18. What is Soft Computing?
Soft computing is an approach where we compute solutions to the existing complex problems, where output results
are imprecise or fuzzy in nature, one of the most important features of soft computing is it should be adaptive so that
any change in environment does not affect the present process.
The following are the characteristics of soft computing.
•It does not require any mathematical modeling for solving any given problem
•It gives different solutions when we solve a problem of one input from time to time
•Uses some biologically inspired methodologies such as genetics, evolution, particles swarming, the human nervous
system, etc.
•Adaptive in nature.
Swarm intelligence refers to collective intelligence. Biologists and natural scientist have been studying the
behavior of social insects due to their efficiency of solving complex problems such as finding the shortest path
between their nest and food source or organizing their nests. when working together as unified systems, can
outperform the vast majority of individual members when solving problems and making decisions. Scientists call
this “Swarm Intelligence” and it proves the old adage – many minds are better than one.
19. Soft computing is defined as a group of computational techniques based on artificial intelligence (human like
decision) and natural selection that provides quick and cost effective solution to very complex problems for which
analytical (hard computing) formulations do not exist.
The term soft computing was coined by Zadeh .
Soft computing aims at finding precise approximation, which gives a robust (strong), computationally efficient and
cost effective solution saving the computational time.
Most of these techniques are basically enthused on biological inspired phenomena and societal behavioural
patterns.
Soft computing is an important branch of computational intelligence, where fuzzy logic, probability theory, neural
networks, and genetic algorithms are synergistically used to mimic the reasoning and decision making of a human.
Soft computing contrasts with possibility, an approach that
is used when there is not enough information available to
solve a problem. In contrast, soft computing is used where
the problem is not adequately specified for the use of
conventional math and computer techniques. Soft
computing has numerous real-world applications in
domestic, commercial and industrial situations.
20. Soft computing is the use of approximate calculations to provide imprecise but usable solutions to complex
computational problems. The approach enables solutions for problems that may be either unsolvable or just too
time-consuming to solve with current hardware. Soft computing is sometimes referred to as computational
intelligence.
Soft computing provides an approach to problem-solving using means other than computers. With the human mind
as a role model, soft computing is tolerant of partial truths, uncertainty, imprecision and approximation, unlike
traditional computing models. The tolerance of soft computing allows researchers to approach some problems that
traditional computing can't process.
Soft computing uses component fields of study in:
•Fuzzy logic
•Machine learning
•Probabilistic reasoning
•Evolutionary computation
•Perceptron
•Genetic algorithms
•Differential algorithms
•Support vector machines
•Metaheuristics
•Swarm intelligence
•Ant colony optimization
•Particle optimization
•Bayesian networks
•Artificial neural networks
21. Characteristics of Soft Computing:
•Soft computing provides an approximate but precise solution for real-life problems.
•The algorithms of soft computing are adaptive, so the current process is not affected by any kind of change in the
environment.
•The concept of soft computing is based on learning from experimental data. It means that soft computing does
not require any mathematical model to solve the problem.
•Soft computing helps users to solve real-world problems by providing approximate results that conventional and
analytical models cannot solve.
•It is based on Fuzzy logic, genetic algorithms, machine learning, ANN, and expert systems.
22. Need of soft computing
Sometimes, conventional computing or analytical models does not provide a solution to some real-world problems. In
that case, we require other technique like soft computing to obtain an approximate solution.
•Hard computing is used for solving mathematical problems that need a precise answer. It fails to provide solutions for
some real-life problems. Thereby for real-life problems whose precise solution does not exist, soft computing helps.
•When conventional mathematical and analytical models fail, soft computing helps, e.g., You can map even the
human mind using soft computing.
•Analytical models can be used for solving mathematical problems and valid for ideal cases. But the real-world
problems do not have an ideal case; these exist in a non-ideal environment.
•Soft computing is not only limited to theory; it also gives insights into real-life problems.
•Like all the above reasons, Soft computing helps to map the human mind, which cannot be possible with
conventional mathematical and analytical models.
23. Elements of soft computing
Soft computing is viewed as a foundation component for an emerging field of conceptual intelligence. Fuzzy Logic
(FL), Machine Learning (ML), Neural Network (NN), Probabilistic Reasoning (PR), and Evolutionary Computation
(EC) are the supplements of soft computing. Also, these are techniques used by soft computing to resolve any
complex problem.
24. Genetic Algorithm in Soft Computing
The genetic algorithm was introduced by Prof. John Holland in 1965. It is used to solve problems based on principles
of natural selection, that come under evolutionary algorithm. They are usually used for optimization problems like
maximization and minimization of objective functions, which are of two types of an ant colony and swarm particle. It
follows biological processes like genetics and evolution.
Functions of the Genetic Algorithm
The genetic algorithm can solve the problems which cannot be solved in real-time also known as the NP-Hard
problem. The complicated problems which cannot be solved mathematically can be easily solved by applying the
genetic algorithm. It is a heuristic search or randomized search method, which provides an initial set of solutions and
generate a solution to the problem efficiently and effectively.
25. Soft computing vs hard computing
Hard computing uses existing mathematical algorithms to solve certain problems. It provides a precise and exact
solution of the problem. Any numerical problem is an example of hard computing.
On the other hand, soft computing is a different approach than hard computing. In soft computing, we compute
solutions to the existing complex problems. The result calculated or provided by soft computing are also not precise.
They are imprecise and fuzzy in nature.
26. Parameters Soft Computing Hard Computing
Computation
time
Takes less computation time. Takes more computation time.
Dependency It depends on approximation and
dispositional.
It is mainly based on binary logic
and numerical systems.
Computation
type
Parallel computation Sequential computation
Result/Output Approximate result Exact and precise result
Example Neural Networks, such as
Madaline, Adaline, Art Networks.
Any numerical problem or traditional
methods of solving using personal
computers.
27. •Advantage:
•The simple mathematical calculation is performed
•Good efficiency
•Applicable in real-time
•Based on human reasoning.
Disadvantages
The disadvantages of soft computing are
•It gives an approximate output value
•If a small error occurs the entire system stops working, to overcome its entire system must be corrected from the
beginning, which is time taking process.
28. Structure of GA
GAs are categorised under the is an umbrella term Evolutionary Algorithms, which are used to describe
computer-based problem solving systems which use computational models of evolutionary processes
as key elements in their design and implementation.
A variety of evolutionary algorithms have been proposed, of which the major ones are: GAs,
evolutionary programming, evolution strategies, classifier systems, and genetic programming
They all share a common conceptual base of simulating the 'evolution' of individual structures via
processes of selection, mutation and reproduction.
The existing GAs are founded upon the following main principles:
Reproduction
1.Fitness
2.Crossover
3.Mutation
29. GA algorithms follow a standard procedure:
1. Start with a randomly generated population of n l-bit strings (candidate solutions to a problem).
(These "solutions" are not to be confused with "answers" to the problem, think of them as possible
characteristics that the system would employ in order to reach the answer )
2. Calculate the fitness f(x) of each string in the population.
3. Repeat the following steps until n new strings have been created:
1. Select a pair of parent strings from the current population, the probability of selection being
an increasing function of fitness. Selection is done "with replacement" meaning that the same
string can be selected more than once to become a parent.
2. With the crossover probability, cross over the pair at a randomly chosen point to form two new
strings. If no crossover takes place, form two new strings that are exact copies of their
respective parents.
3. Mutate the two new strings at each locus with the mutation probability, and place the resulting
strings in the new population.
4. Replace the current population with the new population.
5. Go to step 2.
31. How the flowchart principles are being used
Main Parameter:
The characteristics of the algorithm that have a major impact on the outcome when processing data
with a GA are the implementation of crossover and the fitness factor.
Crossover:
During crossover, a random group (varying in size) of nucleotides of two lined up DNA strands are
swapped in figure
Figure: Crossover. The red and green lines are DNA strands (which are strings of bits in a
GA situation).
32. The algorithms code crossover as either a swap of one bit, which is more like a single-point mutation
than a 'real crossover', or of several bits (used with genetic programming) and a distinction is made
between two parents (bit strings, but called chromosomes in GA terminology) who are the identical,
two different parents and single parent. Anyhow, the process has the following procedure:
1.Select two bit strings (chromosomes), or in case of the genetic programming: select a branch of
each parent.
2.Cut the chromosome (or branch) at a particular location.
3.Swap the bits/branches of the two parents.
Single bit crossover
111111 ------------> 111011
000000 crossover 000100
Multiple bits crossover
the first string = 01001000
second string = 11101111 could be crossed over after
the fifth bit (locus) in each to produce the two offspring:
01001111 and
11101000.
33. What Is Simulated Annealing?
Simulated annealing is a method for solving unconstrained and bound-constrained optimization problems. The method models the physical
process of heating a material and then slowly lowering the temperature to decrease defects, thus minimizing the system energy.
Simulated Annealing (SA) is an effective and general form of optimization. It is useful in finding global optima in the presence
of large numbers of local optima. “Annealing” refers to an analogy with thermodynamics, specifically with the way that metals
cool and anneal. Simulated annealing uses the objective function of an optimization problem instead of the energy of a material.
The algorithm is basically hill-climbing except instead of picking the best move, it picks a random move. If the selected move
improves the solution, then it is always accepted. Otherwise, the algorithm makes the move anyway with some probability less
than 1. The probability decreases exponentially with the “badness” of the move, which is the amount deltaE by which the
solution is worsened (i.e., energy is increased.)