Human brain is the most valuable creation of God. The man is intelligent because of the brain. "Blue brain" is the name of the world’s first virtual brain. That means a machine can function as human brain. Today scientists are in research to create an artificial brain that can think, response, take decision, and keep anything in memory. The main aim is to upload human brain into machine. So that man can think, take decision without any effort. After the death of the body, the virtual brain will act as the man .So, even after the death of a person we will not lose the knowledge, intelligence, personalities, feelings and memories of that man that can be used for the development of the human society.
Blue brain bringing a virtual brain to lifeIJARIIT
Man is intelligent because of the brain. But the brain, all its knowledge, and power are destroyed after the death of
the man. BLUE BRAIN, The name of the world's first virtual brain that means a machine that functions like a human
brain. It can think. It can take a decision. It can response. It can store things in memory. The research involves studying
slices of living brain tissue using microscopes and patch clamp electrodes. Data is collected about all the many different
neuron types. This data is used to build biologically realistic models of neurons and networks of neurons in the cerebral
cortex. The simulations are carried out on a Blue Gene Supercomputer built by IBM.
In this paper, we concentrate on the application of Blue Brain for "Cracking Neural Code" as well as the use of Blue Brain in
"Human memory loss". The neural code refers to how the human brain builds images using electrical patterns and cracking
the neural code means finding the patterns and meaning in the noisy activity of the cell ensembles. Human memory loss
includes conditions like ‘Alzheimer’ and 'short-term memory loss'.
Blue brain bringing a virtual brain to lifeIJARIIT
Man is intelligent because of the brain. But the brain, all its knowledge, and power are destroyed after the death of
the man. BLUE BRAIN, The name of the world's first virtual brain that means a machine that functions like a human
brain. It can think. It can take a decision. It can response. It can store things in memory. The research involves studying
slices of living brain tissue using microscopes and patch clamp electrodes. Data is collected about all the many different
neuron types. This data is used to build biologically realistic models of neurons and networks of neurons in the cerebral
cortex. The simulations are carried out on a Blue Gene Supercomputer built by IBM.
In this paper, we concentrate on the application of Blue Brain for "Cracking Neural Code" as well as the use of Blue Brain in
"Human memory loss". The neural code refers to how the human brain builds images using electrical patterns and cracking
the neural code means finding the patterns and meaning in the noisy activity of the cell ensembles. Human memory loss
includes conditions like ‘Alzheimer’ and 'short-term memory loss'.
The Blue Brain, a Swiss national brain initiative, aims to create a digital reconstruction of the brain by reverse-engineering mammalian brain circuitry. The mission of the project, founded in May 2005 by the Brain and Mind Institute of the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland, is to use biologically-detailed digital reconstructions and simulations of the mammalian brain (brain simulation) to identify the fundamental principles of brain structure and function in health and disease.
It is said that within 30 years we will be able to scan ourselves into computers.
The blue brain is the first virtual brain in the
world. It is a machine that can work like the human brain. At
present scientists are trying to make a virtual brain that will
be able to make decisions and keep information in the
memory. The idea is to upload the human brain into the
machine. So that man can think without any efforts. The main
advantage of this project is that even after the death of the
person, we can use the knowledge and intelligence of that
person
Blue brain –The name of the world’s first virtual brain. That means a machine that can function as human brain. Today scientists are in research to create an artificial brain that can think, response, take decision, and keep anything in memory. The main aim is to upload human brain into machine. So that man can think, take decision without any effort. After the death of the body, the virtual brain will act as the man .So, even after the death of a person we will not loose the knowledge, intelligence, personalities, feelings and memories of that man that can be used for the development of the human society. No one has ever understood the complexity of human brain. It is complex than any circuitry in the world. So, question may arise “Is it really possible to create a human brain?” The answer is “Yes”. Because what ever man has created today always he has followed the nature. When man does not have a device called computer, it was a big question for all .But today it is possible due to the technology. Technology is growing faster than everything. IBM is now in research to create a virtual brain. It is called Blue brain. If possible, this would be the first virtual brain of the world.
With the introduction of Blue Brain technology, which is a reverse engineering, we can overcome all the brain disorders and diseases. Blue Brain is the name of the world’s first virtual brain which makes a machine, function as a human brain. Even after the death of the person the complete functional attribute of a human brain can be stored in that and can be used for further development.
Blue brain enables humans to give new dimensions to science and technology and make enormous development in making the best possible enlightenment to the present scenario.the details can be seen by going though the power point presentation
Supercapacitors or EDLCs (i.e. electric double-layer capacitors) or ultra-capacitors are becoming increasingly popular as alternatives for the conventional and traditional battery sources. This brief overview focuses on the different types of supercapacitors, the relevant quantitative modeling areas and the future of supercapacitor research and development. Supercapacitors may emerge as the solution for many application-specific power systems. Especially, there has been great interest in developing supercapacitors for electric vehicle hybrid power systems, pulse power applications, as well as back-up and emergency power supplies. Because of their flexibility, however, supercapacitors can be adapted to serve in roles for which electrochemical batteries are not as well suited. Also, supercapacitors have some intrinsic characteristics that make them ideally suited to specialized roles and applications that complement the strengths of batteries. In particular, supercapacitors have great potential for applications that require a combination of high power, short charging time, high cycling stability and long shelf life. So, let’s just begin the innovative journey of these near future of life-long batteries that can charge up almost anything and everything within a few seconds!
This is a paper on the AbioCor Heart System written by our five-person student group during a semester-long introductory engineering course for materials science engineering. The paper includes a detailed description on under which medical conditions the use of this device is appropriate, a description of alternatives and predecessors to the AbioCor Heart System, the components that make up the AbioCor System, and a design recommendation for improving the AbioCor System. I wrote this paper with a group of other undergraduate engineering students for an introductory engineering class focusing on material use in biomedical devices.
The Blue Brain, a Swiss national brain initiative, aims to create a digital reconstruction of the brain by reverse-engineering mammalian brain circuitry. The mission of the project, founded in May 2005 by the Brain and Mind Institute of the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland, is to use biologically-detailed digital reconstructions and simulations of the mammalian brain (brain simulation) to identify the fundamental principles of brain structure and function in health and disease.
It is said that within 30 years we will be able to scan ourselves into computers.
The blue brain is the first virtual brain in the
world. It is a machine that can work like the human brain. At
present scientists are trying to make a virtual brain that will
be able to make decisions and keep information in the
memory. The idea is to upload the human brain into the
machine. So that man can think without any efforts. The main
advantage of this project is that even after the death of the
person, we can use the knowledge and intelligence of that
person
Blue brain –The name of the world’s first virtual brain. That means a machine that can function as human brain. Today scientists are in research to create an artificial brain that can think, response, take decision, and keep anything in memory. The main aim is to upload human brain into machine. So that man can think, take decision without any effort. After the death of the body, the virtual brain will act as the man .So, even after the death of a person we will not loose the knowledge, intelligence, personalities, feelings and memories of that man that can be used for the development of the human society. No one has ever understood the complexity of human brain. It is complex than any circuitry in the world. So, question may arise “Is it really possible to create a human brain?” The answer is “Yes”. Because what ever man has created today always he has followed the nature. When man does not have a device called computer, it was a big question for all .But today it is possible due to the technology. Technology is growing faster than everything. IBM is now in research to create a virtual brain. It is called Blue brain. If possible, this would be the first virtual brain of the world.
With the introduction of Blue Brain technology, which is a reverse engineering, we can overcome all the brain disorders and diseases. Blue Brain is the name of the world’s first virtual brain which makes a machine, function as a human brain. Even after the death of the person the complete functional attribute of a human brain can be stored in that and can be used for further development.
Blue brain enables humans to give new dimensions to science and technology and make enormous development in making the best possible enlightenment to the present scenario.the details can be seen by going though the power point presentation
Supercapacitors or EDLCs (i.e. electric double-layer capacitors) or ultra-capacitors are becoming increasingly popular as alternatives for the conventional and traditional battery sources. This brief overview focuses on the different types of supercapacitors, the relevant quantitative modeling areas and the future of supercapacitor research and development. Supercapacitors may emerge as the solution for many application-specific power systems. Especially, there has been great interest in developing supercapacitors for electric vehicle hybrid power systems, pulse power applications, as well as back-up and emergency power supplies. Because of their flexibility, however, supercapacitors can be adapted to serve in roles for which electrochemical batteries are not as well suited. Also, supercapacitors have some intrinsic characteristics that make them ideally suited to specialized roles and applications that complement the strengths of batteries. In particular, supercapacitors have great potential for applications that require a combination of high power, short charging time, high cycling stability and long shelf life. So, let’s just begin the innovative journey of these near future of life-long batteries that can charge up almost anything and everything within a few seconds!
This is a paper on the AbioCor Heart System written by our five-person student group during a semester-long introductory engineering course for materials science engineering. The paper includes a detailed description on under which medical conditions the use of this device is appropriate, a description of alternatives and predecessors to the AbioCor Heart System, the components that make up the AbioCor System, and a design recommendation for improving the AbioCor System. I wrote this paper with a group of other undergraduate engineering students for an introductory engineering class focusing on material use in biomedical devices.
Super-Capacitor Energy Storage of DFIG Wind Turbines with Fuzzy ControllerIJERD Editor
With the advance in wind turbine technologies, the cost of wind energy becomes competitive with
other fuel-based generation resources. Due to the price hike of the fossil fuels and the concern of the global
warming, the development of wind power has rapidly progressed over the last decade. Many countries have set
goal for high penetration levels of wind generations. Recently, several large-scale wind generation projects have
been implemented all over the world. It is economically beneficial to integrate very large amounts of wind
capacity in power systems. Unlike other traditional generation facilities, using wind turbines present technical
challenges in producing continuous and controllable electric power. With increase in contribution of wind power
into electric power grid, energy storage devices will be required to dynamically match the intermitting of wind
energy.
medical mirror
Introduction
Regular and non-invasive assessments of cardiovascular function are important insurveillance for cardiovascular catastrophes and treatment therapies of chronic diseases.Resting heart rate, one of the simplest cardiovascular parameters, has been identified as anindependent risk factor (comparable with smoking, dyslipidemia or hypertension) for cardiovascular disease [1]. Currently, the gold standard techniques for measurement of thecardiac pulse such as the electrocardiogram (ECG) require patients to wear adhesive gel patches or chest straps that can cause skin irritation and discomfort. Commercial pulseoximetry sensors that attach to the fingertips or earlobes are also inconvenient for patientsand the spring-loaded clips can cause pain if worn over a long period of time.The ability to monitor a patient¶s physiological signals by a remote, non-contactmeans is a tantalizing prospect that would enhance the delivery of primary healthcare. For example, the idea of performing physiological measurements on the face was first postulated by Pavlidis and associates [2] and later demonstrated through analysis of facial thermalvideos. Although non-contact methods may not be able to provide details concerning cardiacelectrical conduction that ECG offers, these methods can now enable long-term monitoring of other physiological signals such as heart rate or respiratory rate by acquiring themcontinuously in an unobtrusive and comfortable manner.Beyond that, such a technology would also minimize the amount of cabling andclutter associated with neonatal ICU monitoring, long-term epilepsy monitoring, burn or trauma patient monitoring, sleep studies, and other cases where a continuous measure of heart-rate is important. The use of photoplethysmography (PPG), a low cost and non-invasivemeans of sensing the cardiovascular pulse wave (also called the blood volume pulse) throughvariations in transmitted or reflected light, for non-contact physiological measurements has been investigated recently. This electro-optic technique can provide valuable informationabout the cardiovascular system such as heart rate, arterial blood oxygen saturation, blood pressure, cardiac output and autonomic function.
Seminar report on solar tree (by Vikas)dreamervikas
Now a days with the growing population and energy demand we should take a renewable option of energy source and also we should keep in mind that energy should not cause pollution and other natural hazards. In this case the solar energy is the best option for us.
so based on solar energy the solar tree is formed and it acquire very less land.
This is a complete report on the topic BLUE BRAIN TECHNOLOGY.It's a very big project and it is also very much costly and IBM is working on this project .EPFL(Ecole Polytechnique Federal de lausaane) working on this and swiss government providing the funding to this project and on movie maker is also making a movie on this technology.There is a youtube channel is also available.
. The main intention of this paper is to reframe the possibilities in healthcare with the aid of Blue Brain
technology. In general, blue brain is usually associated with the preservation of the intelligence of individuals for future. This paper has stepped ahead by describing the other possible solutions that can be provided by implementing the blue brain technology in the medical field. The possibilities in decreasing the demise rates that occur due to the complications in brain have been discussed. The blue brain can be used for monitoring the conditions of the brain, based on which the brain diseases can be diagnosed and cured in advance. In this paper, the details about blue brain, its functions, simulations and up gradation of human brain are explored in depth. The future
enhancements and predictions in the field of blue brain that can benefit the humanity are also being discussed in this paper
The Blue Brain Project is an attempt to reverse engineer the human brain and recreate it at the cellular level inside a computer simulation. The project was founded in May 2005 by Henry Markram at the EPFL in Lausanne, Switzerland. Goals of the project are to gain a complete understanding of the brain and to enable better and faster development of brain disease treatments.
The main aim of this study is to reframe the possibilities in healthcare with the aid of blue brain technology. In general, blue brain is usually associated with the preservation of the intelligence of individuals for the future. This study has stepped ahead by describing the other possible solutions that can be provided by implementing the blue brain technology in the medical field. The possibilities for decreasing the demise rates that occur due to the complications in the brain have been discussed. The blue brain can be used for monitoring the conditions of the brain, based on which the brain diseases can be diagnosed and cured in advance. In this study, the details about the blue brain, its functions, simulations, and upgradations of the human brain are explored in depth. The future enhancements and predictions in the field of the blue brain that can benefit humanity are also being discussed in this study.
Due to availability of internet and evolution of embedded devices, Internet of things can be useful to contribute in energy domain. The Internet of Things (IoT) will deliver a smarter grid to enable more information and connectivity throughout the infrastructure and to homes. Through the IoT, consumers, manufacturers and utility providers will come across new ways to manage devices and ultimately conserve resources and save money by using smart meters, home gateways, smart plugs and connected appliances. The future smart home, various devices will be able to measure and share their energy consumption, and actively participate in house-wide or building wide energy management systems. This paper discusses the different approaches being taken worldwide to connect the smart grid. Full system solutions can be developed by combining hardware and software to address some of the challenges in building a smarter and more connected smart grid.
A Survey Report on : Security & Challenges in Internet of Thingsijsrd.com
In the era of computing technology, Internet of Things (IoT) devices are now popular in each and every domains like e-governance, e-Health, e-Home, e-Commerce, and e-Trafficking etc. Iot is spreading from small to large applications in all fields like Smart Cities, Smart Grids, Smart Transportation. As on one side IoT provide facilities and services for the society. On the other hand, IoT security is also a crucial issues.IoT security is an area which totally concerned for giving security to connected devices and networks in the IoT .As, IoT is vast area with usability, performance, security, and reliability as a major challenges in it. The growth of the IoT is exponentially increases as driven by market pressures, which proportionally increases the security threats involved in IoT The relationship between the security and billions of devices connecting to the Internet cannot be described with existing mathematical methods. In this paper, we explore the opportunities possible in the IoT with security threats and challenges associated with it.
In today’s emerging world of Internet, each and every thing is supposed to be in connected mode with the help of billions of smart devices. By connecting all the devises used in our day to day life, make our life trouble less and easy. We are incorporated in a world where we are used to have smart phones, smart cars, smart gadgets, smart homes and smart cities. Different institutes and researchers are working for creating a smart world for us but real question which we need to emphasis on is how to make dumb devises talk with uncommon hardware and communication technology. For the same what kind of mechanism to use with various protocols and less human interaction. The purpose is to provide the key area for application of IoT and a platform on which various devices having different mechanism and protocols can communicate with an integrated architecture.
Study on Issues in Managing and Protecting Data of IOTijsrd.com
This paper discusses variety of issues for preserving and managing data produced by IoT. Every second large amount of data are added or updated in the IoT databases across the heterogeneous environment. While managing the data each phase of data processing for IoT data is exigent like storing data, querying, indexing, transaction management and failure handling. We also refer to the problem of data integration and protection as data requires to be fit in single layout and travel securely as they arrive in the pool from diversified sources in different structure. Finally, we confer a standardized pathway to manage and to defend data in consistent manner.
Interactive Technologies for Improving Quality of Education to Build Collabor...ijsrd.com
Today with advancement in Information Communication Technology (ICT) the way the education is being delivered is seeing a paradigm shift from boring classroom lectures to interactive applications such as 2-D and 3-D learning content, animations, live videos, response systems, interactive panels, education games, virtual laboratories and collaborative research (data gathering and analysis) etc. Engineering is emerging with more innovative solutions in the field of education and bringing out their innovative products to improve education delivery. The academic institutes which were once hesitant to use such technology are now looking forward to such innovations. They are adopting the new ways as they are realizing the vast benefits of using such methods and technology. The benefits are better comprehensibility, improved learning efficiency of students, and access to vast knowledge resources, geographical reach, quick feedback, accountability and quality research. This paper focuses on how engineering can leverage the latest technology and build a collaborative learning environment which can then be integrated with the national e-learning grid.
Internet of Things - Paradigm Shift of Future Internet Application for Specia...ijsrd.com
In the world more than 15% people are living with disability that also include children below age of 10 years. Due to lack of independent support services specially abled (handicap) people overly rely on other people for their basic needs, that excludes them from being financially and socially active. The Internet of Things (IoT) can give support system and a better quality of life as well as participation in routine and day to day life. For this purpose, the future solutions for current problems has been introduced in this paper. Daunting challenges have been considered as future research and glimpse of the IoT for specially abled person is given in the paper.
A Study of the Adverse Effects of IoT on Student's Lifeijsrd.com
Internet of things (IoT) is the most powerful invention and if used in the positive direction, internet can prove to be very productive. But, now a days, due to the social networking sites such as Face book, WhatsApp, twitter, hike etc. internet is producing adverse effects on the student life, especially those students studying at college Level. As it is rightly said, something which has some positive effects also has some of the negative effects on the other hand. In this article, we are discussing some adverse effects of IoT on student’s life.
Pedagogy for Effective use of ICT in English Language Learningijsrd.com
The use of information and communications technology (ICT) in education is a relatively new phenomenon and it has been the educational researchers' focus of attention for more than two decades. Educators and researchers examine the challenges of using ICT and think of new ways to integrate ICT into the curriculum. However, there are some barriers for the teachers that prevent them to use ICT in the classroom and develop supporting materials through ICT. The purpose of this study is to examine the high school English teachers’ perceptions of the factors discouraging teachers to use ICT in the classroom.
In recent years usage of private vehicles create urban traffic more and more crowded. As result traffic becomes one of the important problems in big cities in all over the world. Some of the traffic concerns are traffic jam and accidents which have caused a huge waste of time, more fuel consumption and more pollution. Time is very important parameter in routine life. The main problem faced by the people is real time routing. Our solution Virtual Eye will provide the current updates as in the real time scenario of the specific route. This research paper presents smart traffic navigation system, based on Internet of Things, which is featured by low cost, high compatibility, easy to upgrade, to replace traditional traffic management system and the proposed system can improve road traffic tremendously.
Ontological Model of Educational Programs in Computer Science (Bachelor and M...ijsrd.com
In this work there is illustrated an ontological model of educational programs in computer science for bachelor and master degrees in Computer science and for master educational program “Computer science as second competence†by Tempus project PROMIS.
Understanding IoT Management for Smart Refrigeratorijsrd.com
Lately the concept of Internet of Things (IoT) is being more elaborated and devices and databases are proposed thereby to meet the need of an Internet of Things scenario. IoT is being considered to be an integral part of smart house where devices will be connected to each other and also react upon certain environmental input. This will eventually include the home refrigerator, air conditioner, lights, heater and such other home appliances. Therefore, we focus our research on the database part for such an IoT’ fridge which we called as smart Fridge. We describe the potentials achievable through a database for an IoT refrigerator to manage the refrigerator food and also aid the creation of a monthly budget of the house for a family. The paper aims at the data management issue based on a proposed design for an intelligent refrigerator leveraging the sensor technology and the wireless communication technology. The refrigerator which identifies products by reading the barcodes or RFID tags is proposed to order the required products by connecting to the Internet. Thus the goal of this paper is to minimize human interaction to maintain the daily life events.
DESIGN AND ANALYSIS OF DOUBLE WISHBONE SUSPENSION SYSTEM USING FINITE ELEMENT...ijsrd.com
Double wishbone designs allow the engineer to carefully control the motion of the wheel throughout suspension travel. 3-D model of the Lower Wishbone Arm is prepared by using CAD software for modal and stress analysis. The forces and moments are used as the boundary conditions for finite element model of the wishbone arm. By using these boundary conditions static analysis is carried out. Then making the load as a function of time; quasi-static analysis of the wishbone arm is carried out. A finite element based optimization is used to optimize the design of lower wishbone arm. Topology optimization and material optimization techniques are used to optimize lower wishbone arm design.
A Review: Microwave Energy for materials processingijsrd.com
Microwave energy is a latest largest growing technique for material processing. This paper presents a review of microwave technologies used for material processing and its use for industrial applications. Advantages in using microwave energy for processing material include rapid heating, high heating efficiency, heating uniformity and clean energy. The microwave heating has various characteristics and due to which it has been become popular for heating low temperature applications to high temperature applications. In recent years this novel technique has been successfully utilized for the processing of metallic materials. Many researchers have reported microwave energy for sintering, joining and cladding of metallic materials. The aim of this paper is to show the use of microwave energy not only for non-metallic materials but also the metallic materials. The ability to process metals with microwave could assist in the manufacturing of high performance metal parts desired in many industries, for example in automotive and aeronautical industries.
Web Usage Mining: A Survey on User's Navigation Pattern from Web Logsijsrd.com
With an expontial growth of World Wide Web, there are so many information overloaded and it became hard to find out data according to need. Web usage mining is a part of web mining, which deal with automatic discovery of user navigation pattern from web log. This paper presents an overview of web mining and also provide navigation pattern from classification and clustering algorithm for web usage mining. Web usage mining contain three important task namely data preprocessing, pattern discovery and pattern analysis based on discovered pattern. And also contain the comparative study of web mining techniques.
APPLICATION OF STATCOM to IMPROVED DYNAMIC PERFORMANCE OF POWER SYSTEMijsrd.com
Application of FACTS controller called Static Synchronous Compensator STATCOM to improve the performance of power grid with Wind Farms is investigated .The essential feature of the STATCOM is that it has the ability to absorb or inject fastly the reactive power with power grid . Therefore the voltage regulation of the power grid with STATCOM FACTS device is achieved. Moreover restoring the stability of the power system having wind farm after occurring severe disturbance such as faults or wind farm mechanical power variation is obtained with STATCOM controller . The dynamic model of the power system having wind farm controlled by proposed STATCOM is developed . To validate the powerful of the STATCOM FACTS controller, the studied power system is simulated and subjected to different severe disturbances. The results prove the effectiveness of the proposed STATCOM controller in terms of fast damping the power system oscillations and restoring the power system stability.
Making model of dual axis solar tracking with Maximum Power Point Trackingijsrd.com
Now a days solar harvesting is more popular. As the popularity become higher the material quality and solar tracking methods are more improved. There are several factors affecting the solar system. Major influence on solar cell, intensity of source radiation and storage techniques The materials used in solar cell manufacturing limit the efficiency of solar cell. This makes it particularly difficult to make considerable improvements in the performance of the cell, and hence restricts the efficiency of the overall collection process. Therefore, the most attainable maximum power point tracking method of improving the performance of solar power collection is to increase the mean intensity of radiation received from the source used. The purposed of tracking system controls elevation and orientation angles of solar panels such that the panels always maintain perpendicular to the sunlight. The measured variables of our automatic system were compared with those of a fixed angle PV system. As a result of the experiment, the voltage generated by the proposed tracking system has an overall of about 28.11% more than the fixed angle PV system. There are three major approaches for maximizing power extraction in medium and large scale systems. They are sun tracking, maximum power point (MPP) tracking or both.
A REVIEW PAPER ON PERFORMANCE AND EMISSION TEST OF 4 STROKE DIESEL ENGINE USI...ijsrd.com
In day today's relevance, it is mandatory to device the usage of diesel in an economic way. In present scenario, the very low combustion efficiency of CI engine leads to poor performance of engine and produces emission due to incomplete combustion. Study of research papers is focused on the improvement in efficiency of the engine and reduction in emissions by adding ethanol in a diesel with different blends like 5%, 10%, 15%, 20%, 25% and 30% by volume. The performance and emission characteristics of the engine are tested observed using blended fuels and comparative assessment is done with the performance and emission characteristics of engine using pure diesel.
Study and Review on Various Current Comparatorsijsrd.com
This paper presents study and review on various current comparators. It also describes low voltage current comparator using flipped voltage follower (FVF) to obtain the single supply voltage. This circuit has short propagation delay and occupies a small chip area as compare to other current comparators. The results of this circuit has obtained using PSpice simulator for 0.18 μm CMOS technology and a comparison has been performed with its non FVF counterpart to contrast its effectiveness, simplicity, compactness and low power consumption.
Reducing Silicon Real Estate and Switching Activity Using Low Power Test Patt...ijsrd.com
Power dissipation is a challenging problem for today's system-on-chip design and test. This paper presents a novel architecture which generates the test patterns with reduced switching activities; it has the advantage of low test power and low hardware overhead. The proposed LP-TPG (test pattern generator) structure consists of modified low power linear feedback shift register (LP-LFSR), m-bit counter, gray counter, NOR-gate structure and XOR-array. The seed generated from LP-LFSR is EXCLUSIVE-OR ed with the data generated from gray code generator. The XOR result of the sequence is single input changing (SIC) sequence, in turn reduces the switching activity and so power dissipation will be very less. The proposed architecture is simulated using Modelsim and synthesized using Xilinx ISE9.2.The Xilinx chip scope tool will be used to test the logic running on FPGA.
Defending Reactive Jammers in WSN using a Trigger Identification Service.ijsrd.com
In the last decade, the greatest threat to the wireless sensor network has been Reactive Jamming Attack because it is difficult to be disclosed and defend as well as due to its mass destruction to legitimate sensor communications. As discussed above about the Reactive Jammers Nodes, a new scheme to deactivate them efficiently is by identifying all trigger nodes, where transmissions invoke the jammer nodes, which has been proposed and developed. Due to this identification mechanism, many existing reactive jamming defending schemes can be benefited. This Trigger Identification can also work as an application layer .In this paper, on one side we provide the several optimization problems to provide complete trigger identification service framework for unreliable wireless sensor networks and on the other side we also provide an improved algorithm with regard to two sophisticated jamming models, in order to enhance its robustness for various network scenarios.
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Embracing GenAI - A Strategic ImperativePeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Pride Month Slides 2024 David Douglas School District
Blue Brain Technology
1. IJSRD - International Journal for Scientific Research & Development| Vol. 2, Issue 08, 2014 | ISSN (online): 2321-0613
All rights reserved by www.ijsrd.com 389
Blue Brain Technology
Amit Dhanvani
Shree Swaminarayan Naimisharanya College of Management & IT Sidsar Road, Bhavnagar, Gujarat,
India
Abstract— Human brain is the most valuable creation of
God. The man is intelligent because of the brain. “Blue
brain” is the name of the world’s first virtual brain. That
means a machine can function as human brain. Today
scientists are in research to create an artificial brain that can
think, response, take decision, and keep anything in
memory. The main aim is to upload human brain into
machine. So that man can think, take decision without any
effort. After the death of the body, the virtual brain will act
as the man .So, even after the death of a person we will not
lose the knowledge, intelligence, personalities, feelings and
memories of that man that can be used for the development
of the human society.
Keywords: Neurons, Sensory System, Supercomputers,
RTNeuron, Neuroscience, Microscopy, Brain Modeling
I. INTRODUCTION
The Blue Brain System is an attempt to reverse engineer the
human brain and recreate it at the cellular level inside a
computer simulation. The project was founded in May 2005
by Henry Markram at the EPFL in Lausanne, Switzerland.
Goals of the project are to gain a complete
understanding of the brain and to enable better and faster
development of brain disease treatments. The research
involves studying slices of living brain tissue using
microscopes and patch clamp electrodes. Data is collected
about all the many different neuron types. This data is used
to build biologically realistic models of neurons and
networks of neurons in the cerebral cortex. The simulations
are carried out on a Blue Gene supercomputer built by IBM,
hence the name "Blue Brain". The simulation software is
based on Michael Hines's NEURON, together with other
custom-built components. As of August 2012 the largest
simulations are of micro circuits containing around 100
cortical columns such simulations involve approximately 1
million neurons and 1 billion synapses. This is about the
same scale as that of a honey bee brain. It is hoped that a rat
brain neocortical simulation (~21 million neurons) will be
achieved by the end of 2014. A full human brain simulation
(86 billion neurons) should be possible by 2023 provided
sufficient funding is received.
II. WHAT IS BLUE BRAIN?
The IBM is now developing a virtual brain known as the
Blue brain. It would be the world’s first virtual brain. Within
30 years, we will be able to scan ourselves into the
computers. We can say it as Virtual Brain i.e. an artificial
brain, which is not actually a natural brain, but can act as a
brain. It can think like brain, take decisions based on the
past experience, and respond as a natural brain. It is possible
by using a super computer, with a huge amount of storage
capacity, processing power and an interface between the
human brain and artificial one. Through this interface the
data
stored in the natural brain can be up loaded into the
computer. So the brain and the knowledge, intelligence of
anyone can be kept and used for ever, even after the death of
the person.
III. NEED OF VIRTUAL BRAIN
Today we are developed because of our intelligence.
Intelligence is the inborn quality that cannot be created
.Some people have this quality, so that they can think up to
such an extent where other cannot reach. Human society is
always in need of such intelligence and such an intelligent
brain to have with. But the intelligence is lost along with the
body after the death. The virtual brain is a solution to it. The
brain and intelligence will be alive even after the death. We
often face difficulties in remembering things such as people
names, their birthdays, and the spellings of words, proper
grammar, important dates, history facts, and etcetera. In the
busy life everyone wants to be relaxed.
Can’t we use any machine to assist for all these?
Virtual brain may be a better solution for it. What will
happen if we upload ourselves into computer, we were
simply aware of a computer, or maybe, what will happen if
we lived in a computer as a program?
IV. HOW IT IS POSSIBLE?
First, it is helpful to describe the basic manners in which a
person may be uploaded into a computer. Raymond
Kurzweil recently provided an interesting paper on this
topic. In it, he describes both invasive and noninvasive
techniques. The most promising is the use of very small
robots, or nanobots. These robots will be small enough to
travel throughout our circulatory systems. Traveling into the
spine and brain, they will be able to monitor the activity and
structure of our central nervous system. They will be able to
provide an interface with computers that is as close as our
mind can be while we still reside in our biological form.
Nanobots could also carefully scan the structure of our
brain, providing a complete readout of the connections
between each neuron. They would also record the current
state of the brain. This information, when entered into a
computer, could then continue to function like us. All that is
required is a computer with large enough storage space and
processing power.
V. WORKING OF NATURAL BRAIN
A. Getting to know more about Human Brain
The brain essentially serves as the body’s information
processing centre. Itreceives signals from sensory neurons
(nerve cell bodies and their axons and dendrites)in the
central and peripheral nervous systems, and in response it
generates and sendsnew signals that instructthe
corresponding parts of the body to move or react in
someway. It also integrates signalsreceived from the body
with signals from adjacent areasof the brain, giving rise to
perceptionand consciousness. The brain weighs about
2. Blue Brain Technology
(IJSRD/Vol. 2/Issue 08/2014/089)
All rights reserved by www.ijsrd.com 390
1,500grams (3 pounds) and constitutes about 2 percent of
total body weight. It consists of three major divisions;
The massive paired hemispheres of the cerebrum
The brainstem, consisting of the thalamus,
hypothalamus, epithalamus, subtha -
lamus,midbrain, pons, and medulla oblongata
The cerebellumThe human ability to feel, interpret
and even see is controlled, in computer like
calculations, by the magical nervous system. The
nervous system is quite like magic because
we can’t see it, but its working through electric
impulses through your body.
One of the worlds most “intricately organized” electron
mechanisms is the nervous system. Not evenengineers have
come close to making circuit boards and computers as
delicate and precise asthe nervous system. To understand
this system, one has to know the three simple functionsthat
it puts into action; sensory input, integration &motor output.
B. Function of Human Brain
1) Sensory Input
When our eyes see something or our hands touch a warm
surface, the sensorycells, alsoknown as Neurons, send a
message straight to your brain. This actionof getting
informationfrom your surrounding environment is called
sensory inputbecause we are putting things inyour brain by
way of your senses.
2) Integration
Integration is bestknown as theinterpretation ofthings
wehavefelt, tasted, andtouched with our sensory cells, also
known as neurons, into responses that the bodyrecognizes.
This process isall accomplished in the brain where many,
many neuronswork together to understand theenvironment.
3) Motor Output
Once our brain has interpreted all that we have learned,
either by touching ,tasting, or usingany other sense, then our
brain sends a message through neurons to effecter cells,
muscle or gland cells, which actually work to perform our
requests and act upon our environment.
4) Nose
Once the smell of food has reached your nose, which is
lined with hairs, it travels to anol factory bulb, a set of
sensory nerves. The nerve impulses travel through the
olfactory tract, around, in a circular way, the thalamus, and
finally to the smell sensory cortex of our brain, located
between our eye and ear, where it is interpreted to be
understood and memorized by the body.
5) Eye
Seeing is one of the most pleasing senses of the nervous
system. This cherished action primarily conducted by the
lens, which magnifies a seen image, vitreous disc, which
bendsand rotates an image against the retina, which
translates the image and light by a set of cells.The retina is
at the back of the eye ball where rods and cones structure
along with other cellsand tissues covert the image into nerve
impulses which are transmitted along the optic nerveto the
brain where it is kept for memory.
6) Tongue
A set of micro scopic buds on the tongue divide everything
we eat and drink into four kindsof taste: bitter, sour, salty,
and sweet. These buds have taste pores, which convert the
tasteinto a nerve impulse and send the impulse to the brain
by a sensory nerve fiber. Upon receiving the message, our
brain classifies the different kinds of taste. This is how we
can refer the taste of one kind of food to another.
7) Ear
Once the sound or sound wave has entered the drum, it goes
to a large structure called thecochlea. In this snail like
structure, the sound waves are divided into pitches. The
vibrationsof the pitches in the cochlea are measured by the
Corti. This organ transmits the vibrationinformation to a
nerve, which sends it to the brain for interpretation and
memory.
VI. WHAT IS A PATCH CLAMP ELECTRODE?
The patch clamp technique is a laboratory
technique in electrophysiology that allows the
study of single or multiple ion channels in cells.
The technique can be applied to a wide variety of
cells, but is especially useful in the study of
excitable cells such as neurons, cardiomyocytes,
muscle fibers and pancreatic beta cells.
It can also be applied to the study of bacterial ion
channels in specially prepared giant spheroplasts.
Patch clamp recording uses, as an electrode, a glass
micropipette that has an open tip diameter of about
one micrometer, a size enclosing a membrane
surface area or "patch" that often contains just one
or a few ion channel molecules. This type of
electrode is sealed onto the surface of the cell
membrane, rather than inserted through it.
In some experiments, the micropipette tip is heated
in a microforge to produce a smooth surface that
3. Blue Brain Technology
(IJSRD/Vol. 2/Issue 08/2014/089)
All rights reserved by www.ijsrd.com 391
assists in forming a high resistance seal with the
cell membrane.
The interior of the pipette is filled with a solution
matching the ionic composition of the bath
solution, as in the case of cell-attached recording,
or the cytoplasm for whole-cell recording.
A chlorided silver wire is placed in contact with
this solution and conducts electric current to the
amplifier. The investigator can change the
composition of this solution or add drugs to study
the ion channels under different conditions.
The micropipette is pressed against a cell
membrane and suction is applied to assist in the
formation of a high resistance seal between the
glass and the cell membrane (a "gigaohm seal" or
"gigaseal," since the electrical resistance of that
seal is in excess of a gigaohm).
The high resistance of this seal makes it possible to
electronically isolate the currents measured across
the membrane patch with little competing noise, as
well as providing some mechanical stability to the
recording.
VII. COMPUTER HARDWARE / SUPERCOMPUTERS
A. Blue Gene/P
The primary machine used by the Blue Brain Project is
a Blue Gene supercomputer built by IBM. This is where the
name "Blue Brain" originates from. IBM agreed in June
2005 to supply EPFL with a Blue Gene/L as a "technology
demonstrator". The IBM press release did not disclose the
terms of the deal. In June 2010 this machine was upgraded
to a Blue Gene/P. The machine is installed on the EPFL
campus in Lausanne (Google map) and is managed
by CADMOS (Center for Advanced Modelling Science).
The computer is used by a number of different research
groups, not exclusively by the Blue Brain Project. In mid-
2012 the BBP was consuming about 20% of the compute
time. The brain simulations generally run all day, and one
day per week (usually Thursdays). The rest of the week is
used to prepare simulations and to analyze the resulting
data. The supercomputer usage statistics and jobr history are
publicly available online - look for the jobs labelled "C-
BPP".
Blue Gene/P technical specifications:
4,096 quad-core nodes (16,384 cores in total)
Each core is a PowerPC 450, 850 MHz
Total: 56 teraflops, 16 terabytes of memory
4 racks, one row, wired as a 16x16x16 3D torus
1 PB of disk space, GPFS parallel file system
Operating system: Linux SuSE SLES 10
Public front end: bluegene.epfl.ch and processing
log
This machine peaked at 99th fastest supercomputer
in the world in November 2009. By June 2011 it had
dropped to 343th in the world. It has since dropped out of
the top 500. See the Blue Gene/P ranking on
the TOP500 list.
B. Silicon Graphics
A 32-processor Silicon Graphics Inc. (SGI) system with 300
Gb of shared memory is used for visualisation of results.
C. Commodity PC clusters
Clusters of commodity PCs have been used for visualisation
tasks with the RTNeuron software. A research
paper published by the BBP team in 2012 describes the
following setup:
11 node cluster, 3.47 GHz processors (Intel Xeon
X5690)
24 GB RAM, 3 Nvidia GeForce GTX 580 GPUs
Full-HD passive stereo display connected to two
GPUs on head node
1 Gbit/s, 10 Gbit/s ethernet, 40 Gbit/s QDR
InfiniBand
It's not known where this cluster is physically
located - either in the BBP lab itself, in an EPFL data center,
or elsewhere.
4. Blue Brain Technology
(IJSRD/Vol. 2/Issue 08/2014/089)
All rights reserved by www.ijsrd.com 392
VIII. INFRASTRUCTURE
A. Main components of the infrastructure
The Blue Brain workflow depends on a large-scale research
infrastructure, providing:
State of the art technology for the acquisition of
data on different levels of brain organization
(multi-patch clamp set-ups for studies of the
electrophysiological behavior of neural circuits,
Multi-electrode Arrays – MEAs allowing
stimulation of and recording from brain slices,
facilities for the creation and study of cell lines
expressing particular ion channels, a variety of
imaging systems, systems for the 3D reconstruction
of neural morphologies);
An IBM 65,536 core Blue Gene/Q supercomputer
for modeling and simulation (hosted at CSCS)
which has extended capabilities for data-intensive
supercomputing (BlueGene Active Storage);
A 40 node analysis & visualization cluster
A data center providing networked servers for use
in data archiving and neuroinformatics.
B. Data acquisition infrastructure
The success of the Blue Brain project depends on very high
volumes of standardized, high Infrastructure provides the
physical equipment necessary for this work. Most of the
experimental equipment is currently made available by the
EPFL Laboratory of Neural Microcircuitry (LNMC). The
planned Human Brain Project, if accepted, will massively
increase the range of data sources.
C. High Performance Computing
The Blue Brain workflow creates enormous demands for
computational power. In Blue Brain cellular level models,
the representation of the detailed electrophysioloy and
communication of a single can require as many as 20,000
differential equations. No modern workstation is capable of
solving this number of equations in biological real time. In
other words, the only way for the project to achieve its goals
is to use High Performance Computing (HPC). The Blue
Brain project’s simulation of the neocortical column
incorporates detailed representations of 30,000 neurons. A
simulation of a whole brain rat model at the same level of
detail would have to represent up to 200 million neurons and
would require approximately 10,000 times more memory.
Simulating the human brain would require yet another
1,000-fold increase in memory and computational power.
Subcellular modeling, modeling of the neuro-glial vascular
system and the creation of virtual instruments (e.g. virtual
EEG, virtual fMRI) will further expand these requirements.
In the initial phase of its work the Blue Brain
project used an IBM BlueGene/L supercomputer with 8,192
processors. Some years ago, it used a 16,384 core IBM
BlueGene/P supercomputer with almost 8 times more
memory than its predecessor. Today, it uses an IBM
BlueGene/Q supercomputer with 65,536 cores and extended
memory capabilities hosted by the Swiss National
Supercomputing Center (CSCS) in Lugano.
D. Neuroinformatics
Neuroinformatics is the second step in the Blue Brain
workflow. The goal is to extract the maximal possible
information from data acquisition in the previous step. To
achieve this goal, the project has designed a set of prototype
workflows supporting the acquisition, curation, databasing,
post-processing and mining of data (protocols, experimental
conditions, results) from Blue Brain experiments and from
the literature.
One of the project’s key strategies will
be Predictive Reverse Engineering.
Biological data at different –omics levels displays
complex cross-level structures and dependencies, amenable
to discovery by informatics-based tools Together, these
dependencies constrain the structure and functionality of
neural circuits, at many different levels. Predictive Reverse
Engineering exploits these constraints to fill in gaps in the
experimental data. Predictive Reverse Engineering has
already been successfully applied in several different areas
(prediction of the spatial distribution of ion channels in 3D
model neurons, prediction of neuronal firing properties from
expression data for a selected set of ion channels, prediction
of synaptic con¬nectivity from neuronal morphology). In
future work, the project will extend the use of Predictive
Reverse Engineering to new domains, including the
prediction of the transcriptome from limited expression data,
and the prediction of data for one species (e.g. humans),
from data collected in other species (rat, mouse, cat,
primates).
5. Blue Brain Technology
(IJSRD/Vol. 2/Issue 08/2014/089)
All rights reserved by www.ijsrd.com 393
E. Workflows
The architecture of the Blue Brain Facility takes
the form of a network of workflows, in which each step in
every workflow is supported by a set of dedicated software
applications. The key steps are:
Neuroscience: systematic, industrial-scale
collection of experimental data making it possible
to describe all possible levels of structural and
functional brain organization from the subcellular,
through the cellular, to the micro-circuit, meso-
circuit and macrocircuit levels;
Neuroinformatics: automated curation and
databasing of data, use of Predictive Reverse
Engineering to predict unknown data from a
smaller sample of known data or from data
describing other levels of brain organization;
Mathematical abstraction: definition of parameters,
variables, equations, algorithms and constraints
representing the structure and functionality of the
brain at different levels of organization;
Modeling: building geometric and computational
models representing different levels of structural
and functional brain organization;
Virtual experiments: use of models for virtual
experiments and exploratory studies requiring -
Experiment Configuration: configuration of the
experiment to exactly define or replicate the
stimulation and recording protocols, initial
conditions, and protocols of a biological
experiment;
Simulation: simulation of the evolution of model
states (firing dynamics, voltages, synaptic strengths
etc.); replication of previous in vivo experiments
(application of specific patterns of simulation,
administration of a drug etc.), design and
implementation of new experiments; -
Visualization: use of advanced techniques to
display the structure and dynamics of simulations
and (in the medium-long term) to interactively
“steer” and “navigate” the simulation;
Analysis: analysis of simulation results, initially for
model validation, subsequently for simulation-
based investigations of brain function and
dysfunction, diagnostic tools and possible
treatments.
The Blue Brain Project (BBP) has implemented
prototype versions of the software applications needed to
support the different steps in the workflow. Each application
or component is composed of a steadily expanding set of
sub-components (e.g. a sub-component to collect a specific
class of data).
Sub-components are developed in a well-defined three-stage
process, beginning with exploration (identification of
required data, estimation of data volumes, study of data
availability), continuing with research (definition of data
representations and their integration in the model,
automation of the data acquisition process) and concluding
with prototype development (definition of software
architecture, implementation of software, implementation of
required workflows). In the rest of this report, we will
clearly indicate the stage of development reached by
different components and subcomponents.
The Blue Brain Project has adopted an incremental
development strategy in which the capabilities of the facility
are gradually enhanced through step-by-step addition of new
subcomponents. The end result will be a completely
transformed facility with the capability to model and
simulate:
The brain or any region of the brain of any species,
at any stage in its development;
Specific pathologies of the brain;
Diagnostic tools and treatments for these
pathologies. The geometric and computational
models of the brain produced by the facility will
reproduce the structural and functional features of
the biological brain with electron microscopic and
molecular dynamic level accuracy.
F. Neuroscience
Data acquisition is the first step in the Blue Brain workflow
and involves different levels of effort and standardization,
from exploratory experiments, collecting preliminary data
and testing techniques, to industrial-scale efforts to collect
large volumes of standardized data.
The goal is to collect multiomics data describing
every different level in the functional and structural
organization of the brain. The project will collect structural
information which includes information on the genome, the
transcriptome, the proteome, the biochemicalome, the
metabolome, the organellome, the cellome, the synaptome,
extracellular space, microcircuits, mesocircuits,
macrocircuits, vasculature, blood, the blood brain barrier,
ventricles, cerebrospinal fluid, and the whole brain.
The information collected will be used to define
parameters and geometric models describing the structural
organization of the brain. Required functional informa¬tion
includes information on gene transcription, protein
translation, cell biology processes, signaling, receptor
functions, biochemical, biophysical and electrochemical
processes and properties, neuronal and synaptic information
processing, micro-meso-macrocircuit information
processing, whole brain information processing,
metabolism, development, adapatation, learning, perception,
cognition, and behavior.
This information will be used to define variables,
equations, computational models and algorithms
representing the brain’s functional organization. Together
the structural and functional information will make it
possible to describe all possible levels of brain organization
6. Blue Brain Technology
(IJSRD/Vol. 2/Issue 08/2014/089)
All rights reserved by www.ijsrd.com 394
from the subcellular, through the cellular, to the
microcircuit, mesocircuit and macro-circuit levels.
G. Virtual experiments
1) Goals
The ultimate goal of the Blue Brain Facility is to enable
neuroscientists to conduct virtual experiments and
exploratory studies testing hypotheses, diagnostic tools and
treatments. This step in the Blue Brain workflow supports
such studies by providing facilities for configuring the
models, setting up virtual experiments and high performance
simulations, visualization and analysis, all based on a single,
unifying data model.
2) Simulation environment
The Blue Brain Facility provides simulation through the
Simulation Environment, based on a Blue Gene/P super-
computer. Any virtual experiment involves the simulation of
biophysical processes at different levels of brain
organization. In the Simulation Environment, these are
represented by models integrating the data acquired and
managed during the first two steps in the Blue Brain
workflow, with the mathematical abstractions developed in
the third step. The Simulation Environment generates
numerical solutions for these models, guaranteeing accurate
representation of causal relationships across different scales
in time and space and allowing users to interact with the
model.
3) Analytics
In silico studies require many different kinds of analytical
data ranging from the time stamp for a single event -to EEG
measurements covering the whole brain. The Analytics
Environmentleverages the Data Model (see below) to
provide a reliable framework for the development of
reusable, potentially interactive tools for analytics. The
framework is designed to support cohosting of simulations
and analytics tools on the same machine.
4) Visualization
The Visualization Environment provides the interface
between the user, modeling tools (Builders) and the
simulation and analytics environments, with which it shares
a common data model. The Visualization Environment
provides users with tools allowing them to display and
navigate the structural, transient and analysis data generated
by these tools and environments. These tools include virtual
instrumentsallowingt users to display in silico brain
structures in the same way they appear during experiments
with biological samples.
RTNeuron:
RTNeuron is the primary application used by the BBP for
visualisation of neural simulations. The software was
developed internally by the BBP team. It is written
in C++ and OpenGL. RTNeuron is ad-hoc software written
specifically for neural simulations, i.e. it is not generalisable
to other types of simulation. RTNeuron takes the output
from Hodgkin-Huxley simulations in NEURON and renders
them in 3D. This allows researchers to watch as activation
potentials propogate through a neuron and between neurons.
The animations can be stopped, started and zoomed, thus
letting researchers interact with the model. The
visualisations are multi-scale, that is they can render
individual neurons or a whole cortical column. The image
right was rendered in RTNeuron.
5) Data model
Modeling and simulating the brain requires the integration
of structural and functional information spanning multiple
orders of magnitude in space and time. The Blue Brain Data
Model, which evolves continuously, provides a unifying
“spatial scaffolding” for mathematical models representing
different aspects of the brain on different spatial and
temporal scales or serving different purposes (e.g.
visualization vs. simulation). All Blue Brain modeling,
simulation, analytics and visualization are based on
instances of the model.
7. Blue Brain Technology
(IJSRD/Vol. 2/Issue 08/2014/089)
All rights reserved by www.ijsrd.com 395
IX. JUQUEEN
JuQUEEN is an IBM Blue Gene/Q supercomputer that was
installed at the Jülich Research Center in Germany in May
2012. It currently performs at 1.6 petaflops and was ranked
the world's 8th fastest supercomputer in June 2012. It's
likely that this machine will be used for BBP simulations
starting in 2013, provided funding is granted via the Human
Brain Project.
In October 2012 the supercomputer is due to be
expanded with additional racks. It is not known exactly how
many racks or what the final processing speed will be.
The JuQUEEN machine is also to be used by
the JuBrain (Jülich Brain Model) research initiative. This
aims to develop a three-dimensional, realistic model of the
human brain. This is currently separate from the Blue Brain
Project but it will become part of the Human Brain Project if
the latter is chosen for EU funding in late 2012.
X. DEEP - DYNAMICAL EXASCALE ENTRY PLATFORM
DEEP (deep-prroject.eu) is an exascale supercomputer to be
built at the Jülich Research Center in Germany. The project
started in December 2011 and is funded by the European
Union's 7th framework programme. The three-year protoype
phase of the project has received €8.5 million. A prototype
supercomputer that will perform at 100 petaflops is hoped to
be built by the end of 2014.
The Blue Brain Project simulations will be
ported to the DEEP prototype to help test the system's
performance. If successful, a future exascale version of this
machine could provide the 1 exaflops of performance
required for a complete human brain simulation by the
2020s.
The DEEP prototype will be built using Intel
MIC (Many Integrated Cores) processors, each of which
contains over 50 cores fabricated with a 22 nmprocess.
These processors were codenamed Knights Corner during
development and subsequently rebranded as Xeon Phi in
June 2012. The processors will be publicly available in late
2012 or early 2013 and will offer just over 1 teraflop of
performance each.
XI. A TIMELINE OF THE BLUE BRAIN
2002 - Henry Markram founds the Brain Mind Institute
(BMI) at EPFL.
2005 - June - EPFL and IBM agree to launch Blue
Brain Project, IBM installs Blue Gene Basic simulation of
single neurons achieved.
2007 - November - modelling and simulation of
first rat cortical column.
2008 - Cortical column construction and
simulations Neocortical column (10,000 cells) Research on
determining position and size of functional cortical columns.
2009 - June - BlueGene/L replaced by BlueGene/P,
doubling of processors Simulations of cortical construction
continue.
2013 - February - decision on Human Brain Project funding
of €1 billion over 10 years from the EU Simulations using
NEURON software ported to the Blue Gene/Q system in
Jülich.
2014 - Cellular-level simulation of the entire rat brain
neocortex, ~100 mesocircuits NEURON simulation
software ported to the DEEP Cluster-Booster prototype
system in Jülich.
2023 - Cellular-level simulation of the entire human brain,
equivalent to 1,000x the size of the rat brain.
XII. ADVANTAGES AND DISADVANTAGES
A. Advantages
(1) We can remember things without any effort.
(2) Decision can be made without the presence of a
person.
(3) Even after the death of a man his intelligence can
be used.
(4) The activity of different animals can be understood.
That means by interpretation of the electric
impulses from the brain of the animals, their
thinking can be understood easily.
(5) It would allow the deaf to hear via direct nerve
stimulation, and also be helpful for many
psychological diseases. By down loading the
contents of the brain that was uploaded into the
computer, the man can get rid from the madness.
8. Blue Brain Technology
(IJSRD/Vol. 2/Issue 08/2014/089)
All rights reserved by www.ijsrd.com 396
B. Disadvantages
Further, there are many new dangers these technologies will
open. We will be susceptible to new forms of harm.
(1) We become dependent upon the computer systems.
(2) Others may use technical knowledge against us.
(3) Computer viruses will pose an increasingly critical
threat.
(4) The real threat, however, is the fear that people will
have of new technologies. That fear may
culminate in a large resistance. Clear evidence of
this type of fear is found today with respect to
human cloning.
C. Applications
(1) Gathering and Testing 100 Years of Data.
(2) Cracking the Neural Code
(3) Understanding Neocortical Information Processing
(4) A Novel Tool for Drug Discovery for Brain
Disorders
(5) A Global Facility
(6) A Foundation for Whole Brain Simulations
(7) A Foundation for Molecular Modeling of Brain
Function
REFERENCES
[1] POULOS, M., RANGOUSSI, M.,
CHRISSIKOPOULOS, V., AND EVANGELOU, A.
Parametric person identification from the EEG using
computational geometry. In The 6th IEEE
International Conference on Electronics, Circuits and
Systems (Sep 1999), vol. 2, pp. 1005 –1008 vol.2.
[2] The Blue Brain Project [online],
<http://bluebrainproject.epfl.ch> (2005).
[3] Shepherd, G. M. & Brayton, R. K. Computer
simulation of a dendrodendritic synaptic circuit for
self- and lateral-inhibition in the olfactory bulb. Brain
Res. 175, 377–382 (1979).
[4] Graham-Rowe, Duncan. “Mission to build a
simulated brain begins”, NewScientist, June 2005. pp.
1879-85.
[5] “Engineering in Medicine and Biology Society”,
2008. EMBS 2008. 30th Annual International
Conference of the IEEE
[6] Henry Markram, “The Blue Brain Project”, Nature
Reviews Neuroscience 2006 February.
[7] “Project Milestones”. Blue Brain.
http://bluebrain.epfl.ch/Jahia/site/bluebrain/op/edit/pi
d/19085
[8] Segev, Idan. "ASC 2012: Prof. Idan Segev - The blue
brain". The Hebrew University of Jerusalem.
Retrieved 31 May 2013.
[9] web|http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1
569491 | Author: Horton, J. C. & Adams, D. L.
Date:2005. Title:The cortical column: A structure
without a function. Journal:Philosophical
Transactions of The Royal Society, PP:837-862. doi:
10.1098/rstb.2005.1623
[10]"Blue Brain Project - IBM has not withdrawn
support". Henry Markram, Project Director as
quoted by IBM Switzerland to Technology Report on
January 19, 2009. Retrieved 2009-04-14.
[11]Ranjan R, Khazen G, Gambazzi L, Ramaswamy S,
Hill SL, Schürmann F, and Markram H (2011).
Channelpedia: an integrative and interactive database
for ion channels, Front. Neuroinform. 5:36. doi:
10.3389/fninf.2011.00036
[12]Sean L. Hilla, Yun Wangb, Imad Riachia, Felix
Schürmanna, and Henry Markram. Statistical
connectivity provides a sufficient foundation for
specific functional connectivity in neocortical neural
microcircuits. PNAS September 18, 2012. doi:
10.1073/pnas.1202128109 (open access) neural
microcircuits