• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Empowering active teaching and experimental research apr 2010

Empowering active teaching and experimental research apr 2010



Explore how you, as researcher and teacher, can leverage LabVIEW Graphical System Design for hands-on engineering education as well as advanced research.

Explore how you, as researcher and teacher, can leverage LabVIEW Graphical System Design for hands-on engineering education as well as advanced research.



Total Views
Views on SlideShare
Embed Views



2 Embeds 6

http://www.linkedin.com 4
https://www.linkedin.com 2



Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment
  • 1:13 minThank you . Good morning. I’d like to join in extending a warm welcome to everyone to our first Research and Academic Days in .When I first joined National Instruments around 10 years ago and have had the opportunity to work with a lot of companies across Europe in various different fields such as automotive and aerospace; which have always been on the forefront to in innovation. Over the last decade Research and Academia has become one of our most important and strategic area of engagement. It’s been amazing to see how far you have come – last but not least a great help have been theadvancements in commercial available technology such as fast computer buses, high performance multi core processors and reconfigurable silicon chips such as FPGAs which have given more engineers and researchers access to measurements, computing power and embedded platforms....today we are here to talk about how you can take advantage of these technologies to empower your research and teaching. We’ll do that with the great help of local researchers and educators that will present along with us today.Before I’ll talk more about some case studies and enabling technologies, I’d like to briefly introduce our company for those of you who are not familiar with what National Instruments does.
  • 0:34 minIn preparing my presentation, I came across this quote from Albert-Einstein: “The problems that exist in the world today cannot be solved by the level of thinking that created them”. You as scientists and engineers are tasked with solving the most complex and pressing problems (not the politicians and for sure not the bankers will do that for us). In order to accomplish that, you are driven to always stay atop of latest technologies and tools. You as teachers need to equally stay atop latest of those to keep your courses up-to-date and attractive for students. This quote from Einstein captures this thought well, higher levels of thinking are required to solve many of today’s challenges.
  • 0:25 minThis is as list of what the US National Academy of Engineering names Engineering Grand Challenges. If you look at some of the big research topics Europe is working on, you will find that those are very similar ones. In fact most of those are global topics – energy, medical, transport, safety and security. National Instruments' technology and products have been involved in solving engineering challenges in every single one of these areas around the globe.Source: http://www.engineeringchallenges.org/
  • 0:30Today whether we follow the model to develop a hypothesis that we then empirically investigate or the approach to measure and analyze physical phenomena …In every scientific discipline, researchers are digitizing their knowledge and using computational methods.Challenges are often the missing access to tools for acquiring and processing huge amounts of ad hoc scientific data or – to be able to make the experiment happen – the ability to design sophisticated custom control systems. =====================================================================================================One of the basic scientific principles is to pose questions that can be investigated “empirically”, so that scientific hypothesis can be viable. Traditionally, the scientific method (research) has been built around testable hypotheses for which models (conceptual and theoretical) are developed, and then tested through experiments. Many of today’s scientific measurements and analysis and performed by inference (indirectly measuring and analyzing the physical phenomena), with the help of software-implemented algorithms, powerful computers and high-performance/high-channel count measurement hardware.The “hypothesize -> model -> test” approach: It’s about performing research into questions posed by scientific theories and hypotheses by conducting physical experiments (tests) in the lab and/or in the field.An experiment is “a method of investigating particular types of research questions or solving particular types of problems” Typically, the “hypothesize and model” phases are not tightly integrated with the “test” or experimentation phase.Furthermore, the “hypothesize-model-test” model is not the only approach to scientific research:Measure -> Model -> DiscoverThis “hypothesize, model and test” approach to scientific research is being increasingly challenged by a new one in which large datasets are created from measurements coming from sensors and data acquisition systems deployed everywhere or large channel-count systems. In this new approach, models and applied mathematics are used to “find knowledge” and meaningful information in large sets of data, leading to new scientific discovery and innovation. It’s a process that starts with measurements and ends with new designs. This process can be defined as “designing with measurements”. In both cases, new tools and technologies are required to close the “model-to-experiment” gap with less resources, time and effort.
  • 0:20 minFrom the perspective of acquiring large amounts of data, I’d like to use this example…to learn more about dark matter expansion and the origin of our universe, the Large Synaptic Survey Telescope records 30 Terabyte of data every night over a broad range wavelength.==================================================================Particular scientific goals of the LSST include:Measuring weak gravitational lensing in the deep sky to detect signatures of dark energy and dark matter. Dark mater represents the High-precision measurements of the expansion of the universe are required to understand how the expansion rate changes over time. In general relativity, the evolution of the expansion rate is parameterized by the cosmological equation of state. Measuring the equation of state of dark energy is one of the biggest efforts in observational cosmology today.Mapping small objects in the solar system, particularly near-Earth asteroids and Kuiper belt objects.Detecting transient optical events such as novae and supernovae.Mapping the Milky Way.Some of the data from the LSST (up to 30 Terabytes per night[13]).
  • 0:25 minFrom the perspective of processing data, I’d show you a example for medical research…In this case; in order to perform optical coherence tomography research, a none-invasive form of early skin cancer detection; the requirement is to perform 1.4 million FFTs every second for real-time analysis of reflected light frequencies of a super-luminescent diode that is projected into the tissue of the skin, to produce a 3D scan.===================================================================For vision systems and imaging applications, the message is that multicore processors and the right software soon should enable some pretty interesting uses. An example is an optical coherence tomography system being built by Kohji Obayashi of Kitasato University in Japan. To achieve Obayashi’s goals, the system must do 1.5 million 1000-point fast Fourier transforms a second. Right now, the best possible is about 1 million. But quad core systems are due out soon, with Intel predicting 80-core chips by 2011. So the processing power should be there.Technology: This application uses a new broadband super-luminescent diode that produces bright white light containing all frequencies. This light is projected into tissue and different frequencies penetrate to different depths. As these frequencies are reflected back, the intensity of each frequency varies based on the density of the tissue it encounters. These frequencies are the separated into 256 individual frequencies by photo-de-multiplexers. These are then scanned by 32 PXI-5105’s at 60MS/sand synchronized by NI-TClk to 10s of pico-seconds. Each scan of 256 frequencies represents a single optical scan penetrating 3mm deep into the tissue (called and axial scan). These axial scans were thus taken at 60,000,000 per second. This axial scan is then swept across a 3mm width via a resonance scanner at a rate of 16KHz to create a frame. These frames are then advanced in the 3rd dimension by a galvano mirror.The resultThis application produces the world’s fastest OCT axial scan. The scans are swept at 16,000 per second to produce a frame with 1400 axial scans that are 3mmx3mm. These frames are then swept to create a 3mm cubic image. The cube image has a resolution of 23 um and a 40dB dynamic range at all frequencies.
  • 0:23 minAnd we have an example of a really hard control challenge – controlling the world’s most powerful particle accelerator – The Large Hadron Collider at CERN.More than 100 collimators (devices the control the beam), that are positioned by 500 stepper motors have to be tightly synchronized and controlled with an accuracy of 20 micro seconds – a really hard montion control challenge. ==================================================================More than 100 collimators: “…a device that narrows a beam of particles or waves” [3], to “protect the LHC from uncontrolled particle losses, and absorb energetic particles out of the nominal beam core and to reduce noise to the LHC experiments”. [4]More than 500 stepper motorsPrecise synchronization between collimatorsMotion control with accuracy of 20 µm[3] http://en.wikipedia.org/wiki/Collimator(*) to cause the directions of motion to become more aligned in a specific direction (i.e. collimated or parallel) or to cause the spatial cross section of the beam to become smaller.[3][4] LHC Collimators Low Level Control System, Alessandro Masi, Roberto Losito
  • 0:22So how can we help the researcher of today as well as tomorrow to empower them to effectively and efficiently work on solving Engineering Grand Challenges – providing access to virtually any measurement, scientific computing power and rapid embedded platforms – as a single, integrated and COTS platform.
  • 0:35 minOur so evolved vision we call Graphical System Design where Virtual Instrumentation still plays a key role with heavy investments in expanding measurement capabilities (especially in RF, digital testing, protocol aware-test). We have also added real-time measurements, embedded monitoring and HIL which all require some design elements to complete,and on the right-hand site you see systems for deployment ; mainly FPGA based but also for c code generation. As we have designed, prototyped and deployed test and measurement systems for years, this sort of has been a natural evolution.
  • The key component of Graphical System Design is LabVIEW. LabVIEW has come to cover a tremendous breath ofapplications; ranging from programming LEGO Mindstroms NXT (for which we developed a variant of LabVIEW) to the most sophisticated big physics applications. It’s graphical approach to designing test, measurement and automation systems, has beenwidely adopted amongst engineers and scientists last but not least since it removes complexity that domain expert can focus on their actual task versus learning how to do parallel programming or leverage latest computer buses.
  • LabVIEW’s role has expanded its role further into systems design…added high level design models…Very complete platform for building next generation systems…LabVIEW can target a variety of platforms such as PC based systems, real-time systems, FPGAs and other microprocessors such as ARMs…that allow to connect via I/O modules to the real world – sensors and actuators…
  • Our HW DAQ hardware platform (for teaching) ranges from simple, yet complete, USB DAQ (available at a very affordable price point staring at around 100 USD)to USB platforms that provide connectivity to virtually any signal and come with integrated signal conditioning as well as products for teaching such as NI EVLIS that provide a complete platform for hands-on education in various field of engineering.For as more industrial or research applications are concerned, we offer highly flexible industrial grade systems (cRIO) for rapid design, prototyping and deployment of embedded control and monitoring systems as well as high performance platform PXI (RT, high channel count, RF, mixed signals)…
  • Virtual Instrumentation concept lets you build custom instruments that are design to accommodate the need of your research..One example of what Virtual Instrumentation allows you to is to build what I’d like to call “Unimaginable Instruments”; In this case a PXI based PXI based Acoustic Camera. A 1000+ channel microphone array that maps noise to a 2 or 3D model of the object under test in form of a color shading. The research team could easily find the dominating noise sources in very complex situations (an invaluable tool in localizing which sources should be controlled) to achieve a substantial global noise level reduction”.
  • Energy from Fusion has been a heavily invest research topic for years. To do research we use TOKAMA Fusion Test Reactors. One challenge is the task of controlling the PLASMA’s position and shape in a 1 ms timed loop…Now, in order to do that The Grad-Shafranov equation has to be solved – a two-dimensional, nonlinear, elliptic partial differential equation to maintain the equilibrium of the plasma field.The Grad-Shafranov equation (H. Grad and H. Rubin (1958); Shafranov (1966)) is the equilibrium equation in ideal magnetohydrodynamics (MHD) for a two dimensional plasma, for example the axisymmetrictoroidal plasma in a tokamak. This equation is a two-dimensional, nonlinear, elliptic partial differential equation obtained from the reduction of the ideal MHD equations to two dimensions, often for the case of toroidal axisymmetry (the case relevant in a tokamak). Interestingly the flux function ψ is both a dependent and an independent variable in this equation:
  • Key Point: Industry leaders such as Microsoft and Intel are investing millions of dollars into research of how to program parallel systems. LabVIEW is already ahead and pushing forward.With the advent of multicore processors, Industry leaders are recognizing a tremendous challenge in software – how do program hardware that is becoming increasingly parallel. In the past few months we have heard how Microsoft and Intel are investing tens of millions of dollars to fund parallel programming labs at UC Berkeley. Stanford has gotten million dollar sponsorships from Intel, AMD, Nvidia, Sun, and HP to research what they call the largest problem in computer science today. Recently, Steve Jobs was quoted as saying that Apple was going to hit the pause button on new features for the operating system as they tackled this challenge.This is important for industrial control because PC technology has become an integral part of industrial control and automation.Transition: While this is happening, NI is well prepared to meet this challenge. In fact LabVIEW was designed to be parallel in nature from the ground up…Background notes on quotes1. Microsoft and Intel are committing $20 million dollars to launch parallel programming research labs at UC Berkeley and the University of Illinois at Urbana-Champaign. Press Release: http://www.intel.com/pressroom/archive/releases/20080318corp.htm2. The Pervasive Parallelism Lab (PPL) pools the efforts of many leading Stanford computer scientists and electrical engineers with support from Sun Microsystems, Advanced Micro Devices, NVIDIA, IBM, Hewlett Packard and Intel. The center, with a budget of $6 million over three years, will research and develop a top-to-bottom parallel computing system, stretching from fundamental hardware to new user-friendly programming languages that will allow developers to exploit parallelism automatically.Press Release: http://news-service.stanford.edu/pr/2008/pr-parallel-050708.html3. Steve Jobs quote on NY Times: “We’ve added over a thousand features to Mac OS X in the last five years,” he said Monday in an interview after his presentation. “We’re going to hit the pause button on new features.”Instead, the company is going to focus on what he called “foundational features” that will be the basis for a future version of the operating system.“The way the processor industry is going is to add more and more cores, but nobody knows how to program those things,” he said. “I mean, two, yeah; four, not really; eight, forget it.”http://bits.blogs.nytimes.com/2008/06/10/apple-in-parallel-turning-the-pc-world-upside-down/
  • As we have shared in the past, LabVIEW was designed to be parallel in nature from the ground up… LabVIEW automatically distributes parallel processing diagrams to run on different threads, which can run on different CPU cores, in fact, LabVIEW has supported multithreading for over 10 years.As a LabVIEW programmer, today you have to follow a few simple steps to build systems that take advantage of multi-core processors- Look for tasks or data or both that that can be run in parallel within your code, like in this example- Architect code to reflect this parallelism, which you can inherently do in LabVIEW (So while people in the industry are trying to figure out how to make software parallel we continue to further develop the development of features in LabVIEW to continue to take advantage of multi-core, multithreaded systems.
  • This is a slide our computer science experts at NI have created for us…in simple terms LabVIEW eliminates artificial complexity of programming paralle systems – little interlude; if you asked me to write a LV manual in Chinese, you’d add a lot of artificial complexity to my task. I certainly would be able to write on in . Kind of similar with traditional programming languages and parallel programming. LabVIEW removes this artificial complexity through its graphical approach which makes it inherently parallel!
  • IF it comes to parallel programming we have to also talk about FPGA – as those silicon chips are the ultimate implementation of parallel programming. FPGA –piece of silican with an ocean of transistors and they are all connected together…grouped in logical blocks (ANDs, ORs, NOTs,…) Using LabVIEW you can define a pattern of how these logical blocks connect. This program can run at the speed at which the transistors propagate -> fast, reliable, no OS, It is like having a blank slate of custom circuitry that you can rewire to implement your application.FPGAs are parallel – you can burn as many different parallel paths into the chip as you can fit. With LV FPGA, you can develop applications using the graphical dataflow nature of LV, and map your programs directly onto an FPGA chip.Imagine: PXIe chassis packed with R series cards, each running multiple loops in parallel – ultimate parallel processing.
  • FPGA based I/O can span a wide range of applications such as PWM, custom counters and timers, multiple scan rates, built-in IP processing blocks and so and so forth…
  • FPGA technology is an important component of Graphical System Design…At the heart of our control platforms we find an FPGA chip that connects to real-world (sensors and actuators). This chip is fully programmable with the LabVIEW Graphical Development. This allows for rapid prototyping of embedded systems in software. This technology is available on all of our platforms – ranging from commercial PCs, Industrial Computers to small rugged embedded platforms such as CompactRIO.
  • With this platform we are capable of solving some tough real-time challenges including plasma control in a TOKAMAK fusion test reactor, and controlling the particle beam of the LHC at CERN.
  • Message for slide:ESO: European Southern Observatory – supported by 13 European countries. Current Flagship VLT (Very Large Telescope). It’s first image was of an extra solar planet 173 light-years away and 4 billion times fainter than what we can see with our eyes. Next project: ELT (Extremely Large Telescope). Primary mirror M1 is 4 times bigger than any other mirror (size of half a soccer field). Consists out of 984 segments since one can’t build a single mirror in that size. Each segments is 1.5 meters in diameter and weighs 150 kg. All segments have to be perfectly aligned.Control problem: Solving the 3k (actuators) x 6k (sensors) matrix 500 to 1000 times per second (computing time: 1 to 2 ms) in real-time.NI has already been working with ESO on Data Acquisition and synchronization of such high channel count system as well as on some data processing and control problems and also has been recruited to help with this extraordinary control problem. With LV RT running on a DELL Precision T4700 we currently solve the control problem taking 2ms to compute the 3kx 6k matrix using commercial of the shelf technology.+++++++++++++++++++++++++++++++++++++++++++++++++++Author(s):Jason Spyromilio - European Southern ObservatoryIndustry:Research, Aerospace/AvionicsProducts:LabVIEW, Real-Time ModuleThe Challenge:Using commercial off-the-shelf (COTS) solutions for high-performance computing (HPC) in active and adaptive optics real-time control in extremely large telescopes.The Solution:Combining the NI LabVIEW graphical programming environment with multicore processors to develop a real-time control system and prove that COTS technology can control the optics in the European Extremely Large Telescope (E-ELT), which is currently in the design and prototyping phases."NI engineers proved that we can, in fact, use LabVIEW and the LabVIEW Real-Time Module to implement a COTS-based solution and control multicore computation for real-time results."The European Southern Observatory (ESO) is an astronomical research organization supported by 13 European countries. We have experience developing and deploying some of the world’s most advanced telescopes. Our organization currently operates at three sites in the Chilean Andes – the La Silla, Paranal, and Chajnantor observatories. We have always commanded highly innovative technology, from the first common-user adaptive optics systems at the 3.6 m telescope on La Silla to the deployment of active optics at La Silla’s 3.5 m New Technology Telescope (NTT) to the integrated operation of the Very Large Telescope (VLT) and the associated interferometer at Paranal. In addition, we are collaborating with our North American and East Asian partners in constructing the Atacama Large Millimeter Array (ALMA), a $1 billion (USD) 66-antenna submillimeter telescope scheduled for completion at the Llano de Chajnantor in 2012.The next project on our design board is the E-ELT. The design for this 42 m primary mirror diameter telescope is in phase B and received $100 million (USD) in funding for preliminary design and prototyping. After phase B, construction is expected to start in late 2010.Grand-Scale Active and Adaptive OpticsThe 42 m telescope draws on the ESO and astronomical community experience with active and adaptive optics and segmented mirrors. Active optics incorporates a combination of sensors, actuators, and a control system so that the telescope can maintain the correct mirror shape, or collimation. We actively maintain the correct configuration for the telescope to reduce any residual aberrations in the optical design and increase efficiency and fault tolerance. These telescopes require active optics corrections every minute of the night, so the images are limited only by atmospheric effects.Adaptive optics uses a similar methodology to monitor the atmospheric effects at frequencies of hundreds of hertz and corrects them using a deformed, suitably configured thin mirror. Turbulence scale length determines the number of actuators on these deformable mirrors. The wave front sensors run fast to sample the atmosphere and transform any aberrations to mirror commands. This requires very fast hardware and software.Controlling the complex system requires an extreme amount of processing capability. To control systems deployed in the past, we developed proprietary control systems based on virtual machine environment (VME) real-time control, which can be expensive and time-consuming. We are working with National Instruments engineers to benchmark the control system for the E-ELT primary segmented mirror, called M1, using COTS software and hardware. Together we are also exploring possible COTS-based solutions to the telescope’s adaptive mirror real-time control, called M4.M1 is a segmented mirror that consists of 984 hexagonal mirrors (Figure 1), each weighing nearly 330 lb with diameters between 1.5 and 2 m, for a total 42 m diameter. In comparison, the primary mirror of the Hubble Space Telescope has a 2.4 m diameter. The single primary mirror of the E-ELT alone will measure four times the size of any optical telescope on the earth and incorporate five mirrors (Figure 2).Defining the Extreme Computational Requirements of the Control SystemIn the M1 operation, adjacent mirror segments may tilt with respect to the other segments. We monitor this deviation using edge sensors and actuator legs that can move the segment 3 degrees in any direction when needed. The 984 mirror segments comprise 3,000 actuators and 6,000 sensors (Figure 3).The system, controlled by LabVIEW software, must read the sensors to determine the mirror segment locations and, if the segments move, use the actuators to realign them. LabVIEW  computes a 3,000 by 6,000 matrix by 6,000 vector product and must complete this computation 500 to 1,000 times per second to produce effective mirror adjustments.Sensors and actuators also control the M4 adaptive mirror. However, M4 is a thin deformable mirror – 2.5 m in diameter and spread over 8,000 actuators (Figure 4). This problem is similar to the M1 active control, but instead of retaining the shape, we must adapt the shape based on measured wave front image data. The wave front data maps to a 14,000 value vector, and we must update the 8,000 actuators every few milliseconds, creating a matrix-vector multiply of an 8 by 14 k control matrix by a 14 k vector. Rounding up the computational challenge to 9 by 15 k, this requires about 15 times the  large segmented M1 control computation.We were already working with NI on a high-channel-count data acquisition and synchronization system when they began working on the math and control problem. NI engineers are simulating the layout and designing the control matrix and control loop. At the heart of all these operations is a very large LabVIEW matrix-vector function that executes the bulk of the computation. M1 and M4 control requires enormous computational ability, which we approached with multiple multicore systems. Because M4 control represents 15 3 by 3 k submatrix problems, we require 15 machines that must contain as many cores as possible. Therefore, the control system must command multicore processing. This is a capability that LabVIEW offers using COTS solutions, making a very attractive proposition for this problem.Addressing the Problem with LabVIEW in Multicore HPC FunctionalityBecause we required the control system engineering before the actual E-ELT construction, the system configuration could affect some of the construction characteristics of the telescope. It was critical that we thoroughly test the solution as if it were running the actual telescope. To meet this challenge, NI engineers not only implemented the control system, but also a system that runs a real-time simulation of the M1 mirror to perform a hardware-in-the-loop (HIL) control system test. HIL is a testing method commonly used in automotive and aerospace control design to validate a controller using an accurate, real-time system simulator. NI engineers created an M1 mirror simulator that responds to the control system outputs and validates its performance. The NI team developed the control system and mirror simulation using LabVIEW and deployed it to a multicore PC running the LabVIEW Real-Time Module for deterministic execution.In similar real-time HPC applications, communication and computation tasks are closely related. Failures in the communication system result in whole system failures. Therefore, the entire application development process includes the communication and computation interplay design. NI engineers needed a fast, deterministic data exchange at the core of the system and immediately determined that this application cannot rely on standard Ethernet for communication because the underlying network protocol is nondeterministic. They used the LabVIEW Real-Time Module time-triggered network feature to exchange data between the control system and the M1 mirror simulator, resulting in a network that moves 36 MB/s deterministically.NI developed the full M1 solution that incorporates two Dell Precision T7400 Workstationss, each with eight cores and a notebook that provides an operator interface. It also includes two networks – a standard network that connects both real-time targets to the notebook and a 1 GB time-triggered Ethernet network between the real-time targets for exchanging I/O data (Figure 5).As for system performance, we learned that the controller receives 6,000 sensor values, executes the control algorithm to align the segments, and outputs 3,000 actuator values during each loop. The NI team created this control system to achieve these results and produced a telescope real-time simulation in actual operation called “the mirror.” The mirror receives the 3,000 actuator outputs, adds a variable representative of atmospheric disturbances such as wind, executes the mirror algorithm to simulate M1, and outputs 6,000 sensor values to complete the loop. The entire control loop is completed in less than 1 ms to adequately control the mirror (Figure 6).The benchmarks NI engineers established for their matrix-vector multiplications include the following:LabVIEW Real-Time Module with a machine with two quad-core processors,using four cores and single precision at 0.7 msLabVIEW Real-Time Module with a machine with two quad-core processors, using eight cores and single precision at 0.5 msThe M4 compensates for measured atmospheric wave form aberrations, and NI engineers determined the problem could only be solved using a state-of-the-art, multicore blade system. Dell invited the team to test the solution on its M1000, a 16-blade system (Figure 7), and the test results were encouraging. Each of the M1000 blade machines features eight cores, which translates into the fact that engineers distributed the LabVIEW control problem onto 128 cores.NI engineers proved that we can, in fact, use LabVIEW and the LabVIEW Real-Time Module to implement a COTS-based solution and control multicore computation for real-time results. Because of this performance breakthrough, our team continues to set benchmarks for both computer science and astronomy in E-ELT implementation, which will further scientific advancements as a whole.
  • With regards to the Engineering Grand Challenges, robotics plays an important role in the area of security and safety – robots such as unmanned vehicles might be used in hazardous environment (such as under water or in environmental conditions that put human health at risk) or potentially in dangerous areas such as urban warfare.This is an example system architecture for Robotics…A robot obviously has to sense (distances, colors, movements, sounds,…)…those signals have to processed and decisions have to made that then will result into actions. All this has to happen at the same while operating such as moving along to execute a specific task. Hence like a human being the robot has to perform all these things in parallel and in real-time!
  • I want to follow this up with quote by Dr David Barret. He’s one of the leading robotics experts globally. He currently serves as Director for the SCOPE program at the Boston Olin college which engages students in a significant engineering project under realistic constraints for an actual client. Prior to these he’s served as the VP of Engineering for iRobots.
  • Inspiring students to be creative in designing robots is a good lead into the remaining part of my presentation. What is the roadmap to for a sustainable future…preparing our next generation engineers and scientists for solving engineering grand challenges……preparing them to be competitive on the market by teaching tools that will help them be just that….To do this we have to bring fun back to engineering education and inspire them by teaching with tools that are fun and help students to grasp complex topics more easily…
  • The world in which we are living in will require more from our Engineers of tomorrow than it used to do in the past. There is the technology aspect…Systems are getting more complex – smaller, richer in features and they require a high level of multi-disciplinary knowledge (such as mechatronics, biomedics, green engineering). We have to inspire creativity. Remember, we can’t solve the problems of today with the same level of thinking that created them…More over there is the aspect of becoming a world citizen. The world in which are living is getting smaller. In pretty much every big company or research teams, one is required to cooperate in global teams which requires communication skills as well as a solid multi-cultural background and one has to be ethically grounded.
  • Where we can help is the lack of physical intuition and the understanding of real world constraints…that’s where hands-on teaching comes into play.
  • For example: National Instruments provides students with a complete platform to explore circuit concepts with National Instruments graphical system design and teaching tools. NI Multisim, LabVIEW, and prototyping platforms, such as the NI Educational Laboratory Virtual Instrumentation Suite (NI ELVIS), help your students make the transition from theory to prototype…
  • …around the ELVIS platform third party vendors such as QUANSER, EMONA and others have helped creating a ecosystem of ELVIS accessories (plug-in boards with lab experiments) for teaching various different engineering disciplines ranging from RF communications to newer topics such as bio medical and green engineering.
  • In the past years Graphical System Design has been instrumental to helping students success various areas, ranging from medical assistive systems such as a mind controlled wheelchair to green engineering applications such as fuel cell driven locomotives or the development of technologies for renewable energy production.
  • To keep inspiring students after they have left the laboratory, we have been working on a device that we consider the companion of our ELVIs platform for at home I’m happy to present to you what will become our latest addition to our family of academic products - NI MyDAQ. We are planning on releasing this in the near future. However, we have already brought a prototype. It will feature…2 AI and AO channelsStereo/Audio connectivity8 DIO lines and 1 counterPower supplyIntegrated DMMCome and take a look at it during the breaks…
  • FIRST - The most impactful program in the world in inspiring students to pursue science and engineering careers.FIRST’ mission is to inspire young people to be science and technology leaders, by engaging them in exciting mentor-based programs that build science, engineering and technology skillsand that foster well-rounded life capabilities including self-confidence, communication, and leadership.FIRST has a huge impact not only in pursuing young people to pursue careers in science and engineering but also changing our culture in communities and schools to celebrate success in these areas as they celebrate success in football (soccer) or Hollywood. National Instruments as provide mentors and sponsored teams for many years. National Instruments together with their technology partners has sponsored 1800 teams around the world who have received NI CompactRIO. What was exiting for us was is that 65% have already re-programmed their robots in 2009 with CompactRIO (up from 0% the year before).
  • By removing the artificial complexity,LabVIEW unlocks the potential of the engineer (young and old) by giving access to latest technologies in applications from kindergarten to rocket science.….
  • …and therewith enables researchers as well as students to keep solving Engineering Grand Challenges around the globe to make our world a better place.Thank you very much for your attention. I wish you very a informative day, take advantaged of our technical experts and your colleagues and partners that are presenting today and feel free to approach us any time...

Empowering active teaching and experimental research apr 2010 Empowering active teaching and experimental research apr 2010 Presentation Transcript