NASA uses custom Eclipse RCP applications to control experiments on the International Space Station from the ground and from within the ISS. The document describes two experiments: 1) Using a smartphone connected to the SPHERES robot to simulate internal inspections of the ISS modules. 2) Using a K10 rover at NASA Ames to simulate deploying a telescope, controlled by an astronaut on the ISS. It provides details on the SPHERES and K10 rover, as well as screenshots and descriptions of the custom Eclipse RCP applications created for these experiments.
NASA VERVE: Interactive 3D Visualization within Eclipsetamarmot
NASA's Intelligent Robotics Group developed NASAVERVE, an interactive 3D visualization tool within the Eclipse IDE. NASAVERVE allows scientists and engineers to remotely visualize rover locations, sensor data, and terrain maps in 3D during field experiments. It integrates the Ardor3D 3D library with the Eclipse RCP framework. NASA uses NASAVERVE to remotely operate rovers on the International Space Station and at sites like Mars, providing situational awareness to distributed teams.
This document describes ArduSat, an open-source nanosatellite based on the CubeSat standard that allows students to design their own satellite experiments using an Arduino sensor suite and collect real-world space data. Two undergraduate physics students tested seven sensors on a prototype "CubeSat" but were unable to upload their code to ArduSat satellites before completing the project. The extensive Arduino sensor suite on ArduSat gives students the opportunity to gain hands-on experience with space technologies and data collection.
This document describes the development of a digital cathode ray oscilloscope (CRO) using LabVIEW. The digital CRO replaces the hardware of a traditional CRO with software and virtual instrumentation. It discusses how LabVIEW was used to graphically program the virtual CRO and provide all the functionality of a physical CRO. The digital CRO allows students to learn about oscilloscope operation and electronic measurement techniques through interactive software instead of requiring physical equipment.
The document discusses SmartSPHERES, which are free-flying robots used on the International Space Station. SmartSPHERES were originally developed at MIT with DARPA funding as testbeds for satellite and free-flying control algorithms. They have been installed on the ISS since 2006. The document describes how SmartSPHERES are being upgraded with smartphone technology to add cameras, sensors, and WiFi capability. It outlines candidate use cases for SmartSPHERES like interior surveys, ground control telepresence, and inventory management. Schematics and photos show how the smartphones are being modified and certified to fly safely on the ISS.
Real-Time Hardware Simulation with Portable Hardware-in-the-Loop (PHIL-Rebooted)Riley Waite
PHIL-Rebooted is an updated version of NASA's Portable Hardware-in-the-Loop (PHIL) system that allows for real-time hardware simulation. It uses an open-source software framework called libSPRITE to simulate satellite instrumentation data and interface with hardware during testing. This makes the system cheaper and more user-friendly than the previous proprietary PHIL system. PHIL-Rebooted simulates spacecraft dynamics and environmental forces using numerical integration and SPICE ephemeris data. It publishes simulated sensor data in real-time for use in testing and hardware-in-the-loop applications.
Conceptual Design of a Crewed Lunar Landerguinness
Conceptual design study performed at Johnson Space Center in summer 2006, in partial fulfillment of the requirements for the Master of Science in Space Studies at the International Space University.
This document outlines a cube satellite project involving an onboard computer (OBC). It discusses designing the OBC to control the satellite's systems from space using Python code. The code would read button presses on the satellite as inputs to trigger LED outputs, representing communication with different satellites. It also discusses building a ground station GUI to communicate with the satellites, display health data, and download payload information like images and sensor readings. The project aims to help students learn Linux commands, satellite systems, and interfacing technologies like LabVIEW and STK.
NASA VERVE: Interactive 3D Visualization within Eclipsetamarmot
NASA's Intelligent Robotics Group developed NASAVERVE, an interactive 3D visualization tool within the Eclipse IDE. NASAVERVE allows scientists and engineers to remotely visualize rover locations, sensor data, and terrain maps in 3D during field experiments. It integrates the Ardor3D 3D library with the Eclipse RCP framework. NASA uses NASAVERVE to remotely operate rovers on the International Space Station and at sites like Mars, providing situational awareness to distributed teams.
This document describes ArduSat, an open-source nanosatellite based on the CubeSat standard that allows students to design their own satellite experiments using an Arduino sensor suite and collect real-world space data. Two undergraduate physics students tested seven sensors on a prototype "CubeSat" but were unable to upload their code to ArduSat satellites before completing the project. The extensive Arduino sensor suite on ArduSat gives students the opportunity to gain hands-on experience with space technologies and data collection.
This document describes the development of a digital cathode ray oscilloscope (CRO) using LabVIEW. The digital CRO replaces the hardware of a traditional CRO with software and virtual instrumentation. It discusses how LabVIEW was used to graphically program the virtual CRO and provide all the functionality of a physical CRO. The digital CRO allows students to learn about oscilloscope operation and electronic measurement techniques through interactive software instead of requiring physical equipment.
The document discusses SmartSPHERES, which are free-flying robots used on the International Space Station. SmartSPHERES were originally developed at MIT with DARPA funding as testbeds for satellite and free-flying control algorithms. They have been installed on the ISS since 2006. The document describes how SmartSPHERES are being upgraded with smartphone technology to add cameras, sensors, and WiFi capability. It outlines candidate use cases for SmartSPHERES like interior surveys, ground control telepresence, and inventory management. Schematics and photos show how the smartphones are being modified and certified to fly safely on the ISS.
Real-Time Hardware Simulation with Portable Hardware-in-the-Loop (PHIL-Rebooted)Riley Waite
PHIL-Rebooted is an updated version of NASA's Portable Hardware-in-the-Loop (PHIL) system that allows for real-time hardware simulation. It uses an open-source software framework called libSPRITE to simulate satellite instrumentation data and interface with hardware during testing. This makes the system cheaper and more user-friendly than the previous proprietary PHIL system. PHIL-Rebooted simulates spacecraft dynamics and environmental forces using numerical integration and SPICE ephemeris data. It publishes simulated sensor data in real-time for use in testing and hardware-in-the-loop applications.
Conceptual Design of a Crewed Lunar Landerguinness
Conceptual design study performed at Johnson Space Center in summer 2006, in partial fulfillment of the requirements for the Master of Science in Space Studies at the International Space University.
This document outlines a cube satellite project involving an onboard computer (OBC). It discusses designing the OBC to control the satellite's systems from space using Python code. The code would read button presses on the satellite as inputs to trigger LED outputs, representing communication with different satellites. It also discusses building a ground station GUI to communicate with the satellites, display health data, and download payload information like images and sensor readings. The project aims to help students learn Linux commands, satellite systems, and interfacing technologies like LabVIEW and STK.
This document describes the design of a low-cost radio receiver system for an unmanned aerial vehicle to track multiple wildlife radio collars from a distance. Software defined radios and small low-noise amplifiers were evaluated and found to improve the receiver's sensitivity and ability to detect collars from over 70 feet away for under $100. Initial field tests validated the potential of using this system as an alternative to manually tracking individual collars. Further field tests are still needed to fully evaluate the system.
This document discusses integrating the Swarmpulse mobile app and website with the Nervousnet framework. The key changes are:
1. The Swarmpulse mobile app will now push data to Nervousnet CORE servers instead of its own servers.
2. The Swarmpulse website will receive data from the Nervousnet CORE servers instead of the old Swarmpulse servers.
3. New features have been added to the Swarmpulse mobile app and website like measuring earthquake intensity using the mobile's accelerometer and showing other users' locations on a map.
Introduction to cosmology and numerical cosmology (with the Cactus code) (2/2)SEENET-MTP
This document discusses using the Cactus code to model cosmological simulations numerically. It introduces the Cosmo and RealSF thorns developed to solve Einstein's equations for cosmological models within the Cactus framework. The Cosmo thorn provides initial data and boundary conditions for the Friedmann-Robertson-Walker metric. The RealSF thorn evolves a scalar field by solving the Klein-Gordon equation. Examples are presented of simulations using these thorns to model pure FRW cosmologies and those with a cosmological constant or scalar field.
This document describes a project to improve the processing and visualization of weather satellite telemetry data from the Ocean Surface Topography Mission (OSTM) satellites. The current tool (Cyclone) is slow, taking a long time to perform queries and generate graphs from the large dataset. The authors developed a new tool that ingests the data into Elasticsearch using Logstash, then allows users to quickly query and visualize the data in an interactive graph through a custom interface. Benchmarking showed their tool was over 5 times faster than Cyclone. While ingestion is still slow, the visualization component provides a quicker and easier experience to replace Cyclone. Future work could implement Apache Spark to improve the speed of data ingestion.
The CoNNeCT project faced several major challenges that threatened its schedule. Requirements were not fully defined, which led to rework. Structural analysis found weak margins, requiring a redesign with more testing. A heritage gimbal design was not suitable and needed significant redesign to meet safety requirements, causing cost growth and schedule delays. Solutions included workshops to finalize requirements, structural testing and redesign, and co-developing a redesigned gimbal with added simulators to prevent schedule impacts. These issues are common on projects and understanding them can help others face similar problems.
Honeybee Robotics is a 25-year old engineering company specializing in robotics for space exploration. It has expertise in sampling and excavation tools, spacecraft mechanisms, and field robotics. Honeybee leverages its experience across government agencies and commercial sectors to reduce costs. It develops technologies for NASA, DoD, and industry by adapting tools to different applications and environments like the moon, Mars, and hazardous sites on Earth.
The document describes the INSpIRe initiative, which aims to establish a remotely operable observatory to study geocoronal hydrogen and other near-space phenomena. The observatory will integrate two Fabry-Perot spectrometers and one spatial heterodyne spectrometer. Researchers from Embry-Riddle Aeronautical University, NASA Goddard Space Flight Center, and the University of Wisconsin-Madison are collaborating on the project. The observatory is currently under development at ERAU and will allow remote observations to investigate important questions about the distribution and variability of atomic hydrogen in the upper atmosphere and exosphere.
Unifying Space Mission Knowledge with NLP & Knowledge GraphVaticle
Synopsis
The number of space missions being designed and launched worldwide is growing exponentially. Information on these missions, such as their objectives, orbit, or payload, is disseminated across various documents and datasets. Facilitating access to this information is key to accelerating the design of future missions, enabling experts to link an application to a mission, and following various stakeholders' activities.
This presentation introduces recent research done at the ESA to combine the latest Language Models with Knowledge Graphs, unifying our knowledge on space missions. Language Models such as GPT-3 and BERT are trained to understand the patterns of human (natural) language. These models have revolutionised the field of NLP, the branch of AI enabling machines to understand human language in all its complexity. In this work, key information on a mission is parsed from documents with the GPT-3 model, and the parsed data is then migrated to a TypeDB Knowledge Graph to be easily queried. Although this work focuses on an application in the space sector, the method can be transferred to other engineering fields.
Presenters
Dr. Audrey Berquand is a Research Fellow at the ESA. Her research aims at enhancing space mission design and knowledge management with text mining, NLP, and Knowledge Graphs. She was awarded her PhD in 2021 from the University of Strathclyde (Scotland) for her thesis on “Text Mining and Natural Language Processing for the Early Stages of Space Mission Design”. Audrey has a background in space systems engineering, she holds an MSc in Aerospace Engineering from the Royal Institute of Technology KTH (Sweden), and a diplôme d'ingénieur from the EPF Graduate School of Engineering (France). Before diving into the world of AI, she spent 3 years at ESA being involved in the early design phases of future Earth Observation missions.
Ana Victória Ladeira works with Knowledge Management at the ESA, using automated methods to exploit the information contained in the piles and piles of documents that ESA generates every day. With a Masters degree in Data Science from Maastricht University, Ana is particularly excited about how NLP methods can help large organizations connect different documents and highlight the bigger picture over a big universe of data sources, as well as using Knowledge Graphs to help connect people to the expertise and information they need.
This document provides a summary of the system design for an autonomous robot. It describes the hardware components available including Lego Mindstorms kits with sensors, motors and structural parts. It also describes the available software tools and programming languages. The document outlines the system model, software and hardware capabilities and compatibility. It details the reusable components from previous projects and the planned software and mechanical structures. It explains the coordinate system, localization, navigation and other methodologies to be used.
Compressed sensing allows for the recovery of sparse signals from fewer samples than required by the Nyquist rate. It works by finding the sparsest solution that is consistent with the observed samples. This is done using l1 norm optimization. The talk overviewed compressed sensing and provided several examples of applications that use it, such as single-pixel cameras, fast MRI, and light field photography. It concluded by discussing practical strategies for implementing compressed sensing using libraries like L1Magic.
Scheduled Scientific Data Releases Using .backup Volumesretsamedoc
How the Mars Space Flight Facility uses (and abuses) the .backup snapshot feature of OpenAFS.
This presentation was given by Chris Kurtz and myself at the 2008 AFS and Kerberos Best Practices Workshop in Newark, NJ.
AstroAccelerate - GPU Accelerated Signal Processing on the Path to the Square...inside-BigData.com
In this deck from the GPU Technology Conference, Wes Armour from the Oxford eResearch Centre discusses the role of GPUs in processing large amounts of astronomical data collected by the Square Kilometre Array and how CUDA is the best suited option for their signal processing software.
During his session at GTC 2019, Armour talked about AstroAccelerate, a GPU enabled software package that uses CUDA and NVIDIA GPUs to achieve real-time processing of radio-astronomy data. He stated that “The massive computational power of modern day GPUs allows code to perform algorithms such as de-dispersion, single pulse searching and Fourier Domain Acceleration Searching in real-time on very large data-sets which are comparable to those which will be produced by next generation radio-telescopes such as the SKA.”
Watch the video: https://wp.me/p3RLHQ-kBv
Learn more: https://www.skatelescope.org/the-ska-project/
and
https://www.nvidia.com/en-us/gtc/
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
Development of image processing based human tracking and control algorithm fo...adarsa lakshmi
The document describes a proposed human detection and obstacle avoidance algorithm for a service robot. An ARM 11-based Raspberry Pi board will be used along with a USB camera and ultrasonic sensor. OpenCV will be used to detect and track humans based on body shape analysis of camera images. The ultrasonic sensor will detect obstacles and allow the robot to avoid them while continuing to track the detected human. The goal is to allow the service robot to follow and track a human while navigating around moving obstacles in an outdoor environment.
Flying to Jupiter with OSGi - Tony Walsh (ESA) & Hristo Indzhov (Telespazio V...mfrancis
OSGi Community Event 2018 Presentation by Tony Walsh (ESA) & Hristo Indzhov (Telespazio Vega)
Abstract: The European Space Operations Centre (ESOC) is the main operations center for the European Space Agency (ESA), operating a number of earth observation and scientific missions. Monitoring and control functions needed by spacecraft operators are provided by software systems which are reused across missions, but tailored and extended for mission specific needs. The current generation of monitoring and control systems are becoming obsolete and a European wide initiative called the European Ground Systems Common Core (EGS-CC) (http://www.egscc.esa.int) has been started to develop the next generation.
This talk will explain why OSGi was chosen and how it is used in the development of next generation of monitoring and control software. It will describe how OSGi provides the necessary framework that enables the software to be extended for the different space systems it is expected to support. The overall software architecture will be discussed, some of the challenges faced and the benefits gained by using OSGi. The first target mission for the system is JUICE (http://sci.esa.int/juice) which will explore the moons of Jupiter and which is scheduled for launch in 2022.
The document discusses the Panel Processing of DebriSat project. DebriSat was a representative low earth orbit satellite that was destroyed in a hypervelocity impact test to generate debris fragments. The fragments were caught by foam panels in the test chamber. This summary discusses how the University of Florida processes these panels using archaeological techniques to maintain fragment integrity, tag locations, and organize large teams of workers. It also summarizes the extensive instrumentation used to document the catastrophic collision of DebriSat during the impact test.
Automatic Altitude Control of Quadroto3isaac chang
The document discusses testing various sensors for automatic altitude control of a quadrotor. It analyzes the built-in pressure sensor and finds it insufficient due to lack of stability and accuracy. Additional distance sensors are considered, including ultrasonic and infrared sensors. The pressure, ultrasonic, and infrared sensors are experimentally tested to evaluate their performance for automatic altitude control. The testing examines measurement accuracy, detection range, and sensitivity to object shape and surface for each sensor.
Want to know what's new with these topics? The PROSSATeam is developing projects and searching for new ideas. Take a look of our progress in Space Wether and these topics!
PSS (Portable Stimulus) is a modeling approach that uses graphs to comprehensively model complex functionality in an understandable way. It enables portable test scenarios for UVM sequence generation and C-based SoC tests across different environments. PSS models are modular, allowing smaller tests to be easily shared and reused. The document compares Cadence Perspec and Breker TrekSoC PSS tools on various parameters like modeling, constraint solving, test intent, libraries, debugging, coverage, and support for the Accellera PSS language standard. In general, Cadence Perspec provides more comprehensive functionality for modeling, constraint solving, test intent specification, debugging and coverage compared to Breker TrekSoC.
This document describes the design of a low-cost radio receiver system for an unmanned aerial vehicle to track multiple wildlife radio collars from a distance. Software defined radios and small low-noise amplifiers were evaluated and found to improve the receiver's sensitivity and ability to detect collars from over 70 feet away for under $100. Initial field tests validated the potential of using this system as an alternative to manually tracking individual collars. Further field tests are still needed to fully evaluate the system.
This document discusses integrating the Swarmpulse mobile app and website with the Nervousnet framework. The key changes are:
1. The Swarmpulse mobile app will now push data to Nervousnet CORE servers instead of its own servers.
2. The Swarmpulse website will receive data from the Nervousnet CORE servers instead of the old Swarmpulse servers.
3. New features have been added to the Swarmpulse mobile app and website like measuring earthquake intensity using the mobile's accelerometer and showing other users' locations on a map.
Introduction to cosmology and numerical cosmology (with the Cactus code) (2/2)SEENET-MTP
This document discusses using the Cactus code to model cosmological simulations numerically. It introduces the Cosmo and RealSF thorns developed to solve Einstein's equations for cosmological models within the Cactus framework. The Cosmo thorn provides initial data and boundary conditions for the Friedmann-Robertson-Walker metric. The RealSF thorn evolves a scalar field by solving the Klein-Gordon equation. Examples are presented of simulations using these thorns to model pure FRW cosmologies and those with a cosmological constant or scalar field.
This document describes a project to improve the processing and visualization of weather satellite telemetry data from the Ocean Surface Topography Mission (OSTM) satellites. The current tool (Cyclone) is slow, taking a long time to perform queries and generate graphs from the large dataset. The authors developed a new tool that ingests the data into Elasticsearch using Logstash, then allows users to quickly query and visualize the data in an interactive graph through a custom interface. Benchmarking showed their tool was over 5 times faster than Cyclone. While ingestion is still slow, the visualization component provides a quicker and easier experience to replace Cyclone. Future work could implement Apache Spark to improve the speed of data ingestion.
The CoNNeCT project faced several major challenges that threatened its schedule. Requirements were not fully defined, which led to rework. Structural analysis found weak margins, requiring a redesign with more testing. A heritage gimbal design was not suitable and needed significant redesign to meet safety requirements, causing cost growth and schedule delays. Solutions included workshops to finalize requirements, structural testing and redesign, and co-developing a redesigned gimbal with added simulators to prevent schedule impacts. These issues are common on projects and understanding them can help others face similar problems.
Honeybee Robotics is a 25-year old engineering company specializing in robotics for space exploration. It has expertise in sampling and excavation tools, spacecraft mechanisms, and field robotics. Honeybee leverages its experience across government agencies and commercial sectors to reduce costs. It develops technologies for NASA, DoD, and industry by adapting tools to different applications and environments like the moon, Mars, and hazardous sites on Earth.
The document describes the INSpIRe initiative, which aims to establish a remotely operable observatory to study geocoronal hydrogen and other near-space phenomena. The observatory will integrate two Fabry-Perot spectrometers and one spatial heterodyne spectrometer. Researchers from Embry-Riddle Aeronautical University, NASA Goddard Space Flight Center, and the University of Wisconsin-Madison are collaborating on the project. The observatory is currently under development at ERAU and will allow remote observations to investigate important questions about the distribution and variability of atomic hydrogen in the upper atmosphere and exosphere.
Unifying Space Mission Knowledge with NLP & Knowledge GraphVaticle
Synopsis
The number of space missions being designed and launched worldwide is growing exponentially. Information on these missions, such as their objectives, orbit, or payload, is disseminated across various documents and datasets. Facilitating access to this information is key to accelerating the design of future missions, enabling experts to link an application to a mission, and following various stakeholders' activities.
This presentation introduces recent research done at the ESA to combine the latest Language Models with Knowledge Graphs, unifying our knowledge on space missions. Language Models such as GPT-3 and BERT are trained to understand the patterns of human (natural) language. These models have revolutionised the field of NLP, the branch of AI enabling machines to understand human language in all its complexity. In this work, key information on a mission is parsed from documents with the GPT-3 model, and the parsed data is then migrated to a TypeDB Knowledge Graph to be easily queried. Although this work focuses on an application in the space sector, the method can be transferred to other engineering fields.
Presenters
Dr. Audrey Berquand is a Research Fellow at the ESA. Her research aims at enhancing space mission design and knowledge management with text mining, NLP, and Knowledge Graphs. She was awarded her PhD in 2021 from the University of Strathclyde (Scotland) for her thesis on “Text Mining and Natural Language Processing for the Early Stages of Space Mission Design”. Audrey has a background in space systems engineering, she holds an MSc in Aerospace Engineering from the Royal Institute of Technology KTH (Sweden), and a diplôme d'ingénieur from the EPF Graduate School of Engineering (France). Before diving into the world of AI, she spent 3 years at ESA being involved in the early design phases of future Earth Observation missions.
Ana Victória Ladeira works with Knowledge Management at the ESA, using automated methods to exploit the information contained in the piles and piles of documents that ESA generates every day. With a Masters degree in Data Science from Maastricht University, Ana is particularly excited about how NLP methods can help large organizations connect different documents and highlight the bigger picture over a big universe of data sources, as well as using Knowledge Graphs to help connect people to the expertise and information they need.
This document provides a summary of the system design for an autonomous robot. It describes the hardware components available including Lego Mindstorms kits with sensors, motors and structural parts. It also describes the available software tools and programming languages. The document outlines the system model, software and hardware capabilities and compatibility. It details the reusable components from previous projects and the planned software and mechanical structures. It explains the coordinate system, localization, navigation and other methodologies to be used.
Compressed sensing allows for the recovery of sparse signals from fewer samples than required by the Nyquist rate. It works by finding the sparsest solution that is consistent with the observed samples. This is done using l1 norm optimization. The talk overviewed compressed sensing and provided several examples of applications that use it, such as single-pixel cameras, fast MRI, and light field photography. It concluded by discussing practical strategies for implementing compressed sensing using libraries like L1Magic.
Scheduled Scientific Data Releases Using .backup Volumesretsamedoc
How the Mars Space Flight Facility uses (and abuses) the .backup snapshot feature of OpenAFS.
This presentation was given by Chris Kurtz and myself at the 2008 AFS and Kerberos Best Practices Workshop in Newark, NJ.
AstroAccelerate - GPU Accelerated Signal Processing on the Path to the Square...inside-BigData.com
In this deck from the GPU Technology Conference, Wes Armour from the Oxford eResearch Centre discusses the role of GPUs in processing large amounts of astronomical data collected by the Square Kilometre Array and how CUDA is the best suited option for their signal processing software.
During his session at GTC 2019, Armour talked about AstroAccelerate, a GPU enabled software package that uses CUDA and NVIDIA GPUs to achieve real-time processing of radio-astronomy data. He stated that “The massive computational power of modern day GPUs allows code to perform algorithms such as de-dispersion, single pulse searching and Fourier Domain Acceleration Searching in real-time on very large data-sets which are comparable to those which will be produced by next generation radio-telescopes such as the SKA.”
Watch the video: https://wp.me/p3RLHQ-kBv
Learn more: https://www.skatelescope.org/the-ska-project/
and
https://www.nvidia.com/en-us/gtc/
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter
Development of image processing based human tracking and control algorithm fo...adarsa lakshmi
The document describes a proposed human detection and obstacle avoidance algorithm for a service robot. An ARM 11-based Raspberry Pi board will be used along with a USB camera and ultrasonic sensor. OpenCV will be used to detect and track humans based on body shape analysis of camera images. The ultrasonic sensor will detect obstacles and allow the robot to avoid them while continuing to track the detected human. The goal is to allow the service robot to follow and track a human while navigating around moving obstacles in an outdoor environment.
Flying to Jupiter with OSGi - Tony Walsh (ESA) & Hristo Indzhov (Telespazio V...mfrancis
OSGi Community Event 2018 Presentation by Tony Walsh (ESA) & Hristo Indzhov (Telespazio Vega)
Abstract: The European Space Operations Centre (ESOC) is the main operations center for the European Space Agency (ESA), operating a number of earth observation and scientific missions. Monitoring and control functions needed by spacecraft operators are provided by software systems which are reused across missions, but tailored and extended for mission specific needs. The current generation of monitoring and control systems are becoming obsolete and a European wide initiative called the European Ground Systems Common Core (EGS-CC) (http://www.egscc.esa.int) has been started to develop the next generation.
This talk will explain why OSGi was chosen and how it is used in the development of next generation of monitoring and control software. It will describe how OSGi provides the necessary framework that enables the software to be extended for the different space systems it is expected to support. The overall software architecture will be discussed, some of the challenges faced and the benefits gained by using OSGi. The first target mission for the system is JUICE (http://sci.esa.int/juice) which will explore the moons of Jupiter and which is scheduled for launch in 2022.
The document discusses the Panel Processing of DebriSat project. DebriSat was a representative low earth orbit satellite that was destroyed in a hypervelocity impact test to generate debris fragments. The fragments were caught by foam panels in the test chamber. This summary discusses how the University of Florida processes these panels using archaeological techniques to maintain fragment integrity, tag locations, and organize large teams of workers. It also summarizes the extensive instrumentation used to document the catastrophic collision of DebriSat during the impact test.
Automatic Altitude Control of Quadroto3isaac chang
The document discusses testing various sensors for automatic altitude control of a quadrotor. It analyzes the built-in pressure sensor and finds it insufficient due to lack of stability and accuracy. Additional distance sensors are considered, including ultrasonic and infrared sensors. The pressure, ultrasonic, and infrared sensors are experimentally tested to evaluate their performance for automatic altitude control. The testing examines measurement accuracy, detection range, and sensitivity to object shape and surface for each sensor.
Want to know what's new with these topics? The PROSSATeam is developing projects and searching for new ideas. Take a look of our progress in Space Wether and these topics!
PSS (Portable Stimulus) is a modeling approach that uses graphs to comprehensively model complex functionality in an understandable way. It enables portable test scenarios for UVM sequence generation and C-based SoC tests across different environments. PSS models are modular, allowing smaller tests to be easily shared and reused. The document compares Cadence Perspec and Breker TrekSoC PSS tools on various parameters like modeling, constraint solving, test intent, libraries, debugging, coverage, and support for the Accellera PSS language standard. In general, Cadence Perspec provides more comprehensive functionality for modeling, constraint solving, test intent specification, debugging and coverage compared to Breker TrekSoC.
1. NASA uses Eclipse RCP applications
for experiments on the
International Space Station
Tamar Cohen
Intelligent Robotics Group
Eclipse RCP for ISS Experiments
Ames Research Center
NASA 1
2. In 2012 – 2013, the Intelligent Robotics Group from NASA Ames
Research Center is conducting 2 experiments with the International
Space Station (ISS)
Experiment 1: Simulate an internal inspection of a module of the ISS using
the free-flying SPHERES robot with an Android Smartphone connected to
it.
Experiment 2: Simulate deployment of a telescope by having an astronaut
on the ISS control the K10 Rover at NASA Ames.
For both of these experiments, the
astronauts will be using a custom
“Workbench” RCP application. These
are all based on Eclipse 3.7.2.
Eclipse RCP for ISS Experiments
2
3. The ISS has a 450 page set of standards for software. This helps
maintain consistency between various software control systems and
helps astronauts with different native languages understand what
various icons mean.
We also have to deal with unique usability issues, such as the fact that
it is very difficult to click and point with a mouse when you are in
zero g.
The only operating systems on the ISS computers with a GUI is
currently Windows XP, so that is our target development platform.
Internally we also use Linux and OSX, so we are doing cross-platform
development.
Since we are developing multiple RCP applications we put common
code into shared plugins. (This is one of the reasons we are still on
Eclipse 3.7.2)
Eclipse RCP for ISS Experiments
3
4. Experiment 1: SPHERES
SPHERES are free-flying satellite robots
typically used for orbital experiments.
SPHERES have been on the ISS since 2006,
and were developed at MIT. (They do not
include a Smartphone).
They use a cold-gas CO2 thruster system
that is very similar to what is used on
paintball guns. The entire system is powered
using double-A batteries. A DSP
microprocessor inside coordinates the
mixing of the thrusters for the desired
movement. The microprocessor also
receives signals from five ultrasonic beacons,
so the SPHERES can know where it is.
The SPHERES microprocessor is already fully taxed with normal SPHERES operations;
we needed to add more processing power and a camera. We determined the most
efficient way to do this was to adapt a Smartphone to work with the SPHERES.
Eclipse RCP for ISS Experiments
4
5. We had to remove the battery
and power it with AA batteries,
put teflon tape over the screen,
and remove the GPS chip, as well
as put it through rigorous
testing. Naturally we use velcro
along with a custom USB cable
to connect it to the SPHERES.
We upmassed it on the last
shuttle launch.
For the SPHERES experiments,
we first controlled the SPHERES
on the ISS from the SPHERES
Smartphone Workbench RCP
application running on Earth.
In the next iteration of this experiment, an astronaut on the ISS will control the
SPHERES using the SPHERES Smartphone Workbench.
Eclipse RCP for ISS Experiments
5
6. SPHERES SPHERES Satellite
Thruster - X
Ultrasonic
receivers
CO2 tank
Pressure Adjustable
gauge regulator
Diameter 8 in (0.2 m)
Mass 7.85 lb (3.56 kg)
Thrust <1 oz (0.2 N) + Z
(single thruster)
CO2 Capacity 6 oz (170g) Satellite
body axes
- Y
Payload Systems Inc
6
Slide Courtesy MIT and the Space Systems Laboratory
7. SPHERES Structural Elements
• Satellite is fully functional without shell
Ultrasonic Thruster
receiver
Aluminum
frame Pressure
gauge
CO2 tank
Battery pack
Payload Systems Inc
7
Slide Courtesy MIT and the Space Systems Laboratory
8. When we are commanding and monitoring robots over this long of a distance, we
have a time delay between when commands are sent and when they are received; we
have to design our software to account for this. We also have to support loss of
signal (LOS) times, when the ISS is unable to communicate with Earth.
We use DDS, a data distribution system, to reliably send data. On computers, we use
RTI’s implementation of DDS, with our own standards, called RAPID, running on top
of that. On the Android we licensed CoreDX DDS libraries.
The SPHERES Smartphone Workbenc (RCP application) talks to the Android
Smartphone, which communicates via USB cable to send commands to the SPHERES,
and report the state back to the SPHERES Smartphone Workbench.
Ground
ISS
SPHERES Smartphone Android
Workbench RCP Smart-
DDS
Application
Phone
USB
SPHERES
Eclipse RCP for ISS Experiments
8
10. Screenshot of the SPHERES Smartphone Workbench
Eclipse RCP for ISS Experiments
10
11. Screenshot of the SPHERES Smartphone Workbench: Previewing Plans
Eclipse RCP for ISS Experiments
11
12. Screenshot of the SPHERES Smartphone Workbench: Running Plans
Eclipse RCP for ISS Experiments
12
13. Screenshot of the SPHERES Smartphone Workbench: Manual Control
Eclipse RCP for ISS Experiments
13
14. How to change the layout of an Eclipse RCP application
Here we are controlling the layout of the RCP application, and constructing the
tabs (CTabFolder) which will control and respond to perspective switching.
public class IssApplicationWorkbenchWindowAdvisor extends WorkbenchWindowAdvisor {!
!@Override!
!public void createWindowContents(Shell shell) {!
! !IWorkbenchWindowConfigurer configurer = getWindowConfigurer();!
! !Menu menu = configurer.createMenuBar();!
! !shell.setMenuBar(menu);!
! !shell.setLayout(new FormLayout());!
! !m_topToolbar = configurer.createCoolBarControl(shell);!
! !m_perspectiveBar = createPerspectiveBarControl(shell);!
! !m_page = configurer.createPageComposite(m_cTabFolder);!
!
! !m_perspectiveRegistry = configurer.getWindow().getWorkbench().getPerspectiveRegistry();!
! !createPerspectiveBarTabs();!
! !!
! !m_rightToolbar = createRightToolbar(shell);!
! !m_statusline = new SimpleStatusLineManager().createControl(shell);!
!
! !// The layout method does the work of connecting the controls together.!
! !layoutNormal();!
!}!
Eclipse RCP for ISS Experiments
14
15. Do the construction and customization of the CTabFolder
!protected Control createPerspectiveBarControl(Composite parent){!
! !m_cTabFolder = new CTabFolder(parent, SWT.TOP) {!
! ! !public int getBorderWidth() { return 10; }!
! !};!
! !setTabFolderFont(m_cTabFolder);!
! !m_cTabFolder.setMinimumCharacters(20);!
! !m_cTabFolder.setTabHeight(40);!
! !m_cTabFolder.setSimple(false);!
! !m_cTabFolder.setBorderVisible(true);!
! !m_cTabFolder.setBackground(ColorProvider.INSTANCE.WIDGET_BACKGROUND);!
! !return m_cTabFolder;!
!}!
A method to create a tab
!protected CTabItem createTabItem(CTabFolder tabFolder, String title, !
! ! ! ! ! Control control, final String id) {!
! !CTabItem tabItem = new CTabItem(tabFolder, SWT.NONE);!
! !tabItem.setText(" " + title + " ");!
! !tabItem.setData(id);!
! !tabItem.setControl(control);!
! !return tabItem;!
!}!
A method to select a perspective
!protected void selectPerspective(String perspectiveID, SelectionEvent e){!
! !IWorkbenchPage page = m_workbenchWindow.getActivePage();!
! !if(page != null) {!
! ! !IPerspectiveDescriptor descriptor = !!
! ! ! !m_perspectiveRegistry.findPerspectiveWithId(perspectiveID);!
! ! !page.setPerspective(descriptor);!
! ! !page.getActivePart().setFocus();!
! !}!
!! Eclipse RCP for ISS Experiments
15
16. Set up the tabs based on defined perspectives
!protected void createPerspectiveBarTabs(){!
! !for (String peID : getPerspectiveExtensionIds()){!
! ! !// automagically read the perspectives contributed by plugin.xml!
! ! !IConfigurationElement[] config = Platform.getExtensionRegistry().!
! ! ! !getConfigurationElementsFor("org.eclipse.ui", "perspectives", peID);!
! ! !for (IConfigurationElement e : config) {!
! ! ! !CTabItem item = createTabItem(m_cTabFolder, !
! ! ! ! !e.getAttribute("name"), m_page, e.getAttribute("id"));!
! ! !}!
! !}!
!!
! !// have the tabs listen for selection and change perspective!
! !final CTabFolder tabFolder = m_cTabFolder;!
! !m_cTabFolder.addSelectionListener(new SelectionListener() {!
! ! !public void widgetSelected(SelectionEvent e) {!
! ! ! !CTabItem tabItem = tabFolder.getSelection();!
! ! ! !String perspectiveID = (String)tabItem.getData();!
! ! ! !selectPerspective(perspectiveID, e);!
! ! ! !tabItem.getControl().setFocus();!
! ! !}!
! !});!
! !!
! !// have the tabs autochange if the perspective changes!
! !m_workbenchWindow.addPerspectiveListener(new PerspectiveAdapter() {!
! ! !public void perspectiveActivated(IWorkbenchPage page,
! ! ! ! ! !IPerspectiveDescriptor perspectiveDescriptor) {!
! ! ! !CTabItem foundTab = getTabForPerspective(perspectiveDescriptor.getId());!
! ! ! !if (foundTab != null){m_cTabFolder.setSelection(foundTab);}!
! ! !}!
! !});!
! !!
! !m_cTabFolder.setSelection(0);!
! !populateTopRightButtons(m_cTabFolder); // this is how we contribute Stop SPHERES button!
! !m_cTabFolder.pack();!
!}!
Eclipse RCP for ISS Experiments
16
17. Recording of the SPHERES experiment
Eclipse RCP for ISS Experiments
17
18. Experiment 2: Surface Telerobotics
Surface Telerobotics will examine how astronauts in the ISS can remotely
operate a surface robot (K10 Rover) across short time delays. We will be
simulating an astronaut teleoperating a rover on the lunar farside to deploy a
low radio frequency telescope.
The telescope is comprised of three arms made of Kapton polyimide film,
which will rolled out behind the rover.
Eclipse RCP for ISS Experiments
18
19. The K10 Rover has been used extensively for robotic and geologic
field research. Our K10 rovers have been to numerous field sites on
Earth including the Haughton Crater on Devon Island, Canada; Black
Point Lava Flow, Arizona; and many sites in California.
K10 has four-wheel drive, all wheel steering and a passive averaging
suspension. The K10 rover’s navigational sensors include a GPS
System, a digital compass, stereo hazard cameras, and an inertial
measurement unit. K10 rovers run Rover Software, which supports
autonomous navigation and obstacle avoidance.
ISS
Ground
Surface Telerobotics K10
Workbench RCP DDS
Rover
Application
Eclipse RCP for ISS Experiments
19
20. The K10 rover can be configured with different scientific instruments. For
this experiment instruments include a custom panoramic camera (GigaPan), a
rear-facing inspection camera to observe telescope deployment, a Velodyne
to examine surface texture and to assess terrain hazards, and of course the
film deployer.
We control K10 rover operations with “route plans” – a sequence of tasks
that include stations, segments and tasks to do along the way. Rover
software does its best to achieve the goals of the route plan, though if there
is an obstacle along the way it may not succeed.
We initially developed VERVE
(discussed at EclipseCon 2011) to
allow rover engineers to visualize
rover status in 3D within an Eclipse
RCP application; the Surface
Telerobotics Workbench includes some
of the VERVE technology and plugins,
and extends it to comply with the ISS
standards.
Eclipse RCP for ISS Experiments
20
22. Rover running a plan; panorama coming in
Eclipse RCP for ISS Experiments
22
23. Manually moving the rover forward and inspecting the film
Eclipse RCP for ISS Experiments
23
24. How to fake buttons
ISS standards require us to create unique rounded “command” buttons, so it is
clear which buttons send important commands. These buttons draw images to
a graphics context, and then render text over them.
public class CommandButton extends Composite {!
!
public CommandButton(Composite parent, int style) {!
!
!super(parent, SWT.NONE);!
!GridLayout gl = new GridLayout(1, false);!
!gl.marginHeight = gl.marginWidth = gl.horizontalSpacing = gl.verticalSpacing = 0;!
!setLayout(gl);!
!m_gridData = new GridData(SWT.FILL, SWT.CENTER, true, false);!
!m_gridData.widthHint = m_gridData.minimumWidth = m_width;!
!m_gridData.heightHint = m_gridData.minimumHeight = m_height;!
!setLayoutData(m_gridData);!
!setSize(m_width, m_height);!
!
!m_buttonLabel = new Canvas(this, SWT.NONE);!
!m_buttonLabel.setSize(m_width, m_height);!
!m_buttonLabel.setLayoutData(m_gridData);!
!m_buttonLabel.addPaintListener(new PaintListener() {!
! !public void paintControl(PaintEvent e) { draw(e.gc); }!
!});!
! !!
! !!
Enabled
Pressed
Disabled
Eclipse RCP for ISS Experiments
24
26. Draw the button
!protected void draw(GC gc) {!
! !int imagey = Math.max(0, (m_buttonLabel.getSize().y - m_height) / 2);!
! !gc.drawImage(m_currentImage, 0, imagey);!
! !Color fg = isEnabled()?ColorProvider.INSTANCE.black:ColorProvider.INSTANCE.darkGray;!
! !gc.setForeground(fg);!
! !Point size = gc.textExtent(m_textString); ! ! !!
! !int x = Math.max(0, (m_width - size.x) / 2);!
! !int y = Math.max(0, (m_buttonLabel.getSize().y - size.y) / 2);!
! !if (m_pressed){!
! ! !x +=3;!
! ! !y +=3;!
! !}!
! !gc.drawText(m_textString, x, y, true);!
! !gc.dispose();!
!}!
We could have gotten fancy with buttons made of multiple images which would
stretch depending on the length of the text, but we didn’t.
Eclipse RCP for ISS Experiments
26
27. Including log4j log messages in the UI
The ISS standards require an error acknowledgement bar in a consistent place in
the upper left. When important errors or alerts come in, there is an “Ack” button
to the right that includes the number of unacknowledged messages.
Users can then pop up the “log” view which shows the time, ack state and
description of messages that came in.
We use Apache’s log4j
framework to log messages,
and OSGi’s LogListener to
read the messages in and
display them in ourLogView.
Eclipse RCP for ISS Experiments
27
28. How we get log4j messages into our Log View
In our code, we just call logger.error(“message”)
public class IssLogView extends ViewPart implements LogListener {!
!
public IssLogView() {!
! !setReader(IssLogService.getInstance().getLogReader());!
! !m_comparator = new LogViewComparator(); // this lets us sort the way we want!
!
! !// add the logger appender to the Apache Log4j framework!
! !Logger.getRootLogger().addAppender(new IssLoggerAppender());!
! !Logger.getLogger(IssLoggerAppender.class.getName()).setAdditivity(false);!
!}!
!
protected class IssLoggerAppender extends AppenderSkeleton {!
!protected void append(LoggingEvent event) {!
! !if (event.getLevel().isGreaterOrEqual(MIN_LEVEL)){!
! ! String status = getEventLevelString(event.getLevel()) + ! ! !
! ! ! ! event.getRenderedMessage() ;!
! ! String ds = LogViewUtils.convertToCorrectDateFormat(event.getTimeStamp());!
!!
! ! // convert this log message from the file to our IssLogEntry class,!
! ! // and contribute it to the view to display in the table.!
! ! processEntry(event.getLevel().toInt(), ds + + status);!
! ! asyncRefresh(true);!
! !}!
!}!
// simple class to hold log entries!
}!
public class IssLogEntry {!
!
!!
!protected String m_time;!
!protected Level m_level;!
!protected String m_description;!
!protected boolean m_ack;!
Eclipse RCP for ISS Experiments
28
29. Intelligent Robotics Group
at NASA Ames Research Center
• K10 Rover among others
• SPHERES
• xGDS Ground Data Systems
• VERVE 3D within Eclipse
• Contributed the moon to Google Earth
• Mars-o-vision (mars.planetary.org)
• GigaPan robotic camera
• GeoCam disaster response
• Ames Stereo Pipeline
• Vision Workbench
• Tensegrity research
… and more!
http://irg.arc.nasa.gov
Eclipse RCP for ISS Experiments
29