SlideShare a Scribd company logo
1 of 10
Download to read offline
Setup Description  
 
The lab setup contains a Biclops Pan­ Tilt unit with a Microsoft Kinect Mounted on Top of it.                                   
Check the documentation of Biclops unit, in particular the following files present in the                           
Biclops/Documentation ​directory : ​Biclops Installation Manual.pdf , ​Biclops User                 
Manual.pdf​, ​PMD Configuration File Description.doc​, ​PMD Programming Ref V1.7.pdf and                   
PMD controller user's guide v1.4.pdf and the following files present in the                       
Biclops​/​Software/Stable/libBiclops/samples directory : ​BiclopsBareBones.cpp​,       
BiclopsHomingCalibration.cpp and ​Biclops_Demo.cpp ​to understand its functionality and               
programming. The application stack contains two ROS packages one for the Biclops unit and                           
a vision package for Kinect unit. The whole setup works as the Kinect uses a state of the art                                     
CMT algorithm for tracking an object using RGB images and the pan tilt unit is used to move                                   
the kinect so that the tracking is continuous.  
 
Installation Instructions  
 
You can download the packages as such from the github or follow the instructions 
 
Biclops Package  
 
Using the libraries given for Biclops unit control we can create our own Biclops ROS package                               
for our application specific needs. Creating a separate package makes the code more                         
modular, which is one of the basic principles behind ROS. 
 
1) Create a ROS package with the name “Biclops” 
2) Create Biclops_node.cpp file in src directory and just put the following code  
int main(){ 
return 0 ; 
} 
3) Copy the Biclops.cpp into the src directory of the Biclops package 
4) Copy the Biclops.h into the include/Biclops directory  
5) Copy the directories libPMD and libUtils into Biclops package 
6) Make the following changes in CMakeLists.txt 
 
a) Add the following lines after find_package(catkin REQUIRED COMPONENTS ) 
 
add_subdirectory(libUtils) 
add_subdirectory(libPMD) 
 
b) Edit the lines under # include_directories(include) to  
 
include_directories(include 
${catkin_INCLUDE_DIRS} ${libUtils_SOURCE_DIR}/include   
${libPMD_SOURCE_DIR}/include  
) 
 
c) Edit the lines under ## Declare a cpp library to  
 
add_library(biclops 
   src/Biclops_node.cpp src/Biclops.cpp 
) 
 
d) Edit the lines under ## Declare a cpp executable to  
 
add_executable(Biclops_node src/Biclops_node.cpp src/Biclops.cpp) 
 
e) Edit the lines under ## Specify libraries to link a library or executable target                           
against to  
 
 target_link_libraries(biclops_node 
   PMD Utils ${catkin_LIBRARIES}  
 ) 
 
The above instructions will get you started with accessing Biclops using ROS packages. To                           
test if everything is configured correctly run catkin_make in your catkin workspace and see if                             
the build is complete without any errors 
 
The biclops unit need a configuration file for its initialization. Copy the                       
Biclops​/​Software/Stable/libBiclops/samples/BiclopsDefault.cfg file into Biclops package.         
Also go through the other configuration files with the file extensions ​.cfg, ​to better understand                             
the parameters required for specific application of Biclops unit. To see if the Biclops unit can                               
be used copy the code from biclops_node_simplehoming.cpp file to Biclops_node and then                       
run catkin_make in your catkin_workspace. Now run the node using  
 
rosrun Biclops Biclops_node
 
Now you could able to see Biclops unit being homed. If you dont see any motion and if you                                     
see communication errors on the command line, make sure that you gave the correct port                             
value in the cfg file  
 
Eg : port /dev/ttyUSB0 
 
You can find your port value using  the following command 
 
ls /dev/ttyUSB*
 
If everything is configured correctly and if still the Biclops unit has communication errors                           
please check the connections of the serial port, if they are broken or intact.  
 
 
ROS Services  
 
Check ROS turtorials on ROS Services and Messages to understand the basic concepts. 
 
http://wiki.ros.org/ROS/Tutorials/CreatingMsgAndSrv 
 
http://wiki.ros.org/ROS/Tutorials/UnderstandingServicesParams 
 
 
 A ros service can be invoked from the command line using  
 
rosservice call /servicename {args}
 
This can be demonstrated with the Biclops unit. We can create a service for homing the                               
Biclops unit from the command line whenever we want. Use the code from                         
biclops_node_homingservice.cpp.  
 
This uses Empty.h header file for service message types and the service callback function                           
has just homing sequence of Biclops. This service can be called using  
 
rosservice call /homing_sequence ‘{}’
 
Note the empty arguemnts as we used empty.h header file without any arguments. 
 
URDF ­ Biclops Model 
 
URDF ­ Unified Robot Description Format is an xml format to describe a robot with various                               
joints and Links. Go through the following tutorials step by step to better understand the                             
concept of URDF and its implementation  
 
http://wiki.ros.org/urdf/Tutorials/ 
 
To understand this, we can visualize the model of Biclops in Rviz.  
 
To use the robot model you need to set one ROS parameter “robot_description”. One way to                               
do it is to use the following lines in a launch file  
 
<param name="robot_description" command="cat $(find
Biclops)/urdf/model.urdf" />
 
Where Biclops is the package name and model.urdf file is inside the urdf directory of ROS                               
package.  
 
You can also set this from the command line 
 
For visualizing the model with joint movements use the following command when you are in                             
Biclops package directory 
 
roslaunch urdf_tutorial display.launch model:=urdf/model.urdf gui:=True
Check the urdf tutorials on ROS documentation to learn more about using the URDF files                             
with real robots. 
 
Biclops Teleop 
 
Check the following basic ROS tutorials to understand the turtlesim package and using                         
tele­operation of the turtle from the keyboard arrow keys 
 
http://wiki.ros.org/ROS/Tutorials/UnderstandingNodes 
 
http://wiki.ros.org/ROS/Tutorials/UnderstandingTopics 
 
We can use the teleop node to move the Biclops Unit. Left and Right arrow keys can be used                                     
to control the PAN motion, Up and Down arrow keys can be used to control the TILT motion.                                   
To implement this check the code in biclops_node_teleop.cpp file 
 
 
Kinect Setup in Linux 
 
To ensure proper working of Kinect in Linux several drivers need to be installed. Please                             
following the instructions for installing the drivers. 
 
Software 
To control the LED and the tilt motor, we will use ​freenect ​library (open source, unofficial): 
■ http://openkinect.org/ 
■ Git​: ​https://github.com/OpenKinect/libfreenect 
To control the video flux and get the depth map, we’ll take ​openNI ​drivers (open source, official): 
■ http://openni.org/ 
■ Git​: ​https://github.com/OpenNI/OpenNI 
We also need ​PrimeSense​(the company that makes the Kinect) sensor module (open source): 
Warning:​ Official PrimeSense driver is not compatible with the Kinect, we need to take a modified version. 
■ http://www.primesense.com/ 
■ Git​: ​https://github.com/avin2/SensorKinect​ (it’s the modified version) 
 
Environment setup 
1­ Download needed libraries/software. 
mkdir KinectLibs; cd KinectLibs 
git clone https://github.com/OpenKinect/libfreenect 
git clone https://github.com/OpenNI/OpenNI 
git clone https://github.com/avin2/SensorKinect 
sudo apt­get install cmake libglut3­dev pkg­config build­essential libxmu­dev libxi­dev libusb­1.0­0­dev                   
python 
 
● if you get ​“Unable to locate package libglut3­dev”, ​use this command instead: 
sudo apt­get install cmake freeglut3­dev pkg­config build­essential libxmu­dev libxi­dev libusb­1.0­0­dev                   
python 
 
sudo add­apt­repository "deb http://archive.canonical.com/ lucid partner" 
sudo apt­get update 
sudo apt­get install sun­java6­jdk 
 
sudo apt­get install doxygen mono­complete graphviz 
1­ Install openKinect (libFreenect) 
# in libfreenect directory, in the KinectLibs dir 
mkdir build 
cd build 
cmake .. 
make 
sudo make install 
sudo ldconfig /usr/local/lib64/ 
● Once libFreenect is installed, plug the Kinect, then set permission to R/W on the usb devices                               
(motor and camera). 
sudo chmod a+rw /dev/bus/usb// 
sudo chmod a+rw /dev/bus/usb// 
lsusb | grep Xbox 
libusb couldn't open USB device /dev/bus/usb/001/006: Permission denied. 
libusb requires write access to USB device nodes. 
● Now, let’s see if everything is correctly setup, just run ​glview,​ you should get something like 
Tip​: you can play a bit with the features with these commands: 
‘w’­tilt up, ‘s’­level, ‘x’­tilt down, ‘0’­‘6′­select LED mode, ‘f’­video format 
On the left there is an openGL representation of the depth map, the pixel color is set according to the point’s distance                                           
to the sensor, on the right you can get the regular RGB camera view, or the infrared one (so you can see the infrared                                               
pattern, switch between them with ‘f’) 
Let’s now have a look on how to setup the gesture recognition libraries. 
2­ Install OpenNi 
We just installed a perfectly fine working library here, that seems to handle all functions of the Kinect, why                                     
would we need another one? 
It’s because of the high level library, NITE, which works only with OpenNi drivers, but the OpenNi drivers (which are                                       
not Kinect specific) can’t control the Kinect motorized tilt or it’s led. So we need both libraries to have full access to                                           
the Kinect. 
So basically: 
● we will use ​libfreenect to control the tilt and the led (so the device ​Xbox NUI Motor​, which also handle                                       
the led). 
● we will use ​OpenNi + Sensor module​ to get the camera streams (the device ​Xbox NUI Camera​) 
● we will use ​NITE libraries in concert with OpenNI to get the high level API (so gesture recognition,                                   
hand/skeleton tracking and so on) 
Note​: ​Xbox NUI Audio​ device is handled by OpenNi, not libfreenect. 
#inOpenNIdirectory,intheultratronikdir
cdPlatform/Linux/CreateRedist
chmod+x./RedistMaker
./RedistMaker
cd../Redist/OpenNI-Bin-Dev-Linux-x64-v1.5.2.23/
sudo./install.sh
Note​: it’s Sensor­Bin­Linux­x64­v5.1.0.25​for me, but might be different for you, there is only one directory in Redist/                                 
anyway, just replace in case the name is wrong. 
4­ Install NITE 
● Download the library according to your system, then just run install.sh as root. that’s it. 
You’re now all set for using the kinect! 
Discover the Kinect potential with the examples 
Go into your NITE directory, then 
cdSamples/Bin/x64-Release;lsSample*
These are the available examples, these cover pretty much all the high level recognition handled by NITE. 
You can find detailed documentation of these functions in NITE/Documentation/ directory, here is just a “quick start”                                 
guide for each example. 
NOTE : In case of installation errors please refer to the following page and the discussions part of                                   
it  
http://www.kdab.com/setting­up­kinect­for­programming­in­linux­part­1/ 
 
 
 
Vision Package  
 
1) Create a ROS package with the name “vision”. and with opencv2 & cv_bridge as                           
package dependencies  
2) Create vision_node.cpp file in src directory and just put the following code  
int main(){ 
return 0 ; 
} 
3) Copy the CMT.cpp into the src directory of the vision package 
4) Copy the CMT.h into the include/vision directory  
5) Make the following changes in CMakeLists.txt 
 
f) Add the following lines after find_package(catkin REQUIRED COMPONENTS ) 
 
find_package(OpenCV REQUIRED) 
find_package(Qt4 REQUIRED COMPONENTS 
  QtCore 
  QtGui 
) 
 
 
g) Edit the lines under # include_directories(include) to  
 
include_directories( include 
${catkin_INCLUDE_DIRS}  
) 
 
h) Edit the lines under ## Declare a cpp executable to  
 
add_executable(vision_node src/vision_node.cpp src/CMT.cpp) 
 
i) Edit the lines under ## Specify libraries to link a library or executable target                           
against to  
 
 target_link_libraries(vision_node 
   ${catkin_LIBRARIES} ${OpenCV_LIBS} 
) 
 
 
The above instructions will get you started with making a vision ROS packages that uses                             
CMT algorithm. To test if everything is configured correctly run catkin_make in your catkin                           
workspace and see if the build is complete without any errors 
 
To understand the CMT algorithm please read the following publication  
 
http://www.gnebehay.com/publications/wacv_2014/wacv_2014.pdf 
 
To see the working of CMT algorithm copy and paste the code from vision_node_simple.cpp                           
to vision_node.cpp of src directory. Run catkin_make in the catking workspace and if there                           
are no errors the package will be built successfully  
 
It is assumed that the kinect is connected and made sure that it is working by running                                 
examples as mentioned before in Kinect installation part. Now run the following command to                           
see CMT algorithm  
 
rosrun vision vision_node.cpp 
 
You will be prompted to enter if you want to select the region of interest manually. Enter “y” if                                     
you want to do it manually. By default the bounding box is at the center of the image frame of                                       
640 X 480 resolution. Then an image screen appears, press enter when the object you want                               
to track is in the field of view. Now use the mouse and draw a rectangular box around the                                     
object of interest and press enter. The CMT algorithm starts tracking and give you center,                             
scale and orientation values of the object being tracked. 
 
ROS Messages  
 
Check ROS turtorials on ROS Services and Messages to understand the basic concepts. In                           
our case we use messages to send out data (center, scale and orientation) over ROS topics.                               
The two types are messages we will look are in the msg directory of vision package with the                                   
names  
 
center.msg  ­ message type for sending the center value alone  
vision.msg ­ message type for sending the center, scale and orientation values                       
together. 
 
Follow the ROS turotials to configure the CMakeLists.txt and package.xml files to                       
implement the custom messages 
 
 
Now copy the code from vision_node_withmsg.cpp file to vision_node.cpp file in src directory                         
to see the center, scale and orientation values being published as ROS Topics. 
 
 
Calibration Details of the Biclops ­ Kinect assembly  
 
The field of view of Kinect is   
 
43​0 ​
 Vertical  
  57​0 ​
 Horizontal  
 
Considering that the kinect is operated with 640 X 480 resolution, the PAN joint calibration is                               
done as follows,  
 
Considering the tracking is done such that the tracked point is always placed in the center of                                 
the field of view of Kinect, the following variables are declared 
 
pan_max_step=320;​  (Half of the 640 pixel resolution) 
pan_max_angle=28.5;​ (Half of the 57​0 ​
 Horizontal) 
 
The values returned from the tracking algorithm results in the X pixel value on a 640 pixel                                 
resolution scale, the variable used in the code is ​center_point.CenterX ​in the                       
biclops_node.cpp ​file 
 
The following code converts the returned pixel values into the corresponding PAN joint value  
 
pan_pos_value=int (center_point.CenterX)­320; ​ (Getting the value on half resolution) 
   
pan_pos_value= ( pan_pos_value / pan_max_step ) * pan_max_angle; 
 
 
A similar calibration is done for the TILT joint. For operating kinect at other resolutions the                               
corresponding variables are to be changed to ensure correct tracking. 
 
 
For any further information or suggestions in the documentation, kindly write to me at  
yeshasvitvs@gmail.com 

More Related Content

What's hot

Understanding and Using Git at Eclipse
Understanding and Using Git at EclipseUnderstanding and Using Git at Eclipse
Understanding and Using Git at EclipseChris Aniszczyk
 
Becoming A Plumber: Building Deployment Pipelines - LISA17
Becoming A Plumber: Building Deployment Pipelines - LISA17Becoming A Plumber: Building Deployment Pipelines - LISA17
Becoming A Plumber: Building Deployment Pipelines - LISA17Daniel Barker
 
Tracing Software Build Processes to Uncover License Compliance Inconsistencies
Tracing Software Build Processes to Uncover License Compliance InconsistenciesTracing Software Build Processes to Uncover License Compliance Inconsistencies
Tracing Software Build Processes to Uncover License Compliance InconsistenciesShane McIntosh
 
Open MPI State of the Union X SC'16 BOF
Open MPI State of the Union X SC'16 BOFOpen MPI State of the Union X SC'16 BOF
Open MPI State of the Union X SC'16 BOFJeff Squyres
 
Using Git Inside Eclipse, Pushing/Cloning from GitHub
Using Git Inside Eclipse, Pushing/Cloning from GitHubUsing Git Inside Eclipse, Pushing/Cloning from GitHub
Using Git Inside Eclipse, Pushing/Cloning from GitHubAboutHydrology Slides
 
Architecting the Future: Abstractions and Metadata - CodeStock
Architecting the Future: Abstractions and Metadata - CodeStockArchitecting the Future: Abstractions and Metadata - CodeStock
Architecting the Future: Abstractions and Metadata - CodeStockDaniel Barker
 
Inroduction to golang
Inroduction to golangInroduction to golang
Inroduction to golangYoni Davidson
 
Embedded Recipes 2019 - Remote update adventures with RAUC, Yocto and Barebox
Embedded Recipes 2019 - Remote update adventures with RAUC, Yocto and BareboxEmbedded Recipes 2019 - Remote update adventures with RAUC, Yocto and Barebox
Embedded Recipes 2019 - Remote update adventures with RAUC, Yocto and BareboxAnne Nicolas
 
EclipseCon 2010 tutorial: Understanding git at Eclipse
EclipseCon 2010 tutorial: Understanding git at EclipseEclipseCon 2010 tutorial: Understanding git at Eclipse
EclipseCon 2010 tutorial: Understanding git at Eclipsemsohn
 
차세대컴파일러, VM의미래: 애플 오픈소스 LLVM
차세대컴파일러, VM의미래: 애플 오픈소스 LLVM차세대컴파일러, VM의미래: 애플 오픈소스 LLVM
차세대컴파일러, VM의미래: 애플 오픈소스 LLVMJung Kim
 
Introduction to llvm
Introduction to llvmIntroduction to llvm
Introduction to llvmTao He
 
Ruby and Rails Packaging to Production
Ruby and Rails Packaging to ProductionRuby and Rails Packaging to Production
Ruby and Rails Packaging to ProductionFabio Kung
 
Architecting the Future: Abstractions and Metadata - STL SilverLinings
Architecting the Future: Abstractions and Metadata - STL SilverLiningsArchitecting the Future: Abstractions and Metadata - STL SilverLinings
Architecting the Future: Abstractions and Metadata - STL SilverLiningsDaniel Barker
 
Tutorial contributing to nf-core
Tutorial contributing to nf-coreTutorial contributing to nf-core
Tutorial contributing to nf-coreGisela Gabernet
 

What's hot (20)

Understanding and Using Git at Eclipse
Understanding and Using Git at EclipseUnderstanding and Using Git at Eclipse
Understanding and Using Git at Eclipse
 
Becoming A Plumber: Building Deployment Pipelines - LISA17
Becoming A Plumber: Building Deployment Pipelines - LISA17Becoming A Plumber: Building Deployment Pipelines - LISA17
Becoming A Plumber: Building Deployment Pipelines - LISA17
 
Tracing Software Build Processes to Uncover License Compliance Inconsistencies
Tracing Software Build Processes to Uncover License Compliance InconsistenciesTracing Software Build Processes to Uncover License Compliance Inconsistencies
Tracing Software Build Processes to Uncover License Compliance Inconsistencies
 
ICSE2011_SRC
ICSE2011_SRC ICSE2011_SRC
ICSE2011_SRC
 
Buildtechs
BuildtechsBuildtechs
Buildtechs
 
Open MPI State of the Union X SC'16 BOF
Open MPI State of the Union X SC'16 BOFOpen MPI State of the Union X SC'16 BOF
Open MPI State of the Union X SC'16 BOF
 
Using Git Inside Eclipse, Pushing/Cloning from GitHub
Using Git Inside Eclipse, Pushing/Cloning from GitHubUsing Git Inside Eclipse, Pushing/Cloning from GitHub
Using Git Inside Eclipse, Pushing/Cloning from GitHub
 
Architecting the Future: Abstractions and Metadata - CodeStock
Architecting the Future: Abstractions and Metadata - CodeStockArchitecting the Future: Abstractions and Metadata - CodeStock
Architecting the Future: Abstractions and Metadata - CodeStock
 
How to Build & Use OpenCL on OpenCV & Android NDK
How to Build & Use OpenCL on OpenCV & Android NDKHow to Build & Use OpenCL on OpenCV & Android NDK
How to Build & Use OpenCL on OpenCV & Android NDK
 
Inroduction to golang
Inroduction to golangInroduction to golang
Inroduction to golang
 
How to Build & Use OpenCL on Android Studio
How to Build & Use OpenCL on Android StudioHow to Build & Use OpenCL on Android Studio
How to Build & Use OpenCL on Android Studio
 
Embedded Recipes 2019 - Remote update adventures with RAUC, Yocto and Barebox
Embedded Recipes 2019 - Remote update adventures with RAUC, Yocto and BareboxEmbedded Recipes 2019 - Remote update adventures with RAUC, Yocto and Barebox
Embedded Recipes 2019 - Remote update adventures with RAUC, Yocto and Barebox
 
EclipseCon 2010 tutorial: Understanding git at Eclipse
EclipseCon 2010 tutorial: Understanding git at EclipseEclipseCon 2010 tutorial: Understanding git at Eclipse
EclipseCon 2010 tutorial: Understanding git at Eclipse
 
차세대컴파일러, VM의미래: 애플 오픈소스 LLVM
차세대컴파일러, VM의미래: 애플 오픈소스 LLVM차세대컴파일러, VM의미래: 애플 오픈소스 LLVM
차세대컴파일러, VM의미래: 애플 오픈소스 LLVM
 
Rogue bundles
Rogue bundlesRogue bundles
Rogue bundles
 
Introduction to llvm
Introduction to llvmIntroduction to llvm
Introduction to llvm
 
Ruby and Rails Packaging to Production
Ruby and Rails Packaging to ProductionRuby and Rails Packaging to Production
Ruby and Rails Packaging to Production
 
Architecting the Future: Abstractions and Metadata - STL SilverLinings
Architecting the Future: Abstractions and Metadata - STL SilverLiningsArchitecting the Future: Abstractions and Metadata - STL SilverLinings
Architecting the Future: Abstractions and Metadata - STL SilverLinings
 
Tutorial contributing to nf-core
Tutorial contributing to nf-coreTutorial contributing to nf-core
Tutorial contributing to nf-core
 
nf-core usage tutorial
nf-core usage tutorialnf-core usage tutorial
nf-core usage tutorial
 

Viewers also liked

Viewers also liked (6)

Amersfoortsestraatweg
AmersfoortsestraatwegAmersfoortsestraatweg
Amersfoortsestraatweg
 
Cause10
Cause10Cause10
Cause10
 
Stretch összefoglaló 2014
Stretch összefoglaló 2014Stretch összefoglaló 2014
Stretch összefoglaló 2014
 
Digitale Co-creatie voor Zorg
Digitale Co-creatie voor ZorgDigitale Co-creatie voor Zorg
Digitale Co-creatie voor Zorg
 
Digitale Co-creatie Verenigingen
Digitale Co-creatie VerenigingenDigitale Co-creatie Verenigingen
Digitale Co-creatie Verenigingen
 
Reklám Helyett. A marketing jövője a hálózatok világában
Reklám Helyett. A marketing jövője a hálózatok világábanReklám Helyett. A marketing jövője a hálózatok világában
Reklám Helyett. A marketing jövője a hálózatok világában
 

Similar to LabDocumentation

Os Grossupdated
Os GrossupdatedOs Grossupdated
Os Grossupdatedoscon2007
 
Shifter singularity - june 7, 2018 - bw symposium
Shifter  singularity - june 7, 2018 - bw symposiumShifter  singularity - june 7, 2018 - bw symposium
Shifter singularity - june 7, 2018 - bw symposiuminside-BigData.com
 
Building an Ionic hybrid mobile app with TypeScript
Building an Ionic hybrid mobile app with TypeScript Building an Ionic hybrid mobile app with TypeScript
Building an Ionic hybrid mobile app with TypeScript Serge van den Oever
 
An Overview of the IHK/McKernel Multi-kernel Operating System
An Overview of the IHK/McKernel Multi-kernel Operating SystemAn Overview of the IHK/McKernel Multi-kernel Operating System
An Overview of the IHK/McKernel Multi-kernel Operating SystemLinaro
 
Investigation report on 64 bit support and some of new features in aosp master
Investigation report on 64 bit support and some of new features in aosp masterInvestigation report on 64 bit support and some of new features in aosp master
Investigation report on 64 bit support and some of new features in aosp masterhidenorly
 
Setting up the hyperledger composer in ubuntu
Setting up the hyperledger composer in ubuntuSetting up the hyperledger composer in ubuntu
Setting up the hyperledger composer in ubuntukesavan N B
 
Cloud native buildpacks_collabnix
Cloud native buildpacks_collabnixCloud native buildpacks_collabnix
Cloud native buildpacks_collabnixSuman Chakraborty
 
Dev opsec dockerimage_patch_n_lifecyclemanagement_
Dev opsec dockerimage_patch_n_lifecyclemanagement_Dev opsec dockerimage_patch_n_lifecyclemanagement_
Dev opsec dockerimage_patch_n_lifecyclemanagement_kanedafromparis
 
Developing MIPS Exploits to Hack Routers
Developing MIPS Exploits to Hack RoutersDeveloping MIPS Exploits to Hack Routers
Developing MIPS Exploits to Hack RoutersOnur Alanbel
 
BBL Premiers pas avec Docker
BBL Premiers pas avec DockerBBL Premiers pas avec Docker
BBL Premiers pas avec Dockerkanedafromparis
 
OpenShift Commons - Adopting Podman, Skopeo and Buildah for Building and Mana...
OpenShift Commons - Adopting Podman, Skopeo and Buildah for Building and Mana...OpenShift Commons - Adopting Podman, Skopeo and Buildah for Building and Mana...
OpenShift Commons - Adopting Podman, Skopeo and Buildah for Building and Mana...Mihai Criveti
 
High Quality Symfony Bundles tutorial - Dutch PHP Conference 2014
High Quality Symfony Bundles tutorial - Dutch PHP Conference 2014High Quality Symfony Bundles tutorial - Dutch PHP Conference 2014
High Quality Symfony Bundles tutorial - Dutch PHP Conference 2014Matthias Noback
 
Architecting the Future: Abstractions and Metadata - BSidesKC
Architecting the Future: Abstractions and Metadata - BSidesKCArchitecting the Future: Abstractions and Metadata - BSidesKC
Architecting the Future: Abstractions and Metadata - BSidesKCDaniel Barker
 
Developing for Industrial IoT with Linux OS on DragonBoard™ 410c: Session 2
Developing for Industrial IoT with Linux OS on DragonBoard™ 410c: Session 2Developing for Industrial IoT with Linux OS on DragonBoard™ 410c: Session 2
Developing for Industrial IoT with Linux OS on DragonBoard™ 410c: Session 2Qualcomm Developer Network
 
Docker and IBM Integration Bus
Docker and IBM Integration BusDocker and IBM Integration Bus
Docker and IBM Integration BusGeza Geleji
 

Similar to LabDocumentation (20)

Os Grossupdated
Os GrossupdatedOs Grossupdated
Os Grossupdated
 
HPC_MPI_CICD.pptx
HPC_MPI_CICD.pptxHPC_MPI_CICD.pptx
HPC_MPI_CICD.pptx
 
Readme
ReadmeReadme
Readme
 
LIGGGHTS installation-guide
LIGGGHTS installation-guideLIGGGHTS installation-guide
LIGGGHTS installation-guide
 
Shifter singularity - june 7, 2018 - bw symposium
Shifter  singularity - june 7, 2018 - bw symposiumShifter  singularity - june 7, 2018 - bw symposium
Shifter singularity - june 7, 2018 - bw symposium
 
Building an Ionic hybrid mobile app with TypeScript
Building an Ionic hybrid mobile app with TypeScript Building an Ionic hybrid mobile app with TypeScript
Building an Ionic hybrid mobile app with TypeScript
 
An Overview of the IHK/McKernel Multi-kernel Operating System
An Overview of the IHK/McKernel Multi-kernel Operating SystemAn Overview of the IHK/McKernel Multi-kernel Operating System
An Overview of the IHK/McKernel Multi-kernel Operating System
 
Investigation report on 64 bit support and some of new features in aosp master
Investigation report on 64 bit support and some of new features in aosp masterInvestigation report on 64 bit support and some of new features in aosp master
Investigation report on 64 bit support and some of new features in aosp master
 
Setting up the hyperledger composer in ubuntu
Setting up the hyperledger composer in ubuntuSetting up the hyperledger composer in ubuntu
Setting up the hyperledger composer in ubuntu
 
Cloud native buildpacks_collabnix
Cloud native buildpacks_collabnixCloud native buildpacks_collabnix
Cloud native buildpacks_collabnix
 
Deployer in Pipelines
Deployer in PipelinesDeployer in Pipelines
Deployer in Pipelines
 
Dev opsec dockerimage_patch_n_lifecyclemanagement_
Dev opsec dockerimage_patch_n_lifecyclemanagement_Dev opsec dockerimage_patch_n_lifecyclemanagement_
Dev opsec dockerimage_patch_n_lifecyclemanagement_
 
Gitops Hands On
Gitops Hands OnGitops Hands On
Gitops Hands On
 
Developing MIPS Exploits to Hack Routers
Developing MIPS Exploits to Hack RoutersDeveloping MIPS Exploits to Hack Routers
Developing MIPS Exploits to Hack Routers
 
BBL Premiers pas avec Docker
BBL Premiers pas avec DockerBBL Premiers pas avec Docker
BBL Premiers pas avec Docker
 
OpenShift Commons - Adopting Podman, Skopeo and Buildah for Building and Mana...
OpenShift Commons - Adopting Podman, Skopeo and Buildah for Building and Mana...OpenShift Commons - Adopting Podman, Skopeo and Buildah for Building and Mana...
OpenShift Commons - Adopting Podman, Skopeo and Buildah for Building and Mana...
 
High Quality Symfony Bundles tutorial - Dutch PHP Conference 2014
High Quality Symfony Bundles tutorial - Dutch PHP Conference 2014High Quality Symfony Bundles tutorial - Dutch PHP Conference 2014
High Quality Symfony Bundles tutorial - Dutch PHP Conference 2014
 
Architecting the Future: Abstractions and Metadata - BSidesKC
Architecting the Future: Abstractions and Metadata - BSidesKCArchitecting the Future: Abstractions and Metadata - BSidesKC
Architecting the Future: Abstractions and Metadata - BSidesKC
 
Developing for Industrial IoT with Linux OS on DragonBoard™ 410c: Session 2
Developing for Industrial IoT with Linux OS on DragonBoard™ 410c: Session 2Developing for Industrial IoT with Linux OS on DragonBoard™ 410c: Session 2
Developing for Industrial IoT with Linux OS on DragonBoard™ 410c: Session 2
 
Docker and IBM Integration Bus
Docker and IBM Integration BusDocker and IBM Integration Bus
Docker and IBM Integration Bus
 

LabDocumentation

  • 1. Setup Description     The lab setup contains a Biclops Pan­ Tilt unit with a Microsoft Kinect Mounted on Top of it.                                    Check the documentation of Biclops unit, in particular the following files present in the                            Biclops/Documentation ​directory : ​Biclops Installation Manual.pdf , ​Biclops User                  Manual.pdf​, ​PMD Configuration File Description.doc​, ​PMD Programming Ref V1.7.pdf and                    PMD controller user's guide v1.4.pdf and the following files present in the                        Biclops​/​Software/Stable/libBiclops/samples directory : ​BiclopsBareBones.cpp​,        BiclopsHomingCalibration.cpp and ​Biclops_Demo.cpp ​to understand its functionality and                programming. The application stack contains two ROS packages one for the Biclops unit and                            a vision package for Kinect unit. The whole setup works as the Kinect uses a state of the art                                      CMT algorithm for tracking an object using RGB images and the pan tilt unit is used to move                                    the kinect so that the tracking is continuous.     Installation Instructions     You can download the packages as such from the github or follow the instructions    Biclops Package     Using the libraries given for Biclops unit control we can create our own Biclops ROS package                                for our application specific needs. Creating a separate package makes the code more                          modular, which is one of the basic principles behind ROS.    1) Create a ROS package with the name “Biclops”  2) Create Biclops_node.cpp file in src directory and just put the following code   int main(){  return 0 ;  }  3) Copy the Biclops.cpp into the src directory of the Biclops package  4) Copy the Biclops.h into the include/Biclops directory   5) Copy the directories libPMD and libUtils into Biclops package  6) Make the following changes in CMakeLists.txt    a) Add the following lines after find_package(catkin REQUIRED COMPONENTS )    add_subdirectory(libUtils)  add_subdirectory(libPMD)    b) Edit the lines under # include_directories(include) to     include_directories(include 
  • 2. ${catkin_INCLUDE_DIRS} ${libUtils_SOURCE_DIR}/include    ${libPMD_SOURCE_DIR}/include   )    c) Edit the lines under ## Declare a cpp library to     add_library(biclops     src/Biclops_node.cpp src/Biclops.cpp  )    d) Edit the lines under ## Declare a cpp executable to     add_executable(Biclops_node src/Biclops_node.cpp src/Biclops.cpp)    e) Edit the lines under ## Specify libraries to link a library or executable target                            against to      target_link_libraries(biclops_node     PMD Utils ${catkin_LIBRARIES}    )    The above instructions will get you started with accessing Biclops using ROS packages. To                            test if everything is configured correctly run catkin_make in your catkin workspace and see if                              the build is complete without any errors    The biclops unit need a configuration file for its initialization. Copy the                        Biclops​/​Software/Stable/libBiclops/samples/BiclopsDefault.cfg file into Biclops package.          Also go through the other configuration files with the file extensions ​.cfg, ​to better understand                              the parameters required for specific application of Biclops unit. To see if the Biclops unit can                                be used copy the code from biclops_node_simplehoming.cpp file to Biclops_node and then                        run catkin_make in your catkin_workspace. Now run the node using     rosrun Biclops Biclops_node   Now you could able to see Biclops unit being homed. If you dont see any motion and if you                                      see communication errors on the command line, make sure that you gave the correct port                              value in the cfg file     Eg : port /dev/ttyUSB0    You can find your port value using  the following command    ls /dev/ttyUSB*
  • 3.   If everything is configured correctly and if still the Biclops unit has communication errors                            please check the connections of the serial port, if they are broken or intact.       ROS Services     Check ROS turtorials on ROS Services and Messages to understand the basic concepts.    http://wiki.ros.org/ROS/Tutorials/CreatingMsgAndSrv    http://wiki.ros.org/ROS/Tutorials/UnderstandingServicesParams       A ros service can be invoked from the command line using     rosservice call /servicename {args}   This can be demonstrated with the Biclops unit. We can create a service for homing the                                Biclops unit from the command line whenever we want. Use the code from                          biclops_node_homingservice.cpp.     This uses Empty.h header file for service message types and the service callback function                            has just homing sequence of Biclops. This service can be called using     rosservice call /homing_sequence ‘{}’   Note the empty arguemnts as we used empty.h header file without any arguments.    URDF ­ Biclops Model    URDF ­ Unified Robot Description Format is an xml format to describe a robot with various                                joints and Links. Go through the following tutorials step by step to better understand the                              concept of URDF and its implementation     http://wiki.ros.org/urdf/Tutorials/    To understand this, we can visualize the model of Biclops in Rviz.     To use the robot model you need to set one ROS parameter “robot_description”. One way to                                do it is to use the following lines in a launch file    
  • 4. <param name="robot_description" command="cat $(find Biclops)/urdf/model.urdf" />   Where Biclops is the package name and model.urdf file is inside the urdf directory of ROS                                package.     You can also set this from the command line    For visualizing the model with joint movements use the following command when you are in                              Biclops package directory    roslaunch urdf_tutorial display.launch model:=urdf/model.urdf gui:=True Check the urdf tutorials on ROS documentation to learn more about using the URDF files                              with real robots.    Biclops Teleop    Check the following basic ROS tutorials to understand the turtlesim package and using                          tele­operation of the turtle from the keyboard arrow keys    http://wiki.ros.org/ROS/Tutorials/UnderstandingNodes    http://wiki.ros.org/ROS/Tutorials/UnderstandingTopics    We can use the teleop node to move the Biclops Unit. Left and Right arrow keys can be used                                      to control the PAN motion, Up and Down arrow keys can be used to control the TILT motion.                                    To implement this check the code in biclops_node_teleop.cpp file      Kinect Setup in Linux    To ensure proper working of Kinect in Linux several drivers need to be installed. Please                              following the instructions for installing the drivers.    Software  To control the LED and the tilt motor, we will use ​freenect ​library (open source, unofficial):  ■ http://openkinect.org/ 
  • 5. ■ Git​: ​https://github.com/OpenKinect/libfreenect  To control the video flux and get the depth map, we’ll take ​openNI ​drivers (open source, official):  ■ http://openni.org/  ■ Git​: ​https://github.com/OpenNI/OpenNI  We also need ​PrimeSense​(the company that makes the Kinect) sensor module (open source):  Warning:​ Official PrimeSense driver is not compatible with the Kinect, we need to take a modified version.  ■ http://www.primesense.com/  ■ Git​: ​https://github.com/avin2/SensorKinect​ (it’s the modified version)    Environment setup  1­ Download needed libraries/software.  mkdir KinectLibs; cd KinectLibs  git clone https://github.com/OpenKinect/libfreenect  git clone https://github.com/OpenNI/OpenNI  git clone https://github.com/avin2/SensorKinect  sudo apt­get install cmake libglut3­dev pkg­config build­essential libxmu­dev libxi­dev libusb­1.0­0­dev                    python    ● if you get ​“Unable to locate package libglut3­dev”, ​use this command instead:  sudo apt­get install cmake freeglut3­dev pkg­config build­essential libxmu­dev libxi­dev libusb­1.0­0­dev                    python    sudo add­apt­repository "deb http://archive.canonical.com/ lucid partner"  sudo apt­get update  sudo apt­get install sun­java6­jdk   
  • 6. sudo apt­get install doxygen mono­complete graphviz  1­ Install openKinect (libFreenect)  # in libfreenect directory, in the KinectLibs dir  mkdir build  cd build  cmake ..  make  sudo make install  sudo ldconfig /usr/local/lib64/  ● Once libFreenect is installed, plug the Kinect, then set permission to R/W on the usb devices                                (motor and camera).  sudo chmod a+rw /dev/bus/usb//  sudo chmod a+rw /dev/bus/usb//  lsusb | grep Xbox  libusb couldn't open USB device /dev/bus/usb/001/006: Permission denied.  libusb requires write access to USB device nodes.  ● Now, let’s see if everything is correctly setup, just run ​glview,​ you should get something like  Tip​: you can play a bit with the features with these commands:  ‘w’­tilt up, ‘s’­level, ‘x’­tilt down, ‘0’­‘6′­select LED mode, ‘f’­video format  On the left there is an openGL representation of the depth map, the pixel color is set according to the point’s distance                                            to the sensor, on the right you can get the regular RGB camera view, or the infrared one (so you can see the infrared                                                pattern, switch between them with ‘f’)  Let’s now have a look on how to setup the gesture recognition libraries.  2­ Install OpenNi  We just installed a perfectly fine working library here, that seems to handle all functions of the Kinect, why                                      would we need another one? 
  • 7. It’s because of the high level library, NITE, which works only with OpenNi drivers, but the OpenNi drivers (which are                                        not Kinect specific) can’t control the Kinect motorized tilt or it’s led. So we need both libraries to have full access to                                            the Kinect.  So basically:  ● we will use ​libfreenect to control the tilt and the led (so the device ​Xbox NUI Motor​, which also handle                                        the led).  ● we will use ​OpenNi + Sensor module​ to get the camera streams (the device ​Xbox NUI Camera​)  ● we will use ​NITE libraries in concert with OpenNI to get the high level API (so gesture recognition,                                    hand/skeleton tracking and so on)  Note​: ​Xbox NUI Audio​ device is handled by OpenNi, not libfreenect.  #inOpenNIdirectory,intheultratronikdir cdPlatform/Linux/CreateRedist chmod+x./RedistMaker ./RedistMaker cd../Redist/OpenNI-Bin-Dev-Linux-x64-v1.5.2.23/ sudo./install.sh Note​: it’s Sensor­Bin­Linux­x64­v5.1.0.25​for me, but might be different for you, there is only one directory in Redist/                                  anyway, just replace in case the name is wrong.  4­ Install NITE  ● Download the library according to your system, then just run install.sh as root. that’s it.  You’re now all set for using the kinect!  Discover the Kinect potential with the examples  Go into your NITE directory, then  cdSamples/Bin/x64-Release;lsSample* These are the available examples, these cover pretty much all the high level recognition handled by NITE. 
  • 8. You can find detailed documentation of these functions in NITE/Documentation/ directory, here is just a “quick start”                                  guide for each example.  NOTE : In case of installation errors please refer to the following page and the discussions part of                                    it   http://www.kdab.com/setting­up­kinect­for­programming­in­linux­part­1/        Vision Package     1) Create a ROS package with the name “vision”. and with opencv2 & cv_bridge as                            package dependencies   2) Create vision_node.cpp file in src directory and just put the following code   int main(){  return 0 ;  }  3) Copy the CMT.cpp into the src directory of the vision package  4) Copy the CMT.h into the include/vision directory   5) Make the following changes in CMakeLists.txt    f) Add the following lines after find_package(catkin REQUIRED COMPONENTS )    find_package(OpenCV REQUIRED)  find_package(Qt4 REQUIRED COMPONENTS    QtCore    QtGui  )      g) Edit the lines under # include_directories(include) to     include_directories( include  ${catkin_INCLUDE_DIRS}   )    h) Edit the lines under ## Declare a cpp executable to     add_executable(vision_node src/vision_node.cpp src/CMT.cpp)   
  • 9. i) Edit the lines under ## Specify libraries to link a library or executable target                            against to      target_link_libraries(vision_node     ${catkin_LIBRARIES} ${OpenCV_LIBS}  )      The above instructions will get you started with making a vision ROS packages that uses                              CMT algorithm. To test if everything is configured correctly run catkin_make in your catkin                            workspace and see if the build is complete without any errors    To understand the CMT algorithm please read the following publication     http://www.gnebehay.com/publications/wacv_2014/wacv_2014.pdf    To see the working of CMT algorithm copy and paste the code from vision_node_simple.cpp                            to vision_node.cpp of src directory. Run catkin_make in the catking workspace and if there                            are no errors the package will be built successfully     It is assumed that the kinect is connected and made sure that it is working by running                                  examples as mentioned before in Kinect installation part. Now run the following command to                            see CMT algorithm     rosrun vision vision_node.cpp    You will be prompted to enter if you want to select the region of interest manually. Enter “y” if                                      you want to do it manually. By default the bounding box is at the center of the image frame of                                        640 X 480 resolution. Then an image screen appears, press enter when the object you want                                to track is in the field of view. Now use the mouse and draw a rectangular box around the                                      object of interest and press enter. The CMT algorithm starts tracking and give you center,                              scale and orientation values of the object being tracked.    ROS Messages     Check ROS turtorials on ROS Services and Messages to understand the basic concepts. In                            our case we use messages to send out data (center, scale and orientation) over ROS topics.                                The two types are messages we will look are in the msg directory of vision package with the                                    names     center.msg  ­ message type for sending the center value alone   vision.msg ­ message type for sending the center, scale and orientation values                        together. 
  • 10.   Follow the ROS turotials to configure the CMakeLists.txt and package.xml files to                        implement the custom messages      Now copy the code from vision_node_withmsg.cpp file to vision_node.cpp file in src directory                          to see the center, scale and orientation values being published as ROS Topics.      Calibration Details of the Biclops ­ Kinect assembly     The field of view of Kinect is      43​0 ​  Vertical     57​0 ​  Horizontal     Considering that the kinect is operated with 640 X 480 resolution, the PAN joint calibration is                                done as follows,     Considering the tracking is done such that the tracked point is always placed in the center of                                  the field of view of Kinect, the following variables are declared    pan_max_step=320;​  (Half of the 640 pixel resolution)  pan_max_angle=28.5;​ (Half of the 57​0 ​  Horizontal)    The values returned from the tracking algorithm results in the X pixel value on a 640 pixel                                  resolution scale, the variable used in the code is ​center_point.CenterX ​in the                        biclops_node.cpp ​file    The following code converts the returned pixel values into the corresponding PAN joint value     pan_pos_value=int (center_point.CenterX)­320; ​ (Getting the value on half resolution)      pan_pos_value= ( pan_pos_value / pan_max_step ) * pan_max_angle;      A similar calibration is done for the TILT joint. For operating kinect at other resolutions the                                corresponding variables are to be changed to ensure correct tracking.      For any further information or suggestions in the documentation, kindly write to me at   yeshasvitvs@gmail.com