This document discusses using skeleton tracking with the Kinect. It explains that skeleton tracking works by using machine learning on a large dataset of tagged images to extract body pose data from a live video stream. It then discusses several middleware options for implementing skeleton tracking applications with cameras like the Kinect, including the Kinect for Windows SDK, SoftKinetic IISU, and OpenNI. It also briefly introduces using Open Sound Control for networking between applications using skeleton tracking data.
First day of slides for @GAFFTA workshop http://www.gaffta.org/2012/07/24/hacking-the-kinect-with-openframeworks/
Part 1 of the live stream : http://www.youtube.com/watch?v=WXfy8Cuje-0&feature=plcp
Part 2 of the live stream :
http://www.youtube.com/watch?v=I80FsOlMPj8&feature=plcp
Our 3D technology is entirely CPU based, parallel, and scalable thus allowing real-time visualization of large scenes and datasets from various sources (triangulated scenes, volumetric objects, ...)
First day of slides for @GAFFTA workshop http://www.gaffta.org/2012/07/24/hacking-the-kinect-with-openframeworks/
Part 1 of the live stream : http://www.youtube.com/watch?v=WXfy8Cuje-0&feature=plcp
Part 2 of the live stream :
http://www.youtube.com/watch?v=I80FsOlMPj8&feature=plcp
Our 3D technology is entirely CPU based, parallel, and scalable thus allowing real-time visualization of large scenes and datasets from various sources (triangulated scenes, volumetric objects, ...)
International Journal of Engineering Research and DevelopmentIJERD Editor
Electrical, Electronics and Computer Engineering,
Information Engineering and Technology,
Mechanical, Industrial and Manufacturing Engineering,
Automation and Mechatronics Engineering,
Material and Chemical Engineering,
Civil and Architecture Engineering,
Biotechnology and Bio Engineering,
Environmental Engineering,
Petroleum and Mining Engineering,
Marine and Agriculture engineering,
Aerospace Engineering.
Unity XR platform has a new architecture – Unite Copenhagen 2019Unity Technologies
Unity developed a new architecture that improves the support for existing and future augmented reality (AR) and virtual reality (VR) platforms. Learn about the technology under the hood, the consequent benefits, and improvements to the platform, and how it impacts your workflows in creating AR/VR experiences.
Speakers: Mike Durand, Matt Fuad - Unity
Watch the session on Youtube: https://youtu.be/Stqk1GxlSK0
Watch this talk on YouTube: https://youtu.be/-3K74I7t7CQ
Securing the Software Supply Chain has become a focus of cybersecurity efforts the world over. One aspect of this is with the generation and verification of a Software Bill of Materials (SBOM). But what is an SBOM and how would you go about setting this up for your cloud native container/applications/pipeline?
The Flux team recently published a blog on this very topic and how they’ve gone about implementing these measures. During this session, Dan Luhring, OSS Engineering Manager at Anchore, will dive into SBOMs - what they are, why you need them, some common use cases and how to get your pipeline ready for SBOM generation and verification using the Flux SBOM as an example.
Resources
Anchore: A comprehensive, continuous security and compliance platform to protect your cloud-native applications.
Anchore’s OSS tools featured during this session:
- Syft: A CLI tool for generating a Software Bill of Materials (SBOM) from container images and file systems
- Grype: An easy-to-integrate open source vulnerability scanning tool for container images and file systems.
Speaker Bios:
Dan Luhring heads up OSS at Anchore, where he leads the software engineering team that develops Syft and Grype. Dan is drawn deeply into the cloud native security space, where he focuses on container workflows and developer experience. Dan believes in making software more secure by making life better for software engineers and security practitioners. Dan is a maintainer of Sigstore’s Cosign project, and he loves partnering with other people to find solutions to daunting challenges.
Priyanka (aka “Pinky”) is a Developer Experience Engineer at Weaveworks. She has worked on a multitude of topics including front end development, UI automation for testing and API development. Previously she was a software developer at State Farm where she was on the delivery engineering team working on GitOps enablement. She was instrumental in the multi-tenancy migration to utilize Flux for an internal Kubernetes offering. Outside of work, Priyanka enjoys hanging out with her husband and two rescue dogs as well as traveling around the globe.
A significant proportion of developments in the Internet of Things (IoT) is driven by non-technical innovators and ambitious hobbyists. Node-RED targets this audience and offers a widely used rapid prototyping platform for IoT data plumbing on the basis of JavaScript. Data platforms for the IoT provide storage facilities and value in the form of visualisation & analytics to business and end users alike. This report details how Node-RED connects to 11 different platforms and what additional services these provide.
WorldKit: Rapid and Easy Creation of Ad-hoc Interactive
Applications on Everyday Surfaces.
Instant access to computing, when and where we need it,
has long been one of the aims of research areas such as
ubiquitous computing. In this paper, we describe the
WorldKit system, which makes use of a paired depth camera
and projector to make ordinary surfaces instantly interactive.
Using this system, touch-based interactivity can,
without prior calibration, be placed on nearly any unmodified
surface literally with a wave of the hand, as can other
new forms of sensed interaction. From a user perspective,
such interfaces are easy enough to instantiate that they
could, if desired, be recreated or modified “each time we sat
down” by “painting” them next to us. From the programmer’s
perspective, our system encapsulates these capabilities
in a simple set of abstractions that make the creation of
interfaces quick and easy. Further, it is extensible to new,
custom interactors in a way that closely mimics conventional
2D graphical user interfaces, hiding much of the
complexity of working in this new domain. We detail the
hardware and software implementation of our system, and
several example applications built using the library.
TaPuMa is a digital, tangible public map that allows people to use their own belongings or the everyday objects they carry with them to access relevant information.
International Journal of Engineering Research and DevelopmentIJERD Editor
Electrical, Electronics and Computer Engineering,
Information Engineering and Technology,
Mechanical, Industrial and Manufacturing Engineering,
Automation and Mechatronics Engineering,
Material and Chemical Engineering,
Civil and Architecture Engineering,
Biotechnology and Bio Engineering,
Environmental Engineering,
Petroleum and Mining Engineering,
Marine and Agriculture engineering,
Aerospace Engineering.
Unity XR platform has a new architecture – Unite Copenhagen 2019Unity Technologies
Unity developed a new architecture that improves the support for existing and future augmented reality (AR) and virtual reality (VR) platforms. Learn about the technology under the hood, the consequent benefits, and improvements to the platform, and how it impacts your workflows in creating AR/VR experiences.
Speakers: Mike Durand, Matt Fuad - Unity
Watch the session on Youtube: https://youtu.be/Stqk1GxlSK0
Watch this talk on YouTube: https://youtu.be/-3K74I7t7CQ
Securing the Software Supply Chain has become a focus of cybersecurity efforts the world over. One aspect of this is with the generation and verification of a Software Bill of Materials (SBOM). But what is an SBOM and how would you go about setting this up for your cloud native container/applications/pipeline?
The Flux team recently published a blog on this very topic and how they’ve gone about implementing these measures. During this session, Dan Luhring, OSS Engineering Manager at Anchore, will dive into SBOMs - what they are, why you need them, some common use cases and how to get your pipeline ready for SBOM generation and verification using the Flux SBOM as an example.
Resources
Anchore: A comprehensive, continuous security and compliance platform to protect your cloud-native applications.
Anchore’s OSS tools featured during this session:
- Syft: A CLI tool for generating a Software Bill of Materials (SBOM) from container images and file systems
- Grype: An easy-to-integrate open source vulnerability scanning tool for container images and file systems.
Speaker Bios:
Dan Luhring heads up OSS at Anchore, where he leads the software engineering team that develops Syft and Grype. Dan is drawn deeply into the cloud native security space, where he focuses on container workflows and developer experience. Dan believes in making software more secure by making life better for software engineers and security practitioners. Dan is a maintainer of Sigstore’s Cosign project, and he loves partnering with other people to find solutions to daunting challenges.
Priyanka (aka “Pinky”) is a Developer Experience Engineer at Weaveworks. She has worked on a multitude of topics including front end development, UI automation for testing and API development. Previously she was a software developer at State Farm where she was on the delivery engineering team working on GitOps enablement. She was instrumental in the multi-tenancy migration to utilize Flux for an internal Kubernetes offering. Outside of work, Priyanka enjoys hanging out with her husband and two rescue dogs as well as traveling around the globe.
A significant proportion of developments in the Internet of Things (IoT) is driven by non-technical innovators and ambitious hobbyists. Node-RED targets this audience and offers a widely used rapid prototyping platform for IoT data plumbing on the basis of JavaScript. Data platforms for the IoT provide storage facilities and value in the form of visualisation & analytics to business and end users alike. This report details how Node-RED connects to 11 different platforms and what additional services these provide.
WorldKit: Rapid and Easy Creation of Ad-hoc Interactive
Applications on Everyday Surfaces.
Instant access to computing, when and where we need it,
has long been one of the aims of research areas such as
ubiquitous computing. In this paper, we describe the
WorldKit system, which makes use of a paired depth camera
and projector to make ordinary surfaces instantly interactive.
Using this system, touch-based interactivity can,
without prior calibration, be placed on nearly any unmodified
surface literally with a wave of the hand, as can other
new forms of sensed interaction. From a user perspective,
such interfaces are easy enough to instantiate that they
could, if desired, be recreated or modified “each time we sat
down” by “painting” them next to us. From the programmer’s
perspective, our system encapsulates these capabilities
in a simple set of abstractions that make the creation of
interfaces quick and easy. Further, it is extensible to new,
custom interactors in a way that closely mimics conventional
2D graphical user interfaces, hiding much of the
complexity of working in this new domain. We detail the
hardware and software implementation of our system, and
several example applications built using the library.
TaPuMa is a digital, tangible public map that allows people to use their own belongings or the everyday objects they carry with them to access relevant information.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
2. Skeleton tracking with the Kinect
So far this class has used kinect in several ways:
Open CV + Contour Finder for blobs
Point Cloud to visualize the raw 3D data
Thursday, March 14, 13
3. Skeleton tracking with the Kinect
But one of the more exciting possibilities is user or skeleton
tracking.
Thursday, March 14, 13
4. How does it work ?
Images are tagged and colored to represent portions of a
persons body.
A large data set ( very large , like millions of tagged images )
are fed into a large computing cluster
The cluster then actively learns ( hence why it’s called
machine learning ) and the algorithm can be applied to a live
video stream to extract the appropriate data.
Thursday, March 14, 13
5. How does it work ?
This is why the user is always first removed or separated
from the scene so the machine learning can ignore
unimportant pixel data.
Thursday, March 14, 13
6. How do I use Skeleton Tracking ?
Skeleton tracking if often implemented by something called a
“middleware” - a layer of software between the OS and
your application to interface with your camera device
( kinect )
Luckily there are lots of middleware options to implement
an application using skeleton tracking depending on your
camera.
Thursday, March 14, 13
7. Kinect for Windows SDK
One of the more stable, publicly available middlewares.
Works exclusively with both kinect models.
Includes skeleton tracking, seated skeleton, facial mesh
tracking, directional microphone access. But is Windows
Only.
Thursday, March 14, 13
8. SoftKinetic IISU ( interface is you )
Very robust and versatile platform. Works with 4 different
camera manufacturers. Very robust toolbox and low level API
in c#, c++, unity3d, flash. Also the most expensive middleware.
Includes skeleton tracking, cursors, volumetric, close range
interaction ( including fingers and grasping )
Thursday, March 14, 13
9. Omek
One of the lesser developed middlewares. Works with PMD
and Panasonic devices. The “Grasp” close range SDK which
is in beta looks very promising.
Omek ‘long range’ is more geared towards gaming and 1-2
skeleton tracking.
Thursday, March 14, 13
10. OpenNI
OpenNI is maintained by prime sense, who developed the technology
behind the first Kinect. OpenNI is a very big larger scale project for
modular gesture reconition systems. The ones we will focus on are
OpenNI + NITE Skeleton Tracking. OpenNI is open source, and is
supported by all primesense devices, the kinect for XBOX, and even
the Panasonic D-Imager.
Includes access for finding cursors ( high detail controllers ) and for
full skeleton tracking.
Thursday, March 14, 13
11. Quick git setup ( OS X )
In Xcode make sure you have Command Line Tools installed
Then head over to homebrew : http://mxcl.github.com/homebrew/
and use the terminal to install it
ruby -e "$(curl -fsSL https://raw.github.com/mxcl/homebrew/go)"
then all you need is to add ‘brew install git’ and you’re done !
https://help.github.com/articles/set-up-git can help with the rest.
Thursday, March 14, 13
12. Open Sound Control Networking
Open Sound Control or OSC is a standard developed to send
data from one application or device to another. It’s the new MIDI ,
but awesomer and wireless.
Open Frameworks natively supports OSC with it’s core addon
‘ofxOsc’
One of the benefits of OSC over say : TCP or UDP is that OSC is
broadcast and listener based.Your applications don’t need to first
establish a handshake. If both are on the same internet and
listening on the same port they can communicate to each other.
OSC also natively wraps data types like strings, and floats. So you
don’t have to parse raw bytes of data.
Thursday, March 14, 13
13. Sending Skeleton Data over OSC
Splits up computing power between apps. Easy to send basic data
sets between applications.
Thursday, March 14, 13