Intel has introduced Intel RealSense technology, which combines hardware and software to enable more natural human-computer interaction beyond mouse, keyboard, and touch. The technology uses capabilities like hand and finger tracking, facial analysis, background segmentation, and speech recognition. Intel has released the Intel RealSense SDK for Windows to allow developers to create applications that utilize these interaction methods.
Natural User Interface Microsoft Kinect and Surface ComputingYuvaraj Ilangovan
The document discusses natural user interfaces (NUI) and provides an overview and agenda for NUI application development using Microsoft Kinect and Surface technologies. It describes the capabilities and components of Kinect and Surface, including gesture recognition, speech recognition, face tracking, touch interaction and multi-user support. Examples are given of business opportunities for these technologies in areas like retail, healthcare, education and more. A case study is presented on developing a touch-free Kinect banking application.
SoftKinetic is an independent company founded in 2007 that provides end-to-end natural gesture solutions. It offers depth cameras, middleware for whole body and hand tracking, and a software development kit to help developers create gesture-based applications. SoftKinetic's solutions can be used in areas like digital home media, exercise/training, automotive, and in-flight entertainment.
Designing Apps for Intel RealSense TechnologyKevin Arthur
An overview of user experience guidelines for developing apps for the RealSense depth cameras (short-range F200 and long-range R200).
Joint presentation with Meghana Rao from IDF San Francisco 2015. Includes contributions from Chandrika Jayant, Lisa Mauney, and illustrations by Rachel Kennison.
Presentation on Microsoft Technologies in Teaching, Learning and Research presented at Microsoft IT Academy Summit 2011 October. - Presentation Video in low quality to allow upload
The document discusses developing touch-enabled applications for Windows 8 Ultrabook systems. It covers using touch gestures like tap, pan, zoom and rotate in applications. It provides guidelines for touch interface design regarding touch target sizes, standard gestures, and portrait vs landscape modes. The document also discusses how to access touch and sensor data via the Windows Touch API and sensor manager. It provides code examples for handling touch gestures in a Windows application.
The document is about the Intel Perceptual Computing SDK. It provides the following key details:
- The SDK contains algorithms and modules for hand and finger tracking, face recognition, voice recognition, and 3D object tracking. It includes both low and high-level APIs.
- The SDK can be used to create natural user interfaces using gestures and voice as well as for manipulating 3D content and games.
- An optional Creative camera is available for purchase to enable some of the SDK's capabilities like hand tracking.
- The SDK includes samples, documentation, libraries and headers to allow developers to integrate its capabilities into their applications.
This document discusses Ultrabook sensors and provides examples of apps that utilize those sensors. It begins with an overview of Ultrabook specifications, including thin and elegant designs, rapid startup, responsiveness, battery life, and recommended features like touchscreens and sensors. It then lists the sensors available on Ultrabooks, such as compass, accelerometer, gyroscope, GPS, ambient light, and NFC. Finally, it provides examples of Windows 8 apps for the Ultrabook Innovation Contest that make use of various sensors, like using touch and sensors for astronomy apps, data sharing between devices using NFC, and surveillance apps using the webcam and motion sensors.
Natural User Interface Microsoft Kinect and Surface ComputingYuvaraj Ilangovan
The document discusses natural user interfaces (NUI) and provides an overview and agenda for NUI application development using Microsoft Kinect and Surface technologies. It describes the capabilities and components of Kinect and Surface, including gesture recognition, speech recognition, face tracking, touch interaction and multi-user support. Examples are given of business opportunities for these technologies in areas like retail, healthcare, education and more. A case study is presented on developing a touch-free Kinect banking application.
SoftKinetic is an independent company founded in 2007 that provides end-to-end natural gesture solutions. It offers depth cameras, middleware for whole body and hand tracking, and a software development kit to help developers create gesture-based applications. SoftKinetic's solutions can be used in areas like digital home media, exercise/training, automotive, and in-flight entertainment.
Designing Apps for Intel RealSense TechnologyKevin Arthur
An overview of user experience guidelines for developing apps for the RealSense depth cameras (short-range F200 and long-range R200).
Joint presentation with Meghana Rao from IDF San Francisco 2015. Includes contributions from Chandrika Jayant, Lisa Mauney, and illustrations by Rachel Kennison.
Presentation on Microsoft Technologies in Teaching, Learning and Research presented at Microsoft IT Academy Summit 2011 October. - Presentation Video in low quality to allow upload
The document discusses developing touch-enabled applications for Windows 8 Ultrabook systems. It covers using touch gestures like tap, pan, zoom and rotate in applications. It provides guidelines for touch interface design regarding touch target sizes, standard gestures, and portrait vs landscape modes. The document also discusses how to access touch and sensor data via the Windows Touch API and sensor manager. It provides code examples for handling touch gestures in a Windows application.
The document is about the Intel Perceptual Computing SDK. It provides the following key details:
- The SDK contains algorithms and modules for hand and finger tracking, face recognition, voice recognition, and 3D object tracking. It includes both low and high-level APIs.
- The SDK can be used to create natural user interfaces using gestures and voice as well as for manipulating 3D content and games.
- An optional Creative camera is available for purchase to enable some of the SDK's capabilities like hand tracking.
- The SDK includes samples, documentation, libraries and headers to allow developers to integrate its capabilities into their applications.
This document discusses Ultrabook sensors and provides examples of apps that utilize those sensors. It begins with an overview of Ultrabook specifications, including thin and elegant designs, rapid startup, responsiveness, battery life, and recommended features like touchscreens and sensors. It then lists the sensors available on Ultrabooks, such as compass, accelerometer, gyroscope, GPS, ambient light, and NFC. Finally, it provides examples of Windows 8 apps for the Ultrabook Innovation Contest that make use of various sensors, like using touch and sensors for astronomy apps, data sharing between devices using NFC, and surveillance apps using the webcam and motion sensors.
The document provides an agenda for an Intel Ultrabook AppLab event being held on September 3rd in Berlin. The day-long event includes technical presentations on Ultrabook development, Windows UI design, touch, sensors and tools. There will also be demonstrations of Ultrabook features and networking sessions for developers. The goal is to provide Ultrabook developers with resources and information to help develop applications.
This document provides information about Intel's Ultrabook developer resources. It begins with a legal disclaimer stating that no licenses are granted and Intel assumes no liability. It then discusses the Intel AppUp Center for distributing apps, the Intel Ultrabook Community for developers, and provides instructions for publishing apps in the Intel AppUp store. The overall summary is that the document outlines Intel's resources for Ultrabook developers to distribute, engage with other developers, and publish apps.
The document discusses Intel software development tools for developing applications to take advantage of advanced processor performance. It promotes tools that can boost performance, scale applications to future platforms, and ensure application quality and confidence. It highlights the Intel Parallel Studio XE as a complete software tools solution covering all phases of development.
The document summarizes an Intel presentation on RealSense technology. It includes an agenda with overview, demos from two companies (Chronosapien and Code Monkeys), and discussions on developing with RealSense. Chronosapien demonstrates hand tracking. Code Monkeys discusses using RealSense for gesture controls in existing and new games. The presentation promotes RealSense and Intel's software innovation program.
This document provides an introduction to developing applications for Ultrabook computers running Windows 8. It discusses key differences between Ultrabooks and notebooks, how Windows 8 supports both new Windows 8 apps and legacy desktop apps, and compatibility expectations. It also outlines several Ultrabook and Windows 8 features that can enhance applications, such as the touch-enabled user experience, sensors, and the new Windows 8 user interface.
The document discusses application development for MeeGo and the Intel AppUp store. It provides an overview of the MeeGo architecture and community, describes how to join the Intel AppUp developer program, create apps using the AppUp SDK, submit apps for validation and beta testing, and package apps for distribution. It also highlights opportunities for developers including worldwide application labs and funding.
This document discusses Intel's AppUp developer program. It provides an overview of Intel's global presence, the growth of the app economy, and Intel's vision for the AppUp program. The AppUp program currently has over 70,000 developers from 202 countries who have created over 5,000 apps, resulting in over 810,000 app downloads. The document outlines some of the key developer and consumer features of the AppUp program.
This document summarizes Mahamood Hussain's presentation on delivering compelling usages for imaging with Intel architecture-based platforms. The presentation discusses how imaging hardware and software alternately lead and follow each other in an evolutionary "yo-yo effect". It then outlines a process called the "3 W's" for developing breakthrough usages - defining the target user persona, understanding pain points, and rapidly prototyping solutions. Several example use cases that leverage dual camera capabilities or depth sensing are provided.
The document discusses the Microsoft Kinect technology. It provides an introduction and history, describing how Kinect was launched in 2010 and allows users to control an Xbox or Windows system without a controller. It covers the design and working of Kinect, including its ability to track 20 joints and skeletons at a distance of 0.5-6 meters. Differences between the Kinect for Xbox and Kinect for Windows are outlined. Applications and the future scope of Kinect are discussed.
This document provides an introduction to holographic development using the HoloLens. It discusses the different devices that can be used, how to set up the HoloLens SDK and build a first app. It covers input methods like gaze, tapping, and voice commands. The rest of the document demonstrates HoloLens development tools like the HoloToolkit and how to implement features like spatial mapping, gestures, and shaders. It concludes with a discussion of future mixed reality devices.
The document discusses the Intel AppUp SDK Suite 1.2 for MeeGo. It provides tools to help developers create, test, tune, and publish mobile applications. Specifically, it includes the Qt development environment for building apps, Qt Creator as an IDE, simulators and debuggers for testing apps, performance profiling tools like VTune for optimizing apps, and the Intel C++ Compiler for improving performance. The suite aims to help developers address priorities like porting existing apps, creating new apps quickly, and publishing apps faster.
This document proposes a virtual keyboard using image processing with a standard webcam. It describes how the virtual keyboard would work by taking a photo of the reference surface as the keyboard, segmenting it using thresholding, and detecting key presses on that surface in real time video by comparing frames to the reference image. The virtual keyboard would have no physical keys and allow custom layouts. It could enable full keyboards on small devices without more space or hardware.
The document discusses Intel's AppUp application store coming to the MeeGo operating system. It provides an overview of the MeeGo architecture and ecosystem, describes Intel's AppUp developer program and SDK for creating apps for MeeGo, and encourages developers to join the program.
Kinect is a motion sensing input device created by Microsoft for the Xbox 360 video game console and Windows PCs. It enables users to control and interact with games and applications without the need for a game controller through full-body gestures and spoken commands. Kinect uses cameras and microphones to track body and voice movement. It was designed to broaden the Xbox 360's audience beyond typical gamers and has since been adapted for a variety of uses beyond gaming, including healthcare applications. Kinect sensors require no controllers and respond to full-body movement, voice commands, and facial recognition.
The document is a presentation about using RealSense technology to enhance gaming experiences through personalization and socialization. It discusses using RealSense for character, asset, and level personalization through 3D scanning. It also discusses using RealSense for social gaming features like segmented streaming of players as "floating heads" and depth-based background segmentation for green screen-style streaming without a large setup. The presentation encourages developers to explore these new types of gaming experiences with RealSense and work with Intel to integrate RealSense capabilities into games.
Intel Movidius Neural Compute Stick presentation @QConf San FranciscoDarren Crews
The document discusses moving artificial intelligence capabilities from the cloud to edge devices using the Intel Movidius Neural Compute Stick. It describes barriers to moving AI to the edge like accuracy, available compute, and model efficiency. The stick contains a vision processing unit that can run popular deep learning frameworks and models efficiently at the edge. The document outlines the SDK workflow for converting, loading, and running models on the stick for applications like object detection and classification. It provides examples of computer vision tasks that can now run on edge devices.
Smartphone Behavior On A Featurephone BudgetGail Frederick
JavaOne 2009 BoF Presentation
Mobile application features typical in smartphones can also be implemented on mass-market featurephones using Java ME and Web 2.0 back-end services. In this presentation, we explore the multimodality and rich user interface of a search-driven portal application written in Java ME and broadly ported to mass-market featurephones. Multimodality enables the user to search, browse and discover using familiar activities on a mobile device - saying a phrase, entering text and snapping a photo.
Snippets of Java ME code used to implement voice recording, image capture, location awareness and advanced mapping are presented and analyzed.
Hololens: Primo Contatto - Marco Dal Pino - Codemotion Milan 2016Codemotion
Finalmente il device tanto atteso è arrivato, possiamo immergerci nella realtà aumentata ed interagire con oggetti e personaggi in maniera naturale ed integrata con la realtà circostante. Vediamo come possiamo sviluppare applicazioni per Hololens di quali skill e tecnologie abbiamo bisogno per iniziare.
This document provides an overview of mobile application development using Android. It discusses Android's architecture including the Linux kernel layer, libraries layer, Android runtime layer, application framework layer, and applications layer. It describes key Android components like activities, services, broadcast receivers, content providers, and intents. It also covers the Android development process, tools, requirements and versions.
Presentasi sesi sharing JGJ48 yang dipresentasikan oleh Firstman Marpaung dari Intel,
bapak Firstman menerangkan mengenai apa itu intel realsense dan apa manfaat yang bisa digunakan dari intel realsense
Augmented Reality with the Intel® RealSenseTM SDK and R200 Camera: User Exper...Kevin Arthur
This document summarizes a presentation about augmented reality experiences using the Intel RealSense R200 depth camera and SDK. It provides an overview of the R200 camera and tablet use cases, demonstrates example AR applications for gaming, education and training, and visualization. The document also outlines several user experience guidelines for designing compelling and usable AR apps with the R200, such as giving users reasons to move the tablet camera, supporting both active and inactive camera modes, planning for the capture space, and understanding the camera's limitations.
The document provides an agenda for an Intel Ultrabook AppLab event being held on September 3rd in Berlin. The day-long event includes technical presentations on Ultrabook development, Windows UI design, touch, sensors and tools. There will also be demonstrations of Ultrabook features and networking sessions for developers. The goal is to provide Ultrabook developers with resources and information to help develop applications.
This document provides information about Intel's Ultrabook developer resources. It begins with a legal disclaimer stating that no licenses are granted and Intel assumes no liability. It then discusses the Intel AppUp Center for distributing apps, the Intel Ultrabook Community for developers, and provides instructions for publishing apps in the Intel AppUp store. The overall summary is that the document outlines Intel's resources for Ultrabook developers to distribute, engage with other developers, and publish apps.
The document discusses Intel software development tools for developing applications to take advantage of advanced processor performance. It promotes tools that can boost performance, scale applications to future platforms, and ensure application quality and confidence. It highlights the Intel Parallel Studio XE as a complete software tools solution covering all phases of development.
The document summarizes an Intel presentation on RealSense technology. It includes an agenda with overview, demos from two companies (Chronosapien and Code Monkeys), and discussions on developing with RealSense. Chronosapien demonstrates hand tracking. Code Monkeys discusses using RealSense for gesture controls in existing and new games. The presentation promotes RealSense and Intel's software innovation program.
This document provides an introduction to developing applications for Ultrabook computers running Windows 8. It discusses key differences between Ultrabooks and notebooks, how Windows 8 supports both new Windows 8 apps and legacy desktop apps, and compatibility expectations. It also outlines several Ultrabook and Windows 8 features that can enhance applications, such as the touch-enabled user experience, sensors, and the new Windows 8 user interface.
The document discusses application development for MeeGo and the Intel AppUp store. It provides an overview of the MeeGo architecture and community, describes how to join the Intel AppUp developer program, create apps using the AppUp SDK, submit apps for validation and beta testing, and package apps for distribution. It also highlights opportunities for developers including worldwide application labs and funding.
This document discusses Intel's AppUp developer program. It provides an overview of Intel's global presence, the growth of the app economy, and Intel's vision for the AppUp program. The AppUp program currently has over 70,000 developers from 202 countries who have created over 5,000 apps, resulting in over 810,000 app downloads. The document outlines some of the key developer and consumer features of the AppUp program.
This document summarizes Mahamood Hussain's presentation on delivering compelling usages for imaging with Intel architecture-based platforms. The presentation discusses how imaging hardware and software alternately lead and follow each other in an evolutionary "yo-yo effect". It then outlines a process called the "3 W's" for developing breakthrough usages - defining the target user persona, understanding pain points, and rapidly prototyping solutions. Several example use cases that leverage dual camera capabilities or depth sensing are provided.
The document discusses the Microsoft Kinect technology. It provides an introduction and history, describing how Kinect was launched in 2010 and allows users to control an Xbox or Windows system without a controller. It covers the design and working of Kinect, including its ability to track 20 joints and skeletons at a distance of 0.5-6 meters. Differences between the Kinect for Xbox and Kinect for Windows are outlined. Applications and the future scope of Kinect are discussed.
This document provides an introduction to holographic development using the HoloLens. It discusses the different devices that can be used, how to set up the HoloLens SDK and build a first app. It covers input methods like gaze, tapping, and voice commands. The rest of the document demonstrates HoloLens development tools like the HoloToolkit and how to implement features like spatial mapping, gestures, and shaders. It concludes with a discussion of future mixed reality devices.
The document discusses the Intel AppUp SDK Suite 1.2 for MeeGo. It provides tools to help developers create, test, tune, and publish mobile applications. Specifically, it includes the Qt development environment for building apps, Qt Creator as an IDE, simulators and debuggers for testing apps, performance profiling tools like VTune for optimizing apps, and the Intel C++ Compiler for improving performance. The suite aims to help developers address priorities like porting existing apps, creating new apps quickly, and publishing apps faster.
This document proposes a virtual keyboard using image processing with a standard webcam. It describes how the virtual keyboard would work by taking a photo of the reference surface as the keyboard, segmenting it using thresholding, and detecting key presses on that surface in real time video by comparing frames to the reference image. The virtual keyboard would have no physical keys and allow custom layouts. It could enable full keyboards on small devices without more space or hardware.
The document discusses Intel's AppUp application store coming to the MeeGo operating system. It provides an overview of the MeeGo architecture and ecosystem, describes Intel's AppUp developer program and SDK for creating apps for MeeGo, and encourages developers to join the program.
Kinect is a motion sensing input device created by Microsoft for the Xbox 360 video game console and Windows PCs. It enables users to control and interact with games and applications without the need for a game controller through full-body gestures and spoken commands. Kinect uses cameras and microphones to track body and voice movement. It was designed to broaden the Xbox 360's audience beyond typical gamers and has since been adapted for a variety of uses beyond gaming, including healthcare applications. Kinect sensors require no controllers and respond to full-body movement, voice commands, and facial recognition.
The document is a presentation about using RealSense technology to enhance gaming experiences through personalization and socialization. It discusses using RealSense for character, asset, and level personalization through 3D scanning. It also discusses using RealSense for social gaming features like segmented streaming of players as "floating heads" and depth-based background segmentation for green screen-style streaming without a large setup. The presentation encourages developers to explore these new types of gaming experiences with RealSense and work with Intel to integrate RealSense capabilities into games.
Intel Movidius Neural Compute Stick presentation @QConf San FranciscoDarren Crews
The document discusses moving artificial intelligence capabilities from the cloud to edge devices using the Intel Movidius Neural Compute Stick. It describes barriers to moving AI to the edge like accuracy, available compute, and model efficiency. The stick contains a vision processing unit that can run popular deep learning frameworks and models efficiently at the edge. The document outlines the SDK workflow for converting, loading, and running models on the stick for applications like object detection and classification. It provides examples of computer vision tasks that can now run on edge devices.
Smartphone Behavior On A Featurephone BudgetGail Frederick
JavaOne 2009 BoF Presentation
Mobile application features typical in smartphones can also be implemented on mass-market featurephones using Java ME and Web 2.0 back-end services. In this presentation, we explore the multimodality and rich user interface of a search-driven portal application written in Java ME and broadly ported to mass-market featurephones. Multimodality enables the user to search, browse and discover using familiar activities on a mobile device - saying a phrase, entering text and snapping a photo.
Snippets of Java ME code used to implement voice recording, image capture, location awareness and advanced mapping are presented and analyzed.
Hololens: Primo Contatto - Marco Dal Pino - Codemotion Milan 2016Codemotion
Finalmente il device tanto atteso è arrivato, possiamo immergerci nella realtà aumentata ed interagire con oggetti e personaggi in maniera naturale ed integrata con la realtà circostante. Vediamo come possiamo sviluppare applicazioni per Hololens di quali skill e tecnologie abbiamo bisogno per iniziare.
This document provides an overview of mobile application development using Android. It discusses Android's architecture including the Linux kernel layer, libraries layer, Android runtime layer, application framework layer, and applications layer. It describes key Android components like activities, services, broadcast receivers, content providers, and intents. It also covers the Android development process, tools, requirements and versions.
Presentasi sesi sharing JGJ48 yang dipresentasikan oleh Firstman Marpaung dari Intel,
bapak Firstman menerangkan mengenai apa itu intel realsense dan apa manfaat yang bisa digunakan dari intel realsense
Augmented Reality with the Intel® RealSenseTM SDK and R200 Camera: User Exper...Kevin Arthur
This document summarizes a presentation about augmented reality experiences using the Intel RealSense R200 depth camera and SDK. It provides an overview of the R200 camera and tablet use cases, demonstrates example AR applications for gaming, education and training, and visualization. The document also outlines several user experience guidelines for designing compelling and usable AR apps with the R200, such as giving users reasons to move the tablet camera, supporting both active and inactive camera modes, planning for the capture space, and understanding the camera's limitations.
Welcome to the world of Microsoft Kinect for
Windows–enabled applications. This document
is your roadmap to building exciting humancomputer interaction solutions you once
thought were impossible.
The document provides an overview of Intel's Perceptual Computing SDK capabilities including:
1) The SDK allows for perceptual computing including speech recognition, facial analysis, depth tracking, and augmented reality. It provides APIs for C++, C#, Unity and other frameworks.
2) The SDK enables capabilities like close-range finger tracking, facial tracking and analysis, voice recognition and synthesis, and augmented reality. It can retrieve data like images, geonodes, and gestures.
3) The document provides information on initializing and using the SDK within Unity including selecting capabilities, acquiring frames, retrieving image and gesture data, and cleaning up. It directs developers to additional resources for learning more.
This document provides guidelines for designing applications that utilize the Microsoft Kinect sensor and SDK. It acknowledges that the information is provided "as-is" and does not guarantee any legal rights. The document covers an introduction to the Kinect sensor and SDK capabilities, including skeleton tracking, audio input, and interaction ranges. It emphasizes considering sensor placement and environment during application design.
AbstractThis work presents the design and implementation of an.docxbartholomeocoombs
Abstract
This work presents the design and implementation of an embedded augmented reality game, called MarkerMatch. Augmented reality is a technology that directly contributes to the game interaction experience by enhancing user’s sense of immersion. Current research in embedded augmented reality enables the creation of dedicated hardware capable of executing augmented reality applications. This favors the insertion of augmented reality capabilities in small electronic devices, such as cell phones, handhelds, head-mounted displays and even the development of new ones. The ARCam framework was used for game development, since it provides project designers with all the basic infrastructure needed by the game. Some user tests show that the tested subjects enjoyed the game experience and it proves a point: it is possible to create an augmented reality game completely in hardware with no software involved.1. Introduction
Augmented Reality (AR) makes use of computer vision algorithms in order to superimpose virtual information 2D or 3D, textual or pictorial - onto real world scenes in real time, enhancing user’s perception of and interaction with the environment [4]. Nowadays, augmented reality is applied in different fields, such as entertainment [23], medicine [5], manufacturing and repair [4], and training [19]. The technical challenges lie in determining, in real time, what should be shown where and how.
Traditionally, augmented reality systems place virtual objects in the real world using fiducial markers. Such artificial markers are used to support camera position and orientation tracking by the system, and are intrusive to the environment. Figure 1 illustrates the use of such fiducial
Figure 1. Marker based augmented reality example
markers in order to place a virtual statue on the real table.
The concept of augmented reality is directly related to augmenting users’ perception, specifically the users’ vision. Therefore users need to wear HMDs or similar devices in order to obtain the information enhancement previously mentioned. More important than that, many augmented reality applications are made to provide support to users in their daily and common activities. Therefore, there has been an expanding tendency to seamlessly integrate daily used equipments into common platforms with support to mobility. Continuous advances in device miniaturization, allied with the emergence of various wireless communication technologies, universal plug-and-play devices and powerful portable processing units has opened the door for research on wearable platforms.
It’s natural the evolution of augmented reality desktop platforms into something closer to the user. The terms mobile and wearable must be considered part of such evolution, and for this to happen, the miniaturization and specificity of devices must occur. Embedded augmented reality [22] refers to the research area that aims enabling the mentioned evolution. It researches how augmented reality appli.
Metro Style Apps - Whats there for DevelopersJitendra Soni
The document discusses the new Metro style apps platform in Windows 8. It introduces Metro apps, which are designed for touchscreens and provide an immersive full-screen experience. Metro apps can be developed with HTML5, CSS3, and JavaScript or C++, C#, and VB with XAML. The Visual Studio 11 tools allow developers to easily create Metro apps that can be distributed through the Windows Store and run across multiple device form factors. A demo of Metro apps in the Windows 8 developer preview is provided.
The document discusses Kinect and 3D motion sensing technology. It introduces the Kinect sensor device, the PrimeSense technology behind it, and the OpenNI and NITE libraries for developing applications using depth sensor data. It provides details on the Kinect sensor components and how it measures depth, and describes the various software options for Kinect development including OpenNI, OpenKinect, and Microsoft's Kinect SDK. It also summarizes the PrimeSense technology, OpenNI architecture and nodes, and NITE middleware for gesture and skeleton tracking.
We have provide high quality Digital Signage.
Israk Technology Sdn. Bhd, MSC-status company was established to help any organization to take advantage of new channel via internet and video to facilitate marketing and corporate communications, as well as from the perspective of the media and entertainment markets.
Over the years, the use of video has grown and broadened significantly not only in the media and entertainment market but also within the enterprise environment such as corporate, government and public agencies, education, hospitality and political parties, among many others.
Philipp Nagele (Wikitude): Context Is for Kings: Putting Context in the Hands...AugmentedWorldExpo
A talk from the Develop Track at AWE USA 2018 - the World's #1 XR Conference & Expo in Santa Clara, California May 30- June 1, 2018.
Philipp Nagele (Wikitude): Context Is for Kings: Putting Context in the Hands of AR Developers
In this session, Philipp Nagele will explore why AR centers all around context and why contextual understanding is fundamental to any AR experience. He will show how Wikitude is trying to solve this problem for AR developers and provide technical details about the new release of the Wikitude SDK.
http://AugmentedWorldExpo.com
Gesture Gaming on the World Wide Web Using an Ordinary Web CameraIJERD Editor
- Gesture gaming is a method by which users having a laptop/pc/x-box play games using natural or
bodily gestures. This paper presents a way of playing free flash games on the internet using an ordinary webcam
with the help of open source technologies. Emphasis in human activity recognition is given on the pose
estimation and the consistency in the pose of the player. These are estimated with the help of an ordinary web
camera having different resolutions from VGA to 20mps. Our work involved giving a 10 second documentary to
the user on how to play a particular game using gestures and what are the various kinds of gestures that can be
performed in front of the system. The initial inputs of the RGB values for the gesture component is obtained by
instructing the user to place his component in a red box in about 10 seconds after the short documentary before
the game is finished. Later the system opens the concerned game on the internet on popular flash game sites like
miniclip, games arcade, GameStop etc and loads the game clicking at various places and brings the state to a
place where the user is to perform only gestures to start playing the game. At any point of time the user can call
off the game by hitting the esc key and the program will release all of the controls and return to the desktop. It
was noted that the results obtained using an ordinary webcam matched that of the Kinect and the users could
relive the gaming experience of the free flash games on the net. Therefore effective in game advertising could
also be achieved thus resulting in a disruptive growth to the advertising firms.
The document provides guidelines for designing applications for the Kinect for Windows sensor and SDK. It introduces the Kinect sensor and SDK, explaining that the sensor provides depth, color and audio data to the SDK which then provides capabilities like skeleton tracking. It outlines sensor capabilities like tracking 2 full skeletons, depth ranges, field of view, audio input directions. It emphasizes considering sensor placement and environment during design to ensure reliable interactions.
Bring Intelligence to the Edge with Intel® Movidius™ Neural Compute StickDESMOND YUEN
Motiviation to move intelligence to the edge
Edge compute use cases
Barriers to moving intelligence to the edge
Deep learning algorithms – can they run on an edge device?
Movidius Neural Compute Stick (arch,usage, etc)
This document discusses developing location-aware applications for mobile devices indoors. It introduces location awareness technologies like GPS, RFID, Wi-Fi and Bluetooth. It covers development platforms like Symbian OS, J2ME and .NET and discusses user interface design considerations. The document demonstrates sample applications and concludes by discussing future developments like full databases, outdoor GPS, data analysis and mobile social networking for location-aware mobile applications.
Developing Multi-OS Native Mobile Applications with Intel INDEIntel® Software
Intel® INDE is a suite of tools that allows developers to write code once and deploy applications to multiple operating systems and architectures. It supports developing native Android and iOS applications using Java on Windows or Mac hosts. The suite provides tools for project setup, UI design, building, debugging on simulators or devices, and cloud builds. It aims to improve productivity by reducing redundant work across OS teams.
This document discusses mobile augmented reality technologies. It begins by defining augmented reality and how mobile AR overlays digital information onto the real world viewed through a camera. It then discusses the hardware capabilities of modern smartphones that enable AR applications like cameras, sensors, and high-resolution displays. It also reviews several open-source and proprietary AR software development kits (SDKs) and tools that facilitate creating AR applications. Examples are given of many existing AR applications across different domains.
Windows 8 & Aardvark University Gaming TourLee Stott
This document discusses opportunities for game developers on Windows 8 and strategies for creating engaging games. It outlines how Windows 8 features like the Windows Store, live tiles, cloud services, and sensors can be leveraged to drive player engagement and new business models for games. Specific ideas mentioned include using live tiles to provide updates and draw players back in, cloud services for cross-device syncing, and sensors for new control schemes. The document emphasizes designing games that take advantage of the Windows 8 experience across multiple PCs and provide a consistent experience for players.
The document discusses the Android Charting Component Library (ACCL) project sponsored by Netscout Systems. The project aims to develop a component library for charting in the native Android SDK to enhance charting performance and reduce application development load. It will include common chart types like pie, line, area, column, and bar charts. The library will benefit both application developers and Netscout in exploring the Android market. It is being developed in Java using the Android SDK and Eclipse IDE.
Similar to Intel Real Sense, Diversity Meetup by Jamie Tanna (20)
EPID is a digital signature scheme that allows for anonymity and privacy. It uses a single public key that corresponds to multiple private keys, so the private key used to generate a signature cannot be identified. EPID signatures can be verified using the group public key. EPID provides granular revocation mechanisms and has been shipping since 2008 in Intel processors. It is used for technologies like Intel Insider, Intel TXT, Intel IPT and Intel SGX.
This document is a presentation from Intel about their Internet of Things (IoT) developer platform and tools. It introduces Intel's IoT reference architecture and developer kit, which includes hardware boards, middleware libraries, cloud connectors, and integrated development environments. It provides examples of code samples and reference applications that developers can use to prototype and develop IoT solutions using Intel technologies.
Cisco Paris DevNet Hackathon slideshow - IntroBeMyApp
This document outlines an agenda for a 48-hour hackathon in Paris to invent the city of the future. It will include presentations from Cisco and various technology partners on topics like smart cities, Internet of Things, and cloud platforms. Participants will then pitch project ideas before forming teams and beginning development. Over the weekend, teams will work on their projects, receive mentoring, and do practice pitches. On the final day, projects will be presented to a jury for cash prizes. The goal is to generate innovative ideas for connecting people, data, and devices to improve city services.
Tumeurs Neuroendocrines : une vue d'ensembleBeMyApp
Le Dr Olivier Dubreuil donne une vue d'ensemble des Tumeurs Neuroendocrines via une présentation donnée au cour de la soirée Idéation du hackathon NET Patient Accelerator qui a eu lieu le 30 avril 2016 à la Zalthabar.
Building your first game in Unity 3d by Sarah SextonBeMyApp
Learn to create a 3D isometric survival shooter called Nightmares using Unity 5 in this hands-on workshop. Although this workshop is beginner-friendly, you need to understand mathematical 3D concepts.
Using intel's real sense to create games with natural user interfaces justi...BeMyApp
As technology advances, more sophisticated ways of interfacing with it are emerging. Even though new tech strives to make our apps more intuitive and easy to use, designing interfaces for those apps is not quite as straight forward. We’ve learned a few rules and “gotchas” when working with gesture cameras that can help to make apps that use them easy and fun to use.
In this talk Justin described:
1. Different data types you can get from Intel® RealSense™ and how to get them
2. Designing an interface for a gesture camera
3. Using your hands, face, and voice as an interface
Introduction to using the R200 camera & Realsense SDK in Unity3d - Jon CollinsBeMyApp
We used the Intel RealSense SDK In conjunction with the R200 Camera to bring about the fun, interactive and tactile gameplay featured at GDC16. Using Intel® RealSense™ we were able to create a unique gameplay experience with every play through, as the players themselves sculpted the terrain the game used out of sand which was monitored by the camera and our code effectively translated the profile of the sand into a terrain. In this talk we’ll use one of the realsense samples to step through and gain an insight into how that depth data is translated into a usable game object in Unity3d.
In this talk, Jon described:
1. Features we used in the Magic and Magnums Tower Defense game
2. Using the Blob Sensing and tracking actions provided in the Unity Toolkit for Realsense
3. Using the Realsense Depth feed to manipulate meshes
Unity 5 introduces a new Audio Mixer system to take your game’s music and sound effects to the next level! Andy will give a hands-on, in-editor demo of all the new Audio Features introduced in the latest version of the Unity game engine; covering in-game sound mixing, groups, applying effects, audio ducking, fading and more!
This talk is welcome to all but at least some Unity knowledge will be useful.
Shaders - Claudia Doppioslash - Unity With the BestBeMyApp
Shader programming is one of the things that most influences how good your game will look, yet it's perceived as a black art, hidden away and feared.
In this talk, Claudia described:
1. How shader programming works
2. How Unity lets you take almost full control of the shader subsystem
3. What you can achieve with that control
4. How to implement a custom Physically Based Lighting system and the logic behind every choice
[HACKATHON CISCO PARIS] Slideshow du workshop Smart CityBeMyApp
Slideshow du workshop Smart City organisé le mercredi 23 mars au NUMA. Ce workshop est organise dans le cadre du hackathon Smart City les 1-3 avril prochain.
Before Spécial 1ère Année proposé par BeMyApp le 8 mars 2016 à l'École 42. Tous les bons conseils, astuces, liens pour des Softwares | Applications pour bien réussir un Hackathon.
[Workshop e résidents] présentation intent, craft ai, dalkia et incubateurBeMyApp
Slideshow de présentation diffusé lors du workshop idéation / soirée pitch à l'Usine io le lundi 1er février. Cet événement a été organisé dans le cadre du Programme E-résidents.
http://hackathon.dalkia.fr/
[Webinar E-résidents #1] Présentation des différents métiers du bâtiment conn...BeMyApp
Dans le cadre du hackathon E-résidents, nous organisons deux webinars de 30 minutes chacun les mardi 19 et 26 janvier à 18h.
Pour cette première immersion au sein du bâtiment connecté, ce sont les deux supers mentors Jean-Yves Lépine et Patrick Quach respectivement Directeur des Relations Clients chez Dalkia et Product Manager chez Intent Technologies, qui expliqueront en détail les différents métiers (gestionnaire de patrimoine, exploitant, fournisseurs de services, occupants, etc...). Ils aborderont les problématiques de chaque acteur ainsi que des idées d'applications à développer et/ou existantes. Une session de questions/réponses suivra.
18h - Introduction par Alex de BeMyApp
18h05 - Présentation des métiers du bâtiment connecté
18h25 - Idées & inspirations pour le hackathon
18h30 - Q&A
Pour rappel, le hackathon E-résidents aura lieu les 5-7 février prochain à l'Usine io. Le workshop Idéation est complémentaire de ce webinar, il prévu le mardi 2 février toujours à l'Usine io.
[IoT World Forum Webinar] Review of CMX Cisco technologyBeMyApp
Cisco's CMX provides location detection, visibility, and engagement capabilities through Wi-Fi, BLE, and video technologies. It uses the MSE for location calculation and WLC/APs to collect client RSSI data. The CMX Mobile App Server hosts applications and Notification Receivers subscribe to location events. CMX's REST API allows extracting real-time location data for various use cases like improving customer experience in retail, banking, hospitality, and healthcare industries.
This document provides an overview of user experience (UX) design. It discusses what UX is, how to deliver good UX by understanding users' needs and goals, envisioning key use cases, creating feature lists, prototyping solutions, testing prototypes with users, and following usability heuristics like ensuring visibility of system status and matching systems to the real world. The document aims to provide both high-level and practical tips for designing more usable solutions that solve problems and meet user needs.
HP Helion Webinar #5 - Security Beyond FirewallsBeMyApp
Giuseppe Paternò is an IT security expert who has worked with many large companies. He discussed security issues with OpenStack and cloud applications. Neutron provides software-defined networking and security groups for network segmentation. Keystone handles identity management. APIs must be secured to protect resources and authentication tokens. Continuous security practices like automated testing and monitoring help harden the cloud platform.
HP Helion Webinar #4 - Open stack the magic pillBeMyApp
We will go through a quick overview about the 5 years of OpenStack cloud computing platform. This webinar explains the short history of this fast growing open-source initiative, and try to answer the common questions about the place of infrastructure and platform services in the IT hierarchy.
The technology is ready, but are we ready for the cloud adoption? Does it really solve our business problems? Learn the basic terminology, get an insight about the IT operation and development transition steps required to win the efficiency race.
Introducing Milvus Lite: Easy-to-Install, Easy-to-Use vector database for you...Zilliz
Join us to introduce Milvus Lite, a vector database that can run on notebooks and laptops, share the same API with Milvus, and integrate with every popular GenAI framework. This webinar is perfect for developers seeking easy-to-use, well-integrated vector databases for their GenAI apps.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
Building RAG with self-deployed Milvus vector database and Snowpark Container...Zilliz
This talk will give hands-on advice on building RAG applications with an open-source Milvus database deployed as a docker container. We will also introduce the integration of Milvus with Snowpark Container Services.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
20 Comprehensive Checklist of Designing and Developing a WebsitePixlogix Infotech
Dive into the world of Website Designing and Developing with Pixlogix! Looking to create a stunning online presence? Look no further! Our comprehensive checklist covers everything you need to know to craft a website that stands out. From user-friendly design to seamless functionality, we've got you covered. Don't miss out on this invaluable resource! Check out our checklist now at Pixlogix and start your journey towards a captivating online presence today.
20 Comprehensive Checklist of Designing and Developing a Website
Intel Real Sense, Diversity Meetup by Jamie Tanna
1. Jamie Tanna and Phillip Zissimou, Application Engineers
Intel Software and Services Group, Developer Relations Division
*Other names and brands may be claimed as the property of others.
3. Introducing
Intel® RealSense™
technology
A combination of hardware and
software to enable human-
computer interaction on a PC.
Uses natural intuitive user
interfaces beyond the mouse,
keyboard and touch including:
Hand and finger tracking
Facial analysis
Background segmentation
Speech recognition
Augmented reality
3D scanning and reconstruction
(coming soon**)**Roadmap Notice: All products, computer systems, dates and figures specified are
preliminary based on current expectations, and are subject to change without notice.
3
4. 4
Intel® RealSense™ SDK for Windows*, (Version 2014,
Gold) is now available
What’s New in this SDK
• Improved quality and accuracy of core capabilities: hand/finger tracking, facial analysis, speech
recognition and synthesis, background segmentation, and augmented reality
• Added localization for multiple languages
CALL TO ACTION for Developers
• Be first in line to build a more immersive experience to your app and take full advantage of Intel®
RealSense™ technologies as systems with Intel® RealSense™ 3D cameras become widespread in the
market.
• Reserve Your Intel® RealSense™ Developer Kit now (includes peripheral camera, SDK, and other updates)
intel.com/realsense/sdk
*Other names and brands may be claimed as the property of others.
Intel® RealSense™ Technology will change the
computing experience for users, creating a more
natural and intuitive interface…
5. 5
Intel® RealSense™ Developer Kit 3D camera
Get started developing quickly
Peripheral depth camera specifications:
– Full VGA depth resolution
– 1080p RGB camera
– 0.2 – 1.2 meter range1**
– USB 3.0 interface
– Requires Intel® Core™ processor
Reserve Yours Now
1Specific algorithms may have different range and accuracy.
**Roadmap Notice: All products, computer systems, dates and figures specified are preliminary based on current expectations, and are subject to change without notice.
*Other names and brands may be claimed as the property of others. **Range and accuracy may vary based on application.
Intel® RealSense™ SDK
for Windows*, Version 2014
supports Windows 8.1 (64-bit)
Windows 7 support coming soon
Download the SDK
intel.com/realsense/sdk
What OS and hardware works with Intel®
RealSense™ technologies?
6. 6
Intel® RealSense™ SDK for Windows*
Create Applications for Intel® RealSense™ 3D
Cameras
Integrated inside
Ultrabook, 2in1 and AIO
RGB up to 1080p@30FPS
Depth up to VGA@60FPS,
HVGA
Intel® Core™ processors
support
SDK for Windows now
SDK add-on for Android*
coming soon (for different
cameras)**
**Roadmap Notice: All products, computer systems, dates and figures specified are preliminary based on current expectations, and are subject to change without notice.
* Other names and brands may be claimed as the property of others.
7. 7
Current available technology focus on
a living-room experience or a sub-set
of Intel RealSense technology
features, whereas Intel® RealSense™
technology focus on developing for
close-range interactions.1
Intel® RealSense™ 3D Camera for Windows*
Designed for close-range interactions
20 cm
120 cm
Intel®RealSense™
3D camera
Intel® RealSense™ SDK for Windows* is your solution
for developing close-range interactions
1Specific algorithms may have different range and accuracy.
*Other names and brands may be claimed as the property of others.
8. 8
The Intel® RealSense™ 3D Camera
Shipping on Multiple Form Factors
Ultrabooks™,
Notebooks
2in1’s / Convertibles All-in-1’s
Integrated in Systems shipping by end of 2014 and in 2015
from Major OEMs
9. Hands
Face
Speech
Environment
9
Intel® RealSense™ SDK for Windows*
Understands 4 basic types of input - Standalone or various permutations
**Roadmap Notice: All products, computer systems, dates and figures specified are preliminary based on current expectations, and are subject to change without notice.
*Other names and brands may be claimed as the property of others.
10. 10
Intel® RealSense™ SDK for Windows*
Categories of Input Capabilities Features
Hands • Hand and Finger
Tracking
• Gesture Recognition
• 22-point Hand and Finger Tracking
• Static Poses and Dynamic Gestures
Face • Face Detection and
Tracking
• Multiple Face Detection and tracking
• 78-point Landmark Detection (facial features)
• Face Recognition and Facial Expressions
• Emotion Detection
• Pulse Estimator (coming soon**)
Speech • Speech Recognition and
Synthesis
• Command and Control
• Dictation
• Text to Speech
Environment • Segmentation
• 3D Scanning
• Augmented Reality
• Background Segmentation
• 3D Object / Face / Room Scanning (coming soon**)
• 2D/3D Object Tracking
• Scene Perception (coming soon**)
**Roadmap Notice: All products, computer systems, dates and figures specified are preliminary based on current expectations, and are subject to change without notice.
*Other names and brands may be claimed as the property of others.
11. 11
Intel® RealSense™ Technology
Enabling Exciting, Diverse Categories of Applications
Capture and Share Immersive
Collaboration/Creation
Interact Naturally
Gaming and PlayLearning and Edutainment
12. Immersive
Collaboration/
Creation
• Using the 3D camera, software can
“segment” an environment, allowing
an application to remove the
background
• With the background removed,
developers can create apps ideal for a
combination of video chat and
collaboration
• Make apps that allow presenters to be
overlaid on top of applications, allow
two people to share an experience on
the web, or for a single person to use
this “digital green screen” to create
compelling content 12
13. Interact
Naturally
• Use our extensive library of
gestures, and facial analysis
and speech recognition
algorithms to build interfaces
that are truly natural
• Leverage our carefully
developed user-experience
guidelines document to
ensure the interfaces you
build will delight your
customers
13
14. Gaming
and Play
• Differentiate your game with
gesture, speech, augmented
reality enhanced with depth,
facial analysis, and more.
• Use the user’s pulse to
determine the stability of a
scope. Allow a user to
physically move their head to
peak around a corner. Use
gesture for a lock-picking
challenge. The possibilities
are tremendous.
14
15. Capture
and Share
• Collaborate with scanning/
printing experts, 3D Systems
• Rotate a small object in front
of the 3D camera to create a
3D mesh + color
• Developers can import scans
from a proprietary 3DS* app
or eventually build scanning
into their own application
• The model is scan, edit, share.
Upload scans to social media,
or sent to a 3D printer 15
**Roadmap Notice: All products, computer systems, dates and figures specified are preliminary based
on current expectations, and are subject to change without notice.
* Other names and brands may be claimed as the property of others.
16. Learning and
‘Edutainment’
• Use augmented reality,
powered by Metaio to track
2D or 3D objects and
incorporate them into a
learning experience.
• Tracking is significantly
improved by the 3D camera.
Enjoy both enhanced
precision and accuracy.
• Interact with stories using
both facial/emotion
recognition and gesture.
Disguise learning as play.
16
17. 17
Intel® RealSense™ SDK for Windows*
Required Hardware A system with a minimum of a 4th generation Intel® Core™ processor including
an Intel® RealSense™ 3D camera (or a peripheral camera)
Required OS Microsoft Windows* 8.1 64-bit Desktop Mode
Microsoft Windows 8.1 New UI (Metro) Mode, Windows 7 (coming soon)**
Support for Android* coming soon**
Supported Programming
Languages
C++, C#, Java, JavaScript
Supported IDE Microsoft* Visual Studio 2010-2013 with service pack 1 or newer
Supported Development
Tools
-Microsoft* .NET 4.0 Framework for C# development
-Unity* PRO 4.1.0 or later for Unity game development
-Processing* 2.1.2 or higher for Processing framework development
-Java* JDK 1.7.0_11 or higher for Java development
The Intel® RealSense™ SDK for Windows does not support Intel® RealSense™ Snapshot, support for that is coming soon.**
**Roadmap Notice: All products, computer systems, dates and figures specified are preliminary based on current expectations, and are subject to change without notice.
*Other names and brands may be claimed as the property of others.
Supported Features
18. 18
Intel® RealSense™ SDK for Windows*
Input Device Manager Multiple applications can access data from the camera
simultaneously
Multi-Mode Support Support multiple usage modes within a single app (e.g. finger
tracking + speech + face tracking) or between apps
Power Management State Apps can manage battery life
Extensible Framework Developers can plug in their own algorithms. New usage modes can
be added. New devices can be supported.
Privacy Notification Tool Notifies user when camera is turned on by an app
Additional Features
*Other names and brands may be claimed as the property of others.
19. 19
Web / HTML5
Run Intel® RealSense™ technology in a browser
Unity Web Player App
HTML5 / JavaScript
SDK Local
Web-Socket Server
Intel® RealSense™
SDK runtime
Operating System
Web Socket Communication
127.0.0.1:4181
Internet Browser
Trigger face and gesture events, just like in C# applications
Switch to the web player platform for deployment
Gestures and Facial Analysis
Unity* Web Player games
HTML5 / JavaScript (interactive web sites)
*Other names and brands may be claimed as the property of others.
20. 20
Productivity extension for Unity* games
Add Intel® RealSense™ technology
to your games
• A set of scripts that provides
configurable actions/rules
based on the capabilities
provided in the SDK.
• Programming: Associate your
game objects with the action
scripts.
• Programming: Create rules.
Associate your game objects
with rules.
Drag and drop programming
Write less code!
*Other names and brands may be claimed as the property of others.
21. 21
Developer Resource Center Program
intel.com/realsense/sdk
A one-stop shop for all developer
needs for
Intel® RealSense™ technology
Developer Tools
Tutorials, Demos, Code Samples,
Documentation
Community Forums
Hacker Labs, Virtual AE Sessions,
Webinars
App Challenges
22. 22
Get Started Now!
Reserve Your Developer Kit now (includes peripheral camera, the
latest SDK, and other software updates)
Provided by Intel
Free Intel® RealSense™ SDK for Windows, (Version 2014, Gold)
Intel® RealSense™ Developer Kit camera available for purchase
Huge opportunity to reach customers with integrated 3D camera
Works with languages/frameworks/engines developers already use
High-level APIs for NUI beginners. Low-level APIs for NUI experts
intel.com/realsense/sdk
Seize the Opportunity