Ronn Torossian sits down to discuss the future of the automotive industry and the burgeoning realities of the auto-piloted car – ultimately taking us in a closer look at the efforts of Google, Elon Musk & Tesla, and
Baidu World 2016 With NVIDIA CEO Jen-Hsun HuangNVIDIA
Jen-Hsun Huang, CEO of NVIDIA, gave a keynote speech at the 2016 Baidu World Conference. He discussed how NVIDIA GPUs have become the dominant platform for artificial intelligence research and deep learning. GPUs enabled breakthroughs like superhuman image recognition in 2012 and voice recognition in 2015. NVIDIA's Pascal GPU architecture provides a 65x speedup for deep learning compared to 4 years ago. Huang outlined NVIDIA's work in self-driving cars through its Drive PX platform and partnership with Baidu to apply AI to transportation and other domains.
A study conducted by the Federal Motor Carrier Safety Administration found that rear-end collisions could be reduced through the use of Forward Collision Warning Systems. An analysis estimated that 18,013 rear-end crashes involving trucks and motor carriers from 2001-2005 could have been prevented through FCWS. Lane Departure Warning Systems could have prevented 8,120 crashes caused by lane departures. The avoided costs of FCWS and LDWS were estimated to be in the ranges of $100,000-$1,000,000 depending on whether crashes involved property damage only, injury, or fatality.
The document describes NVIDIA's DRIVE PX 2, an AI supercomputer purpose-built for self-driving cars. It has 12 CPU cores, a Pascal GPU providing 8 TFLOPS of processing power and 24 DL TOPS for deep learning. The DRIVE PX 2 features various interfaces to connect to sensors, displays and development tools. It also includes software like NVIDIA DRIVEWORKS and supports AUTOSAR for automotive software development. The DRIVE PX 2 is designed to help developers create self-driving applications and migrate them from testing to production vehicles.
OTOY founder and CEO, Jules Urbach, announces a colossal update to the OctaneRender ecosystem, including the pricing and availability of its highly anticipated OctaneRender 3 software and OctaneRender Cloud rendering service, and a detailed roadmap outlining the future of Octane’s development towards a 4.0 release in 2017 with full integration of OTOY’s advanced real-time path tracing engine, Brigade.
For the full video of this presentation, please visit:
http://www.embedded-vision.com/platinum-members/auvizsystems/embedded-vision-training/videos/pages/may-2015-embedded-vision-summit
For more information about embedded vision, please visit:
http://www.embedded-vision.com
Nagesh Gupta, CEO and Founder of Auviz Systems, presents the "Trade-offs in Implementing Deep Neural Networks on FPGAs" tutorial at the May 2015 Embedded Vision Summit.
Video and images are a key part of Internet traffic—think of all the data generated by social networking sites such as Facebook and Instagram—and this trend continues to grow. Extracting usable information from video and images is thus a growing requirement in the data center. For example, object and face recognition are valuable for a wide range of uses, from social applications to security applications. Deep neural networks are currently the most popular form of convolutional neural networks (CNN) used in data centers for such applications. 3D convolutions are a core part of CNNs. Nagesh presents alternative implementations of 3D convolutions on FPGAs, and discusses trade-offs among them.
At CES 2016, we made a series of announcements highlighting our work to advance the biggest trends in the industry — self-driving cars, artificial intelligence and
virtual reality. The focus of our news was NVIDIA DRIVE, an end-to-end deep learning platform for self-driving cars.
See the superhuman breakthroughs in modern artificial intelligence powered by GPUs and the NVIDIA DGX-1, the world's first deep learning computer in a box. Deep learning is delivering revolutionary results in all industries, and there's 35x growth in the number of organizations engaged with NVIDIA to apply this technology.
This document highlights how deep learning and AI are accelerating innovation. It provides 5 stories from the week covering topics like using deep learning to help predict and prevent sudden infant deaths, how AI is impacting the chip market, using deep learning to help retail investors, and profiling 10 promising deep learning applications and startups in various industries. The CEO of NVIDIA is quoted discussing the massive growth in AI startups using deep learning and how it will transform many industries.
At a press event kicking off CES 2016, we unveiled artificial intelligence technology that will let cars sense the world around them and pilot a safe route forward.
Dressed in his trademark black leather jacket, speaking to a crowd of some 400 automakers, media and analysts, NVIDIA CEO Jen-Hsun Huang revealed DRIVE PX 2, an automotive supercomputing platform that processes 24 trillion deep learning operations a second. That’s 10 times the performance of the first-generation DRIVE PX, now being used by more than 50 companies in the automotive world.
The new DRIVE PX 2 delivers 8 teraflops of processing power. It has the processing power of 150 MacBook Pros. And it’s the size of a lunchbox in contrast to other autonomous-driving technology being used today, which takes up the entire trunk of a mid-sized sedan.
“Self-driving cars will revolutionize society,” Huang said at the beginning of his talk. “And NVIDIA’s vision is to enable them.”
For the full video of this presentation, please visit:
http://www.embedded-vision.com/platinum-members/altera/embedded-vision-training/videos/pages/may-2016-embedded-vision-summit
For more information about embedded vision, please visit:
http://www.embedded-vision.com
Bill Jenkins, Senior Product Specialist for High Level Design Tools at Intel, presents the "Accelerating Deep Learning Using Altera FPGAs" tutorial at the May 2016 Embedded Vision Summit.
While large strides have recently been made in the development of high-performance systems for neural networks based on multi-core technology, significant challenges in power, cost and, performance scaling remain. Field-programmable gate arrays (FPGAs) are a natural choice for implementing neural networks because they can combine computing, logic, and memory resources in a single device. Intel's Programmable Solutions Group has developed a scalable convolutional neural network reference design for deep learning systems using the OpenCL programming language built with our SDK for OpenCL. The design performance is being benchmarked using several popular CNN benchmarks: CIFAR-10, ImageNet and KITTI.
Building the CNN with OpenCL kernels allows true scaling of the design from smaller to larger devices and from one device generation to the next. New designs can be sized using different numbers of kernels at each layer. Performance scaling from one generation to the next also benefits from architectural advancements, such as floating-point engines and frequency scaling. Thus, you achieve greater than linear performance and performance per watt scaling with each new series of devices.
Deep learning goes beyond the traditional machine learning of big data and analytics. In this session, we will review the AWS offering, Amazon Machine Learning, and the AWS GPU-intensive family of servers that run native machine learning and deep-learning algorithms. We will also cover some basic deep-learning algorithms using open source software. Session sponsored by Day1 Solutions.
The automotive industry is going through an innovation shift where manufacturers are trying to achieve new heights of technological and design innovations every day to lure customers. Tesla Motors, Inc. is an American automotive company which is well renowned in the market for manufacturing luxury electric vehicles. Tesla is swiftly pioneering in the automotive industry leveraging information and technology systems (IT/IS) with a combination of highly intelligent hardware and software system (Newcomb, 2015, para. 3). The technology and IS integrated in the Tesla cars permit Over-the-Air (OTA) software updates and Autopilot features. OTA was first launched in Tesla cars in 2012 with Model S (Newcomb, 2015, para. 3). Since the adoption of “OTA software update” technology in tesla cars, it has created a competitive advantage for the company over other well established automotive manufacturers including BMW, GMC, Ford and others who are still trying to develop and integrate the OTA technology in their cars (Zhang, 2016, para. 2).
This paper analyzes implementation of hands-free and feet-free travel experience also called Autopilot feature in Tesla cars powered by Over-the-Air IS technology. First, this paper describes a brief background of Tesla Motors Inc., current information system trends and their requirements in automotive industry, and Tesla’s competitors. Second, the paper analyzes OTA technology used to push firmware updates, support autopilot, and its influence on automotive industry. Third, the paper presents tesla’s competitive environment in automotive industry, its trading partners, and how tesla is leveraging OTA for sustainable competitive advantage. Next, the paper analyzes economic feasibility to implement autopilot supported by OTA in all models, compares best case, worst case, most likely case and its value towards the success of the company. Lastly, the paper explains organizational implementation of self-driving cars powered by OTA at a large scale.
As artificial intelligence sweeps across the technology landscape, NVIDIA unveiled today at its annual GPU Technology Conference a series of new products and technologies focused on deep learning, virtual reality and self-driving cars.
CES 2016 Recap: The Autonomous 4K VR 3D IoT Drone AwakensDavid Berkowitz
What were the most important trends, themes, and technologies at CES 2016? The Consumer Electronics Show this year featured massive partnership announcements from car brands, fast drones, immersive virtual reality experiences, and much more. See what matters most for technologists, marketers, and others in this roundup.
The document summarizes a Consumer Report from May 2013 that compares the Tesla Model S to a vehicle Marty McFly might have chosen in Back to the Future. In 3 sentences:
The Consumer Report from May 2013 featured in the document compares Tesla's all-electric Model S sedan favorably to a vehicle Marty McFly could have chosen instead of the DeLorean time machine in the movie Back to the Future. Tesla's Model S is highlighted as a luxury electric vehicle offering outstanding performance and a package of services. The report presents the Model S as a car that could have fulfilled Marty McFly's futuristic transportation needs had it been available when the movie was made.
Some resources how to navigate in the hardware space in order to build your own workstation for training deep learning models.
Alternative download link: https://www.dropbox.com/s/o7cwla30xtf9r74/deepLearning_buildComputer.pdf?dl=0
Ronn Torossian sits down to discuss the future of the automotive industry and the burgeoning realities of the auto-piloted car – ultimately taking us in a closer look at the efforts of Google, Elon Musk & Tesla, and
Baidu World 2016 With NVIDIA CEO Jen-Hsun HuangNVIDIA
Jen-Hsun Huang, CEO of NVIDIA, gave a keynote speech at the 2016 Baidu World Conference. He discussed how NVIDIA GPUs have become the dominant platform for artificial intelligence research and deep learning. GPUs enabled breakthroughs like superhuman image recognition in 2012 and voice recognition in 2015. NVIDIA's Pascal GPU architecture provides a 65x speedup for deep learning compared to 4 years ago. Huang outlined NVIDIA's work in self-driving cars through its Drive PX platform and partnership with Baidu to apply AI to transportation and other domains.
A study conducted by the Federal Motor Carrier Safety Administration found that rear-end collisions could be reduced through the use of Forward Collision Warning Systems. An analysis estimated that 18,013 rear-end crashes involving trucks and motor carriers from 2001-2005 could have been prevented through FCWS. Lane Departure Warning Systems could have prevented 8,120 crashes caused by lane departures. The avoided costs of FCWS and LDWS were estimated to be in the ranges of $100,000-$1,000,000 depending on whether crashes involved property damage only, injury, or fatality.
The document describes NVIDIA's DRIVE PX 2, an AI supercomputer purpose-built for self-driving cars. It has 12 CPU cores, a Pascal GPU providing 8 TFLOPS of processing power and 24 DL TOPS for deep learning. The DRIVE PX 2 features various interfaces to connect to sensors, displays and development tools. It also includes software like NVIDIA DRIVEWORKS and supports AUTOSAR for automotive software development. The DRIVE PX 2 is designed to help developers create self-driving applications and migrate them from testing to production vehicles.
OTOY founder and CEO, Jules Urbach, announces a colossal update to the OctaneRender ecosystem, including the pricing and availability of its highly anticipated OctaneRender 3 software and OctaneRender Cloud rendering service, and a detailed roadmap outlining the future of Octane’s development towards a 4.0 release in 2017 with full integration of OTOY’s advanced real-time path tracing engine, Brigade.
For the full video of this presentation, please visit:
http://www.embedded-vision.com/platinum-members/auvizsystems/embedded-vision-training/videos/pages/may-2015-embedded-vision-summit
For more information about embedded vision, please visit:
http://www.embedded-vision.com
Nagesh Gupta, CEO and Founder of Auviz Systems, presents the "Trade-offs in Implementing Deep Neural Networks on FPGAs" tutorial at the May 2015 Embedded Vision Summit.
Video and images are a key part of Internet traffic—think of all the data generated by social networking sites such as Facebook and Instagram—and this trend continues to grow. Extracting usable information from video and images is thus a growing requirement in the data center. For example, object and face recognition are valuable for a wide range of uses, from social applications to security applications. Deep neural networks are currently the most popular form of convolutional neural networks (CNN) used in data centers for such applications. 3D convolutions are a core part of CNNs. Nagesh presents alternative implementations of 3D convolutions on FPGAs, and discusses trade-offs among them.
At CES 2016, we made a series of announcements highlighting our work to advance the biggest trends in the industry — self-driving cars, artificial intelligence and
virtual reality. The focus of our news was NVIDIA DRIVE, an end-to-end deep learning platform for self-driving cars.
See the superhuman breakthroughs in modern artificial intelligence powered by GPUs and the NVIDIA DGX-1, the world's first deep learning computer in a box. Deep learning is delivering revolutionary results in all industries, and there's 35x growth in the number of organizations engaged with NVIDIA to apply this technology.
This document highlights how deep learning and AI are accelerating innovation. It provides 5 stories from the week covering topics like using deep learning to help predict and prevent sudden infant deaths, how AI is impacting the chip market, using deep learning to help retail investors, and profiling 10 promising deep learning applications and startups in various industries. The CEO of NVIDIA is quoted discussing the massive growth in AI startups using deep learning and how it will transform many industries.
At a press event kicking off CES 2016, we unveiled artificial intelligence technology that will let cars sense the world around them and pilot a safe route forward.
Dressed in his trademark black leather jacket, speaking to a crowd of some 400 automakers, media and analysts, NVIDIA CEO Jen-Hsun Huang revealed DRIVE PX 2, an automotive supercomputing platform that processes 24 trillion deep learning operations a second. That’s 10 times the performance of the first-generation DRIVE PX, now being used by more than 50 companies in the automotive world.
The new DRIVE PX 2 delivers 8 teraflops of processing power. It has the processing power of 150 MacBook Pros. And it’s the size of a lunchbox in contrast to other autonomous-driving technology being used today, which takes up the entire trunk of a mid-sized sedan.
“Self-driving cars will revolutionize society,” Huang said at the beginning of his talk. “And NVIDIA’s vision is to enable them.”
For the full video of this presentation, please visit:
http://www.embedded-vision.com/platinum-members/altera/embedded-vision-training/videos/pages/may-2016-embedded-vision-summit
For more information about embedded vision, please visit:
http://www.embedded-vision.com
Bill Jenkins, Senior Product Specialist for High Level Design Tools at Intel, presents the "Accelerating Deep Learning Using Altera FPGAs" tutorial at the May 2016 Embedded Vision Summit.
While large strides have recently been made in the development of high-performance systems for neural networks based on multi-core technology, significant challenges in power, cost and, performance scaling remain. Field-programmable gate arrays (FPGAs) are a natural choice for implementing neural networks because they can combine computing, logic, and memory resources in a single device. Intel's Programmable Solutions Group has developed a scalable convolutional neural network reference design for deep learning systems using the OpenCL programming language built with our SDK for OpenCL. The design performance is being benchmarked using several popular CNN benchmarks: CIFAR-10, ImageNet and KITTI.
Building the CNN with OpenCL kernels allows true scaling of the design from smaller to larger devices and from one device generation to the next. New designs can be sized using different numbers of kernels at each layer. Performance scaling from one generation to the next also benefits from architectural advancements, such as floating-point engines and frequency scaling. Thus, you achieve greater than linear performance and performance per watt scaling with each new series of devices.
Deep learning goes beyond the traditional machine learning of big data and analytics. In this session, we will review the AWS offering, Amazon Machine Learning, and the AWS GPU-intensive family of servers that run native machine learning and deep-learning algorithms. We will also cover some basic deep-learning algorithms using open source software. Session sponsored by Day1 Solutions.
The automotive industry is going through an innovation shift where manufacturers are trying to achieve new heights of technological and design innovations every day to lure customers. Tesla Motors, Inc. is an American automotive company which is well renowned in the market for manufacturing luxury electric vehicles. Tesla is swiftly pioneering in the automotive industry leveraging information and technology systems (IT/IS) with a combination of highly intelligent hardware and software system (Newcomb, 2015, para. 3). The technology and IS integrated in the Tesla cars permit Over-the-Air (OTA) software updates and Autopilot features. OTA was first launched in Tesla cars in 2012 with Model S (Newcomb, 2015, para. 3). Since the adoption of “OTA software update” technology in tesla cars, it has created a competitive advantage for the company over other well established automotive manufacturers including BMW, GMC, Ford and others who are still trying to develop and integrate the OTA technology in their cars (Zhang, 2016, para. 2).
This paper analyzes implementation of hands-free and feet-free travel experience also called Autopilot feature in Tesla cars powered by Over-the-Air IS technology. First, this paper describes a brief background of Tesla Motors Inc., current information system trends and their requirements in automotive industry, and Tesla’s competitors. Second, the paper analyzes OTA technology used to push firmware updates, support autopilot, and its influence on automotive industry. Third, the paper presents tesla’s competitive environment in automotive industry, its trading partners, and how tesla is leveraging OTA for sustainable competitive advantage. Next, the paper analyzes economic feasibility to implement autopilot supported by OTA in all models, compares best case, worst case, most likely case and its value towards the success of the company. Lastly, the paper explains organizational implementation of self-driving cars powered by OTA at a large scale.
As artificial intelligence sweeps across the technology landscape, NVIDIA unveiled today at its annual GPU Technology Conference a series of new products and technologies focused on deep learning, virtual reality and self-driving cars.
CES 2016 Recap: The Autonomous 4K VR 3D IoT Drone AwakensDavid Berkowitz
What were the most important trends, themes, and technologies at CES 2016? The Consumer Electronics Show this year featured massive partnership announcements from car brands, fast drones, immersive virtual reality experiences, and much more. See what matters most for technologists, marketers, and others in this roundup.
The document summarizes a Consumer Report from May 2013 that compares the Tesla Model S to a vehicle Marty McFly might have chosen in Back to the Future. In 3 sentences:
The Consumer Report from May 2013 featured in the document compares Tesla's all-electric Model S sedan favorably to a vehicle Marty McFly could have chosen instead of the DeLorean time machine in the movie Back to the Future. Tesla's Model S is highlighted as a luxury electric vehicle offering outstanding performance and a package of services. The report presents the Model S as a car that could have fulfilled Marty McFly's futuristic transportation needs had it been available when the movie was made.
Some resources how to navigate in the hardware space in order to build your own workstation for training deep learning models.
Alternative download link: https://www.dropbox.com/s/o7cwla30xtf9r74/deepLearning_buildComputer.pdf?dl=0