Kim Stevenson, Intel's Chief Information Officer, discussed how big data and analytics are driving innovation through increased data volumes, lower computing costs, and new tools. Big data allows for improved customer experiences, more intelligent systems, and richer data analysis. Corporations are using analytics to increase efficiency, assist campaigns, and reduce costs. Intel's data platform aims to enable massive computing power, build an open ecosystem, and reduce complexity to fuel data-driven innovation. Stevenson highlighted opportunities from traffic optimization to personalized healthcare and ways analytics can provide operational efficiency, revenue growth, and cost reduction.
IT@Intel: Creating Smart Spaces with All-in-OnesIT@Intel
Intel IT explains how they used all-in-one devices as collaboration tools both in the office as well as lab spaces. By providing efficient collaboration solutions, we help our employees be more productive and have greater job satisfaction.
Intel® Xeon® Processor E5-2600 v4 Product Family EAMGIntel IT Center
See why the new Intel® Xeon® processor E5-2600 v4 product family is ideal for next-generation application workloads and is the powerhouse for software-defined infrastructure (SDI) environments where automation and orchestration capabilities are foundational. Higher core counts, enhanced virtualization capabilities, and increased memory bandwidth provide the resources that are necessary to drive improvements in performance across a wide range of workloads. These processors also include technologies that can help IT organizations and cloud providers orchestrate resources more intelligently so they can optimize performance, agility, and efficiency. From 3-D data visualization and virtual prototyping, to personalized content delivery, new software capabilities provide the foundation for smarter, faster, and more agile business solutions.
Three Steps to Making a Digital Workplace a RealityIntel IT Center
The workplace is undergoing a dramatic evolution. Work styles are more mobile, changing the way we collaborate and share information while a more mobile workforce means a greater need to thwart cyber-attacks. You'll learn about Intel's three-part approach to help IT leaders sustainably embrace mobility and increase your security posture.
Achieve Unconstrained Collaboration in a Digital WorldIntel IT Center
Technology is at the center of every digitally-savvy workplace, yet organizations are constrained with bridging current tools to more modern solutions. This session from Gartner Digital Workplace Summit will cover a new way to facilitate employee collaboration that is easy, engaging and gives IT an uncompromised security and management experience.
IT@Intel: Creating Smart Spaces with All-in-OnesIT@Intel
Intel IT explains how they used all-in-one devices as collaboration tools both in the office as well as lab spaces. By providing efficient collaboration solutions, we help our employees be more productive and have greater job satisfaction.
Intel® Xeon® Processor E5-2600 v4 Product Family EAMGIntel IT Center
See why the new Intel® Xeon® processor E5-2600 v4 product family is ideal for next-generation application workloads and is the powerhouse for software-defined infrastructure (SDI) environments where automation and orchestration capabilities are foundational. Higher core counts, enhanced virtualization capabilities, and increased memory bandwidth provide the resources that are necessary to drive improvements in performance across a wide range of workloads. These processors also include technologies that can help IT organizations and cloud providers orchestrate resources more intelligently so they can optimize performance, agility, and efficiency. From 3-D data visualization and virtual prototyping, to personalized content delivery, new software capabilities provide the foundation for smarter, faster, and more agile business solutions.
Three Steps to Making a Digital Workplace a RealityIntel IT Center
The workplace is undergoing a dramatic evolution. Work styles are more mobile, changing the way we collaborate and share information while a more mobile workforce means a greater need to thwart cyber-attacks. You'll learn about Intel's three-part approach to help IT leaders sustainably embrace mobility and increase your security posture.
Achieve Unconstrained Collaboration in a Digital WorldIntel IT Center
Technology is at the center of every digitally-savvy workplace, yet organizations are constrained with bridging current tools to more modern solutions. This session from Gartner Digital Workplace Summit will cover a new way to facilitate employee collaboration that is easy, engaging and gives IT an uncompromised security and management experience.
For the full video of this presentation, please visit:
https://www.edge-ai-vision.com/2021/07/accelerating-edge-ai-solution-development-with-pre-validated-hardware-software-kits-from-intel-partners-a-presentation-from-intel/
Daniel Tsui, Foundational Developer Kit Product Manager at Intel, presents the “Accelerating Edge AI Solution Development with Pre-validated Hardware-Software Kits from Intel Partners” tutorial at the May 2021 Embedded Vision Summit.
When developing a new edge AI solution, you want to focus on your system’s unique functionality. In this session, Tsui shares the different foundational developer kits available from Intel’s partners to help speed your edge AI solution development. These kits include industrial-grade hardware that’s ready to deploy and that can be purchased easily using a credit card.
These systems come with Intel’s Edge Insights for Vision software package, a set of pre-validated software modules for orchestration and cloud support, and the Intel Distribution of OpenVINO toolkit for computer vision and deep learning applications. Watch and learn how these robust, pre-validated hardware and software resources can accelerate the development of your edge AI solution.
Intel Gateway Solutions for the Internet of ThingsIntel IoT
Intel Gateway Solutions for the Internet of Things (IoT) is a family of platforms that enables companies to seamlessly interconnect industrial infrastructure devices and secure data flow between devices and the cloud. Intel Gateway Solutions for IoT enables customers to securely aggregate, share,and filter data for analysis.
Reducing Cost and Complexity with Industrial System ConsolidationIntel IoT
In today’s highly competitive manufacturing environment, success requires a constant focus on cost cutting while maintaining production throughput and employee safety. For manufacturers, this includes finding new ways to lower operating expenses, a large part of which are the purchase and support of industrial systems. A significant cost stems from the inefficiencies created by the growing numbers and varieties of systems on the factory floor.
This white paper describes how virtualization technology running on multi-core Intel Core vPro processors can be used in industrial automation to consolidate computing devices for motion control, programmable logic control (PLC), human machine interface (HMI), machine vision, data acquisition, functional safety and so forth. This approach can help manufacturers reduce cost and complexity on the factory floor.
For the full video of this presentation, please visit:
https://www.edge-ai-vision.com/2020/11/acceleration-of-deep-learning-using-openvino-3d-seismic-case-study-a-presentation-from-intel/
For more information about edge AI and computer vision, please visit:
https://www.edge-ai-vision.com
Manas Pathak, Global AI Lead for Oil and Gas at Intel, presents the “Acceleration of Deep Learning Using OpenVINO: 3D Seismic Case Study” tutorial at the September 2020 Embedded Vision Summit.
The use of deep learning for automatic seismic data interpretation is gaining the attention of many researchers across the oil and gas industry. The integration of high-performance computing (HPC) AI workflows in seismic data interpretation brings the challenge of moving and processing large amounts of data from HPC to AI computing solutions and vice-versa.
In this presentation, Pathak illustrates this challenge via a case study using a public deep learning model for salt identification applied on a 3D seismic survey from the F3 Dutch block in the North Sea. He presents a workflow to address this challenge and perform accelerated AI on seismic data. The Intel Distribution of OpenVINO toolkit was used to increase the inference performance of a pre-trained model on an Intel CPU. OpenVINO allows CPU users to get significant improvement in AI inference performance for high memory capacity deep learning models used on large datasets without any significant loss in accuracy.
Gary Brown (Movidius, Intel): Deep Learning in AR: the 3 Year HorizonAugmentedWorldExpo
A talk from the Develop Track at AWE USA 2017 - the largest conference for AR+VR in Santa Clara, California May 31- June 2, 2017.
Gary Brown (Movidius, Intel): Deep Learning in AR: the 3 Year Horizon
Deep learning techniques are gaining in popularity in many facets of embedded vision, and this holds true for AR and VR. Will they soon dominate every facet of vision processing? This talk explores this question by examining the theory and practice of applying deep learning to real world problems for Augmented Reality, with real examples describing how this shift is happening today quickly in some areas, and slower in others.
http://AugmentedWorldExpo.com
Caffelli is a full service integrated branding agency in Portland, OR founded in 2010. As a
startup company, Caffelli was challenged to scale their IT capabilities quickly to meet their
expanding needs. They chose to deploy an on-premise Intel® Xeon® Processor E3 familybased
server with Intel® RAID Controller RMS2AF080 to serve as the foundation of their small
business’ data, while leveraging the availability, scalability and pay-as-you go model of business
applications in the cloud.
Deploying Image Classifiers on Intel® Movidius™ Neural Compute StickIntel® Software
In this webinar, Ashwin Vijayakumar will walk through the process of profiling pre-trained neural networks designed for image classification, identify a good balance between accuracy and real-time performance, and write a simple Python* script to deploy these classifiers on the Intel® Movidius™ Neural Compute Stick.
AWS Summit Berlin 2013 - Big Data AnalyticsAWS Germany
Learn more about the tools, techniques and technologies for working productively with data at any scale. This session will introduce the family of data analytics tools on AWS which you can use to collect, compute and collaborate around data, from gigabytes to petabytes. We'll discuss Amazon Elastic MapReduce, Hadoop, structured and unstructured data, and the EC2 instance types which enable high performance analytics.
Building Confidence in Big Data - IBM Smarter Business 2013 IBM Sverige
Success with big data comes down to confidence. Without confidence in the underlying data, decision makers may not trust and act on analytic insight. You need confidence in your data – that it’s correct, trusted, and protected through automated integration, visual context, and agile governance. You need confidence in your ability to accelerate time to value, with fast deployments of big data appliances. Learn how clients have succeeded with big data by building confidence in their data, ability to deploy, and skills. Presenter: David Corrigan, Big Data specialist, IBM. Mer från dagen på http://bit.ly/sb13se
AWS & Intel: A Partnership Dedicated to Cloud InnovationsAmazon Web Services
Innovation is at the heart of the collaboration between Intel and AWS. Cloud adoption is fueling the next industrial revolution. The session is about exploring the new opportunities offered through cloud adoption. It is also about how Intel and AWS are bringing to the customers the latest technologies that help accelerate the adoption in the cloud of big data, HPC or IoT.
Craig Stires, Head of Big Data and Analytics, Amazon Web Services, APAC
(Singapore) Eddie Toh, Regional Director, Datacenter Platform Marketing, Asia Pacific & Japan, Intel Technology Asia Pte Ltd
For the full video of this presentation, please visit:
https://www.edge-ai-vision.com/2021/07/accelerating-edge-ai-solution-development-with-pre-validated-hardware-software-kits-from-intel-partners-a-presentation-from-intel/
Daniel Tsui, Foundational Developer Kit Product Manager at Intel, presents the “Accelerating Edge AI Solution Development with Pre-validated Hardware-Software Kits from Intel Partners” tutorial at the May 2021 Embedded Vision Summit.
When developing a new edge AI solution, you want to focus on your system’s unique functionality. In this session, Tsui shares the different foundational developer kits available from Intel’s partners to help speed your edge AI solution development. These kits include industrial-grade hardware that’s ready to deploy and that can be purchased easily using a credit card.
These systems come with Intel’s Edge Insights for Vision software package, a set of pre-validated software modules for orchestration and cloud support, and the Intel Distribution of OpenVINO toolkit for computer vision and deep learning applications. Watch and learn how these robust, pre-validated hardware and software resources can accelerate the development of your edge AI solution.
Intel Gateway Solutions for the Internet of ThingsIntel IoT
Intel Gateway Solutions for the Internet of Things (IoT) is a family of platforms that enables companies to seamlessly interconnect industrial infrastructure devices and secure data flow between devices and the cloud. Intel Gateway Solutions for IoT enables customers to securely aggregate, share,and filter data for analysis.
Reducing Cost and Complexity with Industrial System ConsolidationIntel IoT
In today’s highly competitive manufacturing environment, success requires a constant focus on cost cutting while maintaining production throughput and employee safety. For manufacturers, this includes finding new ways to lower operating expenses, a large part of which are the purchase and support of industrial systems. A significant cost stems from the inefficiencies created by the growing numbers and varieties of systems on the factory floor.
This white paper describes how virtualization technology running on multi-core Intel Core vPro processors can be used in industrial automation to consolidate computing devices for motion control, programmable logic control (PLC), human machine interface (HMI), machine vision, data acquisition, functional safety and so forth. This approach can help manufacturers reduce cost and complexity on the factory floor.
For the full video of this presentation, please visit:
https://www.edge-ai-vision.com/2020/11/acceleration-of-deep-learning-using-openvino-3d-seismic-case-study-a-presentation-from-intel/
For more information about edge AI and computer vision, please visit:
https://www.edge-ai-vision.com
Manas Pathak, Global AI Lead for Oil and Gas at Intel, presents the “Acceleration of Deep Learning Using OpenVINO: 3D Seismic Case Study” tutorial at the September 2020 Embedded Vision Summit.
The use of deep learning for automatic seismic data interpretation is gaining the attention of many researchers across the oil and gas industry. The integration of high-performance computing (HPC) AI workflows in seismic data interpretation brings the challenge of moving and processing large amounts of data from HPC to AI computing solutions and vice-versa.
In this presentation, Pathak illustrates this challenge via a case study using a public deep learning model for salt identification applied on a 3D seismic survey from the F3 Dutch block in the North Sea. He presents a workflow to address this challenge and perform accelerated AI on seismic data. The Intel Distribution of OpenVINO toolkit was used to increase the inference performance of a pre-trained model on an Intel CPU. OpenVINO allows CPU users to get significant improvement in AI inference performance for high memory capacity deep learning models used on large datasets without any significant loss in accuracy.
Gary Brown (Movidius, Intel): Deep Learning in AR: the 3 Year HorizonAugmentedWorldExpo
A talk from the Develop Track at AWE USA 2017 - the largest conference for AR+VR in Santa Clara, California May 31- June 2, 2017.
Gary Brown (Movidius, Intel): Deep Learning in AR: the 3 Year Horizon
Deep learning techniques are gaining in popularity in many facets of embedded vision, and this holds true for AR and VR. Will they soon dominate every facet of vision processing? This talk explores this question by examining the theory and practice of applying deep learning to real world problems for Augmented Reality, with real examples describing how this shift is happening today quickly in some areas, and slower in others.
http://AugmentedWorldExpo.com
Caffelli is a full service integrated branding agency in Portland, OR founded in 2010. As a
startup company, Caffelli was challenged to scale their IT capabilities quickly to meet their
expanding needs. They chose to deploy an on-premise Intel® Xeon® Processor E3 familybased
server with Intel® RAID Controller RMS2AF080 to serve as the foundation of their small
business’ data, while leveraging the availability, scalability and pay-as-you go model of business
applications in the cloud.
Deploying Image Classifiers on Intel® Movidius™ Neural Compute StickIntel® Software
In this webinar, Ashwin Vijayakumar will walk through the process of profiling pre-trained neural networks designed for image classification, identify a good balance between accuracy and real-time performance, and write a simple Python* script to deploy these classifiers on the Intel® Movidius™ Neural Compute Stick.
AWS Summit Berlin 2013 - Big Data AnalyticsAWS Germany
Learn more about the tools, techniques and technologies for working productively with data at any scale. This session will introduce the family of data analytics tools on AWS which you can use to collect, compute and collaborate around data, from gigabytes to petabytes. We'll discuss Amazon Elastic MapReduce, Hadoop, structured and unstructured data, and the EC2 instance types which enable high performance analytics.
Building Confidence in Big Data - IBM Smarter Business 2013 IBM Sverige
Success with big data comes down to confidence. Without confidence in the underlying data, decision makers may not trust and act on analytic insight. You need confidence in your data – that it’s correct, trusted, and protected through automated integration, visual context, and agile governance. You need confidence in your ability to accelerate time to value, with fast deployments of big data appliances. Learn how clients have succeeded with big data by building confidence in their data, ability to deploy, and skills. Presenter: David Corrigan, Big Data specialist, IBM. Mer från dagen på http://bit.ly/sb13se
AWS & Intel: A Partnership Dedicated to Cloud InnovationsAmazon Web Services
Innovation is at the heart of the collaboration between Intel and AWS. Cloud adoption is fueling the next industrial revolution. The session is about exploring the new opportunities offered through cloud adoption. It is also about how Intel and AWS are bringing to the customers the latest technologies that help accelerate the adoption in the cloud of big data, HPC or IoT.
Craig Stires, Head of Big Data and Analytics, Amazon Web Services, APAC
(Singapore) Eddie Toh, Regional Director, Datacenter Platform Marketing, Asia Pacific & Japan, Intel Technology Asia Pte Ltd
Keynote Address at 2013 CloudCon: Future of Big Data by Richard McDougall (In...exponential-inc
Over the last few years we’ve seen a frenzy of interest and buzz around the area of Big Data. Beyond the hype, there is a solid base of growing use cases, which are becoming center stage to most businesses. 2012 was the year of awareness. There was a great amount of sharing from the early core developers of the analytic platforms – showing the rest of the world the capabilities of the tools and platforms that had been developed for special purpose high scale analytics. The big names at the core of open source analytics development include Facebook, eBay, Linkedin, Twitter – all blazing the trail with new approaches. These companies brought along with them a new and expanding interest in leveraging the same technologies for commercial interest.
This talk is focused at how a growing number of enterprises that are already heavily invested in the use cases – but by volume, most customers now have some form of big data proof-of-concept underway. These proof of concepts typically start with a thesis of how competitive advantage can be gained through insight from the data. A proof of concept can quickly validate the theory, and helps sell further investment in the analytics platform, and it snowballs from there.
Bridging the Last Mile: Getting Data to the People Who Need It (APAC)Denodo
Watch full webinar here: https://bit.ly/34iCruM
Many organizations are embarking on strategically important journeys to embrace data and analytics. The goal can be to improve internal efficiencies, improve the customer experience, drive new business models and revenue streams, or – in the public sector – provide better services. All of these goals require empowering employees to act on data and analytics and to make data-driven decisions. However, getting data – the right data at the right time – to these employees is a huge challenge and traditional technologies and data architectures are simply not up to this task. This webinar will look at how organizations are using Data Virtualization to quickly and efficiently get data to the people that need it.
Attend this session to learn:
- The challenges organizations face when trying to get data to the business users in a timely manner
- How Data Virtualization can accelerate time-to-value for an organization’s data assets
- Examples of leading companies that used data virtualization to get the right data to the users at the right time
this presentation will let you know the in and out of bigdata growing trends... market potential , solutions provided by bigdata, advantages and disadvantages.
Cloud 2015: Connecting the Next Billion - Intel Keynote @ HP Discover 2011Intel IT Center
Kirk Skaugen VP & GM of the Intel Data Center Group discusses Intel's vision for computing in 2015, mobile technologies, and Intel & HPs combined commitment to developing leading mission critical solutions.
Tackling Retail Technology Management Challenges at the EdgeRebekah Rodriguez
As the adoption of intelligent applications in the Retail industry grows, so do their technology requirements. This creates challenges for store operators to navigate the deployment and maintenance of hardware, applications, and management tools at locations without a dedicated IT staff. Along with the complexity of solutions, these operators are dealing with a wide range of installation scenarios, specific to the products or services they offer.
Purpose-built edge systems, such as Supermicro’s Fanless servers powered by Intel® Xeon D® processors, provide a secure and rugged platform that can be deployed where conventional servers cannot. These systems, along with our broad portfolio of short-depth rackmount systems, can be combined with the Reliant Platform by Acumera, a secure, cost-effective, and central cloud managed solution to automate delivery and management of applications, networking, and security controls either in-store or in the cloud.
Join this webinar to hear how these solutions are currently employed across thousands of locations, simplifying Edge IT for many major retailers today. Speakers will include David Nielsen, Sr. System Product Manager for IoT and Edge Applications, Richard Newman and Brett Stewart, Technology Leaders at Acumera, as well as Craig Carter, Product Line Manager from Intel to discuss the latest generation of the Xeon D platform.
Cloud & Big Data - Digital Transformation in Banking Sutedjo Tjahjadi
Datacomm Cloud Business Overview
Making Indonesia 4.0
Digital Transformation in Banking Industry
Introduction to Cloud Computing
Big Data Analytics Introduction
Big Data Analytics Application in Banking
Accelerating Cloud Services and How to Match your Workload to the Right Intel...Amazon Web Services
This presentation will take you through the underlying Intel technologies available in AWS EC2 Instances, what benefits these technologies can provide and how to use these technologies to maximise your use case workloads AWS EC2 Instances provide a wide selection of instance types, optimised to fit a broad and diverse set of use cases, all supported by the latest Intel Xeon processors and technologies providing you with confidence to choose an AWS EC2 instance type that best meet your performance needs for compute intensive, memory intensive, or IOPS intensive applications.
So, if you are interested in how to maximise your existing code optimisations and investments or thinking about migrating workloads to AWS come to this session.
Speaker: Peter Kerney, Lead Enterprise Architect, Cloud, SDI and NFV, Intel
Similar to Unlock Hidden Potential through Big Data and Analytics (20)
Intel and IT- key industry trends driving business transformationIT@Intel
A presentation by Intel CIO Kim Stevenson @kimsstevenson: "Intel and IT- key industry trends driving business transformation" to Metroplex Technology Business Council.
Includes examples of the Intel IT is embracing the SMAC stack - social, mobile, analytics and cloud to transform business.
Intel IT wanted to better serve Intel employees by improving the way we provided PC peripherals. We wanted a more accessible and, ideally, self-service method.
We created IT on the GO, a vending machine with consigned inventory that enables employees to get a variety of PC peripherals in less than a minute, any time of the day.
Enterprise Video Hosting: Introducing the Intel Video PortalIT@Intel
Intel IT developed an enterprise video hosting solution in order to meet the needs of employees who wanted to create and share videos in an easy-to-use and secure manner.
How to Self-Provision over WLAN with Intel(R) vPro(TM) TechnologyIT@Intel
Intel IT wanted to create a simple and efficient way for employees to be able to self-provision their PCs, without having to take the
PC to an IT Service Center. We created a tool that allows our employees to self-provision their systems over the WLAN.
To help other enterprises easily do the same, this presentation provides the necessary instructions and a link to the downloadable batch file.
Intel IT recently conducted a 10-day crowdsourcing activity to gather ideas on how to increase organizational innnovation. We asked our participants 4 easy questions and learned a wealth of information from the responses.
Accelerating Our Path to Multi Platform BenefitsIT@Intel
This is a time of tremendous change for IT organizations everywhere.
Intel IT realized we need to enable enterprise applications to support the devices of today (touch) and also develop the applications so they are ready for the next big thing (voice and gesture). We’ve kicked-off a new initiative that focuses on accelerating delivery of applications to our business partners and employees on their mobile platform(s) of choice.
Deploying Intel Architecture-based Tablets with Windows* 8 at IntelIT@Intel
Intel IT recently deployed Intel® Atom™ processor-based tablets with Microsoft Windows* 8 in our enterprise in a proof of concept.
Our participants were pleased with the experience, and reported greater productivity and flexibility.
How do you secure your most sensitive data in the cloud? How can you provide the right level of authentication controls or encryption services? These are some of the key challenges of virtualization in the cloud. Intel IT responded by creating an architecture called a High Trust Zone (HTZ). This architecture greatly increases flexibility and focuses on rapid detection of compromise and survivability. In particular, it uses zones of trust that provide more flexible, dynamic, and granular controls than do traditional enterprise security models.
Six Irrefutable Laws of Information SecurityIT@Intel
How can organizations balance business needs and growth with risk mitigation and security controls? These Six Irrefutable Laws of Information security can help you achieve balance.
Gather insights from Malcolm Harkins, Intel Chief information Security Officer, on how to balance business growth with risk mitigation. This presentation links to a webinar on this topic.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
4. What’s Driving Big Data?
10X
Volume &
Type of Data
Lower Cost of
Compute & Storage
Data growth by 2016
90% unstructured 1
AVERAGE
SERVER COST
STORAGE
COST / GB
2002 - 20122
2002 - 20122
90%
40%
$17B
New Investments in
Tools & Services
$3B
2010
1: IDC Digital Universe Study December 2012
2: Intel Forecast
3: IDC WW Big Data Forecast
$7B
2012
20173
5. 100,000+ tweets
695,000 status updates
Cloud/Server
Megabytes
The Internet
Gigabytes
Social, Mobile,
Cloud, Analytics
Zettabytes
698,445 Google searches
•
•
Mainframe
Kilobytes
168 million+ emails sent
1,820TB of data created
217 new mobile web users
Yottabytes ?
Every 60 seconds
11million instant messages
6. Virtuous Cycle of Data-Driven Innovation
40 Zettabytes of data will be
generated WW in 20201
Clients
Cloud
Richer user experiences
Intelligent Systems
Richer data to analyze
2.8 Zettabytes of data generated
WW in 20121
(1) IDC Digital Universe 2020, (2) IDC
Richer data from devices
7. Corporations are deriving value from data
UPS Package Flow
Allows UPS to increase its customer satisfaction, obviates the
needs of humans in route planning, increasing accuracy and
efficiency, saves millions of miles and millions of gallons
Data and Micro-Targeting Assists Campaigns
Built a voter file that included voter history, demographic data, and
preferences to identify “persuadables” to target
Predictive Analytics to Reduce
Maintenance Costs
Predicted part failure rates and sent preventative maintenance crews
to anticipate issues
*Other brands and names are the property of their respective owners
8. Big Data: Big Opportunities
Traffic
Optimization
Smart Energy
Grid
Personalized
Preventive
Care
Location
Aware Ad
Placement
Claim Fraud
Reduction
Buyer
Protection
Program
12. Make Business Intelligence Your Competitive Advantage
The Intel® Xeon® Processor E7 – 8800/4800/2800 v2 Product Families delivers:
MEMORY
PERFORMANCE
RELIABILITY
In-Memory
reliability and uptime
+ scalability
Up To
3X
Capacity
Up To
2X
Throughput
Improvement
Designed For
>99.999%
Reliability
Results have been estimated based on internal Intel analysis and are provided for informational purposes only. Any difference in system hardware or software design or configuration may
affect actual performance.
(E7 v2 Memory Capacity Increase) On a 4-socket natively-connected platform: Intel® Xeon® processor E7 family supports 64DIMMS, max memory per DIMM of 32GB RDIMM; Intel® Xeon®
processor E7 v2 family supports 96DIMMs, max memory per DIMM of 64GB RDIMM. This enables a 3x increase in memory.
14. Value Creation Through
ADVANCED ANALYTICS
FORESIGHT
How can we make it happen?
What will happen?
INSIGHT
Why did it happen?
What happened?
HINDSIGHT
Descriptive
Diagnostic
Predictive
Prescriptive
15. Business Value through Analytics at Intel
Operational
Efficiency
Revenue Growth
Cost Reduction
Post Silicon
Validation
Channel
Reseller
Selection
Fraud
Prevention
16. The Path to
INSIGHTS
• Start the Journey with Quick
Wins
• Move from Rear view to
Future Projections
• Acquire and Build Skills
• Seek Causation not
Correlation
• Intel Data Platform can help
accelerate your value
18. Sharing Intel IT Best Practices with the World
Learn More About Intel IT’s Initiatives at
www.intel.com/IT
Download
Intel IT Business Review mobile app
from your smartphone or tablet device
m.intel.com/IIBR
Advanced analytics on top of Big data is Hot Topic some might even say an over used and abused topic. But the reality we all deal with is that our organizations have experienced tremendous growth in data whether it is your organization data, your customers data, citizen data or event data. The question you have to ask is do we really get the most value out of this data? The answer is …We’re starting to. At NCAR, they are just finishing up an experiment where they’ve collected atmospheric data 7 miles above the earth to help determine if this will improve the forecasting of severe weather.
In fact, we have already seen great improvement in predictive models because of greater data and processing ability. The EF-5 Tornado that hit Moore Oklahoma in May had winds exceeding 200MPH and was on the ground for 17 miles. Due to the predictive model, the town had 36 minutes of warning before the tornado hit where the average prior to this was 12 minutes. 24 minutes is the difference between life and death. Although, this storm did horrific damage and many lost their lives, it would have been worse without the predictive capabilities due in large part to data analysis. The NCAR experiment (MPEX) strives to make this warning period even greater by understanding the atmospheric data miles above the earth. Advanced analytics on top of Big data use cases are abundant across commercial enterprises and government organizations.I’ll discuss how to unlock the potential of big data and analytics thru industry examples and specifics of what we are doing at Intel.
We talked about tornados, but lets get closer to home. While preparing for this keynote we pulled the 10 day forecast for New Orleans on 2/20/14. How accurate is the forecast? How often is a 5 or 10 day weather forecast still inaccurate? What methods are being used to predict that weather? The business of weather predictions is changing. And the opportunities abound for companies to use weather predictions to help them understand business projections. A third of U.S. commerce is sensitive to the weather,” according to Bill Pardue, head of Weather Analytics, a new firm providing predictive weather modeling. “With modeling based on truly global data, companies are in a position to make better decisions for business”. These are not the types of informational models that would have been possible until recently.- Placeholder for latest and play by ear
UPS: Optimized and Automated their entire Package Flow system. the "last mile" in its delivery network - UPS developed a suite of package flow technologies and business processes that use smart labels to capture information about the package before it gets to the center. Using historical, forecasted and exceptions information, package flow technologies create a dispatch plan for every driver working out of the package distribution center. The system helps package center management ensure that drivers are not over-dispatched and that last minute load changes to a driver's package car are minimized. They use this system to avoid things like left hand turns, red light wait times, traffic issues, -Automated routing saves 20-30 million miles and ~3 million gallons of gas saved and 30 metric tons of CO2 reduction. Obama: Data and Micro-Targeting for the 2012 election. Many claim this was his differentiator.His team started on day 1 of his first election to create personalized voter profiles. This included voter history, demographic data, and preferences to identify “persuadables” to target. They linked voter files to zip codes and individuals within households to target for people for early voting engagements.They also used data to raise campaign funds and to determine where to spend campaign dollars (based on their target audiences). GE: Vision to fix or replace products before they break. They predict part failure rates and send preventative maintenance crews out to anticipate issues. http://blogs.wsj.com/cio/2012/11/29/ge-ceo-jeff-immelt-says-analytics-next-holy-grail/The products they are creating are targeted at the airline, railroads, hospitals, manufacturing and energy companies to operate more efficiently by analyzing data collected by networks of sensors. The company is investing $1 billion into these services/products.Analytics is not a new concept. Analytics based on your company’s structured data (supply demand, customer order history, workforce statistics, etc) is the foundation for running an effective business today. We’ve entered a new era where data has exploded and is accessible creating new industries and competitive advantage for the early adopters. This era has a fundamentally different computing model: THE SMAC STACK.
As we look around many examples are emanating from different walks of life across public, private and research opportunities where the new paradigm of not just storing the data but processing and analyzing are solving big problems and thus providing big opportunities. You can see this in As Tim Reilly famously said, “data is the source of competitive advantage” – search or retail – broadly applied to many applications and opportunitiesRepresents another platform opportunity for Intel.Transition: Our StrategyNOTES:Tim’s statement is that every significant internet application to date has been backed by a specialized database. Amazons database of products, Googles Web Crawl, Mapquests map database, Napsters Song Database.Quotes: In the internet era, one can already see a number of cases where control over the database has led to market control and outsized financial returns. The monopoly on domain name registry initially granted by government fiat to Network Solutions (later purchased by Verisign) was one of the first great moneymakers of the internet. While we've argued that business advantage via controlling software APIs is much more difficult in the age of the internet, control of key data sources is not, especially if those data sources are expensive to create or amenable to increasing returns via network effects.The race is on to own certain classes of core data: location, identity, calendaring of public events, product identifiers and namespaces. In many cases, where there is significant cost to create the data, there may be an opportunity for an Intel Inside style play, with a single source for the data. In others, the winner will be the company that first reaches critical mass via user aggregation, and turns that aggregated data into a system service.FULL TEXT - “Data is the Next Intel Inside” – Tim O’Reilly – 2009 (2005?)Every significant internet application to date has been backed by a specialized database: Google's web crawl, Yahoo!'s directory (and web crawl), Amazon's database of products, eBay's database of products and sellers, MapQuest's map databases, Napster's distributed song database. As Hal Varian remarked in a personal conversation last year, "SQL is the new HTML." Database management is a core competency of Web 2.0 companies, so much so that we have sometimes referred to these applications as "infoware" rather than merely software. This fact leads to a key question: Who owns the data?In the internet era, one can already see a number of cases where control over the database has led to market control and outsized financial returns. The monopoly on domain name registry initially granted by government fiat to Network Solutions (later purchased by Verisign) was one of the first great moneymakers of the internet. While we've argued that business advantage via controlling software APIs is much more difficult in the age of the internet, control of key data sources is not, especially if those data sources are expensive to create or amenable to increasing returns via network effects.Look at the copyright notices at the base of every map served by MapQuest, maps.yahoo.com, maps.msn.com, or maps.google.com, and you'll see the line "Maps copyright NavTeq, TeleAtlas," or with the new satellite imagery services, "Images copyright Digital Globe." These companies made substantial investments in their databases (NavTeq alone reportedly invested $750 million to build their database of street addresses and directions. Digital Globe spent $500 million to launch their own satellite to improve on government-supplied imagery.) NavTeq has gone so far as to imitate Intel's familiar Intel Inside logo: Cars with navigation systems bear the imprint, "NavTeq Onboard." Data is indeed the Intel Inside of these applications, a sole source component in systems whose software infrastructure is largely open source or otherwise commodified.The now hotly contested web mapping arena demonstrates how a failure to understand the importance of owning an application's core data will eventually undercut its competitive position. MapQuest pioneered the web mapping category in 1995, yet when Yahoo!, and then Microsoft, and most recently Google, decided to enter the market, they were easily able to offer a competing application simply by licensing the same data.Contrast, however, the position of Amazon.com. Like competitors such as Barnesandnoble.com, its original database came from ISBN registry provider R.R. Bowker. But unlike MapQuest, Amazon relentlessly enhanced the data, adding publisher-supplied data such as cover images, table of contents, index, and sample material. Even more importantly, they harnessed their users to annotate the data, such that after ten years, Amazon, not Bowker, is the primary source for bibliographic data on books, a reference source for scholars and librarians as well as consumers. Amazon also introduced their own proprietary identifier, the ASIN, which corresponds to the ISBN where one is present, and creates an equivalent namespace for products without one. Effectively, Amazon "embraced and extended" their data suppliers.Imagine if MapQuest had done the same thing, harnessing their users to annotate maps and directions, adding layers of value. It would have been much more difficult for competitors to enter the market just by licensing the base data.The recent introduction of Google Maps provides a living laboratory for the competition between application vendors and their data suppliers. Google's lightweight programming model has led to the creation of numerous value-added services in the form of mashups that link Google Maps with other internet-accessible data sources. Paul Rademacher'shousingmaps.com, which combines Google Maps with Craigslist apartment rental and home purchase data to create an interactive housing search tool, is the pre-eminent example of such a mashup.At present, these mashups are mostly innovative experiments, done by hackers. But entrepreneurial activity follows close behind. And already, one can see that for at least one class of developer, Google has taken the role of data source away from Navteq and inserted themselves as a favored intermediary. We expect to see battles between data suppliers and application vendors in the next few years, as both realize just how important certain classes of data will become as building blocks for Web 2.0 applications.The race is on to own certain classes of core data: location, identity, calendaring of public events, product identifiers and namespaces. In many cases, where there is significant cost to create the data, there may be an opportunity for an Intel Inside style play, with a single source for the data. In others, the winner will be the company that first reaches critical mass via user aggregation, and turns that aggregated data into a system service.For example, in the area of identity, PayPal, Amazon's 1-click, and the millions of users of communications systems, may all be legitimate contenders to build a network-wide identity database. (In this regard, Google's recent attempt to use cell phone numbers as an identifier for Gmail accounts may be a step towards embracing and extending the phone system.) Meanwhile, startups like Sxip are exploring the potential of federated identity, in quest of a kind of "distributed 1-click" that will provide a seamless Web 2.0 identity subsystem. In the area of calendaring, EVDB is an attempt to build the world's largest shared calendar via a wiki-style architecture of participation. While the jury's still out on the success of any particular startup or approach, it's clear that standards and solutions in these areas, effectively turning certain classes of data into reliable subsystems of the "internet operating system", will enable the next generation of applications.A further point must be noted with regard to data, and that is user concerns about privacy and their rights to their own data. In many of the early web applications, copyright is only loosely enforced. For example, Amazon lays claim to any reviews submitted to the site, but in the absence of enforcement, people may repost the same review elsewhere. However, as companies begin to realize that control over data may be their chief source of competitive advantage, we may see heightened attempts at control.Much as the rise of proprietary software led to the Free Software movement, we expect the rise of proprietary databases to result in a Free Data movement within the next decade. One can see early signs of this countervailing trend in open data projects such as Wikipedia, the Creative Commons, and in software projects like Greasemonkey, which allow users to take control of how data is displayed on their computer.
Key message: We’ve always had data and have been using tools to gain insights from that data. So, why so much focus on this now? What is driving the big data phenomenon? Primarily there are 3 major elements that are coming together to form this inflection point. On one hand the volume of data has been growing at astronomical speed while the nature of the data also is changing from used to be more structured data to multitude of unstructured data formats. As per IDC, the amount of digital data created in 2010was 1200 exabytes. It sis expected to grow to 40000 exabytes by 2020 alone. (IDC Digital Universe Study Dec 2012)The cost of technology that is needed to store and process this ever-growing data has been coming down significantly that now it is making economic possibility to apply the technology to use data for significant value generation. As an example: Server system pricing declined from $11.5K in 2002 down to ~$6.5K in 2012 (IDC WW Server Tracker) Cost of storage has declined from $24 per GB to $1.5 per GB from 2002 – 2012 (IDC Storage Tracker)Adding to this is the third vector of significant new investments from several ecosystem players to build tools and services required.
Let us talk Data:On a personal scale, not too far ago, I was perfectly happy with a storage of few MB to store all my important files. Now I have 4 TB of personal data and still not enough.Same thing is applicable to Enterprises.Not only the volume but the variety of data that we exchange and store has changed and that is driving the rapid growth.Most corporations used to measure the data in KB in the mainframe era. The advent of Internet brought this in GB.The new denominator in the wave of mobile, social, big data and cloud era is Zettabytes. And no one is able to accurately predict the rate of growth that is constantly being revised upwards. Most of this was not possible before. With the advent of new computing, Storage and networking technologies coupled with others are making this possible. The thirst never seem to subside though !!
It’s no surprise that the amount of data being generated is growing at a phenomenal rate. In the next decade the rate of information generated will grow nearly 15x, powered by a variety of inputs, from cloud to the growth of clients to the increase in machine-to-machine communication.Big Data has always been with us, and it’s likely that it exists in many places in your environment. If you have a website that accepts comments, you have the potential to mine those comments to make your business better. If you have industrial controls in your manufacturing environment, the data from those could increase your profits. Your employees are also one of your best indicators of how business runs, and the ability to use information to make their jobs easier could greatly benefit your company.All of this data provides a gold mine of information and decision process… if you have the capability to mine that data to get those insights. As the volume, variety, and velocity of data continues to grow, the opportunities to draw on the data will also increase. And that means insight and value to your business.
SMAC equals SOCIAL, MOBILE, ANALYTICS & CLOUDCloud is a new service delivery model & creates economies of scale and time to market improvement. Thinking about cloud in the context of Analytics – Cloud enables enterprises to ingest and manage structured and unstructured data from numerous sources as inputs into Analytics engines. Mobile is not just how you connect with your customers, suppliers and employees but also how the “Internet of Things” connect to your enterprise. Analytics enhance the value proposition of mobility.Social is the new feedback mechanism and boils down to finding the right people and the right information then driving engagement to a desired outcome. Analytics – When we look at the data challenge it breaks into 3 areas: Volume, Variety and Velocity. We now have enormous amounts of data – some of our own, some from external sources. Some of this data is structured and stored in our traditional relational databases, but much of it is unstructured. Unstructured could be an email, text message, video file, voicemail. The 1st step is to bring some structure to these unstructured data sources. Once that is done science of data can be applied to create observations, predictive models and answer previously unanswerable questions.Figuring out what business problems need to be solved using advanced analytics is the key in Unlocking the Potential Value. Individually each component of the SMAC stack creates value. Placed in the combination, and the exponential value is unlocked. And when it comes to the transformative opportunities with Big Data and Analytics, SMAC in combination can’t be underestimated.
Collective Intelligence and human creativity are our only limitation – True but we need to put a little structure to our approach so we’ve chosen these four simple questions and are progressing thru each phase systematically. The 4 key questions are:What Happened? Why did it Happen? What will happen? And How do we make it Happen?Simple to say but just a bit more complex when you apply a specific business question.
I’ll shift gears on what we’re doing with big data at Intel. These use cases span across internal operations, consumer behavior and security and risk management. In the next three years we expect to achieve cost savings and increased bottom line revenue of nearly half a billion dollars through use of analytics solutions by 2016.
Key Message: Intel Big Data value – massive compute capabilities and industry enabling skill. Reinforce that use of our SW will aid Intel in getting value from our HW innovations and expanding solution sales opportunitiesWe know there is pent up demand for data analytics solutions. Corporations know they can grow revenue, reduce costs and reduce risk by leveraging available data sources. We’ve all seen the stats: the big data market is sized as $35B in silicon, system hardware, software and professional services by 2020, growing at 30% CAGR. On one hand, we are making Intel Xeon E7 v2 the platform of choice for in-memory workloads by ramping established solutions such as SAP HANA, ORCL 12c, SAS, etc. But today’s deployments of big data solutions are constrained by cost and complexity. There aren’t enough data scientists in the world to meet the needs of Enterprise.Intel’s Distribution of Hadoop is both optimized for Intel Architecture and the widely adopted. Our recent launch of the Hadoop version 3 framework and Graph Builder visual analytics software are direct steps towards that goal. Further, Intel SSD and Network optimized products are targeted to Big Data workloads.As your business grows and your data grows, Xeon E7 is built to handle the growth of your largest workloads. Not all workloads scale out, for example DB workloads. Xeon E7 v2, with it’s ability to scale up, can handle those workloads.
Xeon E7 v ’s unique combination of high performance, improved reliability and large memory footprint provide a winning combination that is ideal for today’s data heavy IT environments. By harnessing these capabilities, IT leaders can more quickly gain insight, make business decisions and gain competitive advantage. Real-time Business Intelligence that drives faster decision making, for greater competitiveness and profitability potential. Puts organizations in a position to solve today’s business problems and even handle data-related issues for tomorrow.
Basic BI is the cost of doing business today. Advanced BI and predictive models help keep Intel ahead of the competition with faster information analysis and decision making. Identification of High-Potential Resellers - Advanced BI allows us to focus resources where they will generate the highest return. For example, we developed a solution to help Intel sales teams strategically focus on large-volume resellers to deliver greater revenue. This engine mines large sets of internal and external sales data to identify the most promising reseller partners in specific geographies. In last two years,this solution identified 3x as many high-potential resellers compared to using manual methods. We estimated up to USD 50 million in potentially new and incremental sales opportunities from our deployments worldwide in 2012 and 2013.Rapid detection of information security threats - Early warning of malware and other cyber threats increase enterprise information security. (need to obtain more speaker notes on this use case with some numbers)IDH – Key Differentiators:Open Data Platform:We believe Hadoop has the potential to evolve into the open data platform where ton of innovation will happen, we want the momentum to continue and almost everything we are doing in the space we are contributing back to open source. "Operationalizing Hadoop" is key to making it enterprise grade. There is tremendous work happening in the community and we leverage that. However we have decided to invest where the community has not had much time to invest, and focus on Specifically around Security, deployment and performance to make it ready for mass enterprise deploymentsSecurity:Authentication, authorization, auditing built-in to Apache HadoopTransparent encryption in Hive, Pig, MapReduce, HBase, HDFSUp to 20x faster en/decryption with Intel AES-NI1Performance:Up to 30x faster on Intel architecture than other hardwareUp to 2.6X faster than other open source distributionsManageability:Enterprise-grade cluster management console and APIsAutomated configuration with Intel® Auto TunerREST based fully configurable manager
All organizations have something to be gained from Big Data Analytics. The key is to get started. The first step is to identify a few small quick wins – think about what your logs can tell you (security logs, call center logs, transaction logs, etc) and think about uncovering the patterns in any for of claims that you have (insurance, warranty, rebate) – maybe it’s fraud or duplicate payments like we had. Then you have to begin to shift the questions from what happened to what WILL happen – dedicate some % of time on the predictionsOn to the more difficult issue of skills. These skills are scarce but you can acquire, build skills; or even “rent” skills to get you started.My view is You’ll need to do all of these things initiallyFocus on causation not correlation….test tons of hypotheses…scientific method Correlation not equal to causation
IT is in the best position to be the organizational catalyst to capture the value of Big data. At Intel IT, we are not only driving our teams to think about the opportunity but all functions across the company. Are you up for the challenge in your organization? Thank You
I would like to draw your attention to IT@Intel Program from Intel with an intent to share our best practices with you all and I turn bring the learnings back to Intel to influence our products that could benefit all. You can download our mobile app onto your device as well as visit the URL as listed here to access many resources including white papers, case studies, how to guides, radio shows, webinars among others..