Introduces the Microsoft’s Data Platform for on premise and cloud. Challenges businesses are facing with data and sources of data. Understand about Evolution of Database Systems in the modern world and what business are doing with their data and what their new needs are with respect to changing industry landscapes.
Dive into the Opportunities available for businesses and industry verticals: the ones which are identified already and the ones which are not explored yet.
Understand the Microsoft’s Cloud vision and what is Microsoft’s Azure platform is offering, for Infrastructure as a Service or Platform as a Service for you to build your own offerings.
Introduce and demo some of the Real World Scenarios/Case Studies where Businesses have used the Cloud/Azure for creating New and Innovative solutions to unlock these potentials.
This is a run-through at a 200 level of the Microsoft Azure Big Data Analytics for the Cloud data platform based on the Cortana Intelligence Suite offerings.
Big Data in the Cloud with Azure Marketplace ImagesMark Kromer
Here are some of the trends that I'm seeing from customer looking to build Azure-based Cloud Big Data solutions using images from the Azure Marketplace
Building Modern Data Platform with Microsoft AzureDmitry Anoshin
This presentation will cover Cloud history and Microsoft Azure Data Analytics capabilities. Moreover, it has a real-world example of DW modernization. Finally, we will check the alternative solution on Azure using Snowflake and Matillion ETL.
This is a run-through at a 200 level of the Microsoft Azure Big Data Analytics for the Cloud data platform based on the Cortana Intelligence Suite offerings.
Big Data in the Cloud with Azure Marketplace ImagesMark Kromer
Here are some of the trends that I'm seeing from customer looking to build Azure-based Cloud Big Data solutions using images from the Azure Marketplace
Building Modern Data Platform with Microsoft AzureDmitry Anoshin
This presentation will cover Cloud history and Microsoft Azure Data Analytics capabilities. Moreover, it has a real-world example of DW modernization. Finally, we will check the alternative solution on Azure using Snowflake and Matillion ETL.
The Hive Think Tank - The Microsoft Big Data Stack by Raghu Ramakrishnan, CTO...The Hive
Until recently, data was gathered for well-defined objectives such as auditing, forensics, reporting and line-of-business operations; now, exploratory and predictive analysis is becoming ubiquitous, and the default increasingly is to capture and store any and all data, in anticipation of potential future strategic value. These differences in data heterogeneity, scale and usage are leading to a new generation of data management and analytic systems, where the emphasis is on supporting a wide range of very large datasets that are stored uniformly and analyzed seamlessly using whatever techniques are most appropriate, including traditional tools like SQL and BI and newer tools, e.g., for machine learning and stream analytics. These new systems are necessarily based on scale-out architectures for both storage and computation.
Hadoop has become a key building block in the new generation of scale-out systems. On the storage side, HDFS has provided a cost-effective and scalable substrate for storing large heterogeneous datasets. However, as key customer and systems touch points are instrumented to log data, and Internet of Things applications become common, data in the enterprise is growing at a staggering pace, and the need to leverage different storage tiers (ranging from tape to main memory) is posing new challenges, leading to caching technologies, such as Spark. On the analytics side, the emergence of resource managers such as YARN has opened the door for analytics tools to bypass the Map-Reduce layer and directly exploit shared system resources while computing close to data copies. This trend is especially significant for iterative computations such as graph analytics and machine learning, for which Map-Reduce is widely recognized to be a poor fit.
While Hadoop is widely recognized and used externally, Microsoft has long been at the forefront of Big Data analytics, with Cosmos and Scope supporting all internal customers. These internal services are a key part of our strategy going forward, and are enabling new state of the art external-facing services such as Azure Data Lake and more. I will examine these trends, and ground the talk by discussing the Microsoft Big Data stack.
Designing an Agile Fast Data Architecture for Big Data Ecosystem using Logica...Denodo
Autodesk designed a modern data architecture that heavily uses data virtualization to integrate both legacy data sources and contemporary big data analytics like Spark into a single unified logical data warehouse. In this presentation, you will learn how to build a logical data warehouse using data virtualization and create a single, unified enterprise-wide access and governance point for any data used within the company.
This presentation is part of the Fast Data Strategy Conference, and you can watch the video here goo.gl/Ab4PDB.
Entity Resolution Service - Bringing Petabytes of Data Online for Instant AccessDataWorks Summit
2.5B+ ids, 2ms latency, 15K+ TPS and Petabytes of data.These numbers outline the challenges with eBay’s Entity Resolution Service (ERS). ERS provides a temporal map between anyid-anyid. The technology stack of ERS has Hadoop as the batch layer, Couchbase as cache layer, Spring Batch to load data to Couchbase and Rest API at Service layer. In our presentation we will take you through the journey from conceptual to production release. It’s a great story and we would like to share with you!
Simplifying Cloud Architectures with Data VirtualizationDenodo
Watch here: https://bit.ly/2yxLo6f
Moving applications and data to the Cloud is a priority for many organizations. The benefits - in terms of flexibility, agility, and cost savings - are driving Cloud adoption. However, the journey to the Cloud is not as easy as many people think. The process of moving application and data to the Cloud is challenging and can entail widespread disruption across the organization if not carefully managed. Even when systems are migrated to the Cloud, the resultant hybrid or multi-Cloud architecture is more complex for users to navigate, making it harder for them to get the data that they need to do their jobs.
Data Virtualization can help organizations at all stages of their journey to the Cloud - during migration and also in the resultant hybrid or multi-Cloud architectures. Attend this webinar to learn how Data Virtualization can:
- Help organizations manage risk and minimize the disruption caused as systems are moved to the Cloud
- Provide a single point of access for data that is both on-premise and in the Cloud, making it easier for users to find and access the data that they need
- Provide a security layer to protect and manage your data when it's distributed across hybrid or multi-Cloud architectures
Cloud Modernization and Data as a Service OptionDenodo
Watch here: https://bit.ly/36tEThx
The current data landscape is fragmented, not just in location but also in terms of shape and processing paradigms. Cloud has become a key component of modern architecture design. Data lakes, IoT, NoSQL, SaaS, etc. coexist with relational databases to fuel the needs of modern analytics, ML and AI. Exploring and understanding the data available within your organization is a time-consuming task. Dealing with bureaucracy, different languages and protocols, and the definition of ingestion pipelines to load that data into your data lake can be complex. And all of this without even knowing if that data will be useful at all.
Attend this session to learn:
- How dynamic data challenges and the speed of change requires a new approach to data architecture – one that is real-time, agile and doesn’t rely on physical data movement.
- Learn how logical data architecture can enable organizations to transition data faster to the cloud with zero downtime and ultimately deliver faster time to insight.
- Explore how data as a service and other API management capabilities is a must in a hybrid cloud environment.
Cortana Analytics Workshop: Operationalizing Your End-to-End Analytics SolutionMSAdvAnalytics
Wee Hyong Tok. With Azure Data Factory (ADF), existing data movement and analytics processing services can be composed into data pipelines that are highly available and managed in the cloud. In this demo-driven session, you learn by example how to build, operationalize, and manage scalable analytics pipelines. Go to https://channel9.msdn.com/ to find the recording of this session.
Data Mesh in Practice - How Europe's Leading Online Platform for Fashion Goes...Dr. Arif Wider
A talk presented by Max Schultze from Zalando and Arif Wider from ThoughtWorks at NDC Oslo 2020.
Abstract:
The Data Lake paradigm is often considered the scalable successor of the more curated Data Warehouse approach when it comes to democratization of data. However, many who went out to build a centralized Data Lake came out with a data swamp of unclear responsibilities, a lack of data ownership, and sub-par data availability.
At Zalando - europe’s biggest online fashion retailer - we realised that accessibility and availability at scale can only be guaranteed when moving more responsibilities to those who pick up the data and have the respective domain knowledge - the data owners - while keeping only data governance and metadata information central. Such a decentralized and domain focused approach has recently been coined a Data Mesh.
The Data Mesh paradigm promotes the concept of Data Products which go beyond sharing of files and towards guarantees of quality and acknowledgement of data ownership.
This talk will take you on a journey of how we went from a centralized Data Lake to embrace a distributed Data Mesh architecture and will outline the ongoing efforts to make creation of data products as simple as applying a template.
Scaling Multi-Cloud Deployments with Denodo: Automated Infrastructure ManagementDenodo
Watch full webinar here: https://bit.ly/3oWR1Bl
The future of infrastructure management lies in automation. In this session, Denodo subject matter expert will talk about how in a multi-cloud scenario, the infrastructure can be automatically managed transparently via a web GUI. Audience will get to see that in action through a live demo.
Verizon Centralizes Data into a Data Lake in Real Time for AnalyticsDataWorks Summit
Verizon – Global Technology Services (GTS) was challenged by a multi-tier, labor-intensive process when trying to migrate data from disparate sources into a data lake to create financial reports and business insights. Join this session to learn more about how Verizon:
• Easily accessed data from multiple sources including SAP data
• Ingested data into major targets including Hadoop
• Achieved real-time insights from data leveraging change data capture (CDC) technology
• Reduced costs and labor
Microsoft and Hortonworks Delivers the Modern Data Architecture for Big DataHortonworks
Joint webinar with Microsoft and Hortonworns on the power of combining the Hortonworks Data Platform with Microsoft’s ubiquitous Windows, Office, SQL Server, Parallel Data Warehouse, and Azure platform to build the Modern Data Architecture for Big Data.
Embarking on building a modern data warehouse in the cloud can be an overwhelming experience due to the sheer number of products that can be used, especially when the use cases for many products overlap others. In this talk I will cover the use cases of many of the Microsoft products that you can use when building a modern data warehouse, broken down into four areas: ingest, store, prep, and model & serve. It’s a complicated story that I will try to simplify, giving blunt opinions of when to use what products and the pros/cons of each.
Cortana Analytics Suite is a fully managed big data and advanced analytics suite that transforms your data into intelligent action. It is comprised of data storage, information management, machine learning, and business intelligence software in a single convenient monthly subscription. This presentation will cover all the products involved, how they work together, and use cases.
Where does Fast Data Strategy Fit within IT ProjectsDenodo
Fast Data Strategy is a must for organizations to become and be competitive. There are four use cases where Fast Data Strategy fits within IT Projects - Agile BI, Big Data/ Cloud, Data Services, and Single View. In this presentation, you will discover how four customers used data virtualization and Fast Data Strategy for these use cases.
This presentation is part of the Fast Data Strategy Conference, and you can watch the video here goo.gl/UxHMuJ.
The Hive Think Tank - The Microsoft Big Data Stack by Raghu Ramakrishnan, CTO...The Hive
Until recently, data was gathered for well-defined objectives such as auditing, forensics, reporting and line-of-business operations; now, exploratory and predictive analysis is becoming ubiquitous, and the default increasingly is to capture and store any and all data, in anticipation of potential future strategic value. These differences in data heterogeneity, scale and usage are leading to a new generation of data management and analytic systems, where the emphasis is on supporting a wide range of very large datasets that are stored uniformly and analyzed seamlessly using whatever techniques are most appropriate, including traditional tools like SQL and BI and newer tools, e.g., for machine learning and stream analytics. These new systems are necessarily based on scale-out architectures for both storage and computation.
Hadoop has become a key building block in the new generation of scale-out systems. On the storage side, HDFS has provided a cost-effective and scalable substrate for storing large heterogeneous datasets. However, as key customer and systems touch points are instrumented to log data, and Internet of Things applications become common, data in the enterprise is growing at a staggering pace, and the need to leverage different storage tiers (ranging from tape to main memory) is posing new challenges, leading to caching technologies, such as Spark. On the analytics side, the emergence of resource managers such as YARN has opened the door for analytics tools to bypass the Map-Reduce layer and directly exploit shared system resources while computing close to data copies. This trend is especially significant for iterative computations such as graph analytics and machine learning, for which Map-Reduce is widely recognized to be a poor fit.
While Hadoop is widely recognized and used externally, Microsoft has long been at the forefront of Big Data analytics, with Cosmos and Scope supporting all internal customers. These internal services are a key part of our strategy going forward, and are enabling new state of the art external-facing services such as Azure Data Lake and more. I will examine these trends, and ground the talk by discussing the Microsoft Big Data stack.
Designing an Agile Fast Data Architecture for Big Data Ecosystem using Logica...Denodo
Autodesk designed a modern data architecture that heavily uses data virtualization to integrate both legacy data sources and contemporary big data analytics like Spark into a single unified logical data warehouse. In this presentation, you will learn how to build a logical data warehouse using data virtualization and create a single, unified enterprise-wide access and governance point for any data used within the company.
This presentation is part of the Fast Data Strategy Conference, and you can watch the video here goo.gl/Ab4PDB.
Entity Resolution Service - Bringing Petabytes of Data Online for Instant AccessDataWorks Summit
2.5B+ ids, 2ms latency, 15K+ TPS and Petabytes of data.These numbers outline the challenges with eBay’s Entity Resolution Service (ERS). ERS provides a temporal map between anyid-anyid. The technology stack of ERS has Hadoop as the batch layer, Couchbase as cache layer, Spring Batch to load data to Couchbase and Rest API at Service layer. In our presentation we will take you through the journey from conceptual to production release. It’s a great story and we would like to share with you!
Simplifying Cloud Architectures with Data VirtualizationDenodo
Watch here: https://bit.ly/2yxLo6f
Moving applications and data to the Cloud is a priority for many organizations. The benefits - in terms of flexibility, agility, and cost savings - are driving Cloud adoption. However, the journey to the Cloud is not as easy as many people think. The process of moving application and data to the Cloud is challenging and can entail widespread disruption across the organization if not carefully managed. Even when systems are migrated to the Cloud, the resultant hybrid or multi-Cloud architecture is more complex for users to navigate, making it harder for them to get the data that they need to do their jobs.
Data Virtualization can help organizations at all stages of their journey to the Cloud - during migration and also in the resultant hybrid or multi-Cloud architectures. Attend this webinar to learn how Data Virtualization can:
- Help organizations manage risk and minimize the disruption caused as systems are moved to the Cloud
- Provide a single point of access for data that is both on-premise and in the Cloud, making it easier for users to find and access the data that they need
- Provide a security layer to protect and manage your data when it's distributed across hybrid or multi-Cloud architectures
Cloud Modernization and Data as a Service OptionDenodo
Watch here: https://bit.ly/36tEThx
The current data landscape is fragmented, not just in location but also in terms of shape and processing paradigms. Cloud has become a key component of modern architecture design. Data lakes, IoT, NoSQL, SaaS, etc. coexist with relational databases to fuel the needs of modern analytics, ML and AI. Exploring and understanding the data available within your organization is a time-consuming task. Dealing with bureaucracy, different languages and protocols, and the definition of ingestion pipelines to load that data into your data lake can be complex. And all of this without even knowing if that data will be useful at all.
Attend this session to learn:
- How dynamic data challenges and the speed of change requires a new approach to data architecture – one that is real-time, agile and doesn’t rely on physical data movement.
- Learn how logical data architecture can enable organizations to transition data faster to the cloud with zero downtime and ultimately deliver faster time to insight.
- Explore how data as a service and other API management capabilities is a must in a hybrid cloud environment.
Cortana Analytics Workshop: Operationalizing Your End-to-End Analytics SolutionMSAdvAnalytics
Wee Hyong Tok. With Azure Data Factory (ADF), existing data movement and analytics processing services can be composed into data pipelines that are highly available and managed in the cloud. In this demo-driven session, you learn by example how to build, operationalize, and manage scalable analytics pipelines. Go to https://channel9.msdn.com/ to find the recording of this session.
Data Mesh in Practice - How Europe's Leading Online Platform for Fashion Goes...Dr. Arif Wider
A talk presented by Max Schultze from Zalando and Arif Wider from ThoughtWorks at NDC Oslo 2020.
Abstract:
The Data Lake paradigm is often considered the scalable successor of the more curated Data Warehouse approach when it comes to democratization of data. However, many who went out to build a centralized Data Lake came out with a data swamp of unclear responsibilities, a lack of data ownership, and sub-par data availability.
At Zalando - europe’s biggest online fashion retailer - we realised that accessibility and availability at scale can only be guaranteed when moving more responsibilities to those who pick up the data and have the respective domain knowledge - the data owners - while keeping only data governance and metadata information central. Such a decentralized and domain focused approach has recently been coined a Data Mesh.
The Data Mesh paradigm promotes the concept of Data Products which go beyond sharing of files and towards guarantees of quality and acknowledgement of data ownership.
This talk will take you on a journey of how we went from a centralized Data Lake to embrace a distributed Data Mesh architecture and will outline the ongoing efforts to make creation of data products as simple as applying a template.
Scaling Multi-Cloud Deployments with Denodo: Automated Infrastructure ManagementDenodo
Watch full webinar here: https://bit.ly/3oWR1Bl
The future of infrastructure management lies in automation. In this session, Denodo subject matter expert will talk about how in a multi-cloud scenario, the infrastructure can be automatically managed transparently via a web GUI. Audience will get to see that in action through a live demo.
Verizon Centralizes Data into a Data Lake in Real Time for AnalyticsDataWorks Summit
Verizon – Global Technology Services (GTS) was challenged by a multi-tier, labor-intensive process when trying to migrate data from disparate sources into a data lake to create financial reports and business insights. Join this session to learn more about how Verizon:
• Easily accessed data from multiple sources including SAP data
• Ingested data into major targets including Hadoop
• Achieved real-time insights from data leveraging change data capture (CDC) technology
• Reduced costs and labor
Microsoft and Hortonworks Delivers the Modern Data Architecture for Big DataHortonworks
Joint webinar with Microsoft and Hortonworns on the power of combining the Hortonworks Data Platform with Microsoft’s ubiquitous Windows, Office, SQL Server, Parallel Data Warehouse, and Azure platform to build the Modern Data Architecture for Big Data.
Embarking on building a modern data warehouse in the cloud can be an overwhelming experience due to the sheer number of products that can be used, especially when the use cases for many products overlap others. In this talk I will cover the use cases of many of the Microsoft products that you can use when building a modern data warehouse, broken down into four areas: ingest, store, prep, and model & serve. It’s a complicated story that I will try to simplify, giving blunt opinions of when to use what products and the pros/cons of each.
Cortana Analytics Suite is a fully managed big data and advanced analytics suite that transforms your data into intelligent action. It is comprised of data storage, information management, machine learning, and business intelligence software in a single convenient monthly subscription. This presentation will cover all the products involved, how they work together, and use cases.
Where does Fast Data Strategy Fit within IT ProjectsDenodo
Fast Data Strategy is a must for organizations to become and be competitive. There are four use cases where Fast Data Strategy fits within IT Projects - Agile BI, Big Data/ Cloud, Data Services, and Single View. In this presentation, you will discover how four customers used data virtualization and Fast Data Strategy for these use cases.
This presentation is part of the Fast Data Strategy Conference, and you can watch the video here goo.gl/UxHMuJ.
Доклад посвящен экосистеме Cortana Analytics Suite, в т.ч. сервису предиктивной аналитики Azure Machine Learning. В demo-части доклада разбирается задача анализа тональности сообщений в социальных сетях.
Видео выступления и пояснения к demo-доклада доступно на http://0xcode.in/dev-camp
Internet Of Things What You Need To Know - TechFuseRichard Harbridge
The Internet of Things (IoT) is here today in the devices, sensors, cloud services, and data your business uses. Microsoft delivers a flexible cloud-based approach that enables enterprises to capitalize on IoT by gathering, storing, and processing data centrally. When centrally connecting distributed LoB assets, the edge of an enterprise’s infrastructure can be redefined, and the breadth of the Microsoft data platform can be harnessed. Join Richard Harbridge as you learn about Microsoft’s position on IoT, and the technology and services being delivered from Microsoft to help you create the Internet of Your Things.
Microsoft cloud profitability scenariosMedhy Sandjak
La demande des clients pour les services sur le Cloud ne cesse de grandir.
Identifiez de nouvelles opportunités pour étendre votre rentabilité grâce à des exemples de modèles financiers et de scénarios dans le Cloud via Microsoft CSP.
Microsoft Azure And The Competitive Cloud Industry - Collab365Richard Harbridge
Public cloud platforms are important for the future. Many organizations have made big bets, and are continuing to evaluate their options in the public and hybrid cloud arena. Microsoft has become a major player in the public cloud space, but it has plenty of competitors: Amazon, Google, OpenStack, Salesforce/Force.com and more. How do these providers compare to Azure? And what’s likely to happen in the industry as we move into the future? Join Richard Harbridge as he provides guidance and advice for how Azure measures up based on customer experience and industry insights.
Think of big data as all data, no matter what the volume, velocity, or variety. The simple truth is a traditional on-prem data warehouse will not handle big data. So what is Microsoft’s strategy for building a big data solution? And why is it best to have this solution in the cloud? That is what this presentation will cover. Be prepared to discover all the various Microsoft technologies and products from collecting data, transforming it, storing it, to visualizing it. My goal is to help you not only understand each product but understand how they all fit together, so you can be the hero who builds your companies big data solution.
So you got a handle on what Big Data is and how you can use it to find business value in your data. Now you need an understanding of the Microsoft products that can be used to create a Big Data solution. Microsoft has many pieces of the puzzle and in this presentation I will show how they fit together. How does Microsoft enhance and add value to Big Data? From collecting data, transforming it, storing it, to visualizing it, I will show you Microsoft’s solutions for every step of the way
Caserta Concepts, Datameer and Microsoft shared their combined knowledge and a use case on big data, the cloud and deep analytics. Attendes learned how a global leader in the test, measurement and control systems market reduced their big data implementations from 18 months to just a few.
Speakers shared how to provide a business user-friendly, self-service environment for data discovery and analytics, and focus on how to extend and optimize Hadoop based analytics, highlighting the advantages and practical applications of deploying on the cloud for enhanced performance, scalability and lower TCO.
Agenda included:
- Pizza and Networking
- Joe Caserta, President, Caserta Concepts - Why are we here?
- Nikhil Kumar, Sr. Solutions Engineer, Datameer - Solution use cases and technical demonstration
- Stefan Groschupf, CEO & Chairman, Datameer - The evolving Hadoop-based analytics trends and the role of cloud computing
- James Serra, Data Platform Solution Architect, Microsoft, Benefits of the Azure Cloud Service
- Q&A, Networking
For more information on Caserta Concepts, visit our website: http://casertaconcepts.com/
Microsoft® SQL Server® 2012 is a cloud-ready information platform that will help organizations unlock breakthrough insights across the organization and quickly build solutions to extend data across on-premises and public cloud, backed by mission critical confidence.
Modernizing to a Cloud Data ArchitectureDatabricks
Organizations with on-premises Hadoop infrastructure are bogged down by system complexity, unscalable infrastructure, and the increasing burden on DevOps to manage legacy architectures. Costs and resource utilization continue to go up while innovation has flatlined. In this session, you will learn why, now more than ever, enterprises are looking for cloud alternatives to Hadoop and are migrating off of the architecture in large numbers. You will also learn how elastic compute models’ benefits help one customer scale their analytics and AI workloads and best practices from their experience on a successful migration of their data and workloads to the cloud.
The cloud is all the rage. Does it live up to its hype? What are the benefits of the cloud? Join me as I discuss the reasons so many companies are moving to the cloud and demo how to get up and running with a VM (IaaS) and a database (PaaS) in Azure. See why the ability to scale easily, the quickness that you can create a VM, and the built-in redundancy are just some of the reasons that moving to the cloud a “no brainer”. And if you have an on-prem datacenter, learn how to get out of the air-conditioning business!
Modern apps and services are leveraging data to change the way we engage with users in a more personalized way. Skyla Loomis talks big data, analytics, NoSQL, SQL and how IBM Cloud is open for data.
Learn more by visiting our Bluemix Hybrid page: http://ibm.co/1PKN23h
Big Data Made Easy: A Simple, Scalable Solution for Getting Started with HadoopPrecisely
With so many new, evolving frameworks, tools, and languages, a new big data project can lead to confusion and unwarranted risk.
Many organizations have found Data Warehouse Optimization with Hadoop to be a good starting point on their Big Data journey. Offloading ETL workloads from the enterprise data warehouse (EDW) into Hadoop is a well-defined use case that produces tangible results for driving more insights while lowering costs. You gain significant business agility, avoid costly EDW upgrades, and free up EDW capacity for faster queries. This quick win builds credibility and generates savings to reinvest in more Big Data projects.
A proven reference architecture that includes everything you need in a turnkey solution – the Hadoop distribution, data integration software, servers, networking and services – makes it even easier to get started.
DataLakes kan skalere i takt med skyen, nedbryde integrationsbarrierer og data gemt i siloer og bane vejen for nye forretningsmuligheder. Det er alt sammen med til at give et bedre beslutningsgrundlag for ledelse og medarbejdere. Kom og hør hvordan.
David Bojsen, Arkitekt, Microsoft
Streaming IBM i to Kafka for Next-Gen Use CasesPrecisely
Your team is always under pressure to accelerate the adoption of the most modern and powerful technologies. Simultaneously, your existing investments, such as IBM i, your organization’s most critical data asset, remain in a silo. The only practical path forward is to connect the new and existing with a streaming technology like Apache Kafka to feed real-time applications that power use cases ranging from marketing and order replenishment to fraud detection.
Join this Precisely webinar to learn how to unlock the potential of your IBM i data by creating data pipelines that integrate, transform, and deliver it to users when and where they need it. Additionally, hear how Stark Denmark, uses Precisely Connect CDC to provide data to their organization in real-time.
Join this webinar to:
- Understand the benefits and challenges of building data pipelines that access and integrate data from IBM i systems to modern data platforms
- Learn how Precisely can help you build real-time data pipelines
- Hear from Stark Denmark on how they are using Connect CDC from Precisely and the benefits they are getting
Fast Data Strategy Houston Roadshow PresentationDenodo
Fast Data Strategy Houston Roadshow focused on the next industrial revolution on the horizon, driven by the application of big data, IoT and Cloud technologies.
• Denodo’s innovative customer, Anadarko, elaborated on how data virtualization serves as the key component in their prescriptive and predictive analytics initiatives, driven by multi-structured data ranging from customer data to equipment data.
• Denodo’s session, Unleashing the Power of Data, described the complexity of the modern data ecosystem and how to overcome challenges and successfully harness insights.
• Our Partner Noah Consulting, an expert analytics solutions provider in the energy industry, explained how your peers are innovating using new business models and reducing cost in areas such as Asset Management and Operations by leveraging Data Virtualization and Prescriptive and Predictive Analytics.
For more information on upcoming roadshows near you, follow this link: https://goo.gl/WBDHiE
Big Data, IoT, data lake, unstructured data, Hadoop, cloud, and massively parallel processing (MPP) are all just fancy words unless you can find uses cases for all this technology. Join me as I talk about the many use cases I have seen, from streaming data to advanced analytics, broken down by industry. I’ll show you how all this technology fits together by discussing various architectures and the most common approaches to solving data problems and hopefully set off light bulbs in your head on how big data can help your organization make better business decisions.
Data Virtualization. An Introduction (ASEAN)Denodo
Watch full webinar here: https://bit.ly/3uiXVoC
What is Data Virtualization and why do I care? In this webinar we intend to help you understand not only what Data Virtualization is but why it's a critical component of any organization's data fabric and how it fits. How data virtualization liberates and empowers your business users via data discovery, data wrangling to generation of reusable reporting objects and data services. Digital transformation demands that we empower all consumers of data within the organization, it also demands agility too. Data Virtualization gives you meaningful access to information that can be shared by a myriad of consumers.
Watch on-demand this session to learn:
- What is Data Virtualization?
- Why do I need Data Virtualization in my organization?
- How do I implement Data Virtualization in my enterprise? Where does it fit..?
SendGrid Improves Email Delivery with Hybrid Data WarehousingAmazon Web Services
When you received your Uber ‘Tuesday Evening Ride Receipt’ or Spotify’s ‘This Week’s New Music’ email, did you think about how they got there?
SendGrid’s reliable email platform delivers each month over 20 Billion transactional and marketing emails on behalf of many of your favorite brands, including Uber, Airbnb, Spotify, Foursquare and NextDoor.
SendGrid was looking to evolve its data warehouse architecture in order to improve decision making and optimize customer experience. They needed a scalable and reliable architecture that would allow them to move nimbly and efficiently with a relatively small IT organization, while supporting the needs of both business and technical users at SendGrid.
SendGrid’s Director of Enterprise Data Operations will be joining architects from Amazon Web Services (AWS) and Informatica to discuss SendGrid’s journey to a hybrid cloud architecture and how a hybrid data warehousing solution is optimized to support SendGrid’s analytics initiative. Speakers will also review common technologies and use cases being deployed in hybrid cloud today, common data management challenges in hybrid cloud and best practices for addressing these challenges.
Join us to learn:
• How to evolve to a hybrid data warehouse with Amazon Redshift for scalability, agility and cost efficiency with minimal IT resources
• Hybrid cloud data management use cases
• Best practices for addressing hybrid cloud data management challenges
Similar to Opportunity: Data, Analytic & Azure (20)
Features in VSTS and TFS for Mobile DevOps. Using Automated Build, Automated Release & HockeyApp for Mobile App Distribution & Monitoring. Combining Mobile DevOps Story with Visual Studio Mobile Center.
Debugging application using visual studio 2010 and intellitraceAbhimanyu Singhal
When the unexpected happens, how do you figure out what went wrong and recover gracefully? We take a look at the debugging features and capabilities of the latest version of Visual Studio 2010. Where you can take advantage of the tools to simplify and streamline the complex job of debugging software.
Web and load testing with Visual Studio 2010 UltimateAbhimanyu Singhal
Visual Studio 2010 Ultimate also gives you access to advanced web and load testing features that help you ensure that whatever application you are building it is ready for the most demanding uses. The Web and Load testing capabilities of Visual Studio 2010 Ultimate help you ensure that whatever application you are building it is ready for whatever scenarios your organization can envisage taking you one step closer to knowing your application is ready for prime time.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
How world-class product teams are winning in the AI era by CEO and Founder, P...
Opportunity: Data, Analytic & Azure
1.
2.
3.
4.
5. Technological innovation accelerates value
Complex implementations
Ad-hoc analysis Dashboards
Enterprise data warehouse
Spreadmarts
Siloed data
Hadoop experimentation
Machine learning
OLAP
Predictive analytics
In-memory Any data
Internet of Things
Innovation
Transactional systems
ETL
Operational reporting
Value
6. Increasing
data volumes
1
Real-time
data
2
New data
sources and types
3
4
Cloud-born
data
Data sources
8. The opportunity
is bigger than
you may think
$1.6T
data dividend available to
businesses that embrace data
over the next four years.
How?
Put new data types and
analytics capabilities in
the hands of more
people, faster.
Data Source: Microsoft & IDC, April 2014
9. The opportunity
is bigger than
you may think
$1.6T
data dividend available to
businesses that embrace data
over the next four years
Speed
Diverse
data
New
analytics
More
people
How?
Data Source: Microsoft & IDC, April 2014
13. Microsoft’s approach delivers data dividends
An end-to-end platform for any data and everyone
Transform
+ analyze
Visualize
+ decide
Data
Collect +
manage
Microsoft data platform
14. The modern data & analytics solution
Data sources Non-relational data
15. Cloud Characteristics
Little or no
requirement
for capital
investment to
enable usage
Variable
pricing based
on
consumption;
buyers “pay
per use”
Rapid
acquisition
and
deployment
Lower ongoing
operating
costs than IT
owned and
managed in-house
Programmable
and adaptable
in use
16. Thinking about cloud services
Infrastructure Clouds
(Iaas)
• “Raw” infrastructure (CPU, memory,
storage, network) available on an as-needed
basis in public or private
clouds
Platform Clouds
(Paas)
• Virtualized development and run time
platform
Application Clouds
(SaaS)
• Business applications provided on a
subscription basis
17. Changing Landscape
Devices/IoT Apps Big data
Elastic
26B
By 2020, IoT will include 26B
units, creating large quantities
of data & generating >$300B
in revenue for IoT suppliers
>50%
More than half of information
workers across 17 countries
report using 3+ devices
for work
25%
one quarter of external app
implementation spending
will be on mobility, cloud,
analytics, & social, by 2016
~50%
nearly half of total IT spend
will be cloud-related
by 2020
Enable employees
to work from anywhere
Evolve business apps
to meet new needs
Help employees
make faster decisions
Ensure infrastructure
scales to meet demand
Devices: “Info Workers Will Erase Boundary Between enterprise And Consumer Technologies.” Forrester Research. August 30, 2012
Apps: Gartner: “Predicts 2013: Business Impact of Technology Drives the Futures Application Services Market.” Nov. 21, 2012
Big data: Paraphrased from "The Impact of the Internet of Things on Data Centers." Gartner Research, Feb 27, 2014
Cloud: “Prepare For 2020: Transform Your IT Infrastructure And Operations Practice.” Forrester Research. Oct. 24, 2012
18. One Environment
Unlock insights on any data
• Windows Server 2012 R2
• Microsoft Azure
• SQL Server 2014
• Power BI
Cloud OS Vision
Transform the datacenter
• Windows Server 2012 R2
• System Center 2012 R2
• Microsoft Azure
• SQL Server 2014
• Coming Soon: WAP
Enable modern business
applications
• Microsoft Azure
• SQL Server 2014
• Visual Studio
Empower enterprise mobility
• Windows Server 2012 R2
• System Center Configuration Manager 2012 R2
• Microsoft Azure
• Windows Intune
19. To transform your datacenter,
Microsoft gives you innovative solutions across compute, storage, and networking,
in your datacenter and in the cloud
Reduced cost
and complexity
Take advantage of innovation
in compute, storage, and networking
Rapid response
to the business
Speed delivery of services
with a flexible approach and unified
management
Cloud options
on demand
Extend to the cloud with hybrid
capabilities that offer concrete
benefits to your business
20. To empower enterprise mobility,
Microsoft enables consumerization of IT without compromising compliance
Enable your users
Give people access to the
applications, data, and resources
they need on devices they love
Protect your data
Protect company information by applying
policy to application and data access, and
selectively wiping devices
Unify your environment
Extend your existing on-premises investments in
System Center and Active Directory to the cloud
with integrated IT Pro experiences
21. To enable application innovation,
Microsoft helps you architect the future of business innovation
Rapid development
Deliver innovation at the pace your
business expects and the quality it
needs
A flexible platform
Connect what you have to today with
where you need to go in the future
Enterprise proven
Capabilities you expect and a track
record of delivering enterprise value
22. To unlock insights on any data,
Microsoft solutions help everyone, not just the experts,
uncover business insights faster
Easy access to data
big and small
Give everyone access to data, big and
small, to drive the best decisions
Powerful, familiar BI
tools for everyone
Engage everyone in BI by delivering
powerful analytics and visualization
capabilities in familiar tools, like Excel
Unified and complete
data platform
Give IT a complete data platform that
connects big and small data across clouds,
datacenters, and devices
24. Cloud Computing Opportunities
Departmental, niche applications
Medical imaging archiving Personal health records or PHRs Analytics
Core health IT systems
EMRs/HIEs Scheduling/practice management Clinical decision support Quality reporting
Virtualized, integrated health networks
Health plans Hospitals, clinics and labs Pharmacies Patients and caregivers
Seamless care delivery
Anywhere, anytime access Personalized care plan Real-time visibility (cost, quality)
26. Cloud Computing Opportunities
Retail
Consumers’ embrace of digital technology is disrupting retailers worldwide—
making a strong case for cloud-computing responses
Channel Operations
Merchandizing & Marketing
Supply Chain Management
Sales, Service & Support
27. Azure Offerings for Data Platform
Data & Storage
• SQL Database
• DocumentDB
• Redis Cache
• Storage
• StorSimple
• Azure Search
Backup
• Backup
• Site Recovery
Analytics
• HDInsight
• Machine Learning
• Stream Analytics
• Data Factory
28. SQL Database
Support elastic scale & more predictable
performance
Streamline business continuity for your critical
applications
29. SQL Database
Enjoy near-zero maintenance through a self-managed
service
Drive productivity through support for familiar tools
and platforms
30. SQL Database
Enable security and compliance-related tasks
Blend service tiers for innovative designs
31. Document DB
A fully-managed, highly-scalable, NoSQL document database service.
Rich query and transactions over schema-free JSON data
Delivers reliable and configurable performance
Enables rapid development
32. Storage
Reliable, economical cloud storage for data big and small
Blobs, Tables, Queues, and Files Highly scalable Durable & highly available
Designed for developers Global reach Cost effective
33. StorSimple
A unique hybrid cloud storage solution that lowers costs and improves data protection
Effectively manage data growth
Simplify storage and data protection
Reduce storage costs by up to 60%
Significantly accelerate disaster recovery,
Improve compliance
34. HDInsight
Our 100% Apache Hadoop-based service in the cloud
HDInsight is Microsoft’s 100% Apache compatible Hadoop distribution
Available as a Microsoft Azure service & Scales elastically on demand
Crunch all data – structured,semi-structured, unstructured & Develop in your
favorite language
Use Excel to visualize your Hadoop data & Connect on-premises Hadoop clusters
with the cloud
Includes NoSQL transactional capabilities & Provide real-time stream processing
35. Machine Learning
Powerful cloud-based predictive analytics
Drag, drop, predict
R spoken here
Integrated Cloud Offering
Changes the Game
36. Azure Offerings for Data Platform
Data & Storage
• SQL Database
• DocumentDB
• Redis Cache
• Storage
• StorSimple
• Azure Search
Backup
• Backup
• Site Recovery
Analytics
• HDInsight
• Machine Learning
• Stream Analytics
• Data Factory
37.
38. Structured Data
Create resiliency in your data environment to minimize
interruption to core business activities.
39.
40. Structured Data
Using SQL Azure for business critical applications,
Reducing Time to Market, Affordable solutions
41. The Relational Database Continuum
SQL
Server
SQL
Server
VM
SQL
Database
IaaS
PaaS
Physical
PAY-AS-YOU-GO,
ELASTIC SCALE,
ZERO ADMIN
HARDWARE &
SOFTWARE COSTS,
ADMIN COSTS,
FULL CONTROL
51. Shorten the development, testing, analysis and deployment cycles for data analysts through PowerPivot
52.
53.
54. Headquartered in Seoul, SK Telecom is a South
Korean wireless telecommunications operator,
controlled by the SK Group, one of the country's
largest conglomerates.
SK Telecom wanted to differentiate itself in the global
indoor location-based services (ILBS) market. It
needed to develop and market a standard ILBS
platform that would be easily deployed to customers.
SK Telecom developed a cloud-based ILBS solution
called Wizturn that runs in the Microsoft Azure
environment and uses StorSimple, a cloud-integrated
data management solution.
Cloud-based solution improves marketability
Global ecosystem adds value
Scalability reduces administration
Takes advantage of the Internet of Things
Microsoft Azure – Microsoft Azure SQL Database – StorSimple – Azure Media Services
55. Ascribe is one of the United Kingdom’s leading
suppliers of clinically focused IT solutions for the
healthcare industry.
To help clinicians improve services, Ascribe wanted to
provide rapid insight into large volumes of data from
multiple sources.
Ascribe and Two10Degrees designed an end-to-end
Big Data solution with business intelligence (BI) tools
based on Microsoft SQL Server 2012 and Windows
Azure HDInsight Service.
Transforms healthcare with near-real-time access to
information
Speeds response to health threats
Provides actionable insight into large volumes of
structured and unstructured data
Microsoft SQL Server 2012 - Microsoft Azure – Cloud Service - Data Warehousing - Virtualization
56. Based in Sunnyvale, California, AMD designs and
integrates technology that powers millions of
intelligent devices including personal computers,
tablets, game consoles, and cloud servers.
With a global customer base and a presence in
multiple markets, the company needed better tools
for monitoring manufacturing processes and other
business operations
A BI solution based on SQL Server 2014 Enterprise
and SharePoint Server 2013. The data warehouse
team wanted to take advantage of built-in, self-service
BI tools such as Power View, an interactive
data visualization feature of Microsoft SQL Server
2014 Reporting Services
Easier Implementation and Simplified BI
Better Business Insight and Operational Support
More Efficient and Leaner IT
Microsoft SQL Server 2014 - Microsoft SQL Server 2014 PowerPivot for Microsoft Excel – Microsoft SharePoint
Server 2013 - Microsoft SQL Server 2014 Reporting Services
57. LG CNS is a global IT services provider with a wide
range of offerings, including consulting, system
integration, outsourcing, Enterprise Resource
Planning/Business Intelligence, IT infrastructure
solutions, and IT convergence.
Wanted to sell solutions for healthcare to small
nursing homes and assisted care facilities in the
United States. But such small operations in the US lack
technical staff and budget to support traditional IT
infrastructures
Host the infrastructure for its solutions in the cloud by
using Windows Azure infrastructure as a service
(IaaS).
Quick Entrance to the US Market
Easier Rollouts Anywhere
More Affordable Solutions
Microsoft Azure