In this presentation, all the aspects of SMAC are covered in as much detail as possible. You will find some ideas worth sharing and also get attuned to Social, Mobile, Analytics and Cloud
ZStack for Datacenter as a Service - Product DeckRyo Ardian
This slide include informations about:
- Cloud Market Overview in Indonesia, Australia, Russia and other development countries in Asia.
- Public Cloud User Problems
- Private Cloud User Problems
- Cloud Platform competitions mapping
- ZStack users profiling.
- Business module
- Operations efficiency if using ZStack Cloud
- Market Research Data for cloud platforms
- Who is ZStack founder
- ZStack companies Journey
- ZStack partner in Indonesia and globals market
- ZStack gartner as Alibaba Cloud family
- ZStack solutions as VMware replacement and Private Cloud solutions
- ZStack solutions's scope vs Openstack, Nutanix, Microsoft and VMware.
- ZStack package vs VMware package
- TCO Comparison for VMware vs ZStack
- Performance comparison ZStack vs VMware
- TCO Comparison ZStack Cloud vs Public Cloud.
- ZStack vs other Virtualization Products such as Microsoft, Nutanix, Redhat, Citrix and VMware.
- ZStack vs other Cloud Products such as Proxmox, OpenNebula, Virtuozzo and OpenStack.
- ZStack Quality of Service Commitment.
- ZStack Solutions Landscape.
- ZStack Cloud Functions overview.
- ZStack CEPH overview.
- ZStack CMP Overview
- ZStack Kubernetes Solutions overview
Big data refers to datasets that are too large to be managed by traditional database tools. It is characterized by volume, velocity, and variety. Hadoop is an open-source software framework that allows distributed processing of large datasets across clusters of computers. It works by distributing storage across nodes as blocks and distributing computation via a MapReduce programming paradigm where nodes process data in parallel. Common uses of big data include analyzing social media, sensor data, and using machine learning on large datasets.
This document provides an overview of serverless computing using Azure Functions. It begins with an introduction to serverless computing and what it means for servers to be fully abstracted and for scaling to be event-driven. It then discusses the value of serverless computing in terms of availability, speed, and paying only for what you use. The remainder of the document discusses Logic Apps as a serverless integration technology, provides a list of Software as a Service (SaaS) applications that can be connected in Logic Apps, and concludes with an announcement about a new pricing model for Logic Apps.
This document describes Bubbles, a Python framework for data processing and quality probing. Bubbles focuses on representing data objects and defining operations that can be performed on those objects. Key aspects include:
- Data objects define the structure and representations of data without enforcing a specific storage format.
- Operations can be performed on data objects and are dispatched dynamically based on the objects' representations.
- A context stores available operations and handles dispatching.
- Stores provide interfaces to load and save objects from formats like SQL, CSV, etc.
- Pipelines allow sequencing operations to transform and process objects from source to target stores.
- The framework includes common operations for filtering, joining, aggreg
Getting started with azure event hubs and stream analytics servicesEastBanc Tachnologies
Author: Vladimir Bychkov, www.eastbanctech.com
The total amount of data in the world almost doubles every 2 years. Storing data for offline processing is no longer a viable business model. In the past few years, new technologies for real-time data processing emerged. Microsoft Azure offers a comprehensive set of tools to ingest and process data in motion. In this presentation we will go over and learn how to collect data from devices, how to process data in real time using Azure Stream Analytic jobs, and how to produce and handle actionable insights.
ZStack for Datacenter as a Service - Product DeckRyo Ardian
This slide include informations about:
- Cloud Market Overview in Indonesia, Australia, Russia and other development countries in Asia.
- Public Cloud User Problems
- Private Cloud User Problems
- Cloud Platform competitions mapping
- ZStack users profiling.
- Business module
- Operations efficiency if using ZStack Cloud
- Market Research Data for cloud platforms
- Who is ZStack founder
- ZStack companies Journey
- ZStack partner in Indonesia and globals market
- ZStack gartner as Alibaba Cloud family
- ZStack solutions as VMware replacement and Private Cloud solutions
- ZStack solutions's scope vs Openstack, Nutanix, Microsoft and VMware.
- ZStack package vs VMware package
- TCO Comparison for VMware vs ZStack
- Performance comparison ZStack vs VMware
- TCO Comparison ZStack Cloud vs Public Cloud.
- ZStack vs other Virtualization Products such as Microsoft, Nutanix, Redhat, Citrix and VMware.
- ZStack vs other Cloud Products such as Proxmox, OpenNebula, Virtuozzo and OpenStack.
- ZStack Quality of Service Commitment.
- ZStack Solutions Landscape.
- ZStack Cloud Functions overview.
- ZStack CEPH overview.
- ZStack CMP Overview
- ZStack Kubernetes Solutions overview
Big data refers to datasets that are too large to be managed by traditional database tools. It is characterized by volume, velocity, and variety. Hadoop is an open-source software framework that allows distributed processing of large datasets across clusters of computers. It works by distributing storage across nodes as blocks and distributing computation via a MapReduce programming paradigm where nodes process data in parallel. Common uses of big data include analyzing social media, sensor data, and using machine learning on large datasets.
This document provides an overview of serverless computing using Azure Functions. It begins with an introduction to serverless computing and what it means for servers to be fully abstracted and for scaling to be event-driven. It then discusses the value of serverless computing in terms of availability, speed, and paying only for what you use. The remainder of the document discusses Logic Apps as a serverless integration technology, provides a list of Software as a Service (SaaS) applications that can be connected in Logic Apps, and concludes with an announcement about a new pricing model for Logic Apps.
This document describes Bubbles, a Python framework for data processing and quality probing. Bubbles focuses on representing data objects and defining operations that can be performed on those objects. Key aspects include:
- Data objects define the structure and representations of data without enforcing a specific storage format.
- Operations can be performed on data objects and are dispatched dynamically based on the objects' representations.
- A context stores available operations and handles dispatching.
- Stores provide interfaces to load and save objects from formats like SQL, CSV, etc.
- Pipelines allow sequencing operations to transform and process objects from source to target stores.
- The framework includes common operations for filtering, joining, aggreg
Getting started with azure event hubs and stream analytics servicesEastBanc Tachnologies
Author: Vladimir Bychkov, www.eastbanctech.com
The total amount of data in the world almost doubles every 2 years. Storing data for offline processing is no longer a viable business model. In the past few years, new technologies for real-time data processing emerged. Microsoft Azure offers a comprehensive set of tools to ingest and process data in motion. In this presentation we will go over and learn how to collect data from devices, how to process data in real time using Azure Stream Analytic jobs, and how to produce and handle actionable insights.
Cloud computing :
Accessibility: Cloud computing facilitates the access of applications and data from any location worldwide and from any device with an internet connection.
Cost savings: Cloud computing offers businesses scalable computing resources hence saving them on the cost of acquiring and maintaining them.
Security: Cloud providers especially those offering private cloud services, have strived to implement the best security standards and procedures in order to protect client’s data saved in the cloud.
Disaster recovery: Cloud computing offers the most efficient means for small, medium, and even large enterprises to backup and restore their data and applications in a fast and reliable way.
Big data is characterized by 3Vs - volume, velocity, and variety. Hadoop is a framework for distributed processing of large datasets across clusters of computers. It provides HDFS for storage, MapReduce for batch processing, and YARN for resource management. Additional tools like Spark, Mahout, and Zeppelin can be used for real-time processing, machine learning, and data visualization respectively on Hadoop. Benefits of Hadoop include ease of scaling to large data, high performance via parallel processing, reliability through data protection and failover.
This document provides an introduction to cloud computing. It discusses the benefits of cloud computing like pay-as-you-go models and operational expense instead of capital expense. It defines cloud computing and introduces its essential characteristics, service models of SaaS, PaaS and IaaS, and deployment models of private, public and hybrid clouds. It demonstrates using Amazon EC2 as an example of infrastructure as a service.
This document provides a summary of improvements made to Hive's performance through the use of Apache Tez and other optimizations. Some key points include:
- Hive was improved to use Apache Tez as its execution engine instead of MapReduce, reducing latency for interactive queries and improving throughput for batch queries.
- Statistics collection was optimized to gather column-level statistics from ORC file footers, speeding up statistics gathering.
- The cost-based optimizer Optiq was added to Hive, allowing it to choose better execution plans.
- Vectorized query processing, broadcast joins, dynamic partitioning, and other optimizations improved individual query performance by over 100x in some cases.
A private cloud provides hosted computing services behind a company's firewall. It offers benefits like flexibility, mobility, confidentiality, availability, and cost savings over traditional IT. A private cloud gives a company direct control over its data and infrastructure while providing high availability, security, and efficiency through virtualization and elastic resources. It transforms IT from a cost center to a strategic enabler by reducing maintenance costs and allowing on-demand provisioning and reallocation of resources.
Big Data yani büyük veri nedir diyorsanız ve büyük veri analizinin ne gibi yararlar sağlayacağını merak ediyorsanız sizin için Renerald olarak bu sunumu hazırladık. Büyük veri analizleri sayesinde, stratejilerinizi bilimsel veriler ışığında geliştirip şirketinize inanılmaz artı değerler kazandırabileceksiniz.
This document provides an overview of Apache HBase, including:
- Two presenters from Cloudera will discuss HBase's architecture, data model, and hands-on installation and usage.
- HBase is an open-source, distributed, scalable database built on Hadoop that allows for random, real-time read/write access to big data.
- The presentation will cover HBase fundamentals, demonstrate its usage, and discuss how companies apply it for large-scale analytics and real-time applications.
This document discusses Hadoop, an open-source software framework for distributed storage and processing of large datasets across clusters of computers. It describes key Hadoop components like HDFS for distributed file storage and MapReduce for distributed processing. Several companies that use Hadoop at large scale are mentioned, including Yahoo, Amazon and Facebook. Applications of Hadoop in healthcare for storing and analyzing large amounts of medical data are discussed. The document concludes that Hadoop is well-suited for big data applications due to its scalability, fault tolerance and cost effectiveness.
With the world moving to the cloud, the need to conduct testing has simultaneously arisen. This PPT will shed light on the key factors under cloud computing and the types of testing performed for the same. Get to know more on Cloud Service Models, Key Characteristics, Cloud Testing, Functional testing, Performance and Benchmark testing, Network resting, Interoperability and Compatibility testing, cloud testing tools, and cloud testing methodology through this PPT as well as stay tuned for the upcoming ones.
This document discusses common use cases for MongoDB and why it is well-suited for them. It describes how MongoDB can handle high volumes of data feeds, operational intelligence and analytics, product data management, user data management, and content management. Its flexible data model, high performance, scalability through sharding and replication, and support for dynamic schemas make it a good fit for applications that need to store large amounts of data, handle high throughput of reads and writes, and have low latency requirements.
A seminar made to the Tennessee Department of Health in July 2015. An introduction to HL7 standards with a focus on HL7 v3 messaging and clinical document architecture standards.
Amazon Web Services (AWS) provides a low cost, reliable and secure foundation for you to use as you build and deliver Software as a Service (SaaS) solutions to customers. For ISVs, the process of transition from a traditional (license based) model to a Software as a Service (SaaS) is a challenge that concerns not only the technical aspect, but also financial and commercial strategy aspects. In this presentation you will find out how AWS can become the ideal partner to support the transformation.
Ceph Benchmarking Tool (CBT) is a Python framework for benchmarking Ceph clusters. It has client and monitor personalities for generating load and setting up the cluster. CBT includes benchmarks for RADOS operations, librbd, KRBD on EXT4, KVM with RBD volumes, and COSBench tests against RGW. Test plans are defined in YAML files and results are archived for later analysis using tools like awk, grep, and gnuplot.
PowerPoint explaining cloud migration, benefits and risks of cloud migration as well as the legal and financial information associated with cloud migration
Estimating the Total Costs of Your Cloud Analytics Platform DATAVERSITY
Organizations today need a broad set of enterprise data cloud services with key data functionality to modernize applications and utilize machine learning. They need a platform designed to address multi-faceted needs by offering multi-function Data Management and analytics to solve the enterprise’s most pressing data and analytic challenges in a streamlined fashion. They need a worry-free experience with the architecture and its components.
A complete machine learning infrastructure cost for the first modern use case at a midsize to large enterprise will be anywhere from $2M to $14M. Get this data point as you take the next steps on your journey.
This document outlines Apache Flume, a distributed system for collecting large amounts of log data from various sources and transporting it to a centralized data store such as Hadoop. It describes the key components of Flume including agents, sources, sinks and flows. It explains how Flume provides reliable, scalable, extensible and manageable log aggregation capabilities through its node-based architecture and horizontal scalability. An example use case of using Flume for near real-time log aggregation is also briefly mentioned.
Big Data, Big Deal: For Future Big Data ScientistsWay-Yen Lin
Big Data, Big Deal is a document that discusses big data. It begins by defining big data as high-volume, high-velocity, and high-variety information that requires new processing methods. It then discusses the key drivers for big data, including technical drivers like increased data storage and social media, as well as business drivers like customer analytics and public opinion analysis. The document concludes by discussing challenges for big data like data quality, privacy, and the need for skilled data scientists with technical expertise, curiosity, storytelling abilities, and cleverness.
Modern apps and services are leveraging data to change the way we engage with users in a more personalized way. Skyla Loomis talks big data, analytics, NoSQL, SQL and how IBM Cloud is open for data.
Learn more by visiting our Bluemix Hybrid page: http://ibm.co/1PKN23h
Operating a secure big data platform in a multi-cloud environmentDataWorks Summit
The Health Cyberinfrastructure Division at the San Diego Supercomputer Center (SDSC) at the University of California, San Diego has been deploying and managing a number of big data platforms ranging from the traditional data warehouse to the more recent big data platforms leveraging Hadoop in a secure cloud platform, Sherlock Cloud, for nearly a decade. We understand the necessity to remain agile and visionary in this arena to grow with the ever-changing technological and customer requirements while simultaneously ensuring a compliant environment to secure data.
As such, during our presentation, we will speak to our more recent deployment, namely a multi-cloud, Hadoop-based data management platform and the mechanisms employed to marry best-of-breed big data technology solutions and cloud platforms to support large-scale data management and analytics within the highly secure and compliant (U.S. HIPAA-compliant) boundaries of our hybrid cloud that spans an on-premises cloud running at UC San Diego and another operating in AWS Cloud. We will further identify the challenges and lessons learned from deploying, and securely operating, a big data platform offering capabilities that include disaster recovery and business continuity across a hybrid cloud setup.
Speaker
Sandeep Chandra, Division Director, San Diego Supercomputer Center
Cloud computing :
Accessibility: Cloud computing facilitates the access of applications and data from any location worldwide and from any device with an internet connection.
Cost savings: Cloud computing offers businesses scalable computing resources hence saving them on the cost of acquiring and maintaining them.
Security: Cloud providers especially those offering private cloud services, have strived to implement the best security standards and procedures in order to protect client’s data saved in the cloud.
Disaster recovery: Cloud computing offers the most efficient means for small, medium, and even large enterprises to backup and restore their data and applications in a fast and reliable way.
Big data is characterized by 3Vs - volume, velocity, and variety. Hadoop is a framework for distributed processing of large datasets across clusters of computers. It provides HDFS for storage, MapReduce for batch processing, and YARN for resource management. Additional tools like Spark, Mahout, and Zeppelin can be used for real-time processing, machine learning, and data visualization respectively on Hadoop. Benefits of Hadoop include ease of scaling to large data, high performance via parallel processing, reliability through data protection and failover.
This document provides an introduction to cloud computing. It discusses the benefits of cloud computing like pay-as-you-go models and operational expense instead of capital expense. It defines cloud computing and introduces its essential characteristics, service models of SaaS, PaaS and IaaS, and deployment models of private, public and hybrid clouds. It demonstrates using Amazon EC2 as an example of infrastructure as a service.
This document provides a summary of improvements made to Hive's performance through the use of Apache Tez and other optimizations. Some key points include:
- Hive was improved to use Apache Tez as its execution engine instead of MapReduce, reducing latency for interactive queries and improving throughput for batch queries.
- Statistics collection was optimized to gather column-level statistics from ORC file footers, speeding up statistics gathering.
- The cost-based optimizer Optiq was added to Hive, allowing it to choose better execution plans.
- Vectorized query processing, broadcast joins, dynamic partitioning, and other optimizations improved individual query performance by over 100x in some cases.
A private cloud provides hosted computing services behind a company's firewall. It offers benefits like flexibility, mobility, confidentiality, availability, and cost savings over traditional IT. A private cloud gives a company direct control over its data and infrastructure while providing high availability, security, and efficiency through virtualization and elastic resources. It transforms IT from a cost center to a strategic enabler by reducing maintenance costs and allowing on-demand provisioning and reallocation of resources.
Big Data yani büyük veri nedir diyorsanız ve büyük veri analizinin ne gibi yararlar sağlayacağını merak ediyorsanız sizin için Renerald olarak bu sunumu hazırladık. Büyük veri analizleri sayesinde, stratejilerinizi bilimsel veriler ışığında geliştirip şirketinize inanılmaz artı değerler kazandırabileceksiniz.
This document provides an overview of Apache HBase, including:
- Two presenters from Cloudera will discuss HBase's architecture, data model, and hands-on installation and usage.
- HBase is an open-source, distributed, scalable database built on Hadoop that allows for random, real-time read/write access to big data.
- The presentation will cover HBase fundamentals, demonstrate its usage, and discuss how companies apply it for large-scale analytics and real-time applications.
This document discusses Hadoop, an open-source software framework for distributed storage and processing of large datasets across clusters of computers. It describes key Hadoop components like HDFS for distributed file storage and MapReduce for distributed processing. Several companies that use Hadoop at large scale are mentioned, including Yahoo, Amazon and Facebook. Applications of Hadoop in healthcare for storing and analyzing large amounts of medical data are discussed. The document concludes that Hadoop is well-suited for big data applications due to its scalability, fault tolerance and cost effectiveness.
With the world moving to the cloud, the need to conduct testing has simultaneously arisen. This PPT will shed light on the key factors under cloud computing and the types of testing performed for the same. Get to know more on Cloud Service Models, Key Characteristics, Cloud Testing, Functional testing, Performance and Benchmark testing, Network resting, Interoperability and Compatibility testing, cloud testing tools, and cloud testing methodology through this PPT as well as stay tuned for the upcoming ones.
This document discusses common use cases for MongoDB and why it is well-suited for them. It describes how MongoDB can handle high volumes of data feeds, operational intelligence and analytics, product data management, user data management, and content management. Its flexible data model, high performance, scalability through sharding and replication, and support for dynamic schemas make it a good fit for applications that need to store large amounts of data, handle high throughput of reads and writes, and have low latency requirements.
A seminar made to the Tennessee Department of Health in July 2015. An introduction to HL7 standards with a focus on HL7 v3 messaging and clinical document architecture standards.
Amazon Web Services (AWS) provides a low cost, reliable and secure foundation for you to use as you build and deliver Software as a Service (SaaS) solutions to customers. For ISVs, the process of transition from a traditional (license based) model to a Software as a Service (SaaS) is a challenge that concerns not only the technical aspect, but also financial and commercial strategy aspects. In this presentation you will find out how AWS can become the ideal partner to support the transformation.
Ceph Benchmarking Tool (CBT) is a Python framework for benchmarking Ceph clusters. It has client and monitor personalities for generating load and setting up the cluster. CBT includes benchmarks for RADOS operations, librbd, KRBD on EXT4, KVM with RBD volumes, and COSBench tests against RGW. Test plans are defined in YAML files and results are archived for later analysis using tools like awk, grep, and gnuplot.
PowerPoint explaining cloud migration, benefits and risks of cloud migration as well as the legal and financial information associated with cloud migration
Estimating the Total Costs of Your Cloud Analytics Platform DATAVERSITY
Organizations today need a broad set of enterprise data cloud services with key data functionality to modernize applications and utilize machine learning. They need a platform designed to address multi-faceted needs by offering multi-function Data Management and analytics to solve the enterprise’s most pressing data and analytic challenges in a streamlined fashion. They need a worry-free experience with the architecture and its components.
A complete machine learning infrastructure cost for the first modern use case at a midsize to large enterprise will be anywhere from $2M to $14M. Get this data point as you take the next steps on your journey.
This document outlines Apache Flume, a distributed system for collecting large amounts of log data from various sources and transporting it to a centralized data store such as Hadoop. It describes the key components of Flume including agents, sources, sinks and flows. It explains how Flume provides reliable, scalable, extensible and manageable log aggregation capabilities through its node-based architecture and horizontal scalability. An example use case of using Flume for near real-time log aggregation is also briefly mentioned.
Big Data, Big Deal: For Future Big Data ScientistsWay-Yen Lin
Big Data, Big Deal is a document that discusses big data. It begins by defining big data as high-volume, high-velocity, and high-variety information that requires new processing methods. It then discusses the key drivers for big data, including technical drivers like increased data storage and social media, as well as business drivers like customer analytics and public opinion analysis. The document concludes by discussing challenges for big data like data quality, privacy, and the need for skilled data scientists with technical expertise, curiosity, storytelling abilities, and cleverness.
Modern apps and services are leveraging data to change the way we engage with users in a more personalized way. Skyla Loomis talks big data, analytics, NoSQL, SQL and how IBM Cloud is open for data.
Learn more by visiting our Bluemix Hybrid page: http://ibm.co/1PKN23h
Operating a secure big data platform in a multi-cloud environmentDataWorks Summit
The Health Cyberinfrastructure Division at the San Diego Supercomputer Center (SDSC) at the University of California, San Diego has been deploying and managing a number of big data platforms ranging from the traditional data warehouse to the more recent big data platforms leveraging Hadoop in a secure cloud platform, Sherlock Cloud, for nearly a decade. We understand the necessity to remain agile and visionary in this arena to grow with the ever-changing technological and customer requirements while simultaneously ensuring a compliant environment to secure data.
As such, during our presentation, we will speak to our more recent deployment, namely a multi-cloud, Hadoop-based data management platform and the mechanisms employed to marry best-of-breed big data technology solutions and cloud platforms to support large-scale data management and analytics within the highly secure and compliant (U.S. HIPAA-compliant) boundaries of our hybrid cloud that spans an on-premises cloud running at UC San Diego and another operating in AWS Cloud. We will further identify the challenges and lessons learned from deploying, and securely operating, a big data platform offering capabilities that include disaster recovery and business continuity across a hybrid cloud setup.
Speaker
Sandeep Chandra, Division Director, San Diego Supercomputer Center
Data Mesh in Azure using Cloud Scale Analytics (WAF)Nathan Bijnens
This document discusses moving from a centralized data architecture to a distributed data mesh architecture. It describes how a data mesh shifts data management responsibilities to individual business domains, with each domain acting as both a provider and consumer of data products. Key aspects of the data mesh approach discussed include domain-driven design, domain zones to organize domains, treating data as products, and using this approach to enable analytics at enterprise scale on platforms like Azure.
While many enterprises consider cloud computing the savior of their data strategy, there is a process they should be following when looking to leveraging database-as-a-service. This includes understanding their own data requirements, selecting the right cloud computing candidate, and then planning for the migration and operations. A huge number of issues and obstacles will inevitably arise, but fortunately best practices are emerging. This presentation will take you through the process of moving data to cloud computing providers.
This document discusses cloud computing, big data, Hadoop, and data analytics. It begins with an introduction to cloud computing, explaining its benefits like scalability, reliability, and low costs. It then covers big data concepts like the 3 Vs (volume, variety, velocity), Hadoop for processing large datasets, and MapReduce as a programming model. The document also discusses data analytics, describing different types like descriptive, diagnostic, predictive, and prescriptive analytics. It emphasizes that insights from analyzing big data are more valuable than raw data. Finally, it concludes that cloud computing can enhance business efficiency by enabling flexible access to computing resources for tasks like big data analytics.
This presentation contains a broad introduction to big data and its technologies.
Big data is a term that describes the large volume of data – both structured and unstructured – that inundates a business on a day-to-day basis.
Big Data is a phrase used to mean a massive volume of both structured and unstructured data that is so large it is difficult to process using traditional database and software techniques. In most enterprise scenarios the volume of data is too big or it moves too fast or it exceeds current processing capacity.
Introduces the Microsoft’s Data Platform for on premise and cloud. Challenges businesses are facing with data and sources of data. Understand about Evolution of Database Systems in the modern world and what business are doing with their data and what their new needs are with respect to changing industry landscapes.
Dive into the Opportunities available for businesses and industry verticals: the ones which are identified already and the ones which are not explored yet.
Understand the Microsoft’s Cloud vision and what is Microsoft’s Azure platform is offering, for Infrastructure as a Service or Platform as a Service for you to build your own offerings.
Introduce and demo some of the Real World Scenarios/Case Studies where Businesses have used the Cloud/Azure for creating New and Innovative solutions to unlock these potentials.
Data Virtualization: Introduction and Business Value (UK)Denodo
This document provides an overview of a webinar on data virtualization and the Denodo platform. The webinar agenda includes an introduction to adaptive data architectures and data virtualization, benefits of data virtualization, a demo of the Denodo platform, and a question and answer session. Key takeaways are that traditional data integration technologies do not support today's complex, distributed data environments, while data virtualization provides a way to access and integrate data across multiple sources.
Developed by Google’s Artificial Intelligence division, the Sycamore quantum processor boasts 53 qubits1.
In 2019, it achieved a feat that would take a state-of-the-art supercomputer 10,000 years to accomplish: completing a specific task in just 200 seconds1
The document discusses Microsoft's approach to implementing a data mesh architecture using their Azure Data Fabric. It describes how the Fabric can provide a unified foundation for data governance, security, and compliance while also enabling business units to independently manage their own domain-specific data products and analytics using automated data services. The Fabric aims to overcome issues with centralized data architectures by empowering lines of business and reducing dependencies on central teams. It also discusses how domains, workspaces, and "shortcuts" can help virtualize and share data across business units and data platforms while maintaining appropriate access controls and governance.
The document discusses cloud computing, big data, and big data analytics. It defines cloud computing as an internet-based technology that provides on-demand access to computing resources and data storage. Big data is described as large and complex datasets that are difficult to process using traditional databases due to their size, variety, and speed of growth. Hadoop is presented as an open-source framework for distributed storage and processing of big data using MapReduce. The document outlines the importance of analyzing big data using descriptive, diagnostic, predictive, and prescriptive analytics to gain insights.
Enabling Next Gen Analytics with Azure Data Lake and StreamSetsStreamsets Inc.
This document discusses enabling next generation analytics with Azure Data Lake. It provides definitions of big data and discusses how big data is a cornerstone of Cortana Intelligence. It also discusses challenges with big data like obtaining skills and determining value. The document then discusses Azure HDInsight and how it provides a cloud Spark and Hadoop service. It also discusses StreamSets and how it can be used for data movement and deployment on Azure VM or local machine. Finally, it discusses a use case of StreamSets at a major bank to move data from on-premise to Azure Data Lake and consolidate migration tools.
IBM's Big Data platform provides tools for managing and analyzing large volumes of data from various sources. It allows users to cost effectively store and process structured, unstructured, and streaming data. The platform includes products like Hadoop for storage, MapReduce for processing large datasets, and InfoSphere Streams for analyzing real-time streaming data. Business users can start with critical needs and expand their use of big data over time by leveraging different products within the IBM Big Data platform.
IBM's Big Data platform provides tools for managing and analyzing large volumes of structured, unstructured, and streaming data. It includes Hadoop for storage and processing, InfoSphere Streams for real-time streaming analytics, InfoSphere BigInsights for analytics on data at rest, and PureData System for Analytics (formerly Netezza) for high performance data warehousing. The platform enables businesses to gain insights from all available data to capitalize on information resources and make data-driven decisions.
BAR360 open data platform presentation at DAMA, SydneySai Paravastu
Sai Paravastu discusses the benefits of using an open data platform (ODP) for enterprises. The ODP would provide a standardized core of open source Hadoop technologies like HDFS, YARN, and MapReduce. This would allow big data solution providers to build compatible solutions on a common platform, reducing costs and improving interoperability. The ODP would also simplify integration for customers and reduce fragmentation in the industry by coordinating development efforts.
The document discusses various cloud computing service models including SaaS, PaaS, IaaS, DaaS, SECaaS, TaaS, STaaS, and BPaaS. It defines each service, provides examples of providers, and outlines the benefits they provide like cost savings, automatic updates, global access, and an on-demand flexible model. All cloud services aim to deliver computing resources over the network as standardized services on a pay-per-use basis to improve efficiency and business processes.
The document discusses various cloud computing service models including Infrastructure as a Service (IaaS), Platform as a Service (PaaS), Software as a Service (SaaS), and other derived models such as Data as a Service (DaaS), Security as a Service (SECaaS), Testing as a Service (TaaS), Storage as a Service (STaaS), and Business Process as a Service (BPaaS). These services provide on-demand access to computing resources, data, applications, and business processes over the internet. The document outlines the definition, providers, benefits, and pricing models for each service type.
Sharing a presentation highlighting some key aspects to be taken into consideration while harnessing your Digital Transformation projects as a Digital Intelligence enabler for your enterprise
Similar to SMAC - Social, Mobile, Analytics and Cloud - An overview (20)
This document discusses the use of big data across several industries, including BFSI (banking, financial services, and insurance), healthcare, retail, energy, education, manufacturing, ecommerce, technology, and telecom. For each industry, it describes how big data is used for applications such as customer segmentation, risk analysis, fraud detection, predictive maintenance, personalized marketing, demand forecasting, and more. Overall, the document outlines how big data analytics provides benefits to many industries by generating insights from large and diverse data sources.
The document discusses how AI is being applied across multiple industries ("verticals") to drive digital transformation. It provides examples of common AI applications in several sectors like BFSI, healthcare, retail, energy, education, manufacturing, e-commerce, technology, and telecom. These include tasks like fraud detection, customer service, supply chain management, personalized learning, predictive maintenance, and network optimization. The document also shares industry reports and forecasts projecting significant revenue growth and cost savings from AI adoption over the next few years across all these verticals.
The document provides an agenda for a two-day training on NoSQL and MongoDB. Day 1 covers an introduction to NoSQL concepts like distributed and decentralized databases, CAP theorem, and different types of NoSQL databases including key-value, column-oriented, and document-oriented databases. It also covers functions and indexing in MongoDB. Day 2 focuses on specific MongoDB topics like aggregation framework, sharding, queries, schema-less design, and indexing.
Social media presentation in under 2 hoursRajesh Menon
Social Media - definition, 10 rules of success, marketing and strategy, channels, best practices, trends 2018.
Yammer (from ideation to implementation), other tools like Slack, Confluence and Sharepoint.
Activity time
Security.
Recap
This document contains examples of database normalization forms and SQL aggregation functions. It shows how a customer database table is normalized from 1NF to 2NF, an employees skills database from 2NF to 3NF, and a tournament winners database from 3NF. It also provides examples of the OLAP functions ROLLUP and CUBE in Oracle to produce subtotals and cross-tab summaries from aggregated data. Finally, it briefly introduces the architecture of the Oracle Business Intelligence Enterprise Edition (OBIEE).
The document provides an overview of technology topics including the history of computing hardware, operating systems, programming languages, databases, networks, the internet, and how technology applies to individuals. It discusses how technology has evolved from the abacus to modern computers and networks. Key concepts and terms related to these topics are defined. The document is intended to teach readers about technology and how to make use of it in their profession.
This presentation was provided by Rebecca Benner, Ph.D., of the American Society of Anesthesiologists, for the second session of NISO's 2024 Training Series "DEIA in the Scholarly Landscape." Session Two: 'Expanding Pathways to Publishing Careers,' was held June 13, 2024.
Level 3 NCEA - NZ: A Nation In the Making 1872 - 1900 SML.pptHenry Hollis
The History of NZ 1870-1900.
Making of a Nation.
From the NZ Wars to Liberals,
Richard Seddon, George Grey,
Social Laboratory, New Zealand,
Confiscations, Kotahitanga, Kingitanga, Parliament, Suffrage, Repudiation, Economic Change, Agriculture, Gold Mining, Timber, Flax, Sheep, Dairying,
Elevate Your Nonprofit's Online Presence_ A Guide to Effective SEO Strategies...TechSoup
Whether you're new to SEO or looking to refine your existing strategies, this webinar will provide you with actionable insights and practical tips to elevate your nonprofit's online presence.
🔥🔥🔥🔥🔥🔥🔥🔥🔥
إضغ بين إيديكم من أقوى الملازم التي صممتها
ملزمة تشريح الجهاز الهيكلي (نظري 3)
💀💀💀💀💀💀💀💀💀💀
تتميز هذهِ الملزمة بعِدة مُميزات :
1- مُترجمة ترجمة تُناسب جميع المستويات
2- تحتوي على 78 رسم توضيحي لكل كلمة موجودة بالملزمة (لكل كلمة !!!!)
#فهم_ماكو_درخ
3- دقة الكتابة والصور عالية جداً جداً جداً
4- هُنالك بعض المعلومات تم توضيحها بشكل تفصيلي جداً (تُعتبر لدى الطالب أو الطالبة بإنها معلومات مُبهمة ومع ذلك تم توضيح هذهِ المعلومات المُبهمة بشكل تفصيلي جداً
5- الملزمة تشرح نفسها ب نفسها بس تكلك تعال اقراني
6- تحتوي الملزمة في اول سلايد على خارطة تتضمن جميع تفرُعات معلومات الجهاز الهيكلي المذكورة في هذهِ الملزمة
واخيراً هذهِ الملزمة حلالٌ عليكم وإتمنى منكم إن تدعولي بالخير والصحة والعافية فقط
كل التوفيق زملائي وزميلاتي ، زميلكم محمد الذهبي 💊💊
🔥🔥🔥🔥🔥🔥🔥🔥🔥
This document provides an overview of wound healing, its functions, stages, mechanisms, factors affecting it, and complications.
A wound is a break in the integrity of the skin or tissues, which may be associated with disruption of the structure and function.
Healing is the body’s response to injury in an attempt to restore normal structure and functions.
Healing can occur in two ways: Regeneration and Repair
There are 4 phases of wound healing: hemostasis, inflammation, proliferation, and remodeling. This document also describes the mechanism of wound healing. Factors that affect healing include infection, uncontrolled diabetes, poor nutrition, age, anemia, the presence of foreign bodies, etc.
Complications of wound healing like infection, hyperpigmentation of scar, contractures, and keloid formation.
Chapter wise All Notes of First year Basic Civil Engineering.pptxDenish Jangid
Chapter wise All Notes of First year Basic Civil Engineering
Syllabus
Chapter-1
Introduction to objective, scope and outcome the subject
Chapter 2
Introduction: Scope and Specialization of Civil Engineering, Role of civil Engineer in Society, Impact of infrastructural development on economy of country.
Chapter 3
Surveying: Object Principles & Types of Surveying; Site Plans, Plans & Maps; Scales & Unit of different Measurements.
Linear Measurements: Instruments used. Linear Measurement by Tape, Ranging out Survey Lines and overcoming Obstructions; Measurements on sloping ground; Tape corrections, conventional symbols. Angular Measurements: Instruments used; Introduction to Compass Surveying, Bearings and Longitude & Latitude of a Line, Introduction to total station.
Levelling: Instrument used Object of levelling, Methods of levelling in brief, and Contour maps.
Chapter 4
Buildings: Selection of site for Buildings, Layout of Building Plan, Types of buildings, Plinth area, carpet area, floor space index, Introduction to building byelaws, concept of sun light & ventilation. Components of Buildings & their functions, Basic concept of R.C.C., Introduction to types of foundation
Chapter 5
Transportation: Introduction to Transportation Engineering; Traffic and Road Safety: Types and Characteristics of Various Modes of Transportation; Various Road Traffic Signs, Causes of Accidents and Road Safety Measures.
Chapter 6
Environmental Engineering: Environmental Pollution, Environmental Acts and Regulations, Functional Concepts of Ecology, Basics of Species, Biodiversity, Ecosystem, Hydrological Cycle; Chemical Cycles: Carbon, Nitrogen & Phosphorus; Energy Flow in Ecosystems.
Water Pollution: Water Quality standards, Introduction to Treatment & Disposal of Waste Water. Reuse and Saving of Water, Rain Water Harvesting. Solid Waste Management: Classification of Solid Waste, Collection, Transportation and Disposal of Solid. Recycling of Solid Waste: Energy Recovery, Sanitary Landfill, On-Site Sanitation. Air & Noise Pollution: Primary and Secondary air pollutants, Harmful effects of Air Pollution, Control of Air Pollution. . Noise Pollution Harmful Effects of noise pollution, control of noise pollution, Global warming & Climate Change, Ozone depletion, Greenhouse effect
Text Books:
1. Palancharmy, Basic Civil Engineering, McGraw Hill publishers.
2. Satheesh Gopi, Basic Civil Engineering, Pearson Publishers.
3. Ketki Rangwala Dalal, Essentials of Civil Engineering, Charotar Publishing House.
4. BCP, Surveying volume 1
2. Cloud
Cloud computing is a model for enabling
ubiquitous, convenient, on-demand
network access to a shared pool of
configurable computing resources (e.g.,
networks, servers, storage, applications,
and services) that can be rapidly
provisioned and released with minimal
management effort or service provider
interaction.
3. Big Data and Analytics
Big data analytics is the process of examining
large amounts of data of a variety of types (big
data) to uncover hidden patterns, unknown
correlations and other useful information.
Big Data analysts basically want
the knowledge that comes from analyzing the
data.
4. Social Media
Social media is the collective of online
communications channels dedicated to
community-based input, interaction, content-
sharing and collaboration. Websites and
applications dedicated to
forums, microblogging, social networking, social
bookmarking, social curation, and wikis are
among the different types of social media.
5. Mobility
• Enterprise mobility is the trend toward a shift
in work habits, with more employees working
out of the office and using mobile devices and
cloud services to perform business tasks.
• The term refers not only to mobile workers
and mobile devices, but also to the mobility of
corporate data. An employee may upload a
corporate presentation from his or her
desktop PC to a cloud storage service, then
access it from a personal iPad to show at a
client site, for example.
6. What is Cloud Computing
Services provided over the Internet, such as e-
mail, Web hosting and data storage. Cloud
computing has evolved from these traditionally
used services to more complex services like
customer relationship management tools and
marketing programs.
7. Why does it matter – 5 reasons
1. Reduced costs and increased scalability
2. Automatic updates
3. Remote access
4. Customization
5. More time to focus on your customers
11. Cloud Computing Characteristics
Common Characteristics:
Low Cost Software
Virtualization Service Orientation
Advanced Security
Homogeneity
Massive Scale Resilient Computing
Geographic Distribution
Essential Characteristics:
Resource Pooling
Broad Network Access Rapid Elasticity
Measured Service
On Demand Self-Service
Adopted from: Effectively and Securely Using the Cloud Computing Paradigm by peter Mell, Tim Grance
12. Virtualization
• Virtual workspaces:
– An abstraction of an execution environment that can be made dynamically
available to authorized clients by using well-defined protocols,
– Resource quota (e.g. CPU, memory share),
– Software configuration (e.g. O/S, provided services).
• Implement on Virtual Machines (VMs):
– Abstraction of a physical host machine,
– Hypervisor intercepts and emulates instructions from VMs, and allows
management of VMs,
– VMWare, Xen, etc.
• Provide infrastructure API:
– Plug-ins to hardware/support structures
Hardware
OS
App App App
Hypervisor
OS OS
Virtualized Stack
13. Cloud Advantages
• Achieve economies of scale
• Reduce spending on technology
infrastructure
• Globalize your workforce on the cheap
• Streamline processes
• Reduce capital costs.
14. Cloud Advantages
• Improve accessibility
• Monitor projects more effectively
• Less personnel training is needed
• Minimize licensing new software
• Improve flexibility
17. Amazon Cloud Features
• Elastic Web-Scale Computing
• Completely Controlled
• Flexible Cloud Hosting Services
• Designed for use with other Amazon Web Services
• Reliable
• Secure
• Inexpensive
• Easy to Start
18.
19. Google AppEngine Features
• Popular languages and frameworks
• Focus on your code
• Multiple storage options
• Powerful built-in services
• Familiar development tools
• Deploy at Google scale
20.
21.
22. IBM SmartCloud Features
• Expert Cloud Consulting
• Private and Hybrid Clouds
• IaaS, PaaS and SaaS
• Speed
• Empowerment
• Economics
28. Infrastructure as a Service - IaaS
Infrastructure as a Service is a provision model in
which an organization outsources the equipment
used to support operations, including storage,
hardware, servers and networking components. The
service provider owns the equipment and is
responsible for housing, running and maintaining it.
The client typically pays on a per-use basis.
29. Platform as a Service - PaaS
Platform as a Service (PaaS) is a way to rent
hardware, operating systems, storage and network
capacity over the Internet. The service delivery
model allows the customer to rent virtualized servers
and associated services for running existing
applications or developing and testing new ones.
30. Software as a Service - SaaS
• Software as a Service (SaaS) is a software distribution model
in which applications are hosted by a vendor or service
provider and made available to customers over a network,
typically the Internet.
• SaaS is becoming an increasingly prevalent delivery model as
underlying technologies that support Web services and
service-oriented architecture (SOA) mature and new
developmental approaches, such as Ajax, become popular.
Meanwhile, broadband service has become increasingly
available to support user access from more areas around the
world.
41. Analytics
Data analytics (DA) is the science of examining raw data with
the purpose of drawing conclusions about that information.
Data analytics is used in many industries to allow companies
and organization to make better business decisions and in the
sciences to verify or disprove existing models or theories.
Data analytics is distinguished from data mining by the scope,
purpose and focus of the analysis. Data miners sort through
huge data sets using sophisticated software to identify
undiscovered patterns and establish hidden relationships.
Data analytics focuses on inference, the process of deriving a
conclusion based solely on what is already known by the
researcher.
42. Big Data Analytics
• Big data analytics is the process of examining large amounts
of data of a variety of types (big data) to uncover hidden
patterns, unknown correlations and other useful information.
Such information can provide competitive advantages over
rival organizations and result in business benefits, such as
more effective marketing and increased revenue.
• The primary goal of big data analytics is to help companies
make better business decisions by enabling data scientists and
other users to analyze huge volumes of transaction data as
well as other data sources that may be left untapped by
conventional business intelligence (BI)programs.
43. The Meaning of Big Data - 3 V’s
• Big Volume
– With simple (SQL) analytics
– With complex (non-SQL) analytics
• Big Velocity
– Drink from the fire hose
• Big Variety
– Large number of diverse data sources to integrate
47. Head Node
Data Node Data Node Data Node Data Node Data Node
File
Big Data 101
Hadoop is just a File System - HDFS
Read Optimised & Failure Tolerant
48. REDUCEMAP
Big Data 101
Map + Reduce = Extract, Load + Transform
Raw Data Raw Data Raw Data Raw Data
Mapper Mapper Mapper Mapper
Data Data Data Data
Reducer
Output
49. Using Pig to Enrich the data
• Pig is a query language which shares
some concepts with SQL
• Invoked from the Hadoop command shell
• No GUI
• Does not do any work until it has to output a
resultset
• Under the hood executes Map/reduce jobs
50. HDInsight hands on
Outputting results to Hive
• Hive is a near SQL compliant
language with a lot of similarities
• Again, under the hood issues MapReduce
queries
• Exposed to ODBC
51. BIG DATA is not just HADOOP
Manage & store huge volume
of any data
Hadoop File System
MapReduce
Manage streaming data Stream Computing
Analyze unstructured data Text Analytics Engine
Data WarehousingStructure and control data
Integrate and govern all
data sources
Integration, Data Quality, Security,
Lifecycle Management, MDM
Understand and navigate
federated big data sources
Federated Discovery and Navigation
52. Merging the Traditional and Big Data Approaches
IT
Structures the
data to answer
that question
IT
Delivers a platform to
enable creative
discovery
Business
Explores what questions
could be asked
Business Users
Determine what
question to ask
Monthly sales reports
Profitability analysis
Customer surveys
Brand sentiment
Product strategy
Maximum asset utilization
Big Data Approach
Iterative & Exploratory Analysis
Traditional Approach
Structured & Repeatable Analysis
53. The Value of Big Data for Customers
Big opportunities
Improve operational effectiveness
• Machines/sensors: predict failures, network attacks
• Financial risk management: reduce fraud, increase security
Reduce data warehouse cost
• Integrate new data sources without increased database cost
• Provide online access to ‘dark data’
Drive incremental revenue
• Predict customer behavior across all channels
• Understand and monetize customer behavior
55. Main Big Data Technologies
Hadoop NoSQL Databases Analytic Databases
Hadoop
• Low cost, reliable scale-
out architecture
• Distributed computing
Proven success in
Fortune 500 companies
• Exploding interest
NoSQL Databases
• Huge horizontal scaling
and high availability
• Highly optimized for
retrieval and appending
• Types
• Document stores
• Key Value stores
• Graph databases
Analytic RDBMS
• Optimized for bulk-load
and fast aggregate
query workloads
• Types
• Column-oriented
• MPP
• In-memory
56. Major Hadoop Utilities
Apache Hive
Apache Pig
Apache HBase
Sqoop
Oozie
Hue
Flume
Apache Whirr
Apache Zookeeper
SQL-like language and
metadata repository
High-level language
for expressing data
analysis programs
The Hadoop database.
Random, real -time
read/write access
Highly reliable
distributed
coordination service
Library for running
Hadoop in the cloud
Distributed service for
collecting and
aggregating log and
event data
Browser-based
desktop interface for
interacting with
Hadoop
Server-based
workflow engine for
Hadoop activities
Integrating Hadoop
with RDBMS
61. Business Users Want to…
• Report on operational data in near real time
– Example: What are my customer orders in my New York warehouse today?
• Query and analyze operational and historic information
– Example: How many of these customers had late deliveries in the past year?
• Analyze information across multiple dimensions
– Example: Summarize # of orders by these customers and revenue impact by sales
territory; Summarize whether delayed orders disproportionately affect particular
customer segment
• Compare information to financial plans and budgets
– Example: What is the potential affect of revenue / profit shortfall associated with these
segments?
• Analyze unstructured / social data to identify any correlations
– Example: Have the customers who are affected by delays complained in feedback forums
to customer service or on social media about my product delays?
• Apply predictive models to determine future outcomes
– Example: Are these existing or related customer segments prone to future churn?
62. Operational Reporting
OLTP Operational
Data Store
OLTP Warehouse or
Data Mart
OLTP OLAP
OLTP Planning System
Warehouse
VoC
Analytics
Ad-hoc Query & Analysis Multi-dimensional OLAP
Planning & Budgeting Unstructured Analytics
Social Data
Predictive
Outcomes
Predictive Analytics
Statistical and
other Models
A Complete Analytics Platform Requires…
64. Business Analytics Strategy
• In-Memory
• Visualizations
• Mobile
• Unstructured Analytics
• Big Data
• Predictive Analytics
Innovation & Investment Priorities
65. High Density Visualizations
Timeline Analysis
Represents key events over a particular period
and surfaces supporting detail as needed
Treemap
Shows patterns in data by displaying hierarchical (tree-
structured) data as sets of nested rectangles
Thematic Map
Focuses on specific themes to emphasize
spatially-based variations in data
Hierarchy Wheel
Illustrates the relative impact of each contributing
level on the distribution of values in a hierarchy
Updates
Heatmap
Shows distribution and reveals patterns via colored
individual values displayed in a matrix
Histogram/Chip Display
Plots density and allows estimation by showing a visual
impression of the distribution of data
66. Big Data Technologies and Tools
• Column-oriented databases
• Schema-less databases, or NoSQL databases
• MapReduce
• WibiData
• PLATFORA
• Storage Technologies
• SkyTree
67.
68. 3 main deliverables in a Web analytics project
• Website Traffic Data - Divided into Total Traffic, Traffic from Search
Engines, Social Media and Other Sources
• Conversion Rate - Define Goals for the website, Set them up, and report
them regularly.
• Custom Reports - Understand where the client business lies, and report
relevant metrics accordingly. For this, you will have to setup such metrics
and get figures for them on a month-on-month basis. For example, if your
client is a media website, tracking the number of people who sign up for a
newsletter subscription - is a good metric. It does not stop at that - custom
reports will allow you to segment this audience into demographics and
will reveal actionable insights for your client.
70. End of Big Data and Analytics
Session
Questions ???
71. What is Social Media
Social media includes the various online
technology tools that enable people to
communicate easily via the internet to share
information and resources. Social media can
include text, audio, video, images, podcasts
and other multimedia communications.
72. What is social media?Social media is a conversation online.
Look who’s talking:
– your customers
– your donors
– your volunteers
– your employees
– your investors
– your critics
– your fans
– your competition....
– anyone who has internet access
and an opinion.
SOCIAL MEDIA DEFINED
73. The social media conversation
The conversation is not:
– controlled
– organized
– “on message”
The conversation is:
– organic
– complex
– speaks in a human voice
Social media is not a strategy or a tactic –
SOCIAL MEDIA DEFINED
74. The conversation is powered by
• Social Networks
• News & Bookmarking
• Blogs
• Microblogging
• Video Sharing
• Photo Sharing
• Message boards
• Wikis
• Virtual Reality
• Social Gaming
• Related:
– Podcasts
– Real Simple Syndication (RSS)
SOCIAL MEDIA DEFINED
75. The power to define and control a brand
is shifting from corporations and institutions
to individuals and communities.
SOCIAL MEDIA DEFINED
76. “It is about putting the ‘public’ back in
Public Relations and realizing that
focusing on important markets and
influencers will have a far greater
impact than trying to reach the masses
with any one message or tool.”
SOCIAL MEDIA DEFINED
77. Word of Mouth is the Future of Marketing
Marketers can effectively use social media by
influencing the conversation.
One way to do this is by delivering
great customer service experiences.
SOCIAL MEDIA DEFINED
78. o 91% say consumer reviews are the #1 aid to buying
decisions - JC Williams Group
o 87% trust a friend’s recommendation over critic’s
review - Marketing Sherpa
o 3 times more likely to trust peer opinions over
advertising for purchasing decisions - Jupiter
Research
SOCIAL MEDIA DEFINED
79. Social media sites are the fastest-growing
category on the web, doubling their traffic
over the last year.
o 73% of active online users have read a blog
o 45% have started their own blog
o 57% have joined a social network
o 55% have uploaded photos
o 83% have watched video clips
SOCIAL MEDIA DEFINED
81. Social media can help you in all
stages of marketing, self-
promotion, public relations,
and customer service:
o research
o strategic planning
o implementation
o evaluation
SOCIAL MEDIA DEFINED
82. o Learn what people are saying about you
o Create buzz for events & campaigns
o Increase brand exposure
o Identify and recruit influencers to spread your message
o Find new opportunities and customers
o Support your products and services
o Improve your search engine visibility
o Gain competitive intelligence
o Get your message out fast
o Retain clients by establishing a personal relationship
o Be an industry leader – not a follower
SOCIAL MEDIA DEFINED
83. Reach
o Website visits / views
o volume of reviews
and comments
o Incoming links
Action & Insight
o Sales inquiries
o New business
o Customer satisfaction
and loyalty
o Marketing efficiency
Engagement & Influence
o Sentiment of reviews
and comments
o Brand affinity
o Commenter
authority/influence
o Time spent
o Favourites / Friends / Fans
o Viral forwards
o Number of downloads
SOCIAL MEDIA DEFINED
Source: The Digital Influence Group, Measuring the Influence of Social Media
84. Resources required for social
media may include:
o Strategic consultation
o Training
o Creating content
o Integrating tools
o Distributing content
o Relationship management
o Measuring value
SOCIAL MEDIA DEFINED
86. o Experiment personally
before professionally
o Try a variety of social media
tools
o Be yourself, make some
friends, and share
KEYS TO SUCCESS
87. 1. Discovery
(people, competition, and
search engines)
3. Skills
(identify internal resources
and gaps)
5. Maintenance
(monitor and adapt)
2. Strategy
(opportunities, objectives)
4. Execution
(tools, integration,
policies, and process)
KEYS TO SUCCESS
Source: 5 Phases of Social Media Marketing
http://socialcomputingjournal.com/viewcolumn.cfm?colid=789
88. CASE STUDY
o YouTube
o MySpace
o Facebook
o Twitter
o EHarmony
o “Digits” (their own online
community)
o Virtual communities –
Second Life
89. CASE STUDY
Leveraged core goals across all
networks:
1.Reinforce their brand as tax
experts
2.Deliver on advocacy
positioning of the brand
3.Present the brand as being
innovative
90. CASE STUDY
o Be community appropriate
and relevant
(interacting on Second Life is
different than YouTube)
o It’s not free -
Human capital increased as
media buys decreased – Ask
yourself if this is successful
how do you scale it?
91. CASE STUDY
1. Brand Perception
o Evaluated brand metrics
through a brand tracking study
2. Engagement
o 600,000 YouTube views
o 1 million unique visits to their
community site
3. Word of Mouth
o Increased online mentions in
blogs, forums, and other social
media
92. o Find where your audience is
participating and indentify the
influencers
o Read industry blogs (including
comments)
o Google your company name & your
competition
o Find tools that can help you listen
KEYS TO SUCCESS
93. o Tap into the wisdom of the crowd to
access a wider talent pool and gain
customer insight
o Companies that use crowd sourcing
include:
o Starbucks (MyStarbucks)
o Dell (Ideastorm)
o DuPont
o Netflix
o Wikipedia
o iStockphoto.com
o Threadless.com
o Mechanical Turk (Amazon)
KEYS TO SUCCESS
94. o Avoid puffery
(people will ignore it)
o Avoid evasion and lying
(people won’t ignore it)
o Companies have watched their
biggest screw-up's rise to
the top 10 of a Google
search
o Admit your mistakes right away
KEYS TO SUCCESS
95. o Don’t be afraid to share.
Corporations, like people, need to
share information to get the value
out of social media
o Make your content easy to share
o Incorporate tools that promote
sharing:
o Share This, RSS feeds, Email a
friend
KEYS TO SUCCESS
96. o Don't shout. Don't
broadcast. Don’t brag.
o Speak like yourself – not a
corporate marketing shill or
press secretary
o Personify your brand – give
people something they can
relate to.
KEYS TO SUCCESS
97. o Think like a contributor, not a
marketer
o Consider what is relevant to the
community before contributing
o Don’t promote your product on
every post
o Win friends by promoting other
people’s content if it interests you
KEYS TO SUCCESS
98. o Don’t try to delete or remove
criticism (it will just make it
worse)
o Listen to your detractors
o Admit your shortcomings
o Work openly towards an
explanation and legitimate
solution
KEYS TO SUCCESS
99. o Don’t wait until you have a
campaign to launch - start
planning and listening now
o Build relationships so
they’re ready when you
need them
KEYS TO SUCCESS
100. o You need buy in from everyone in the
organization
o Convince your CEO that social media
is relevant to your organization
o Get your communications team
together, discuss the options, then
divide and conquer
KEYS TO SUCCESS
101. 1. Experiment with social media
2. Make a plan
3. Listen
4. Be transparent & honest
5. Share your content
6. Be personal and act like a person
7. Contribute in a meaningful way
8. See criticism as an opportunity
9. Be proactive
10. Accept you can’t do it all yourself
KEYS TO SUCCESS
102. Social Media Tools (Free)
• Google Analytics
• Google Alerts
• Facebook Insights
• TweetDeck
• Social Mention
• Mention
• HootSuite
103.
104.
105. Some more Social Media Tools
• Mention: Google Alerts for the social web
• Buffer: Social media publishing plus powerful analytics
• Feedly: Content discovery
• Twitter Counter: Track Twitter progress
• Zapier: Link favorite social services
• Bottlenose: Intelligence for social networks
• Followerwonk: Follower analysis for Twitter
• Quintly: Social analytics for brands
106. Twitter
Twitter is a free social networking microblogging service that
allows registered members to broadcast short posts
called tweets. Twitter members can broadcast tweets and
follow other users' tweets by using multiple platforms and
devices. Tweets and replies to tweets can be sent by cell
phone text message, desktop client or by posting at
the Twitter.comwebsite.
107. Facebook
Facebook is a popular free social networking website that allows registered users
to create profiles, upload photos and video, send messages and keep in touch with
friends, family and colleagues. The site, which is available in 37 different
languages, includes public features such as:
• Marketplace - allows members to post, read and respond to classified ads.
• Groups - allows members who have common interests to find each other and
interact.
• Events - allows members to publicize an event, invite guests and track who plans
to attend.
• Pages - allows members to create and promote a public page built around a
specific topic.
• Presence technology - allows members to see which contacts are online and chat.
108. LinkedIn
LinkedIn is a social networking site designed specifically for
the business community. The goal of the site is to allow
registered members to establish and document networks of
people they know and trust professionally.
109. Blogs, Microblogging and Social Networks
Blog
a personal website or web page on which an individual
records opinions, links to other sites, etc. on a regular basis
MicroBlog
A broadcast medium that exists in the form of blogging. A
microblog differs from a traditional blog in that its content is
typically smaller in both actual and aggregated file size
Social Network
A network of social interactions and personal relationships
112. 5 Pitfalls of social media
1. Social media can waste your time
2. Social media gives a platform to your every thought
3. Social media can fuel discontentment
4. Social media can fuel pride and keep you inwardly focused
5. Social media can distort our view of relationships
113. Using social media across various digital channels
Content marketing management at the heart of social media
Mobile, micro videos, employee advocacy and visual web for social selling
Evolving customer service
Focusing on social good
Digital Rebellion
Social laws, anonymity and business maturity as next steps in marketing
Interest-Based (Social) Networks Rise in Prominence in 2014
Don’t be a Goat: Build Your Author-ity
Yes for Content Marketing, but how?
Have a long term vision
Using social media for competitive intelligence
Governments will embrace social media
Mobile, social ads and videos becoming vital
115. Social Media ROI Models :
1.) The Amplification Model
How much would it cost to buy these impressions/actions via paid media?
2.) Value of Social Traffic versus Display
How much does it cost to get a visitor to your site via social versus display?
3.) Quality of Visitors from Social Media
How well do social-driven visitors perform on your site?
4.) Revenue From Facebook Fans
How much incremental revenue do your Facebook Fans create?
5.) Revenue from Social Media Marketing
How many sales can we attribute to your social media marketing programs?
6.) Social Promotions Sales ROI
How many sales can we attribute to specialized social media promotions?
116. How to Measure Success?
► Site Traffic
► Downloads
► eNewsletter Sign-Ups
► Blog Comments
► Questions
► Shared Links
► Re-Tweets
► Followers
► Who’s talking about you and how?
117. • Listen
• Share Relevant Content
• Industry Specific Content
• Marketing Information
• Consistently monitor for
customer service
• Continually post relevant
content
►Link all pages back to
corporate site
►DON’T talk about
yourself all the time
►Secure Branded URLs for
each Channel
►Promote Monthly
Newsletters
►Listen More
Best Practices
119. Mobile Computing
Mobile computing is a generic term used to refer to a variety of devices that
allow people to access data and information from where ever they are.
120.
121.
122.
123. The
Players
• Android – Open source mobile OS developed
ny the Open Handset Alliance led by Google.
Based on Linux 2.6 kernel
• iOS – Apple’s proprietary mobile OS, iPhone,
iPod Touch, iPad. Derived from OS X, very
UNIX like
• Symbian – acquired by Nokia 2008
• Windows Phone 7 – Microsoft – Kin,
discontinued 6 weeks after initial launch
• Blackberry OS – RIM (Research in Motion),
proprietary OS
124. The Smartphone Platform
• With the iPhone being the first to the
marketplace it sets the configuration of the
Smartphone Platform
– 3G/4G connectivity
– WiFi connectivity
– Bluetooth connectivity
– accelerometer w/compass
– ambient light sensor
– proximity sensor
– GPS
– gyroscope
125. What is
Android
Android is an open source operating system,
created by Google specifically for use on
mobile devices (cell phones and tablets)
Linux based (2.6 kernel)
Can be programmed in C/C++ but most app
development is done in Java (Java access to C
Libraries via JNI (Java Native Interface))
Supports Bluetooth, Wi-Fi, and 3G and 4G
networking
126. What is
iOS
Apple’s mobile OS for phones (iPhone),
tablets (iPad), handhelds (iPod),
based on BSD Unix
Application programming done in
Objective C
Supports Bluetooth, Wi-Fi, and 3G and
4G networking
127. Bluetooth
• Open wireless technology
– Developed by Ericsson (1994)
– Originally supposed to replace wired RS-232
– Short distance via low power, short distance radio
– Allows creation of personal area networks
• Mostly to connect wireless peripheral devices to a host computer
(mice, headsets, microphones, keyboards…)
– Can also be used to communicate between two host computers wirelessly
(replace serial cables)
128. Wi-Fi
• Used to brand certified products that belong to a class of wireless local area
network based on IEEE Standard 802.11
• Currently there are 3 versions of 802.11 in common use:
– B, about 150 feet indoors, 300 ft outdoors
– G, 54 Mbits about 150 feet indoors, 300 ft outdoors
– N, 600 Mbits, about 1.5 miles in open air, uses MIMO (multiple input and
output antennas)
129. 3G ( 3rd
Generation
Network)
• Must allow simultaneous use of speech and
data servicesand provide peak data rate of
200 kbits/sec
130. 4G
• Provides a comprehensive and secure IP
based solution for IP based telephony, ultra
broadband internet, gaming services and
streamed multimedia.
• Peak data rate of 100 Mbit for high mobility
devices and 1 Gbit for low mobility devices.
131. Commonly
Used
Packages
User interface controls and widgets
User interface layout
Secure networking and web browsing
Structured storage and relational databases (SQLite
RDBMS)
2D and 3D Graphics SGL and OpenGL
Audio and visual media support
Access to optional hardware (GPS)
135. • 75% of all new phones in Asia
and Africa
• There are now twice as many
mobile money users in Africa
than there are Facebook users
in region
• More than 30 million people
undertook 224.2 million
transactions in June 2012
• Nigeria is in the top 10 mobile
Emerging Markets
136. Running Ads on Mobile devices
• Mobile banners and display
• Mobile pay per click
• Contextual Mobile Ads
• Idle Screen Advertising
137. HTML5 features for the mobile web
• Offline Support
• Canvas Drawing
• Video and Audio Streaming Support
• GeoLocation API
• Advanced Forms
139. Location-based services: why, what and how
Location-based services are a general class of computer
program-level services used to include specific controls for
location and time data as control features in computer
programs. As such (LBS) is an information and has a number
of uses in Social Networking today as an entertainment
service, which is accessible with mobile devices through the
mobile network and which uses information on the
geographical position of the mobile device. This has become
more and more important with the expansion of the
smartphone and tablet markets as well.
143. Enable applications access data from other applications ,sharing
Providing access to non-code resources
Enables all applications to display alerts in the status bar
Manages the lifecycle of applications
Application Framework
144. • Written in C/C++ - System C Library(libc)
• Display/Graphics(SGL)
• Media Libraries
• SQLite –RDB engine-light weight
• LibWebCore–web browser engine–embeddable web view
Libraries
145. • Linux Version 2.6
• Security, Memory & Process Management
• Proven driver model
• Efficient computing resource management
• Stable and proven OS for mobile platform
Linux Kernel
146. Includes a set of core libraries that provides most of the functionality-
JAVA
Every Android application runs in its own process
Dalvik VM executes files in the (.dex) format
Device can run multiple VMs efficiently
Android Runtime
147. Android Programming Steps
Aa
• Basic Layout
• Working with Lists
• Working with a SQLite Database
• Using Intents and passing information between Activities
• Calling, Emailing, and Texting an Employee
• Navigating Up and Down the Org Chart
148.
149. Prerequisites
for iOS
Development
Previous experience in another Object Oriented Programming(OOP)
language will be helpful
Understanding of OOP concepts.
Some understanding of C can be helpful, but is not required. Objective-C
builds on top of C. You will eventually run into pointers and other
fundamental “C” features
Previous experience with an Integrated Development Environment (IDE) is
helpful, but not required
Mac computer running OS X Lion
If you plan to submit to the App Store, you will need Apple devices to do
real testing on. The simulator is not good enough.
151. iOS Programming Steps
• The Structure of Your Source Code
• Import Statements
• The Main Function
• Working With Variables
• Building Your First Game
• Obtaining User Input
• Working With Conditionals
• Working With While Loops
152. Top 10 - Uses for iPads
• Surf The Internet
• Take Photos
• Take Notes
• Make Presentations
• Accounting
• Sales Transactions
• Maps & Deliveries
• Education & Training
• Research & Relationships
• Custom Apps
153. Limitation
of Mobile
Devices
• Screen size
• Touch screen
• No physical keyboard or trackball – a finger or
stylus is the primary interface to the device
• Memory
• Storage
• Battery Life
• Cell network
• Sometimes flaky networks
• Ergonomics
156. End of Presentation on
Social Media, Mobile, Analytics, Cloud (SMAC)
Thank You
Rajesh Menon
http://www.technospirituality.com/
Editor's Notes
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
51
I especially like this slide. It shows the positioning of traditional and big data approach. The traditional approach is shown in the left part of the slide. The business users know very well what questions they would like to ask and IT just prepare the infrastructure and the solution to satisfy business users needs. No matter this process is even iterative as business users are typically not able to define to IT stuff well enough what they really need and the first attempt doesn’t lead to the final solution. But this is really just the questions of a few iterations and it’s done.
On the other side – in big data area – the business users don’t know what questions they would like to ask. Everybody is talking about social media analytics. Fine, but is it worth also for my business? Is there enough information related to my area of interest? Am I able to join this social media data with data I have in my internal systems? And if all of those questions has a positive answer I should be able what exactly has to be get from the social network and how exactly it will be then joined with data in my internal systems.
So, in big data area there’s always a phase ZERO where different approach has to be applied. IT should be able to deliver to business users a platform where they can analyze big data in the environment the business users are able to work with – like in excel spreadsheets and base on this simple analysis the business users are able to clearly define a questions they weren’t able to define before.
--------------------------------------
The Big Data approach complements the traditional approach.
Traditional approach – Business users determine what questions to ask and IT structure the data to answer that question. This is well suited to many common business processes, such as monitoring sales by geography, product or channel; extract insight from customer surveys; cost and profitability analyses.
The Big Data approach – IT delivers a platform that consolidates data sources of interest and enables creative discovery. Then the business users use the platform to explore data for idea and questions to ask.
On the left, the traditional approach allows organization to answer questions that will be asked time and time again . . . On the right, users have the ability to explore their data in a more creative way . . . Before finding the answer, they must first define the question. Are my customers starting to change their preferences? What is the best way to measure brand health?
* Not many companies have transactional data that classifies as Big Data. Credit card companies, and financial services companies are about it.
* With stock market data were are talking about every stock trade and the bid and ask prices between the transactions - for every stock on multiple markets for a significant time period.
For many other companies the Big Data is sub-transactional - it is the events that lead up to transactions
* Weblogs are semi/badly structured. Consider the number of weblog entries created as you look for a book online - researching 5-10 books, reading reviews and comments. You might generate 1000 entries and may or may not buy a book - potentially lots of entries for no transaction. We also want to enrich this data with metadata about the URLs and information about the location of user
* In an online game or world every interaction between participants and the system and between each other is logged. An individual participant might generate > 1 million events for their 1 monthly transaction
* A single phone call or text message generates many events within a telecoms company
TAKE-AWAYS
Pentaho provides complete integrated DI+BI for every leading big data platform.
4 types of databases in the NOSQL universe:
K-V Stores
Column Family Store
Document Databases
Graph Databases
Who here has worked with NOSQL stores before?
For the people that raised their hand how many used...
KV Stores?
Column Family?
Document DBs?
Graph DBs?
If you raised your hand for Graph DBs, then pat yourself on the back b/c that’s where I spend most of my time.
In the following graph we see that KV stores are the best at scaling due to their simplistic data model and Graph databases are the worst at scaling because of the complexity and interconnectedness of the data.
Even though Graphs DBs are the worst at scaling out of all of the NOSQL types, we’re still able to cover 90% of today’s use cases.
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
The Social Media tactics you see are supported by unseen strategy. We’re review both.
Focus on your code
Focus on your code
Marketing, Digital and PR are the three departments primarily responsible for social media at the moment. It is interesting to note that sales, product development and IT do have some action in social in some of the surveyed companies.
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
Focus on your code
carriers are transformed to access providers. other companies monetize on voice and messaging services. Maybe there is an opportunity for such services.