In this session, we introduce Amazon Quantum Ledger Database (Amazon QLDB) and Amazon Managed Blockchain. We discuss what these services do, the problems they solve, and when you should use each one. We also dive into their features and explain how they work.
Amazon Managed Blockchain and Quantum Ledger Database QLDBJohn Yeung
Introduce two new products: Amazon Managed Blockchain and Quantum Ledger Database QLDB. Explain the key differences between these products and how to choose them for different scenarios. Amazon Managed Blockchain is a managed services supporting two key Blockchain frameworks, namely Hyperledger Fabric and Ethereum, while QLDB aims to provide a managed database service with the immutable and verifiable capabilities. Can they be working together to achieve a higher value? Let's read the document.
최근 국내에도 글로벌 서비스나 급성장하는 웹 서비스를 쉽게 볼 수 있습니다. 초기에 RDBMS로 시작된 서비스들은 규모가 성장함에 따라 샤딩과 NoSQL의 선택의 기로에 서게 됩니다. Amazon DynamoDB는 모든 스케일에서 사용할 수 있는 완전 관리형 Key-Value NoSQL 데이터베이스이지만 여전히 Key Design은 가장 어려운 영역 중 하나입니다. 이 세션에서는 대규모 서비스의 키 디자인 방법을 알아봅니다.
This document discusses AWS Lake Formation and provides an overview of its key capabilities. It describes how Lake Formation can help users build clean and secure data lakes in days by automating tasks like data loading, cleaning, and governance that traditionally take months. It also outlines recent innovations to Lake Formation, including new features that enable row-level security, ACID transactions on data lakes, and acceleration of analytics on data stored in Amazon S3.
This is Part 4 of the GoldenGate series on Data Mesh - a series of webinars helping customers understand how to move off of old-fashioned monolithic data integration architecture and get ready for more agile, cost-effective, event-driven solutions. The Data Mesh is a kind of Data Fabric that emphasizes business-led data products running on event-driven streaming architectures, serverless, and microservices based platforms. These emerging solutions are essential for enterprises that run data-driven services on multi-cloud, multi-vendor ecosystems.
Join this session to get a fresh look at Data Mesh; we'll start with core architecture principles (vendor agnostic) and transition into detailed examples of how Oracle's GoldenGate platform is providing capabilities today. We will discuss essential technical characteristics of a Data Mesh solution, and the benefits that business owners can expect by moving IT in this direction. For more background on Data Mesh, Part 1, 2, and 3 are on the GoldenGate YouTube channel: https://www.youtube.com/playlist?list=PLbqmhpwYrlZJ-583p3KQGDAd6038i1ywe
Webinar Speaker: Jeff Pollock, VP Product (https://www.linkedin.com/in/jtpollock/)
Mr. Pollock is an expert technology leader for data platforms, big data, data integration and governance. Jeff has been CTO at California startups and a senior exec at Fortune 100 tech vendors. He is currently Oracle VP of Products and Cloud Services for Data Replication, Streaming Data and Database Migrations. While at IBM, he was head of all Information Integration, Replication and Governance products, and previously Jeff was an independent architect for US Defense Department, VP of Technology at Cerebra and CTO of Modulant – he has been engineering artificial intelligence based data platforms since 2001. As a business consultant, Mr. Pollock was a Head Architect at Ernst & Young’s Center for Technology Enablement. Jeff is also the author of “Semantic Web for Dummies” and "Adaptive Information,” a frequent keynote at industry conferences, author for books and industry journals, formerly a contributing member of W3C and OASIS, and an engineering instructor with UC Berkeley’s Extension for object-oriented systems, software development process and enterprise architecture.
A Work of Zhamak Dehghani
Principal consultant
ThoughtWorks
https://martinfowler.com/articles/data-monolith-to-mesh.html
https://fast.wistia.net/embed/iframe/vys2juvzc3?videoFoam
How to Move Beyond a Monolithic Data Lake to a Distributed Data Mesh
Many enterprises are investing in their next generation data lake, with the hope of democratizing data at scale to provide business insights and ultimately make automated intelligent decisions. Data platforms based on the data lake architecture have common failure modes that lead to unfulfilled promises at scale. To address these failure modes we need to shift from the centralized paradigm of a lake, or its predecessor data warehouse. We need to shift to a paradigm that draws from modern distributed architecture: considering domains as the first class concern, applying platform thinking to create self-serve data infrastructure, and treating data as a product.
Amazon Managed Blockchain is a fully managed blockchain service that makes it easy for customers to create and manage scalable blockchain-based transaction networks (blockchain networks) using the popular open source blockchain frameworks Hyperledger Fabric and Ethereum. Blockchain technologies enable groups of organizations, oftentimes in financial services and manufacturing, to securely transact, run application code, and share data without a trusted central authority. We will explore the components of blockchain technology, discuss use cases, and do a deep dive into capabilities, performance, and key innovations in Amazon Managed Blockchain.
AWS Neptune - A Fast and reliable Graph Database Built for the CloudAmazon Web Services
Dickson Yue, Solutions Architect, AWS
Amazon Neptune is a fully managed graph database service which has been built ground up for handling rich highly connected data. Come learn how to transform your business with Amazon Neptune and hear diverse use cases such as recommendation engines, knowledge graphs, fraud detection, social networks, network management and life sciences.
Amazon Managed Blockchain and Quantum Ledger Database QLDBJohn Yeung
Introduce two new products: Amazon Managed Blockchain and Quantum Ledger Database QLDB. Explain the key differences between these products and how to choose them for different scenarios. Amazon Managed Blockchain is a managed services supporting two key Blockchain frameworks, namely Hyperledger Fabric and Ethereum, while QLDB aims to provide a managed database service with the immutable and verifiable capabilities. Can they be working together to achieve a higher value? Let's read the document.
최근 국내에도 글로벌 서비스나 급성장하는 웹 서비스를 쉽게 볼 수 있습니다. 초기에 RDBMS로 시작된 서비스들은 규모가 성장함에 따라 샤딩과 NoSQL의 선택의 기로에 서게 됩니다. Amazon DynamoDB는 모든 스케일에서 사용할 수 있는 완전 관리형 Key-Value NoSQL 데이터베이스이지만 여전히 Key Design은 가장 어려운 영역 중 하나입니다. 이 세션에서는 대규모 서비스의 키 디자인 방법을 알아봅니다.
This document discusses AWS Lake Formation and provides an overview of its key capabilities. It describes how Lake Formation can help users build clean and secure data lakes in days by automating tasks like data loading, cleaning, and governance that traditionally take months. It also outlines recent innovations to Lake Formation, including new features that enable row-level security, ACID transactions on data lakes, and acceleration of analytics on data stored in Amazon S3.
This is Part 4 of the GoldenGate series on Data Mesh - a series of webinars helping customers understand how to move off of old-fashioned monolithic data integration architecture and get ready for more agile, cost-effective, event-driven solutions. The Data Mesh is a kind of Data Fabric that emphasizes business-led data products running on event-driven streaming architectures, serverless, and microservices based platforms. These emerging solutions are essential for enterprises that run data-driven services on multi-cloud, multi-vendor ecosystems.
Join this session to get a fresh look at Data Mesh; we'll start with core architecture principles (vendor agnostic) and transition into detailed examples of how Oracle's GoldenGate platform is providing capabilities today. We will discuss essential technical characteristics of a Data Mesh solution, and the benefits that business owners can expect by moving IT in this direction. For more background on Data Mesh, Part 1, 2, and 3 are on the GoldenGate YouTube channel: https://www.youtube.com/playlist?list=PLbqmhpwYrlZJ-583p3KQGDAd6038i1ywe
Webinar Speaker: Jeff Pollock, VP Product (https://www.linkedin.com/in/jtpollock/)
Mr. Pollock is an expert technology leader for data platforms, big data, data integration and governance. Jeff has been CTO at California startups and a senior exec at Fortune 100 tech vendors. He is currently Oracle VP of Products and Cloud Services for Data Replication, Streaming Data and Database Migrations. While at IBM, he was head of all Information Integration, Replication and Governance products, and previously Jeff was an independent architect for US Defense Department, VP of Technology at Cerebra and CTO of Modulant – he has been engineering artificial intelligence based data platforms since 2001. As a business consultant, Mr. Pollock was a Head Architect at Ernst & Young’s Center for Technology Enablement. Jeff is also the author of “Semantic Web for Dummies” and "Adaptive Information,” a frequent keynote at industry conferences, author for books and industry journals, formerly a contributing member of W3C and OASIS, and an engineering instructor with UC Berkeley’s Extension for object-oriented systems, software development process and enterprise architecture.
A Work of Zhamak Dehghani
Principal consultant
ThoughtWorks
https://martinfowler.com/articles/data-monolith-to-mesh.html
https://fast.wistia.net/embed/iframe/vys2juvzc3?videoFoam
How to Move Beyond a Monolithic Data Lake to a Distributed Data Mesh
Many enterprises are investing in their next generation data lake, with the hope of democratizing data at scale to provide business insights and ultimately make automated intelligent decisions. Data platforms based on the data lake architecture have common failure modes that lead to unfulfilled promises at scale. To address these failure modes we need to shift from the centralized paradigm of a lake, or its predecessor data warehouse. We need to shift to a paradigm that draws from modern distributed architecture: considering domains as the first class concern, applying platform thinking to create self-serve data infrastructure, and treating data as a product.
Amazon Managed Blockchain is a fully managed blockchain service that makes it easy for customers to create and manage scalable blockchain-based transaction networks (blockchain networks) using the popular open source blockchain frameworks Hyperledger Fabric and Ethereum. Blockchain technologies enable groups of organizations, oftentimes in financial services and manufacturing, to securely transact, run application code, and share data without a trusted central authority. We will explore the components of blockchain technology, discuss use cases, and do a deep dive into capabilities, performance, and key innovations in Amazon Managed Blockchain.
AWS Neptune - A Fast and reliable Graph Database Built for the CloudAmazon Web Services
Dickson Yue, Solutions Architect, AWS
Amazon Neptune is a fully managed graph database service which has been built ground up for handling rich highly connected data. Come learn how to transform your business with Amazon Neptune and hear diverse use cases such as recommendation engines, knowledge graphs, fraud detection, social networks, network management and life sciences.
Simple cloud migration with OpenText MigrateOpenText
Migrate any server workload to any target destination with the OpenText Migrate cloud migration platform. Learn about common migration challenges and how to choose the right cloud migration tool.
Snowflake: The Good, the Bad, and the UglyTyler Wishnoff
Learn how to solve the top 3 challenges Snowflake customers face, and what you can do to ensure high-performance, intelligent analytics at any scale. Ideal for those currently using Snowflake and those considering it. Learn more at: https://kyligence.io/
The document discusses strategies for migrating IT workloads to the cloud. It describes common drivers for cloud migration like cost reduction and agility. Potential barriers are also outlined, such as existing investments and lack of cloud expertise. The main sections of the document are on migration planning, common migration strategies ranging from rehosting to rearchitecting, examples of migration patterns, and modernizing applications on AWS.
The document discusses building a data lake on AWS. It describes various AWS services that can be used to ingest, store, transform, analyze and visualize data in the data lake. These services include Amazon S3 for storage, AWS Glue for ETL/data cataloging, AWS Lake Formation for governance, Amazon Athena/EMR for analytics and Amazon QuickSight for visualization. The document also covers data movement options from on-premises to the data lake and real-time streaming of data using services like Kinesis. Machine learning workloads can leverage Amazon SageMaker for training and deployment.
Agile Integration with APIs and Containers Workshop Nicole Maselli
The document provides an agenda for a Red Hat Agile Integration workshop. The agenda includes sessions on agile integration concepts and use cases, hands-on developer demos, and labs on contract-first API development. Participants can choose between an API design and management track or an API development and security track. The workshop aims to provide an introduction to agile integration using Red Hat products like OpenShift, Fuse, 3scale, Apicurio and Microcks.
In this presentation we will offer an overview of the fundamental concepts of graph databases and data representation and query technologies;
We will also focus on AWS Property Graph and Apache TinkerPop Gremlin. Then we'll talk about the use of Amazon Neptune: creating a cluster, loading data and running some sample queries;
This document is a training presentation on Databricks fundamentals and the data lakehouse concept by Dalibor Wijas from November 2022. It introduces Wijas and his experience. It then discusses what Databricks is, why it is needed, what a data lakehouse is, how Databricks enables the data lakehouse concept using Apache Spark and Delta Lake. It also covers how Databricks supports data engineering, data warehousing, and offers tools for data ingestion, transformation, pipelines and more.
Scaling and Modernizing Data Platform with DatabricksDatabricks
This document summarizes Atlassian's adoption of Databricks to manage their growing data pipelines and platforms. It discusses the challenges they faced with their previous architecture around development time, collaboration, and costs. With Databricks, Atlassian was able to build scalable data pipelines using notebooks and connectors, orchestrate workflows with Airflow, and provide self-service analytics and machine learning to teams while reducing infrastructure costs and data engineering dependencies. The key benefits included reduced development time by 30%, decreased infrastructure costs by 60%, and increased adoption of Databricks and self-service across teams.
This document provides information about Amazon QuickSight, a fully managed cloud business intelligence system. It discusses how QuickSight allows users to connect to data sources, create interactive dashboards, and publish them for sharing. QuickSight is serverless, scalable from 10 to 10,000 users, and uses a pay-per-session pricing model where users only pay when accessing dashboards.
This document provides an overview of AWS Lake Formation and related services for building a secure data lake. It discusses how Lake Formation provides a centralized management layer for data ingestion, cleaning, security and access. It also describes how Lake Formation integrates with services like AWS Glue, Amazon S3 and ML transforms to simplify and automate many data lake tasks. Finally, it provides an example workflow for using Lake Formation to deduplicate data from various sources and grant secure access for analysis.
Amazon Web Services (AWS) provides a low cost, reliable and secure foundation for you to use as you build and deliver Software as a Service (SaaS) solutions to customers. For ISVs, the process of transition from a traditional (license based) model to a Software as a Service (SaaS) is a challenge that concerns not only the technical aspect, but also financial and commercial strategy aspects. In this presentation you will find out how AWS can become the ideal partner to support the transformation.
AWS May Webinar Series - Getting Started: Storage with Amazon S3 and Amazon G...Amazon Web Services
If you are interested to know more about AWS Chicago Summit, please use the following to register: http://amzn.to/1RooPPL
Amazon S3 and Amazon Glacier provide developers and IT teams with secure, durable, highly-scalable object storage with no minimum fees or setup costs. In this webcast, we will provide an introduction to each service, dive deep into key features of Amazon S3 and Amazon Glacier, and explore different use cases that these services optimize.
Learning Objectives: • Business value of Amazon S3 and Amazon Glacier • Leveraging S3 for web applications, media delivery, big data analytics and backup • Leveraging Amazon Glacier to build cost effective archives • Understand the life cycle management of AWS' storage services
Amazon Managed Blockchain is a fully managed blockchain service that makes it easy for customers to create and manage scalable blockchain-based transaction networks (blockchain networks) using the popular open source blockchain frameworks Hyperledger Fabric and Ethereum. Blockchain technologies enable groups of organizations, oftentimes in financial services and manufacturing, to securely transact, run application code, and share data without a trusted central authority. We will explore the components of blockchain technology, discuss use cases, and do a deep dive into capabilities, performance, and key innovations in Amazon Managed Blockchain.
Speaker: Bill Baldwin - Database Technical Evangelist, AWS
In this session we will introduce key ETL features of AWS Glue and cover common use cases ranging from scheduled nightly data warehouse loads to near real-time, event-driven ETL flows for your data lake. We will also discuss how to build scalable, efficient, and serverless ETL pipelines.
The AWS Architecture Icons release provides updated icons and guidelines to improve accessibility and align with AWS branding. Key updates include:
- Icons were updated to have a minimum contrast ratio of 3:1 for lines and 4.5:1 for text for accessibility. Line weight and text size were increased.
- The color palette was updated to match the new AWS brand colors of White, Galaxy, Nebula, Cosmos, Mars, Smile, Endor, and Orbit.
- The general icons library was expanded with new icons for concepts like cold storage, data streams, credentials, and more.
- Guidance was provided for specific services like using a generic instance icon for EC2 instead
Master the Multi-Clustered Data Warehouse - SnowflakeMatillion
Snowflake is one of the most powerful, efficient data warehouses on the market today—and we joined forces with the Snowflake team to show you how it works!
In this webinar:
- Learn how to optimize Snowflake
- Hear insider tips and tricks on how to improve performance
- Get expert insights from Craig Collier, Technical Architect from Snowflake, and Kalyan Arangam, Solution Architect from Matillion
- Find out how leading brands like Converse, Duo Security, and Pets at Home use Snowflake and Matillion ETL to make data-driven decisions
- Discover how Matillion ETL and Snowflake work together to modernize your data world
- Learn how to utilize the impressive scalability of Snowflake and Matillion
Build real-time streaming data pipelines to AWS with Confluentconfluent
Traditional data pipelines often face scalability issues and challenges related to cost, their monolithic design, and reliance on batch data processing. They also typically operate under the premise that all data needs to be stored in a single centralized data source before it's put to practical use. Confluent Cloud on Amazon Web Services (AWS) provides a fully managed cloud-native platform that helps you simplify the way you build real-time data flows using streaming data pipelines and Apache Kafka.
Cloud native is a new paradigm for developing, deploying, and running applications using containers, microservices, and container orchestration. The Cloud Native Computing Foundation (CNCF) drives adoption of this paradigm through open source projects like Kubernetes, Prometheus, and Envoy. Cloud native applications are packaged as lightweight containers, developed as loosely coupled microservices, and deployed on elastic cloud infrastructure to optimize resource utilization. CNCF seeks to make these innovations accessible to everyone.
Actionable Insights with AI - Snowflake for Data ScienceHarald Erb
Talk @ ScaleUp 360° AI Infrastructures DACH, 2021: Data scientists spend 80% and more of their time searching for and preparing data. This talk explains Snowflake’s Platform capabilities like near-unlimited data storage and instant and near-infinite compute resources and how the platform can be used to seamlessly integrate and support the machine learning libraries and tools data scientists rely on.
Bulk data loading in Snowflake involves the following steps:
1. Creating file format objects to define file types and formats
2. Creating stage objects to store loaded files
3. Staging data files in the stages
4. Listing the staged files
5. Copying data from the stages into target tables
Building Enterprise Solutions with Blockchain and Ledger Technology - SVC202 ...Amazon Web Services
Blockchain technology is rapidly evolving. Are you ready to take advantage of blockchain's use cases for the enterprise? In this session, learn how AWS views blockchain and ledger technology, discover our new services, Amazon Managed Blockchain and Amazon QLDB, and understand how to use their features.
Building enterprise solutions with blockchain and ledger technology - SVC202 ...Amazon Web Services
Blockchain technology is evolving rapidly. Are you ready to take advantage of blockchain’s use cases for the enterprise? In this session, learn how AWS views blockchain and ledger technology, get introduced to our services, and watch a demo of our new managed blockchain service, Amazon Managed Blockchain, which makes it easy to build and manage scalable Hyperledger Fabric networks on AWS.
Simple cloud migration with OpenText MigrateOpenText
Migrate any server workload to any target destination with the OpenText Migrate cloud migration platform. Learn about common migration challenges and how to choose the right cloud migration tool.
Snowflake: The Good, the Bad, and the UglyTyler Wishnoff
Learn how to solve the top 3 challenges Snowflake customers face, and what you can do to ensure high-performance, intelligent analytics at any scale. Ideal for those currently using Snowflake and those considering it. Learn more at: https://kyligence.io/
The document discusses strategies for migrating IT workloads to the cloud. It describes common drivers for cloud migration like cost reduction and agility. Potential barriers are also outlined, such as existing investments and lack of cloud expertise. The main sections of the document are on migration planning, common migration strategies ranging from rehosting to rearchitecting, examples of migration patterns, and modernizing applications on AWS.
The document discusses building a data lake on AWS. It describes various AWS services that can be used to ingest, store, transform, analyze and visualize data in the data lake. These services include Amazon S3 for storage, AWS Glue for ETL/data cataloging, AWS Lake Formation for governance, Amazon Athena/EMR for analytics and Amazon QuickSight for visualization. The document also covers data movement options from on-premises to the data lake and real-time streaming of data using services like Kinesis. Machine learning workloads can leverage Amazon SageMaker for training and deployment.
Agile Integration with APIs and Containers Workshop Nicole Maselli
The document provides an agenda for a Red Hat Agile Integration workshop. The agenda includes sessions on agile integration concepts and use cases, hands-on developer demos, and labs on contract-first API development. Participants can choose between an API design and management track or an API development and security track. The workshop aims to provide an introduction to agile integration using Red Hat products like OpenShift, Fuse, 3scale, Apicurio and Microcks.
In this presentation we will offer an overview of the fundamental concepts of graph databases and data representation and query technologies;
We will also focus on AWS Property Graph and Apache TinkerPop Gremlin. Then we'll talk about the use of Amazon Neptune: creating a cluster, loading data and running some sample queries;
This document is a training presentation on Databricks fundamentals and the data lakehouse concept by Dalibor Wijas from November 2022. It introduces Wijas and his experience. It then discusses what Databricks is, why it is needed, what a data lakehouse is, how Databricks enables the data lakehouse concept using Apache Spark and Delta Lake. It also covers how Databricks supports data engineering, data warehousing, and offers tools for data ingestion, transformation, pipelines and more.
Scaling and Modernizing Data Platform with DatabricksDatabricks
This document summarizes Atlassian's adoption of Databricks to manage their growing data pipelines and platforms. It discusses the challenges they faced with their previous architecture around development time, collaboration, and costs. With Databricks, Atlassian was able to build scalable data pipelines using notebooks and connectors, orchestrate workflows with Airflow, and provide self-service analytics and machine learning to teams while reducing infrastructure costs and data engineering dependencies. The key benefits included reduced development time by 30%, decreased infrastructure costs by 60%, and increased adoption of Databricks and self-service across teams.
This document provides information about Amazon QuickSight, a fully managed cloud business intelligence system. It discusses how QuickSight allows users to connect to data sources, create interactive dashboards, and publish them for sharing. QuickSight is serverless, scalable from 10 to 10,000 users, and uses a pay-per-session pricing model where users only pay when accessing dashboards.
This document provides an overview of AWS Lake Formation and related services for building a secure data lake. It discusses how Lake Formation provides a centralized management layer for data ingestion, cleaning, security and access. It also describes how Lake Formation integrates with services like AWS Glue, Amazon S3 and ML transforms to simplify and automate many data lake tasks. Finally, it provides an example workflow for using Lake Formation to deduplicate data from various sources and grant secure access for analysis.
Amazon Web Services (AWS) provides a low cost, reliable and secure foundation for you to use as you build and deliver Software as a Service (SaaS) solutions to customers. For ISVs, the process of transition from a traditional (license based) model to a Software as a Service (SaaS) is a challenge that concerns not only the technical aspect, but also financial and commercial strategy aspects. In this presentation you will find out how AWS can become the ideal partner to support the transformation.
AWS May Webinar Series - Getting Started: Storage with Amazon S3 and Amazon G...Amazon Web Services
If you are interested to know more about AWS Chicago Summit, please use the following to register: http://amzn.to/1RooPPL
Amazon S3 and Amazon Glacier provide developers and IT teams with secure, durable, highly-scalable object storage with no minimum fees or setup costs. In this webcast, we will provide an introduction to each service, dive deep into key features of Amazon S3 and Amazon Glacier, and explore different use cases that these services optimize.
Learning Objectives: • Business value of Amazon S3 and Amazon Glacier • Leveraging S3 for web applications, media delivery, big data analytics and backup • Leveraging Amazon Glacier to build cost effective archives • Understand the life cycle management of AWS' storage services
Amazon Managed Blockchain is a fully managed blockchain service that makes it easy for customers to create and manage scalable blockchain-based transaction networks (blockchain networks) using the popular open source blockchain frameworks Hyperledger Fabric and Ethereum. Blockchain technologies enable groups of organizations, oftentimes in financial services and manufacturing, to securely transact, run application code, and share data without a trusted central authority. We will explore the components of blockchain technology, discuss use cases, and do a deep dive into capabilities, performance, and key innovations in Amazon Managed Blockchain.
Speaker: Bill Baldwin - Database Technical Evangelist, AWS
In this session we will introduce key ETL features of AWS Glue and cover common use cases ranging from scheduled nightly data warehouse loads to near real-time, event-driven ETL flows for your data lake. We will also discuss how to build scalable, efficient, and serverless ETL pipelines.
The AWS Architecture Icons release provides updated icons and guidelines to improve accessibility and align with AWS branding. Key updates include:
- Icons were updated to have a minimum contrast ratio of 3:1 for lines and 4.5:1 for text for accessibility. Line weight and text size were increased.
- The color palette was updated to match the new AWS brand colors of White, Galaxy, Nebula, Cosmos, Mars, Smile, Endor, and Orbit.
- The general icons library was expanded with new icons for concepts like cold storage, data streams, credentials, and more.
- Guidance was provided for specific services like using a generic instance icon for EC2 instead
Master the Multi-Clustered Data Warehouse - SnowflakeMatillion
Snowflake is one of the most powerful, efficient data warehouses on the market today—and we joined forces with the Snowflake team to show you how it works!
In this webinar:
- Learn how to optimize Snowflake
- Hear insider tips and tricks on how to improve performance
- Get expert insights from Craig Collier, Technical Architect from Snowflake, and Kalyan Arangam, Solution Architect from Matillion
- Find out how leading brands like Converse, Duo Security, and Pets at Home use Snowflake and Matillion ETL to make data-driven decisions
- Discover how Matillion ETL and Snowflake work together to modernize your data world
- Learn how to utilize the impressive scalability of Snowflake and Matillion
Build real-time streaming data pipelines to AWS with Confluentconfluent
Traditional data pipelines often face scalability issues and challenges related to cost, their monolithic design, and reliance on batch data processing. They also typically operate under the premise that all data needs to be stored in a single centralized data source before it's put to practical use. Confluent Cloud on Amazon Web Services (AWS) provides a fully managed cloud-native platform that helps you simplify the way you build real-time data flows using streaming data pipelines and Apache Kafka.
Cloud native is a new paradigm for developing, deploying, and running applications using containers, microservices, and container orchestration. The Cloud Native Computing Foundation (CNCF) drives adoption of this paradigm through open source projects like Kubernetes, Prometheus, and Envoy. Cloud native applications are packaged as lightweight containers, developed as loosely coupled microservices, and deployed on elastic cloud infrastructure to optimize resource utilization. CNCF seeks to make these innovations accessible to everyone.
Actionable Insights with AI - Snowflake for Data ScienceHarald Erb
Talk @ ScaleUp 360° AI Infrastructures DACH, 2021: Data scientists spend 80% and more of their time searching for and preparing data. This talk explains Snowflake’s Platform capabilities like near-unlimited data storage and instant and near-infinite compute resources and how the platform can be used to seamlessly integrate and support the machine learning libraries and tools data scientists rely on.
Bulk data loading in Snowflake involves the following steps:
1. Creating file format objects to define file types and formats
2. Creating stage objects to store loaded files
3. Staging data files in the stages
4. Listing the staged files
5. Copying data from the stages into target tables
Building Enterprise Solutions with Blockchain and Ledger Technology - SVC202 ...Amazon Web Services
Blockchain technology is rapidly evolving. Are you ready to take advantage of blockchain's use cases for the enterprise? In this session, learn how AWS views blockchain and ledger technology, discover our new services, Amazon Managed Blockchain and Amazon QLDB, and understand how to use their features.
Building enterprise solutions with blockchain and ledger technology - SVC202 ...Amazon Web Services
Blockchain technology is evolving rapidly. Are you ready to take advantage of blockchain’s use cases for the enterprise? In this session, learn how AWS views blockchain and ledger technology, get introduced to our services, and watch a demo of our new managed blockchain service, Amazon Managed Blockchain, which makes it easy to build and manage scalable Hyperledger Fabric networks on AWS.
Building enterprise solutions with blockchain technology - SVC217 - New York ...Amazon Web Services
This document discusses building enterprise solutions with blockchain technology. It begins by explaining the need for ledgers with centralized trust in various industries. It then discusses the challenges of using traditional databases to manage these ledgers. Amazon Quantum Ledger Database (QLDB) and Amazon Managed Blockchain are introduced as new blockchain services that address these challenges by providing fully managed ledgers and blockchain networks. Common use cases for these services are also outlined.
Do you need a ledger database or a blockchain? - SVC310 - Chicago AWS SummitAmazon Web Services
This session introduces two new AWS services for blockchain and ledger technology: Amazon Quantum Ledger Database (Amazon QLDB) and Amazon Managed Blockchain. We discuss what these services do, the problems they solve, and when each should be used. We also dive deep into details about service features and how the services work; explain key concepts such as immutability and centralized vs. decentralized trust; and review use cases.
¿Son las bases de datos de contabilidad interesantes, o son parte del hype al...javier ramirez
Las bases de datos tradicionales tienen mala memoria. ¿Cuántas veces se ha actualizado un valor? ¿Se ha borrado algo? Ni idea.
A veces, necesitas registrar todo lo que pasa con tus datos, y quieres asegurarte de que nadie puede manipular ese registro.
Las bases de datos de contabilidad resuelven ese problema. Y si quieres una que soporte transacciones a escala masiva y sin servidores, te interesa Amazon Quantum Ledger Database.
En mi charla veremos cómo usar QLDB para resolver problemas reales. Demo incluída.
También te contaré qué opciones ofrece AWS para usar Blockchain.
The document discusses blockchain technology and Amazon Web Services' blockchain services. It provides an overview of Amazon Quantum Ledger Database (QLDB) and Amazon Managed Blockchain, which allow users to build applications using blockchain ledgers without having to manage the underlying blockchain infrastructure. It also summarizes use cases discussed, such as trade finance platforms, bancassurance, and know-your-customer processes. Deloitte's Asia Pacific Blockchain Lab is mentioned as providing blockchain consulting services and prototypes.
re:Invent Round-up, Time Stream, Quantum and Managed Blockchain Amazon Web Services
This document provides an overview of Amazon Web Services database services, including Amazon DynamoDB, Amazon RDS, Amazon ElastiCache, Amazon Neptune, and Amazon QLDB. It discusses the different types of databases, common use cases, and new features like Amazon Managed Blockchain and Amazon Timestream. The pricing examples show how Amazon QLDB and Amazon Managed Blockchain can provide ledger databases and blockchain networks at lower costs than traditional options.
Building system-of-record applications with Amazon QLDB - SVC218 - New York A...Amazon Web Services
Many organizations build system-of-record applications with ledger-like functionality because they want to maintain an accurate history of their application data. However, ledger applications are usually implemented using relational databases, making building audit functionality with relational databases time consuming, prone to human error, and requiring custom development. This led us to build the world’s first fully managed ledger database, Amazon Quantum Ledger Database (Amazon QLDB). Amazon QLDB is a new class of database that provides a transparent, immutable, and cryptographically verifiable transaction log. Come to this session to learn about the features and functionality of Amazon QLDB, and see a live demo.
Bonus-Session-Interledger-DvP-Settlement-on-Amazon-Managed-BlockchainAmazon Web Services
Project Ubin is a collaborative project first launched in 2016 by the Monetary Authority of Singapore (MAS) and the Association of Banks in Singapore (ABS). In August 2018, MAS and SGX announced partnership with Anquan, Deloitte and Nasdaq to harness blockchain technology for settlement of tokenised assets. In order to explore the possible DvP settlement models, six blockchains of varied capabilities and features were used in prototypes. In this session, Peter from SGX will offer deeper insights into a DvP prototype developed with Deloitte that was hosted on Amazon Managed Blockchain, and will also discuss potential future applications of DLT (Distributed Ledger Technology) for capital markets, while Michael from AWS will discuss some of the common patterns seen in blockchain networks as well as features that control the privacy of data on shared blockchain networks.
Deep Dive on Amazon Managed Blockchain: re:Invent 2018 Recap at the AWS Loft ...Amazon Web Services
The document discusses Amazon Managed Blockchain, a fully managed service that makes it easy to create and manage scalable blockchain networks using popular open source frameworks like Hyperledger Fabric and Ethereum. It introduces blockchain concepts and components. It then explains how Amazon Managed Blockchain works, including how to create networks, its pricing model, and security features. The document also provides an overview of how transactions flow in Hyperledger Fabric.
Automated Frameworks to Deliver DevOps at Speed and Scale on AWSAmazon Web Services
Global technical consultancy Contino discovered they were solving the same DevOps problems for their enterprise financial services clients again and again – so they created automated frameworks to help financial services clients accelerate their journey to DevOps.
In this webinar, you will learn how Continuum, an automated framework for creating an end-to-end DevOps pipeline in minutes on Amazon Web Services (AWS), and Momentum, a data-driven, five-phase roadmap for accelerated enterprise DevOps adoption, are helping enterprises to move faster and innovate more quickly.
Ben Wootton, Contino Co-founder and CTO, will demonstrate how Contino used Continuum to create an automated, secure continuous integration and continuous delivery (CI/CD) pipeline on AWS in under 10 minutes.
Build your first blockchain application with Amazon Managed Blockchain - SVC2...Amazon Web Services
Learn how to set up a blockchain network and deploy your first application using Amazon Managed Blockchain. In this hands-on workshop, attendees build a blockchain network for a nonprofit organization to enable it to distribute funds without an intermediary, ensuring immutable transactions and full transparency to a donor about how the donation is being used. In addition donors can view all of the donations received by the organization and how these donations have been spent. You must have an active AWS account to participate in this workshop.
AWS Summit Singapore 2019 | Big Data Analytics Architectural Patterns and Bes...AWS Summits
Speaker: Renee Lo, Head of Big Data, Analytics, and AI, ASEAN, AWS
Customer Speaker: Natalia Kozyura, Head of Innovation Center, FWD Group
We discuss architectural principles that simplify big data analytics. We'll apply these principles to various stages of big data processing: collect, store, process, analyse, and visualise. We'll discuss how to choose the right technology in each stage based on criteria such as data structure, query latency, cost, request rate, item size, data volume, durability, and so on. Finally, we provide reference architectures, design patterns, and best practices for assembling these technologies to solve your big data problems at the right cost.
Track 1 Session 6_建立安全高效的資料分析平台加速金融創新_HC+EMQ Cliff(已檢核,上下無黑邊).pptxAmazon Web Services
This document discusses building a secure and efficient data analytics platform on AWS to accelerate financial innovation. It covers moving data to AWS services like S3 for storage, using AWS Glue for ETL processing, and Amazon Athena and QuickSight for analytics. AWS services provide comprehensive and open data services, a secure infrastructure, and make it easy to build a modern data platform. The document also shares how a fintech company EMQ migrated their data pipeline to serverless AWS services like Glue and Athena, improving query performance, reducing costs, and simplifying maintenance.
Cloud Smart is today’s IT modernization strategy designed to help Federal agencies adopt cloud solutions that streamline transformation and embrace modern capabilities. We will review the key aspects of the Cloud Smart strategy that agencies can focus on to meet those objectives. We will dive into the Security aspect of Cloud Smart as we focus on its impact on Trusted Internet Connections. We'll see how the new horizon of security services in AWS can help agencies implement Zero Trust Networking and we'll look at ways in which Government agencies can utilize AWS tools and services for architecture decisions that may not require TIC routing, while still meeting government-wide requirements.
The document outlines Matt Jordan's presentation on enabling cloud smart, zero-trust networking, and trusted internet connections (TIC) at a public sector summit. It discusses the concepts of cloud smart, focusing on making informed technology decisions aligned with agency missions. Zero-trust networking principles of identity assurance, least privilege access, and auditing are described. Iterating TIC to leverage commercial innovation through collaboration and automated verification is presented, with benefits including efficient deployment, relying on continuous research and development, and multilayered security.
Initial investigations into the use of blockchain often results in such questions as, “What can this technology do for us?” or “Can’t I just use a database?” rather than a more data-centric approach that can help define an effective blockchain strategy and help create a competitive advantage. In this chalk talk, we review a number of practical uses of blockchain within retail, from supply chain to inventory management, and from customer service conflict resolution to a customer maintaining virtual, transferable warranty wallets. The AWS Retail team presents some of the standard applications of blockchain with AWS in this interactive session. Bring your use cases to the whiteboard!
Similar to Do you need a ledger database or a blockchain - SVC208 - Atlanta AWS Summit.pdf (20)
Come costruire servizi di Forecasting sfruttando algoritmi di ML e deep learn...Amazon Web Services
Il Forecasting è un processo importante per tantissime aziende e viene utilizzato in vari ambiti per cercare di prevedere in modo accurato la crescita e distribuzione di un prodotto, l’utilizzo delle risorse necessarie nelle linee produttive, presentazioni finanziarie e tanto altro. Amazon utilizza delle tecniche avanzate di forecasting, in parte questi servizi sono stati messi a disposizione di tutti i clienti AWS.
In questa sessione illustreremo come pre-processare i dati che contengono una componente temporale e successivamente utilizzare un algoritmo che a partire dal tipo di dato analizzato produce un forecasting accurato.
Big Data per le Startup: come creare applicazioni Big Data in modalità Server...Amazon Web Services
La varietà e la quantità di dati che si crea ogni giorno accelera sempre più velocemente e rappresenta una opportunità irripetibile per innovare e creare nuove startup.
Tuttavia gestire grandi quantità di dati può apparire complesso: creare cluster Big Data su larga scala sembra essere un investimento accessibile solo ad aziende consolidate. Ma l’elasticità del Cloud e, in particolare, i servizi Serverless ci permettono di rompere questi limiti.
Vediamo quindi come è possibile sviluppare applicazioni Big Data rapidamente, senza preoccuparci dell’infrastruttura, ma dedicando tutte le risorse allo sviluppo delle nostre le nostre idee per creare prodotti innovativi.
Ora puoi utilizzare Amazon Elastic Kubernetes Service (EKS) per eseguire pod Kubernetes su AWS Fargate, il motore di elaborazione serverless creato per container su AWS. Questo rende più semplice che mai costruire ed eseguire le tue applicazioni Kubernetes nel cloud AWS.In questa sessione presenteremo le caratteristiche principali del servizio e come distribuire la tua applicazione in pochi passaggi
Vent'anni fa Amazon ha attraversato una trasformazione radicale con l'obiettivo di aumentare il ritmo dell'innovazione. In questo periodo abbiamo imparato come cambiare il nostro approccio allo sviluppo delle applicazioni ci ha permesso di aumentare notevolmente l'agilità, la velocità di rilascio e, in definitiva, ci ha consentito di creare applicazioni più affidabili e scalabili. In questa sessione illustreremo come definiamo le applicazioni moderne e come la creazione di app moderne influisce non solo sull'architettura dell'applicazione, ma sulla struttura organizzativa, sulle pipeline di rilascio dello sviluppo e persino sul modello operativo. Descriveremo anche approcci comuni alla modernizzazione, compreso l'approccio utilizzato dalla stessa Amazon.com.
Come spendere fino al 90% in meno con i container e le istanze spot Amazon Web Services
L’utilizzo dei container è in continua crescita.
Se correttamente disegnate, le applicazioni basate su Container sono molto spesso stateless e flessibili.
I servizi AWS ECS, EKS e Kubernetes su EC2 possono sfruttare le istanze Spot, portando ad un risparmio medio del 70% rispetto alle istanze On Demand. In questa sessione scopriremo insieme quali sono le caratteristiche delle istanze Spot e come possono essere utilizzate facilmente su AWS. Impareremo inoltre come Spreaker sfrutta le istanze spot per eseguire applicazioni di diverso tipo, in produzione, ad una frazione del costo on-demand!
In recent months, many customers have been asking us the question – how to monetise Open APIs, simplify Fintech integrations and accelerate adoption of various Open Banking business models. Therefore, AWS and FinConecta would like to invite you to Open Finance marketplace presentation on October 20th.
Event Agenda :
Open banking so far (short recap)
• PSD2, OB UK, OB Australia, OB LATAM, OB Israel
Intro to Open Finance marketplace
• Scope
• Features
• Tech overview and Demo
The role of the Cloud
The Future of APIs
• Complying with regulation
• Monetizing data / APIs
• Business models
• Time to market
One platform for all: a Strategic approach
Q&A
Rendi unica l’offerta della tua startup sul mercato con i servizi Machine Lea...Amazon Web Services
Per creare valore e costruire una propria offerta differenziante e riconoscibile, le startup di successo sanno come combinare tecnologie consolidate con componenti innovativi creati ad hoc.
AWS fornisce servizi pronti all'utilizzo e, allo stesso tempo, permette di personalizzare e creare gli elementi differenzianti della propria offerta.
Concentrandoci sulle tecnologie di Machine Learning, vedremo come selezionare i servizi di intelligenza artificiale offerti da AWS e, anche attraverso una demo, come costruire modelli di Machine Learning personalizzati utilizzando SageMaker Studio.
OpsWorks Configuration Management: automatizza la gestione e i deployment del...Amazon Web Services
Con l'approccio tradizionale al mondo IT per molti anni è stato difficile implementare tecniche di DevOps, che finora spesso hanno previsto attività manuali portando di tanto in tanto a dei downtime degli applicativi interrompendo l'operatività dell'utente. Con l'avvento del cloud, le tecniche di DevOps sono ormai a portata di tutti a basso costo per qualsiasi genere di workload, garantendo maggiore affidabilità del sistema e risultando in dei significativi miglioramenti della business continuity.
AWS mette a disposizione AWS OpsWork come strumento di Configuration Management che mira ad automatizzare e semplificare la gestione e i deployment delle istanze EC2 per mezzo di workload Chef e Puppet.
Scopri come sfruttare AWS OpsWork a garanzia e affidabilità del tuo applicativo installato su Instanze EC2.
Microsoft Active Directory su AWS per supportare i tuoi Windows WorkloadsAmazon Web Services
Vuoi conoscere le opzioni per eseguire Microsoft Active Directory su AWS? Quando si spostano carichi di lavoro Microsoft in AWS, è importante considerare come distribuire Microsoft Active Directory per supportare la gestione, l'autenticazione e l'autorizzazione dei criteri di gruppo. In questa sessione, discuteremo le opzioni per la distribuzione di Microsoft Active Directory su AWS, incluso AWS Directory Service per Microsoft Active Directory e la distribuzione di Active Directory su Windows su Amazon Elastic Compute Cloud (Amazon EC2). Trattiamo argomenti quali l'integrazione del tuo ambiente Microsoft Active Directory locale nel cloud e l'utilizzo di applicazioni SaaS, come Office 365, con AWS Single Sign-On.
Dal riconoscimento facciale al riconoscimento di frodi o difetti di fabbricazione, l'analisi di immagini e video che sfruttano tecniche di intelligenza artificiale, si stanno evolvendo e raffinando a ritmi elevati. In questo webinar esploreremo le possibilità messe a disposizione dai servizi AWS per applicare lo stato dell'arte delle tecniche di computer vision a scenari reali.
Amazon Web Services e VMware organizzano un evento virtuale gratuito il prossimo mercoledì 14 Ottobre dalle 12:00 alle 13:00 dedicato a VMware Cloud ™ on AWS, il servizio on demand che consente di eseguire applicazioni in ambienti cloud basati su VMware vSphere® e di accedere ad una vasta gamma di servizi AWS, sfruttando a pieno le potenzialità del cloud AWS e tutelando gli investimenti VMware esistenti.
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
Crea la tua prima serverless ledger-based app con QLDB e NodeJSAmazon Web Services
Molte aziende oggi, costruiscono applicazioni con funzionalità di tipo ledger ad esempio per verificare lo storico di accrediti o addebiti nelle transazioni bancarie o ancora per tenere traccia del flusso supply chain dei propri prodotti.
Alla base di queste soluzioni ci sono i database ledger che permettono di avere un log delle transazioni trasparente, immutabile e crittograficamente verificabile, ma sono strumenti complessi e onerosi da gestire.
Amazon QLDB elimina la necessità di costruire sistemi personalizzati e complessi fornendo un database ledger serverless completamente gestito.
In questa sessione scopriremo come realizzare un'applicazione serverless completa che utilizzi le funzionalità di QLDB.
Con l’ascesa delle architetture di microservizi e delle ricche applicazioni mobili e Web, le API sono più importanti che mai per offrire agli utenti finali una user experience eccezionale. In questa sessione impareremo come affrontare le moderne sfide di progettazione delle API con GraphQL, un linguaggio di query API open source utilizzato da Facebook, Amazon e altro e come utilizzare AWS AppSync, un servizio GraphQL serverless gestito su AWS. Approfondiremo diversi scenari, comprendendo come AppSync può aiutare a risolvere questi casi d’uso creando API moderne con funzionalità di aggiornamento dati in tempo reale e offline.
Inoltre, impareremo come Sky Italia utilizza AWS AppSync per fornire aggiornamenti sportivi in tempo reale agli utenti del proprio portale web.
Database Oracle e VMware Cloud™ on AWS: i miti da sfatareAmazon Web Services
Molte organizzazioni sfruttano i vantaggi del cloud migrando i propri carichi di lavoro Oracle e assicurandosi notevoli vantaggi in termini di agilità ed efficienza dei costi.
La migrazione di questi carichi di lavoro, può creare complessità durante la modernizzazione e il refactoring delle applicazioni e a questo si possono aggiungere rischi di prestazione che possono essere introdotti quando si spostano le applicazioni dai data center locali.
In queste slide, gli esperti AWS e VMware presentano semplici e pratici accorgimenti per facilitare e semplificare la migrazione dei carichi di lavoro Oracle accelerando la trasformazione verso il cloud, approfondiranno l’architettura e dimostreranno come sfruttare a pieno le potenzialità di VMware Cloud ™ on AWS.
1) The document discusses building a minimum viable product (MVP) using Amazon Web Services (AWS).
2) It provides an example of an MVP for an omni-channel messenger platform that was built from 2017 to connect ecommerce stores to customers via web chat, Facebook Messenger, WhatsApp, and other channels.
3) The founder discusses how they started with an MVP in 2017 with 200 ecommerce stores in Hong Kong and Taiwan, and have since expanded to over 5000 clients across Southeast Asia using AWS for scaling.
This document discusses pitch decks and fundraising materials. It explains that venture capitalists will typically spend only 3 minutes and 44 seconds reviewing a pitch deck. Therefore, the deck needs to tell a compelling story to grab their attention. It also provides tips on tailoring different types of decks for different purposes, such as creating a concise 1-2 page teaser, a presentation deck for pitching in-person, and a more detailed read-only or fundraising deck. The document stresses the importance of including key information like the problem, solution, product, traction, market size, plans, team, and ask.
This document discusses building serverless web applications using AWS services like API Gateway, Lambda, DynamoDB, S3 and Amplify. It provides an overview of each service and how they can work together to create a scalable, secure and cost-effective serverless application stack without having to manage servers or infrastructure. Key services covered include API Gateway for hosting APIs, Lambda for backend logic, DynamoDB for database needs, S3 for static content, and Amplify for frontend hosting and continuous deployment.
This document provides tips for fundraising from startup founders Roland Yau and Sze Lok Chan. It discusses generating competition to create urgency for investors, fundraising in parallel rather than sequentially, having a clear fundraising narrative focused on what you do and why it's compelling, and prioritizing relationships with people over firms. It also notes how the pandemic has changed fundraising, with examples of deals done virtually during this time. The tips emphasize being fully prepared before fundraising and cultivating connections with investors in advance.
AWS_HK_StartupDay_Building Interactive websites while automating for efficien...Amazon Web Services
This document discusses Amazon's machine learning services for building conversational interfaces and extracting insights from unstructured text and audio. It describes Amazon Lex for creating chatbots, Amazon Comprehend for natural language processing tasks like entity extraction and sentiment analysis, and how they can be used together for applications like intelligent call centers and content analysis. Pre-trained APIs simplify adding machine learning to apps without requiring ML expertise.
Amazon Elastic Container Service (Amazon ECS) è un servizio di gestione dei container altamente scalabile, che semplifica la gestione dei contenitori Docker attraverso un layer di orchestrazione per il controllo del deployment e del relativo lifecycle. In questa sessione presenteremo le principali caratteristiche del servizio, le architetture di riferimento per i differenti carichi di lavoro e i semplici passi necessari per poter velocemente migrare uno o più dei tuo container.