The document discusses MongoDB and data treatment. It covers how MongoDB can help with data integrity, confidentiality, correctness and reliability. It also discusses how MongoDB supports dynamic schemas, replication for high availability, security features and can be used as part of a modern enterprise technology stack including integration with Hadoop. MongoDB can be deployed on Azure as a fully managed service.
Best Practices for MongoDB in Today's Telecommunications MarketMongoDB
It is a challenging time for telecommunications providers: landline voice is in decline, mobile voice margins falling, and the High Speed Internet market is saturated. In order to win in this increasingly competitive landscape, operators must dramatically increase their pace of innovation, focus on the customer experience and, most importantly, bring to market new applications.
In this webinar find out how MongoDB is enabling Telecommunications operators worldwide to:
Improve their current offerings with faster time to market
Roll out new services such as M2M, unified messaging, cloud, and OTT video
Increase customer satisfaction
Operators have tremendous assets in their networks, billing relationships, and knowledge of subscriber behavior across devices and applications. Those that leverage these strengths with greater agility will succeed in the market. The use cases are specific to Telecommunications, but the patterns of usage - agility, scale, global distribution - will be applicable across many industries.
Webinar: How Financial Firms Create a Single Customer View with MongoDBMongoDB
Learn why a tier 1 bank, top 5 insurance provider and other global financial services companies are flocking to MongoDB. This webinar focuses on how firms use MongoDB to generate a single customer view not only to comply with KYC and other regulations, but also to engage customers efficiently, which helps reduce churn and increase wallet share while still reducing costs. We will focus on how MongoDB's dynamic schema, real-time replication and auto-scaling make it possible to create a global, unified data hub aggregating disparate data sources, which can be made available to customers, customer service representatives (CSRs), and relationship managers (RMs).
Webinar: Making A Single View of the Customer Real with MongoDBMongoDB
Tier 1 banks, top insurance providers and other global financial services institutions have discovered that with the use of MongoDB, they are able to achieve a single view of the customer. This allows them not only to comply with KYC and other regulations, but also to engage customers efficiently, which helps reduce churn and increase wallet share while reducing costs. We will focus on how MongoDB's dynamic schema, real-time replication and auto-scaling make it possible to create a global, unified data hub aggregating disparate data sources, which can be made available to customers, customer service representatives (CSRs), and relationship managers (RMs).
Webinar: How Financial Services Organizations Use MongoDBMongoDB
The finance industry is facing major strain on existing IT infrastructure, systems, and design practices:
New pressures and industry regulation have meant increased volume, consolidation & reconciliation, and variability of data
Mobile and other channels demand significantly more flexible programming and data design environments
Improvements in operational efficiency and cost containment is ever increasing
MongoDB is the alternative that allows you to efficiently create and consume data, rapidly and securely, no matter how it is structured across channels and products and make it easy to aggregate data from multiple systems, while lowering TCO and delivering applications faster.
In this session, we will present on common MongoDB use cases including, but not limited to:
Risk Analytics & Reporting
Tick Data Capture & Analysis
Product Catalogues
Cross-Asset Class Trade Stores
Reference Data Management
Private DBaaS
Best Practices for MongoDB in Today's Telecommunications MarketMongoDB
It is a challenging time for telecommunications providers: landline voice is in decline, mobile voice margins falling, and the High Speed Internet market is saturated. In order to win in this increasingly competitive landscape, operators must dramatically increase their pace of innovation, focus on the customer experience and, most importantly, bring to market new applications.
In this webinar find out how MongoDB is enabling Telecommunications operators worldwide to:
Improve their current offerings with faster time to market
Roll out new services such as M2M, unified messaging, cloud, and OTT video
Increase customer satisfaction
Operators have tremendous assets in their networks, billing relationships, and knowledge of subscriber behavior across devices and applications. Those that leverage these strengths with greater agility will succeed in the market. The use cases are specific to Telecommunications, but the patterns of usage - agility, scale, global distribution - will be applicable across many industries.
Webinar: How Financial Firms Create a Single Customer View with MongoDBMongoDB
Learn why a tier 1 bank, top 5 insurance provider and other global financial services companies are flocking to MongoDB. This webinar focuses on how firms use MongoDB to generate a single customer view not only to comply with KYC and other regulations, but also to engage customers efficiently, which helps reduce churn and increase wallet share while still reducing costs. We will focus on how MongoDB's dynamic schema, real-time replication and auto-scaling make it possible to create a global, unified data hub aggregating disparate data sources, which can be made available to customers, customer service representatives (CSRs), and relationship managers (RMs).
Webinar: Making A Single View of the Customer Real with MongoDBMongoDB
Tier 1 banks, top insurance providers and other global financial services institutions have discovered that with the use of MongoDB, they are able to achieve a single view of the customer. This allows them not only to comply with KYC and other regulations, but also to engage customers efficiently, which helps reduce churn and increase wallet share while reducing costs. We will focus on how MongoDB's dynamic schema, real-time replication and auto-scaling make it possible to create a global, unified data hub aggregating disparate data sources, which can be made available to customers, customer service representatives (CSRs), and relationship managers (RMs).
Webinar: How Financial Services Organizations Use MongoDBMongoDB
The finance industry is facing major strain on existing IT infrastructure, systems, and design practices:
New pressures and industry regulation have meant increased volume, consolidation & reconciliation, and variability of data
Mobile and other channels demand significantly more flexible programming and data design environments
Improvements in operational efficiency and cost containment is ever increasing
MongoDB is the alternative that allows you to efficiently create and consume data, rapidly and securely, no matter how it is structured across channels and products and make it easy to aggregate data from multiple systems, while lowering TCO and delivering applications faster.
In this session, we will present on common MongoDB use cases including, but not limited to:
Risk Analytics & Reporting
Tick Data Capture & Analysis
Product Catalogues
Cross-Asset Class Trade Stores
Reference Data Management
Private DBaaS
Real World MongoDB: Use Cases from Financial Services by Daniel RobertsMongoDB
Huge upheaval in the finance industry has led to a major strain on existing IT infrastructure and systems. New finance industry regulation has meant increased volume, velocity and variability of data. This coupled with cost pressures from the business has led these institutions to seek alternatives. In this session learn how FS companies are using MongoDB to solve their problems. The use cases are specific to FS but the patterns of usage - agility, scale, global distribution - will be applicable across many industries.
Operationalizing the Value of MongoDB: The MetLife ExperienceMongoDB
It was a lot of fun bringing exciting emerging technology into the rigid enterprise infrastructure eco-system. And then the real work began. How do you make the new technology operational? Learn from MetLife’s journey of operationalizing MongoDB to the level compliant with large enterprise requirements in High Availability, Recoverability, Security, Monitoring, Alerting, Workload management and Automation.
Presentation on general use cases of MongoDB on Financial Services industry. Over this presentation we discussed why MongoDB is ideal to large datasets analytics, realtime processing, quants analysis and other interesting aspects that make it ideal for FS projects.
Webinar: How Banks Use MongoDB as a Tick DatabaseMongoDB
Learn why MongoDB is spreading like wildfire across capital markets (and really every industry) and then focus in particular on how financial firms are enjoying the developer productivity, low TCO, and unlimited scale of MongoDB as a tick database for capturing, analyzing, and taking advantage of opportunities in tick data. This webinar illustrates how MongoDB can easily and quickly store variable data formats, like top and depth of book, multiple asset classes, and even news and social networking feeds. It will explore aggregating and analyzing tick data in real-time for automated trading or in batch for research and analysis and how auto-sharding enables MongoDB to scale with commodity hardware to satisfy unlimited storage and performance requirements.
A Customer-Centric Banking Platform Powered by MongoDB MongoDB
Speaker: Alan Reyes Vilchis, Technical Lead, Banco Azteca
Level: 200 (Intermediate)
Track: Developer
Business apps powered by single customer views (SCV’s) are one of the most predominant uses of MongoDB. However, each view presents unique challenges such as distilling data from providers, removing duplication, matching records in different systems, and more.
The team at Banco Azteca (part of Grupo Salinas, a holding company of enterprises in media, telecommunications and financial services) in Mexico City has launched various customer-centric banking services that are powered by an SCV built on MongoDB, which has not only allowed the expansion into mobile-first consumer markets, but has also helped in identifying and preventing fraud across the group’s enterprises. This talk will explore the overall initiative, and place emphasis on the technical innovations regarding design, serialization and transactions across multiple systems.
What You Will Learn:
- “Serialization magic” using the MongoDB Java driver and Jackson
- Implementing transactional-like logic across different systems
- Conceptual and physical design for building a Single-View
Webinar: How to Drive Business Value in Financial Services with MongoDBMongoDB
Huge upheaval in the finance industry has led to a major strain on existing IT infrastructure and systems. New finance industry regulation has meant increased volume, velocity and variability of data. This coupled with cost pressures from the business has led these institutions to seek alternatives. Top tier institutions like MetLife have turned to MongoDB because of the enormous business value it enables.
In this session, hear how MongoDB enabled these successful real world examples:
Single View of a Customer - 3 months and $2M for a single view of a customer across 50 source systems
Reference Data Management - $40M in cost savings from migrating to MongoDB for reference data management
Private cloud - MongoDB as a PaaS across a tier 1 bank for enabling agility for operations, not just the developer
The use cases are specific to financial services but the patterns of usage - agility, scale, global distribution - will be applicable across many industries.
Webinar: How MongoDB is Used to Manage Reference Data - May 2014MongoDB
Managing and distributing reference data globally has always been a challenge for financial institutions. Managing and maintaining database schemas while integrating and replicating that data across geographies is costly and time consuming. MongoDB's native replication capabilities and partitioned architecture make it simple to distribute and synchronize data efficiently across the globe. MongoDB’s dynamic schema dramatically reduces database maintenance for schema migrations – data structure changes can be applied with no down time, and with no impact to existing applications. For example, by migrating its reference data management application to MongoDB, a Tier 1 bank dramatically reduced the license and hardware costs associated with the proprietary relational database it previously ran.
Webinar: Real-time Risk Management and Regulatory Reporting with MongoDBMongoDB
Real-time risk management coupled with the requirements for regulatory reporting are top of mind for many heads of risk. In this webinar, we will cover how MongoDB can help with:
Implementing proactive risk controls
Aggregated Risk on Demand
Creating an Adaptive Regulatory Reporting Platform
Cost Effective Risk Calculations
The Connected Consumer – Real-time Customer 360Capgemini
With Business Data Lake technologies based on EMC’s Big Data portfolio it becomes possible to move away from channel specific analytics towards a 360 customer view.
This presentation will show how technologies like Spark, Hadoop, and Kafka help companies gain a real-time view of everything their customers do and make changes to customer touch points whether mobile, web, in-store, direct marketing or existing transactional systems.
Presented by Steve Jones, Vice President, Insights & Data, Capgemini at EMC World 2016
http://www.capgemini.com/emc
The concept of a 360° view, especially of customers, although it potentially applies to other things too, has been around for a substantial period of time. The idea behind the 360° view of customers is that the more you know about your customers the easier it will be to meet their needs, both in terms of products and aftersales care, and to market additional goods and services to them in the most efficient fashion. Thus a 360° view helps both in terms of customer retention and acquisition, as well as up-sell and cross-sell.
In this presentation which complements Bloor Whitepaper on the "Extended 360 degree view" we will discuss why we believe that extending the traditional 360° view makes sense and we will give some uses that demonstrate why the extended 360° view represents an opportunity, both for those that have already implemented a 360° view and for those that have not.
Customer-Centric Data Management for Better Customer ExperiencesInformatica
With consumer and business buyer expectations growing exponentially, more businesses are competing on the basis of customer experience. But executing preferred customer experiences requires data about who your customers are today and what will they likely need in the future. Every business can benefit from an AI-powered master data management platform to supply this information to line-of-business owners so they can execute great experiences at scale. This same need is true from an internal business process perspective as well. For example, many businesses require better data management practices to deliver preferred employee experiences. Informatica provides an MDM platform to solve for these examples and more.
The role of Big Data and Modern Data Management in Driving a Customer 360 fro...Cloudera, Inc.
Organizations spanning all industries are in pursuit of Customer 360, which aims to integrate and enrich customer information across multiple channels, systems, devices and products in order to improve the interaction experience and maximize the value delivered. To achieve this real-time integration requires a modern approach to working with data and the Cloud is providing a differentiating strategic platform for many organisations. Discover how you can strategically structure your data environment leveraging the Cloud to empower analytical deployment, create next generation customer applications whilst also saving costs and realising greater efficiencies.
Partner Webinar: Deliver Big Data Apps Faster With Informatica & MongoDBMongoDB
Informatica + MongoDB is a powerful combination that increases developer productivity up to 5x, enabling them to build and deploy big data applications much faster. Informatica provides access to virtually all types of data from modern and legacy systems at any latency, processes and integrates data at scale, and delivers it directly into MongoDB.
With Informatica, companies can unlock the data in MongoDB for downstream analytics to improve decision making and business operations. Using the Informatica PowerCenter Big Data Edition with the PowerExchange for MongoDB adapter users can read and write data in MongoDB, parse the JSON-based documents and then transform the data and combine it with other information for big data analytics - all without having to write a single line of code.
In this webinar you will see a live demo and learn how to:
- Discover insights from Big Data faster
- Run better applications with better data
- Lower costs of data integration
- Deliver business impact with rapid deployment
See the full presentation at:
Enabling Telco to Build and Run Modern Applications Tugdual Grall
See how new databases like MongoDB enable Telco Enterprises to Build and Run Modern Applications.
This presentations was delivered in Tel Aviv in Jan-2015 during a Telco round table organized by Matrix.
Real World MongoDB: Use Cases from Financial Services by Daniel RobertsMongoDB
Huge upheaval in the finance industry has led to a major strain on existing IT infrastructure and systems. New finance industry regulation has meant increased volume, velocity and variability of data. This coupled with cost pressures from the business has led these institutions to seek alternatives. In this session learn how FS companies are using MongoDB to solve their problems. The use cases are specific to FS but the patterns of usage - agility, scale, global distribution - will be applicable across many industries.
Operationalizing the Value of MongoDB: The MetLife ExperienceMongoDB
It was a lot of fun bringing exciting emerging technology into the rigid enterprise infrastructure eco-system. And then the real work began. How do you make the new technology operational? Learn from MetLife’s journey of operationalizing MongoDB to the level compliant with large enterprise requirements in High Availability, Recoverability, Security, Monitoring, Alerting, Workload management and Automation.
Presentation on general use cases of MongoDB on Financial Services industry. Over this presentation we discussed why MongoDB is ideal to large datasets analytics, realtime processing, quants analysis and other interesting aspects that make it ideal for FS projects.
Webinar: How Banks Use MongoDB as a Tick DatabaseMongoDB
Learn why MongoDB is spreading like wildfire across capital markets (and really every industry) and then focus in particular on how financial firms are enjoying the developer productivity, low TCO, and unlimited scale of MongoDB as a tick database for capturing, analyzing, and taking advantage of opportunities in tick data. This webinar illustrates how MongoDB can easily and quickly store variable data formats, like top and depth of book, multiple asset classes, and even news and social networking feeds. It will explore aggregating and analyzing tick data in real-time for automated trading or in batch for research and analysis and how auto-sharding enables MongoDB to scale with commodity hardware to satisfy unlimited storage and performance requirements.
A Customer-Centric Banking Platform Powered by MongoDB MongoDB
Speaker: Alan Reyes Vilchis, Technical Lead, Banco Azteca
Level: 200 (Intermediate)
Track: Developer
Business apps powered by single customer views (SCV’s) are one of the most predominant uses of MongoDB. However, each view presents unique challenges such as distilling data from providers, removing duplication, matching records in different systems, and more.
The team at Banco Azteca (part of Grupo Salinas, a holding company of enterprises in media, telecommunications and financial services) in Mexico City has launched various customer-centric banking services that are powered by an SCV built on MongoDB, which has not only allowed the expansion into mobile-first consumer markets, but has also helped in identifying and preventing fraud across the group’s enterprises. This talk will explore the overall initiative, and place emphasis on the technical innovations regarding design, serialization and transactions across multiple systems.
What You Will Learn:
- “Serialization magic” using the MongoDB Java driver and Jackson
- Implementing transactional-like logic across different systems
- Conceptual and physical design for building a Single-View
Webinar: How to Drive Business Value in Financial Services with MongoDBMongoDB
Huge upheaval in the finance industry has led to a major strain on existing IT infrastructure and systems. New finance industry regulation has meant increased volume, velocity and variability of data. This coupled with cost pressures from the business has led these institutions to seek alternatives. Top tier institutions like MetLife have turned to MongoDB because of the enormous business value it enables.
In this session, hear how MongoDB enabled these successful real world examples:
Single View of a Customer - 3 months and $2M for a single view of a customer across 50 source systems
Reference Data Management - $40M in cost savings from migrating to MongoDB for reference data management
Private cloud - MongoDB as a PaaS across a tier 1 bank for enabling agility for operations, not just the developer
The use cases are specific to financial services but the patterns of usage - agility, scale, global distribution - will be applicable across many industries.
Webinar: How MongoDB is Used to Manage Reference Data - May 2014MongoDB
Managing and distributing reference data globally has always been a challenge for financial institutions. Managing and maintaining database schemas while integrating and replicating that data across geographies is costly and time consuming. MongoDB's native replication capabilities and partitioned architecture make it simple to distribute and synchronize data efficiently across the globe. MongoDB’s dynamic schema dramatically reduces database maintenance for schema migrations – data structure changes can be applied with no down time, and with no impact to existing applications. For example, by migrating its reference data management application to MongoDB, a Tier 1 bank dramatically reduced the license and hardware costs associated with the proprietary relational database it previously ran.
Webinar: Real-time Risk Management and Regulatory Reporting with MongoDBMongoDB
Real-time risk management coupled with the requirements for regulatory reporting are top of mind for many heads of risk. In this webinar, we will cover how MongoDB can help with:
Implementing proactive risk controls
Aggregated Risk on Demand
Creating an Adaptive Regulatory Reporting Platform
Cost Effective Risk Calculations
The Connected Consumer – Real-time Customer 360Capgemini
With Business Data Lake technologies based on EMC’s Big Data portfolio it becomes possible to move away from channel specific analytics towards a 360 customer view.
This presentation will show how technologies like Spark, Hadoop, and Kafka help companies gain a real-time view of everything their customers do and make changes to customer touch points whether mobile, web, in-store, direct marketing or existing transactional systems.
Presented by Steve Jones, Vice President, Insights & Data, Capgemini at EMC World 2016
http://www.capgemini.com/emc
The concept of a 360° view, especially of customers, although it potentially applies to other things too, has been around for a substantial period of time. The idea behind the 360° view of customers is that the more you know about your customers the easier it will be to meet their needs, both in terms of products and aftersales care, and to market additional goods and services to them in the most efficient fashion. Thus a 360° view helps both in terms of customer retention and acquisition, as well as up-sell and cross-sell.
In this presentation which complements Bloor Whitepaper on the "Extended 360 degree view" we will discuss why we believe that extending the traditional 360° view makes sense and we will give some uses that demonstrate why the extended 360° view represents an opportunity, both for those that have already implemented a 360° view and for those that have not.
Customer-Centric Data Management for Better Customer ExperiencesInformatica
With consumer and business buyer expectations growing exponentially, more businesses are competing on the basis of customer experience. But executing preferred customer experiences requires data about who your customers are today and what will they likely need in the future. Every business can benefit from an AI-powered master data management platform to supply this information to line-of-business owners so they can execute great experiences at scale. This same need is true from an internal business process perspective as well. For example, many businesses require better data management practices to deliver preferred employee experiences. Informatica provides an MDM platform to solve for these examples and more.
The role of Big Data and Modern Data Management in Driving a Customer 360 fro...Cloudera, Inc.
Organizations spanning all industries are in pursuit of Customer 360, which aims to integrate and enrich customer information across multiple channels, systems, devices and products in order to improve the interaction experience and maximize the value delivered. To achieve this real-time integration requires a modern approach to working with data and the Cloud is providing a differentiating strategic platform for many organisations. Discover how you can strategically structure your data environment leveraging the Cloud to empower analytical deployment, create next generation customer applications whilst also saving costs and realising greater efficiencies.
Partner Webinar: Deliver Big Data Apps Faster With Informatica & MongoDBMongoDB
Informatica + MongoDB is a powerful combination that increases developer productivity up to 5x, enabling them to build and deploy big data applications much faster. Informatica provides access to virtually all types of data from modern and legacy systems at any latency, processes and integrates data at scale, and delivers it directly into MongoDB.
With Informatica, companies can unlock the data in MongoDB for downstream analytics to improve decision making and business operations. Using the Informatica PowerCenter Big Data Edition with the PowerExchange for MongoDB adapter users can read and write data in MongoDB, parse the JSON-based documents and then transform the data and combine it with other information for big data analytics - all without having to write a single line of code.
In this webinar you will see a live demo and learn how to:
- Discover insights from Big Data faster
- Run better applications with better data
- Lower costs of data integration
- Deliver business impact with rapid deployment
See the full presentation at:
Enabling Telco to Build and Run Modern Applications Tugdual Grall
See how new databases like MongoDB enable Telco Enterprises to Build and Run Modern Applications.
This presentations was delivered in Tel Aviv in Jan-2015 during a Telco round table organized by Matrix.
New to MongoDB? We'll provide an overview of installation, high availability through replication, scale out through sharding, and options for monitoring and backup. No prior knowledge of MongoDB is assumed. This session will jumpstart your knowledge of MongoDB operations, providing you with context for the rest of the day's content.
Choosing technologies for a big data solution in the cloudJames Serra
Has your company been building data warehouses for years using SQL Server? And are you now tasked with creating or moving your data warehouse to the cloud and modernizing it to support “Big Data”? What technologies and tools should use? That is what this presentation will help you answer. First we will cover what questions to ask concerning data (type, size, frequency), reporting, performance needs, on-prem vs cloud, staff technology skills, OSS requirements, cost, and MDM needs. Then we will show you common big data architecture solutions and help you to answer questions such as: Where do I store the data? Should I use a data lake? Do I still need a cube? What about Hadoop/NoSQL? Do I need the power of MPP? Should I build a "logical data warehouse"? What is this lambda architecture? Can I use Hadoop for my DW? Finally, we’ll show some architectures of real-world customer big data solutions. Come to this session to get started down the path to making the proper technology choices in moving to the cloud.
MongoDB is the leading NoSQL database due to a plenitude of reasons, open source, general purpose, document oriented database supported by a large community and educational platform. It's horizontal scalability features allows this to fit in the operational big data scenarios where the business needs point to realtime analytics and ever-increasing data sets. This talk will focus on the usage of MongoDB for big data operational purposes and why it's ideal to be used in such scenarios. Also integration with other notable big data technology out there like Hadoop and BI tools.
Norberto Leite - Senior Solutions Architect, @MongoDB.
Mongo DB presentation during the Pentaho & Big Data Ecosystem - Live Seminar 2013
MongoDB: The Operational Big Data by NORBERTO LEITE at Big Data Spain 2014Big Data Spain
When one starts analysing the BigData technology spectrum we can find several different solutions for several different purposes. This is may cause confusion, uncertainty and doubts on what to chose and what for. Both on technical and business decision makers. This talk is to shed some light on where you should consider MongoDB for your BigData strategy and how to make the most out of the dominant technologies of the field.
Operational Analytics Using Spark and NoSQL Data StoresDATAVERSITY
NoSQL data stores have emerged for scalable capture and real-time analysis of data. Apache Spark and Hadoop provide additional scalable analytics processing. This session looks at these technologies and how they can be used to support operational analytics to improve operational effectiveness. It also looks at an example of how operational analytics can be implemented in NoSQL environments using the Basho Data Platform with Apache Spark:
•The emergence of NoSQL, Hadoop and Apache Spark
•NoSQL Use Cases
•The need for operational analytics
•Types of operational analysis
•Key requirements for operational analytics
•Operational analytics using the Basho Data Platform with Apache Spark.
MongoDB Evenings Toronto - Monolithic to Microservices with MongoDBMongoDB
Monolithic to Microservices with MongoDB: Building Highly Available Services
Shawn McCarthy, Senior Solutions Architect, MongoDB
MongoDB Evenings Toronto
Infusion Offices
September 27, 2016
Modern apps and services are leveraging data to change the way we engage with users in a more personalized way. Skyla Loomis talks big data, analytics, NoSQL, SQL and how IBM Cloud is open for data.
Learn more by visiting our Bluemix Hybrid page: http://ibm.co/1PKN23h
Data Modelling for MongoDB - MongoDB.local Tel AvivNorberto Leite
At this point, you may be familiar with MongoDB and its Document Model.
However, what are the methods you can use to create an efficient database schema quickly and effectively?
This presentation will explore the different phases of a methodology to create a database schema. This methodology covers the description of your workload, the identification of the relationships between the elements (one-to-one, one-to-many and many-to-many) and an introduction to design patterns. Those patterns present practical solutions to different problems observed while helping our customers over the last 10 years.
In this session, you will learn about:
The differences between modeling for MongoDB versus a relational database.
A flexible methodology to model for MongoDB, which can be applied to simple projects, agile ones or more complex ones.
Overview of some common design patterns that help improve the performance of systems.
Query performance can either be a constant headache or the unsung hero of an application. MongoDB provides extremely powerful querying capabilities when used properly. As a member of the support team I will share common mistakes observed as well as tips and tricks to avoiding them.
The MongoDB Spark Connector integrates MongoDB and Apache Spark, providing users with the ability to process data in MongoDB with the massive parallelism of Spark. The connector gives users access to Spark's streaming capabilities, machine learning libraries, and interactive processing through the Spark shell, Dataframes and Datasets. We'll take a tour of the connector with a focus on practical use of the connector, and run a demo using both Spark and MongoDB for data processing.
Technical feature review of features introduced by MongoDB 3.4 on graph capabilities, MongoDB UI tool: Compass, improvements on the replication and aggregation framework stages and utils. Operations improvements on Ops Manager and MongoDB Atlas.
MongoDB Certification Study Group - May 2016Norberto Leite
Study group session to review the certification exam regarding material covered, exam structure and technical requirements. DBA and Developers track covered to ensure the technical expertise of individuals on subject matter topics specific to MongoDB
This talk is quick reference of all the different queerability options that MongoDB offers to developers that want to build mobile and geospatial referenced applications. We reviewed the basic functionality but also recent improvements in the query and indexation engine of MongoDB geospatial features
Slidedeck presented at http://devternity.com/ around MongoDB internals. We review the usage patterns of MongoDB, the different storage engines and persistency models as well has the definition of documents and general data structures.
This presentation contains a preview of MongoDB 3.2 upcoming release where we explore the new storage engines, aggregation framework enhancements and utility features like document validation and partial indexes.
This presentation reviews the integration details of the springframework and MongoDB. We approach some of the most popular projects of the Spring stack, spring data, spring boot, spring batch ... and how we can easily build applications with MongoDB as backend. This presentation was produced for a webinar hosted by Pivotal.
MongoDB has been conceived for the cloud age. Making sure that MongoDB is compatible and performant around cloud providers is mandatory to achieve complete integration with platforms and systems. Azure is one of biggest IaaS platforms available and very popular amongst developers that work on Microsoft Stack.
Agile Software Development is becoming the defacto way of building software these days. More and more enterprises, from large fortune 500 to small shop start-ups, are adopting agile development methodologies. But Agile Software development is more than just a methodology or a practice. It's also a combined set of tools and platforms that today are at our disposal to allows to iterate faster, get-to-market sooner and also fail faster. These set of tools augment our development cycles by a few orders of magnitude and allow developers to be much more productive.
When dealing with infrastructure we often go through the process of determining the different resources needed to attend our application requirements. This talks looks into the way that resources are used by MongoDB and which aspects should be considered to determined the sizing, capacity and deployment of a MongoDB cluster given the different scenarios, different sets of operations and storage engines available.
Modern architectures are moving away from a "one size fits all" approach. We are well aware that we need to use the best tools for the job. Given the large selection of options available today, chances are that you will end up managing data in MongoDB for your operational workload and with Spark for your high speed data processing needs.
Description: When we model documents or data structures there are some key aspects that need to be examined not only for functional and architectural purposes but also to take into consideration the distribution of data nodes, streaming capabilities, aggregation and queryability options and how we can integrate the different data processing software, like Spark, that can benefit from subtle but substantial model changes. A clear example is when embedding or referencing documents and their implications on high speed processing.
Over the course of this talk we will detail the benefits of a good document model for the operational workload. As well as what type of transformations we should incorporate in our document model to adjust for the high speed processing capabilities of Spark.
We will look into the different options that we have to connect these two different systems, how to model according to different workloads, what kind of operators we need to be aware of for top performance and what kind of design and architectures we should put in place to make sure that all of these systems work well together.
Over the course of the talk we will showcase different libraries that enable the integration between spark and MongoDB, such as MongoDB Hadoop Connector, Stratio Connector and MongoDB Spark Native Connector.
By the end of the talk I expect the attendees to have an understanding of:
How they connect their MongoDB clusters with Spark
Which use cases show a net benefit for connecting these two systems
What kind of architecture design should be considered for making the most of Spark + MongoDB
How documents can be modeled for better performance and operational process, while processing these data sets stored in MongoDB.
The talk is suitable for:
Developers that want to understand how to leverage Spark
Architects that want to integrate their existing MongoDB cluster and have real time high speed processing needs
Data scientists that know about Spark, are playing with Spark and want to integrate with MongoDB for their persistency layer
Talk about schema design practices and technics that can be put to play in big data scenarios as well as in very flexible data scenarios. This talk was presented at PyConUK15.
Strongly Typed Languages and Flexible SchemasNorberto Leite
We like to use strongly type languages and used them along side with flexible schema databases. What challenges and strategies do we have to deal with data coherence and format validations using different strategies and tools like ODMs versioning, migrations et al. We also review the tradeoffs of such strategies.
This presentation touch the basics of MongoMK, Jackrabbit Oak MongoDB persistency layer implementation and how to deploy, operate, manage and size your MongoDB cluster on AEM environments
Introductory talk to how can MongoDB enable new age software taking into account the expected growth rates, the constant availability of services and new business models that appear on a daily basis.
Presentation on MongoDB and Node.JS. We describe how to do basic CRUD operations (insert, remove, update, find) how to aggregate using node.js. We also discuss a bit of Meteor, MEAN Stack and other ODMs and projects on Javascript and MongoDB
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
7. THE LARGEST ECOSYSTEM
9,000,000+
MongoDB Downloads
250,000+
Online Education Registrants
35,000+
MongoDB User Group Members
40,000+
MongoDB Management Service (MMS) Users
700+
Technology and Services Partners
1,000+
Customers Across All Industries
8. 8
MONGODB BUSINESS VALUE
Enabling New Apps Better Customer Experience
Faster Time to Market Lower TCO
9. FORTUNE 500 & GLOBAL 500
10 of the Top Financial Services Institutions
10 of the Top Electronics Companies
10 of the Top Media and Entertainment Companies
10 of the Top Retailers
10 of the Top Telcos
8 of the Top Technology Companies
6 of the Top Healthcare Companies
12. Data Treatment
• Data Treatment in IT is important for several reasons
– Integrity
– Confidentiality
– Correctness & Reliability
– Value
On Big Data and Flexible
Databases it's even more
Important!
14. Integrity
• Several Factors can influence data integrity
– Application Data Corruption
– Migrations
– "Fat Finger" events
• Different Strategies for dealing with those
– Backups
– Delayed Replicas
– Database Decoupling Architecture
– User Roles and Grants
15. Replica Sets
Replica Set – two or more copies
Self-healing shard
Addresses availability considerations:
High Availability
Disaster Recovery
Maintenance
Deployment Flexibility
Data locality to users
Workload isolation: operational & analytics
Delayed Replicas
20. Confidentiality
• This is probably one of the biggest issues for some Big Data
technology
For me, the nearly non-existent response to the security issue is shocking. Can it be
that people believe Hadoop is secure? Because it certainly is not. At every layer of
the stack, vulnerabilities exist, and at the level of the data itself there numerous
concerns
Merv Adrian – Gartner Research VP
http://blogs.gartner.com/merv-adrian/2014/01/21/security-for-hadoop-dont-look-now/
28. Correctness & Reliability
• Different systems use different approaches
– Protocol buffers
– Columnar Databases have Family Data Types
– Thrift
• MongoDB uses BSON for Everything!
– JSON ?
• Not really – binary JSON
• http://bsonspec.org/
29. Documents are Rich Data Structures
{
first_name: ‘Paul’,
surname: ‘Miller’,
cell: ‘+447557505611’
city: ‘London’,
location: [45.123,47.232],
Profession: [banking, finance, trader],
cars: [
{ model: ‘Bentley’,
year: 1973,
value: 100000, … },
{ model: ‘Rolls Royce’,
year: 1965,
value: 330000, … }
]
}
Fields can contain an array of sub-documents
Fields
Typed field values
Fields can contain
arrays
37. MongoDB@ Azure
• MongoDB was designed to run everywhere
– Cloud
– Virtual
– Bare Metal
• On any Platform
– MacOSX
– Linux
– Solaris
– Windows
38. MongoDB@ Azure
• Setting MongoDB using Azure IaaS
– Build your instances
• Windows
• Linux
• …
• MongoDB as fully Managed Service
– Available on Azure Marketplace
– Sanity of mind
43. Data Treatment
• It's important that you assure good governance of your data!
– Integrity and Consistency
– Confidentiality and Security
– Correctness and Reliability
– Value!
• MongoDB offers all of that out of the box
– No crazy setups
– Simple, Scalable, Sophisticated
– Well integrated!
45. MongoDB & Hadoop
Applications
powered by
Analysis
powered by
Low latency
Rich fast querying
Flexible indexing
Ad hoc aggregations in database
Known data relationships
Great at looking at any subset of data
Longer jobs and queries
Analytical processing
Often highly partitionable
Unknown data relationships
Great at looking at all of data
MongoDB Connector
for Hadoop
46. Analytics Landscape
Batch / Predictive / Ad Hoc
(mins – hours)
Real-Time Dashboards
/ Scoring
(<30 ms)
Planned Reporting
(secs – mins )
Experimental
Legacy
47. MONGODB FEATURES
JSON Document Model
with Dynamic Schemas
Auto-Sharding for
Horizontal Scalability
Text Search
Aggregation Framework
and MapReduce
Full, Flexible Index Support
and Rich Queries
Built-In Replication
for High Availability
Advanced Security
Large Media Storage
with GridFS
48. MongoDB Use Cases
Single View Internet of Things Mobile Real-Time Analytics
Catalog Personalization Content Management
49. Enterprise Architecture
Customer-side Applications
Business Operations Applications
MOBILE WEB and CMS
IOT
SAAS / High-scale Online Services
Business Management Software
CRM ERP ITSM
Operational Tools
MONITORING SYSLOG
Core Business
Specific Systems
Operational Data Hub
Analytics
REAL-TIME
DATA WAREHOUSE
COMPUTE CLUTER
BUSINESS INTELLIGENCE
50. Common Applications
Customer-side Applications
MOBILE WEB and CMS
Development Productivity
• End-to-End JSON = Web Development
Productivity
• Asset Catalog Management: managing
frequently evolving schema
• Native search functionality
Geo-Aware Topology
• Multi-Active Data Center Support
Web-scale
• Scale-out economically without verticality
limitations
51. Strategic Initiatives
Customer-side Applications
IOT
SAAS/ High-scale Online Services
SaaS: Moving to the Cloud
• Need for High Availability (eg. 99.999% uptime SLA)
• No downtime Schema Migrations
• Native cross data-center replication and fault tolerance.
• Scale for multi-tenancy
• Operability at Scale: Tools to manage hundreds of nodes
IoT: Connecting the World
• Data model that can support a variety of data types
• Managing a volatile schema for ever growing and changing
data sources
• Scale big for high throughput and data volumes
• Geospatial support
52. Operational Data Hub
Operational Data Hub
An Example: The MetLife Wall
360° View 70 Systems 3 Months
53. Relational Model Challenges
70 Different Policy
Schemas
How can we translate
this into a Customer
View?
ETL 70 applications
into a Dimensional
Model? Integrating a
few is hard…
54. MetLife Wall
Strategy: All documents can have variable schemas
db.policies.find(
{first:”Dylan”,last:”Young”,
type:{
$in[“Healthcare”,”PPO”,”HMO”,”Auto”]
})
Collection of Policies
55. Data Hub: Master Data Distribution
Real-time
Real-time Real-time
Real-time
Real-time
Real-time
Real-time
Primary
Secondary
Secondary
Secondary
Secondary
Secondary
Secondary
Secondary
56. For More Information
Resource Location
Case Studies mongodb.com/customers
Presentations mongodb.com/presentations
Free Online Training education.mongodb.com
Webinars and Events mongodb.com/events
Documentation docs.mongodb.org
MongoDB Downloads mongodb.com/download
Additional Info info@mongodb.com
Editor's Notes
High Availability – Ensure application availability during many types of failures
Disaster Recovery – Address the RTO and RPO goals for business continuity
Maintenance – Perform upgrades and other maintenance operations with no application downtime
Secondaries can be used for a variety of applications – failover, hot backup, rolling upgrades, data locality and privacy and workload isolation
What kinds of tasks?
Provisioning. Any topology, at scale, with the click of a button.
Upgrades. In minutes, with no downtime.
Scale. Add capacity without taking your application offline.
Continuous Backup. Customize to meet your recovery goals.
Point-in-time Recovery. Restore to any point in time, because disasters aren’t scheduled.
Performance Alerts. Monitor 100+ system metrics and get custom alerts before your system degrades.
MongoDB provides agility, scalability, and performance without sacrificing the functionality of relational databases, like full index support and rich queriesIndexes: secondary, compound, text search, geospatial, and more