Presentation on general use cases of MongoDB on Financial Services industry. Over this presentation we discussed why MongoDB is ideal to large datasets analytics, realtime processing, quants analysis and other interesting aspects that make it ideal for FS projects.
Webinar: How Financial Services Organizations Use MongoDBMongoDB
The finance industry is facing major strain on existing IT infrastructure, systems, and design practices:
New pressures and industry regulation have meant increased volume, consolidation & reconciliation, and variability of data
Mobile and other channels demand significantly more flexible programming and data design environments
Improvements in operational efficiency and cost containment is ever increasing
MongoDB is the alternative that allows you to efficiently create and consume data, rapidly and securely, no matter how it is structured across channels and products and make it easy to aggregate data from multiple systems, while lowering TCO and delivering applications faster.
In this session, we will present on common MongoDB use cases including, but not limited to:
Risk Analytics & Reporting
Tick Data Capture & Analysis
Product Catalogues
Cross-Asset Class Trade Stores
Reference Data Management
Private DBaaS
Real World MongoDB: Use Cases from Financial Services by Daniel RobertsMongoDB
Huge upheaval in the finance industry has led to a major strain on existing IT infrastructure and systems. New finance industry regulation has meant increased volume, velocity and variability of data. This coupled with cost pressures from the business has led these institutions to seek alternatives. In this session learn how FS companies are using MongoDB to solve their problems. The use cases are specific to FS but the patterns of usage - agility, scale, global distribution - will be applicable across many industries.
Webinar: Real-time Risk Management and Regulatory Reporting with MongoDBMongoDB
Real-time risk management coupled with the requirements for regulatory reporting are top of mind for many heads of risk. In this webinar, we will cover how MongoDB can help with:
Implementing proactive risk controls
Aggregated Risk on Demand
Creating an Adaptive Regulatory Reporting Platform
Cost Effective Risk Calculations
How Financial Services Organizations Use MongoDBMongoDB
MongoDB is the alternative that allows you to efficiently create and consume data, rapidly and securely, no matter how it is structured across channels and products, and makes it easy to aggregate data from multiple systems, while lowering TCO and delivering applications faster.
Learn how Financial Services Organizations are Using MongoDB with this presentation.
Webinar: Position and Trade Management with MongoDBMongoDB
Learn how leading investment banks are bringing complex financial products to market quickly and effectively with MongoDB, whereas in past rigid relational schema have inhibited time to market. Delegates attending this webinar will see how MongoDB can be used to create complex new products, capture new trades and calculate values and exposures.
Best Practices for MongoDB in Today's Telecommunications MarketMongoDB
It is a challenging time for telecommunications providers: landline voice is in decline, mobile voice margins falling, and the High Speed Internet market is saturated. In order to win in this increasingly competitive landscape, operators must dramatically increase their pace of innovation, focus on the customer experience and, most importantly, bring to market new applications.
In this webinar find out how MongoDB is enabling Telecommunications operators worldwide to:
Improve their current offerings with faster time to market
Roll out new services such as M2M, unified messaging, cloud, and OTT video
Increase customer satisfaction
Operators have tremendous assets in their networks, billing relationships, and knowledge of subscriber behavior across devices and applications. Those that leverage these strengths with greater agility will succeed in the market. The use cases are specific to Telecommunications, but the patterns of usage - agility, scale, global distribution - will be applicable across many industries.
Webinar: How Financial Services Organizations Use MongoDBMongoDB
The finance industry is facing major strain on existing IT infrastructure, systems, and design practices:
New pressures and industry regulation have meant increased volume, consolidation & reconciliation, and variability of data
Mobile and other channels demand significantly more flexible programming and data design environments
Improvements in operational efficiency and cost containment is ever increasing
MongoDB is the alternative that allows you to efficiently create and consume data, rapidly and securely, no matter how it is structured across channels and products and make it easy to aggregate data from multiple systems, while lowering TCO and delivering applications faster.
In this session, we will present on common MongoDB use cases including, but not limited to:
Risk Analytics & Reporting
Tick Data Capture & Analysis
Product Catalogues
Cross-Asset Class Trade Stores
Reference Data Management
Private DBaaS
Real World MongoDB: Use Cases from Financial Services by Daniel RobertsMongoDB
Huge upheaval in the finance industry has led to a major strain on existing IT infrastructure and systems. New finance industry regulation has meant increased volume, velocity and variability of data. This coupled with cost pressures from the business has led these institutions to seek alternatives. In this session learn how FS companies are using MongoDB to solve their problems. The use cases are specific to FS but the patterns of usage - agility, scale, global distribution - will be applicable across many industries.
Webinar: Real-time Risk Management and Regulatory Reporting with MongoDBMongoDB
Real-time risk management coupled with the requirements for regulatory reporting are top of mind for many heads of risk. In this webinar, we will cover how MongoDB can help with:
Implementing proactive risk controls
Aggregated Risk on Demand
Creating an Adaptive Regulatory Reporting Platform
Cost Effective Risk Calculations
How Financial Services Organizations Use MongoDBMongoDB
MongoDB is the alternative that allows you to efficiently create and consume data, rapidly and securely, no matter how it is structured across channels and products, and makes it easy to aggregate data from multiple systems, while lowering TCO and delivering applications faster.
Learn how Financial Services Organizations are Using MongoDB with this presentation.
Webinar: Position and Trade Management with MongoDBMongoDB
Learn how leading investment banks are bringing complex financial products to market quickly and effectively with MongoDB, whereas in past rigid relational schema have inhibited time to market. Delegates attending this webinar will see how MongoDB can be used to create complex new products, capture new trades and calculate values and exposures.
Best Practices for MongoDB in Today's Telecommunications MarketMongoDB
It is a challenging time for telecommunications providers: landline voice is in decline, mobile voice margins falling, and the High Speed Internet market is saturated. In order to win in this increasingly competitive landscape, operators must dramatically increase their pace of innovation, focus on the customer experience and, most importantly, bring to market new applications.
In this webinar find out how MongoDB is enabling Telecommunications operators worldwide to:
Improve their current offerings with faster time to market
Roll out new services such as M2M, unified messaging, cloud, and OTT video
Increase customer satisfaction
Operators have tremendous assets in their networks, billing relationships, and knowledge of subscriber behavior across devices and applications. Those that leverage these strengths with greater agility will succeed in the market. The use cases are specific to Telecommunications, but the patterns of usage - agility, scale, global distribution - will be applicable across many industries.
Operationalizing the Value of MongoDB: The MetLife ExperienceMongoDB
It was a lot of fun bringing exciting emerging technology into the rigid enterprise infrastructure eco-system. And then the real work began. How do you make the new technology operational? Learn from MetLife’s journey of operationalizing MongoDB to the level compliant with large enterprise requirements in High Availability, Recoverability, Security, Monitoring, Alerting, Workload management and Automation.
This is a quick overview of the challenges that BigData and Flexible Schema Databases like MongoDB offer regarding Data Treatment and strategies to overcome them.
Webinar: How Financial Firms Create a Single Customer View with MongoDBMongoDB
Learn why a tier 1 bank, top 5 insurance provider and other global financial services companies are flocking to MongoDB. This webinar focuses on how firms use MongoDB to generate a single customer view not only to comply with KYC and other regulations, but also to engage customers efficiently, which helps reduce churn and increase wallet share while still reducing costs. We will focus on how MongoDB's dynamic schema, real-time replication and auto-scaling make it possible to create a global, unified data hub aggregating disparate data sources, which can be made available to customers, customer service representatives (CSRs), and relationship managers (RMs).
Webinar: How Banks Use MongoDB as a Tick DatabaseMongoDB
Learn why MongoDB is spreading like wildfire across capital markets (and really every industry) and then focus in particular on how financial firms are enjoying the developer productivity, low TCO, and unlimited scale of MongoDB as a tick database for capturing, analyzing, and taking advantage of opportunities in tick data. This webinar illustrates how MongoDB can easily and quickly store variable data formats, like top and depth of book, multiple asset classes, and even news and social networking feeds. It will explore aggregating and analyzing tick data in real-time for automated trading or in batch for research and analysis and how auto-sharding enables MongoDB to scale with commodity hardware to satisfy unlimited storage and performance requirements.
Webinar: Making A Single View of the Customer Real with MongoDBMongoDB
Tier 1 banks, top insurance providers and other global financial services institutions have discovered that with the use of MongoDB, they are able to achieve a single view of the customer. This allows them not only to comply with KYC and other regulations, but also to engage customers efficiently, which helps reduce churn and increase wallet share while reducing costs. We will focus on how MongoDB's dynamic schema, real-time replication and auto-scaling make it possible to create a global, unified data hub aggregating disparate data sources, which can be made available to customers, customer service representatives (CSRs), and relationship managers (RMs).
Webinar: How Financial Organizations use MongoDB for Real-time Risk Managemen...MongoDB
Real-time risk management coupled with the requirements for regulatory reporting are top of mind for many heads of risk; to meet the demands of new regulation financial organizations must have technology that enables the business to easily calculate and analyze risk across products and channels. In this webinar, we will cover how organizations use MongoDB for:
* Implementing proactive risk controls
* Aggregated Risk on Demand, Creating an Adaptive Regulatory Reporting Platform
* Cost Effective Risk Calculations
Webinar: MongoDB and Analytics: Building Solutions with the MongoDB BI ConnectorMongoDB
MongoDB is known for being a developers database of choice, but what about data analysts? MongoDB 3.2 has introduced the MongoDB BI Connector – to allow users to connect to an instance using their analytics tool of choice. Now users of Tableau, QlikView, Excel, Cognos, and countless others can connect to MongoDB and immediately begin building reporting solutions. In this webinar, we will cover the architecture needed to use the BI Connector with MongoDB. We will also demonstrate how to build reports with your data.
How MongoDB is Transforming Healthcare TechnologyMongoDB
Healthcare providers continue to feel increased margin pressure, due to both macro-economic factors as well as significant regulatory change. In response to these pressures, leading healthcare organizations are leveraging new technologies to increase quality of care while simultaneously reducing costs.
In this session, we'll cover:
- How MongoDB has enabled successful real world projects with EHR / EMR in the healthcare industry
- How MongoDB allows providers to create a single view in order to collect patient information from multiple systems
- The challenges with healthcare data collection and how MongoDB handles various data types, HIPAA/PII and hybrid deployments
Webinar: How to Drive Business Value in Financial Services with MongoDBMongoDB
Huge upheaval in the finance industry has led to a major strain on existing IT infrastructure and systems. New finance industry regulation has meant increased volume, velocity and variability of data. This coupled with cost pressures from the business has led these institutions to seek alternatives. Top tier institutions like MetLife have turned to MongoDB because of the enormous business value it enables.
In this session, hear how MongoDB enabled these successful real world examples:
Single View of a Customer - 3 months and $2M for a single view of a customer across 50 source systems
Reference Data Management - $40M in cost savings from migrating to MongoDB for reference data management
Private cloud - MongoDB as a PaaS across a tier 1 bank for enabling agility for operations, not just the developer
The use cases are specific to financial services but the patterns of usage - agility, scale, global distribution - will be applicable across many industries.
L'architettura di classe enterprise di nuova generazione - Massimo BrignoliData Driven Innovation
La nascita dei data lake - La aziende, ormai, sono sommerse dai dati e il classico datawarehouse fa fatica a macinare questi dati per numerosità e varietà. In molti hanno iniziato a guardare a delle architetture chiamate Data Lakes con Hadoop come tecnologia di riferimento. Ma questa soluzione va bene per tutto? Vieni a capire come operazionalizzare i data lakes per creare delle moderne architetture di gestione dati.
MongoDB .local Chicago 2019: MongoDB – Powering the new age data demandsMongoDB
To successfully implement our clients' unique use cases and data patterns, it is mandatory that we unlearn many relational concepts while designing and rapidly developing efficient applications in NoSQL.
In this session, we will talk about some of our client use cases and the strategies we adopted using features of MongoDB.
During this presentation, Infusion and MongoDB shared their mainframe optimization experiences and best practices. These have been gained from working with a variety of organizations, including a case study from one of the world’s largest banks. MongoDB and Infusion bring a tested approach that provides a new way of modernizing mainframe applications, while keeping pace with the demand for new digital services.
MongoBD London 2013: Real World MongoDB: Use Cases from Financial Services pr...MongoDB
Huge upheaval in the finance industry has led to a major strain on existing IT infrastructure and systems. New finance industry regulation has meant increased volume, velocity and variability of data. This coupled with cost pressures from the business has led these institutions to seek alternatives. In this session learn how FS companies are using MongoDB to solve their problems. The use cases are specific to FS but the patterns of usage - agility, scale, global distribution - will be applicable across many industries.
MongoDB is one of the most popular databases these days and there are a few reasons for such popularity. One of these reasons is the excellent integration with different programming languages and development frameworks.
In the case of Python we take it a few notches up (native use of dictionaries, integration with asynchronous libraries (twisted, gevent), good support for web frameworks like django, flask, bottle ... (mongoengine anyone?).
This talk is about the several different projects that we support, the way to effectively use Python and MongoDB together and a few other improvements and announcements.
MongoDB Certification Study Group - May 2016Norberto Leite
Study group session to review the certification exam regarding material covered, exam structure and technical requirements. DBA and Developers track covered to ensure the technical expertise of individuals on subject matter topics specific to MongoDB
Operationalizing the Value of MongoDB: The MetLife ExperienceMongoDB
It was a lot of fun bringing exciting emerging technology into the rigid enterprise infrastructure eco-system. And then the real work began. How do you make the new technology operational? Learn from MetLife’s journey of operationalizing MongoDB to the level compliant with large enterprise requirements in High Availability, Recoverability, Security, Monitoring, Alerting, Workload management and Automation.
This is a quick overview of the challenges that BigData and Flexible Schema Databases like MongoDB offer regarding Data Treatment and strategies to overcome them.
Webinar: How Financial Firms Create a Single Customer View with MongoDBMongoDB
Learn why a tier 1 bank, top 5 insurance provider and other global financial services companies are flocking to MongoDB. This webinar focuses on how firms use MongoDB to generate a single customer view not only to comply with KYC and other regulations, but also to engage customers efficiently, which helps reduce churn and increase wallet share while still reducing costs. We will focus on how MongoDB's dynamic schema, real-time replication and auto-scaling make it possible to create a global, unified data hub aggregating disparate data sources, which can be made available to customers, customer service representatives (CSRs), and relationship managers (RMs).
Webinar: How Banks Use MongoDB as a Tick DatabaseMongoDB
Learn why MongoDB is spreading like wildfire across capital markets (and really every industry) and then focus in particular on how financial firms are enjoying the developer productivity, low TCO, and unlimited scale of MongoDB as a tick database for capturing, analyzing, and taking advantage of opportunities in tick data. This webinar illustrates how MongoDB can easily and quickly store variable data formats, like top and depth of book, multiple asset classes, and even news and social networking feeds. It will explore aggregating and analyzing tick data in real-time for automated trading or in batch for research and analysis and how auto-sharding enables MongoDB to scale with commodity hardware to satisfy unlimited storage and performance requirements.
Webinar: Making A Single View of the Customer Real with MongoDBMongoDB
Tier 1 banks, top insurance providers and other global financial services institutions have discovered that with the use of MongoDB, they are able to achieve a single view of the customer. This allows them not only to comply with KYC and other regulations, but also to engage customers efficiently, which helps reduce churn and increase wallet share while reducing costs. We will focus on how MongoDB's dynamic schema, real-time replication and auto-scaling make it possible to create a global, unified data hub aggregating disparate data sources, which can be made available to customers, customer service representatives (CSRs), and relationship managers (RMs).
Webinar: How Financial Organizations use MongoDB for Real-time Risk Managemen...MongoDB
Real-time risk management coupled with the requirements for regulatory reporting are top of mind for many heads of risk; to meet the demands of new regulation financial organizations must have technology that enables the business to easily calculate and analyze risk across products and channels. In this webinar, we will cover how organizations use MongoDB for:
* Implementing proactive risk controls
* Aggregated Risk on Demand, Creating an Adaptive Regulatory Reporting Platform
* Cost Effective Risk Calculations
Webinar: MongoDB and Analytics: Building Solutions with the MongoDB BI ConnectorMongoDB
MongoDB is known for being a developers database of choice, but what about data analysts? MongoDB 3.2 has introduced the MongoDB BI Connector – to allow users to connect to an instance using their analytics tool of choice. Now users of Tableau, QlikView, Excel, Cognos, and countless others can connect to MongoDB and immediately begin building reporting solutions. In this webinar, we will cover the architecture needed to use the BI Connector with MongoDB. We will also demonstrate how to build reports with your data.
How MongoDB is Transforming Healthcare TechnologyMongoDB
Healthcare providers continue to feel increased margin pressure, due to both macro-economic factors as well as significant regulatory change. In response to these pressures, leading healthcare organizations are leveraging new technologies to increase quality of care while simultaneously reducing costs.
In this session, we'll cover:
- How MongoDB has enabled successful real world projects with EHR / EMR in the healthcare industry
- How MongoDB allows providers to create a single view in order to collect patient information from multiple systems
- The challenges with healthcare data collection and how MongoDB handles various data types, HIPAA/PII and hybrid deployments
Webinar: How to Drive Business Value in Financial Services with MongoDBMongoDB
Huge upheaval in the finance industry has led to a major strain on existing IT infrastructure and systems. New finance industry regulation has meant increased volume, velocity and variability of data. This coupled with cost pressures from the business has led these institutions to seek alternatives. Top tier institutions like MetLife have turned to MongoDB because of the enormous business value it enables.
In this session, hear how MongoDB enabled these successful real world examples:
Single View of a Customer - 3 months and $2M for a single view of a customer across 50 source systems
Reference Data Management - $40M in cost savings from migrating to MongoDB for reference data management
Private cloud - MongoDB as a PaaS across a tier 1 bank for enabling agility for operations, not just the developer
The use cases are specific to financial services but the patterns of usage - agility, scale, global distribution - will be applicable across many industries.
L'architettura di classe enterprise di nuova generazione - Massimo BrignoliData Driven Innovation
La nascita dei data lake - La aziende, ormai, sono sommerse dai dati e il classico datawarehouse fa fatica a macinare questi dati per numerosità e varietà. In molti hanno iniziato a guardare a delle architetture chiamate Data Lakes con Hadoop come tecnologia di riferimento. Ma questa soluzione va bene per tutto? Vieni a capire come operazionalizzare i data lakes per creare delle moderne architetture di gestione dati.
MongoDB .local Chicago 2019: MongoDB – Powering the new age data demandsMongoDB
To successfully implement our clients' unique use cases and data patterns, it is mandatory that we unlearn many relational concepts while designing and rapidly developing efficient applications in NoSQL.
In this session, we will talk about some of our client use cases and the strategies we adopted using features of MongoDB.
During this presentation, Infusion and MongoDB shared their mainframe optimization experiences and best practices. These have been gained from working with a variety of organizations, including a case study from one of the world’s largest banks. MongoDB and Infusion bring a tested approach that provides a new way of modernizing mainframe applications, while keeping pace with the demand for new digital services.
MongoBD London 2013: Real World MongoDB: Use Cases from Financial Services pr...MongoDB
Huge upheaval in the finance industry has led to a major strain on existing IT infrastructure and systems. New finance industry regulation has meant increased volume, velocity and variability of data. This coupled with cost pressures from the business has led these institutions to seek alternatives. In this session learn how FS companies are using MongoDB to solve their problems. The use cases are specific to FS but the patterns of usage - agility, scale, global distribution - will be applicable across many industries.
MongoDB is one of the most popular databases these days and there are a few reasons for such popularity. One of these reasons is the excellent integration with different programming languages and development frameworks.
In the case of Python we take it a few notches up (native use of dictionaries, integration with asynchronous libraries (twisted, gevent), good support for web frameworks like django, flask, bottle ... (mongoengine anyone?).
This talk is about the several different projects that we support, the way to effectively use Python and MongoDB together and a few other improvements and announcements.
MongoDB Certification Study Group - May 2016Norberto Leite
Study group session to review the certification exam regarding material covered, exam structure and technical requirements. DBA and Developers track covered to ensure the technical expertise of individuals on subject matter topics specific to MongoDB
Talk about schema design practices and technics that can be put to play in big data scenarios as well as in very flexible data scenarios. This talk was presented at PyConUK15.
Introductory talk to how can MongoDB enable new age software taking into account the expected growth rates, the constant availability of services and new business models that appear on a daily basis.
This covers some key concepts and techniques when one needs to distribute data across many nodes cutting across products ranging from caches to databases.
CAVEAT: If you haven't seen me present this in person slide 7 and 12 wont make much sense. Will be uploading a video version before long
How To Get Hadoop App Intelligence with DrivenCascading
You built Cascading/Scalding apps to mine all that data you collected in Hadoop. But just when you were seeing results, something went wrong — the app broke, data flows stopped, and business came to a halt.
So what do you do next? How do you find out what went wrong in the shortest time possible? How do you pinpoint the line of code where the error occurred? How do you know which SLA is going to be impacted? How do you view the lineage of data to adhere to compliance requirements?
In this presentation, we show you how to easily find the answers with Driven, the most comprehensive Big Data App Performance Management Platform.
Furthermore, this presentation describes how Driven can help you build higher quality big data apps; run big data apps more reliably; and manage big data apps more effectively.
Who should view this PPT: Any person or organization that is currently involved in planning, deploying or managing a Hadoop application infrastructure.
Rapid Development and Performance By Transitioning from RDBMSs to MongoDB
Modern day application requirements demand rich & dynamic data structures, fast response times, easy scaling, and low TCO to match the rapidly changing customer & business requirements plus the powerful programming languages used in today's software landscape.
Traditional approaches to solutions development with RDBMSs increasingly expose the gap between the modern development languages and the relational data model, and between scaling up vs. scaling horizontally on commodity hardware. Development time is wasted as the bulk of the work has shifted from adding business features to struggling with the RDBMSs.
MongoDB, the premier NoSQL database, offers a flexible and scalable solution to focus on quickly adding business value again.
In this session, we will provide:
- Overview of MongoDB's capabilities
- Code-level exploration of the MongoDB programming model and APIs and how they transform the way developers interact with a database
- Update of the exciting features in MongoDB 3.0
Cassandra-Based Image Processing: Two Case Studies (Kerry Koitzsch, Kildane) ...DataStax
In this presentation, we will detail two image processing applications which rely on a Cassandra centric architecture to achieve distributed, high accuracy analysis of a variety of image formats, types, and quality, and which require different kinds of metadata processing as well as feature extraction from the image themselves. We will outline the architecture choices made for the two use case studies, and how we found Cassandra to be the ideal choice for the persistence layer implementation technology. In conclusion we will discuss extensions to the two use cases discussed and some of the 'lessons learned' from the two implementation projects.
About the Speaker
Kerry Koitzsch Project Lead, Kildane Software Technologies, Inc
Kerry Koitzsch is a software engineer and architect specializing in big data applications, NoSQL databases, and image processing. He currently works for Correlli Software Systems, a big data analytics company in Sunnyvale CA.
The integration between Spring Framework and MongoDB tends to be somewhat unknown. This presentation shows the different projects that compose Spring ecosystem, Springdata, Springboot, SpringIO etc and how to merge between the pure JAVA projects to massive enterprise systems that require the interaction of these systems together.
This talk is quick reference of all the different queerability options that MongoDB offers to developers that want to build mobile and geospatial referenced applications. We reviewed the basic functionality but also recent improvements in the query and indexation engine of MongoDB geospatial features
Presentation on MongoDB and Node.JS. We describe how to do basic CRUD operations (insert, remove, update, find) how to aggregate using node.js. We also discuss a bit of Meteor, MEAN Stack and other ODMs and projects on Javascript and MongoDB
From Monolithic to Microservices in 45 MinutesMongoDB
Presented by Norberto Leite, Developer Advocate, MongoDB
In this session you will learn how to leverage both Python and MongoDB to build highly scalable, asynchronous applications based on microservices architecture. We will review how to connect several different “exotic” services, using a variety of datasets, that together we can mashup into a consolidated application.
We will start by introducing several technologies that we will be using (e.g. Python, Flask, MongoDB, AngularJS) and take a ten-thousand foot overview of micro services architecture. At the end of the talk you will have a better understanding of how to decouple and implement microservices with MongoDB.
MongoDB 3.4: Deep Dive on Views, Zones, and MongoDB CompassMongoDB
Thomas Boyd, Principal Solutions Architect, MongoDB
MongoDB Evenings San Francisco
March 21, 2017
MongoDB 3.4 was released in November 2016 and contains a wealth of new features that allow developers, DBAs, architects, and data scientists to tackle a wide variety of use cases. After an overview of 3.4, Thomas will provide a deep dive on using MongoDB views to encapsulate complex aggregation logic and to enhance MongoDB security, using zones to create a cross-continent, multi-master MongoDB cluster, and using MongoDB Compass to browse and interact with the data stored in your cluster.
Webinar: Faster Big Data Analytics with MongoDBMongoDB
Learn how to leverage MongoDB and Big Data technologies to derive rich business insight and build high performance business intelligence platforms. This presentation includes:
- Uncovering Opportunities with Big Data analytics
- Challenges of real-time data processing
- Best practices for performance optimization
- Real world case study
This presentation was given in partnership with CIGNEX Datamatics.
MongoDB Days Silicon Valley: Jumpstart: The Right and Wrong Use Cases for Mon...MongoDB
Presented by Sigfrido Narvaez, Senior Solutions Architect, MongoDB
Experience level: Introductory
When it comes time to select database software for your project, there are a bewildering number of choices. How do you know if your project is a good fit for a relational database, or whether one of the many NoSQL options is a better choice? In this session you will learn when to use MongoDB and how to evaluate if MongoDB is a fit for your project. You will see how MongoDB's flexible document model is solving business problems in ways that were not previously possible, and how MongoDB's built-in features allow running at scale.
MongoDB .local Toronto 2019: MongoDB – Powering the new age data demandsMongoDB
To successfully implement our clients' unique use cases and data patterns, it is mandatory that we unlearn many relational concepts while designing and rapidly developing efficient applications in NoSQL.
In this session, we will talk about some of our client use cases and the strategies we adopted using features of MongoDB.
Webinar: An Enterprise Architect’s View of MongoDBMongoDB
In the world of big data, legacy modernization, siloed organizations, empowered customers, and mobile devices, making informed choices about your enterprise infrastructure has become more important than ever. The alternatives are abundant, and the successful Enterprise Architect must constantly discern which new technology is just a shiny object and which will add true business value.
MongoDB is more than just a great application database for developers; it gives Enterprise Architects new capabilities to solve previously difficult architectural requirements much more easily. Take for example the challenge of many siloed systems at MetLife – with MongoDB, the Metlife team was able to successfully provide a single view into those 70 systems, in only 3 months.
In this webinar, we will:
Explore real life challenges enterprises face with case studies of their solutions
Consider how best to introduce MongoDB in the enterprise
Give an overview of how to optimize the use of MongoDB
In the world of big data, legacy modernization, siloed organizations, empowered customers, and mobile devices, making informed choices about your enterprise infrastructure has become more important than ever. The alternatives are abundant, and the successful Enterprise Architect must constantly discern which new technology is just a shiny object and which will add true business value.
Data Modelling for MongoDB - MongoDB.local Tel AvivNorberto Leite
At this point, you may be familiar with MongoDB and its Document Model.
However, what are the methods you can use to create an efficient database schema quickly and effectively?
This presentation will explore the different phases of a methodology to create a database schema. This methodology covers the description of your workload, the identification of the relationships between the elements (one-to-one, one-to-many and many-to-many) and an introduction to design patterns. Those patterns present practical solutions to different problems observed while helping our customers over the last 10 years.
In this session, you will learn about:
The differences between modeling for MongoDB versus a relational database.
A flexible methodology to model for MongoDB, which can be applied to simple projects, agile ones or more complex ones.
Overview of some common design patterns that help improve the performance of systems.
Query performance can either be a constant headache or the unsung hero of an application. MongoDB provides extremely powerful querying capabilities when used properly. As a member of the support team I will share common mistakes observed as well as tips and tricks to avoiding them.
The MongoDB Spark Connector integrates MongoDB and Apache Spark, providing users with the ability to process data in MongoDB with the massive parallelism of Spark. The connector gives users access to Spark's streaming capabilities, machine learning libraries, and interactive processing through the Spark shell, Dataframes and Datasets. We'll take a tour of the connector with a focus on practical use of the connector, and run a demo using both Spark and MongoDB for data processing.
Technical feature review of features introduced by MongoDB 3.4 on graph capabilities, MongoDB UI tool: Compass, improvements on the replication and aggregation framework stages and utils. Operations improvements on Ops Manager and MongoDB Atlas.
Slidedeck presented at http://devternity.com/ around MongoDB internals. We review the usage patterns of MongoDB, the different storage engines and persistency models as well has the definition of documents and general data structures.
This presentation contains a preview of MongoDB 3.2 upcoming release where we explore the new storage engines, aggregation framework enhancements and utility features like document validation and partial indexes.
This presentation reviews the integration details of the springframework and MongoDB. We approach some of the most popular projects of the Spring stack, spring data, spring boot, spring batch ... and how we can easily build applications with MongoDB as backend. This presentation was produced for a webinar hosted by Pivotal.
MongoDB has been conceived for the cloud age. Making sure that MongoDB is compatible and performant around cloud providers is mandatory to achieve complete integration with platforms and systems. Azure is one of biggest IaaS platforms available and very popular amongst developers that work on Microsoft Stack.
Agile Software Development is becoming the defacto way of building software these days. More and more enterprises, from large fortune 500 to small shop start-ups, are adopting agile development methodologies. But Agile Software development is more than just a methodology or a practice. It's also a combined set of tools and platforms that today are at our disposal to allows to iterate faster, get-to-market sooner and also fail faster. These set of tools augment our development cycles by a few orders of magnitude and allow developers to be much more productive.
When dealing with infrastructure we often go through the process of determining the different resources needed to attend our application requirements. This talks looks into the way that resources are used by MongoDB and which aspects should be considered to determined the sizing, capacity and deployment of a MongoDB cluster given the different scenarios, different sets of operations and storage engines available.
Modern architectures are moving away from a "one size fits all" approach. We are well aware that we need to use the best tools for the job. Given the large selection of options available today, chances are that you will end up managing data in MongoDB for your operational workload and with Spark for your high speed data processing needs.
Description: When we model documents or data structures there are some key aspects that need to be examined not only for functional and architectural purposes but also to take into consideration the distribution of data nodes, streaming capabilities, aggregation and queryability options and how we can integrate the different data processing software, like Spark, that can benefit from subtle but substantial model changes. A clear example is when embedding or referencing documents and their implications on high speed processing.
Over the course of this talk we will detail the benefits of a good document model for the operational workload. As well as what type of transformations we should incorporate in our document model to adjust for the high speed processing capabilities of Spark.
We will look into the different options that we have to connect these two different systems, how to model according to different workloads, what kind of operators we need to be aware of for top performance and what kind of design and architectures we should put in place to make sure that all of these systems work well together.
Over the course of the talk we will showcase different libraries that enable the integration between spark and MongoDB, such as MongoDB Hadoop Connector, Stratio Connector and MongoDB Spark Native Connector.
By the end of the talk I expect the attendees to have an understanding of:
How they connect their MongoDB clusters with Spark
Which use cases show a net benefit for connecting these two systems
What kind of architecture design should be considered for making the most of Spark + MongoDB
How documents can be modeled for better performance and operational process, while processing these data sets stored in MongoDB.
The talk is suitable for:
Developers that want to understand how to leverage Spark
Architects that want to integrate their existing MongoDB cluster and have real time high speed processing needs
Data scientists that know about Spark, are playing with Spark and want to integrate with MongoDB for their persistency layer
Strongly Typed Languages and Flexible SchemasNorberto Leite
We like to use strongly type languages and used them along side with flexible schema databases. What challenges and strategies do we have to deal with data coherence and format validations using different strategies and tools like ODMs versioning, migrations et al. We also review the tradeoffs of such strategies.
This presentation touch the basics of MongoMK, Jackrabbit Oak MongoDB persistency layer implementation and how to deploy, operate, manage and size your MongoDB cluster on AEM environments
Ops Manager is MongoDB management solution to administer, deploy and backup your MongoDB Cluster. It's complete solution that offers a Automation mechanism, auto and point-in-time backup mechanism along side with a practical Monitoring interface. Along side, and feature better integration with existing deployment and monitoring tools, Ops Manager exposes a REST API to make sure that you can use the offered functionality from your existing infrastructure and existing tools like Docker, Nagios, HP Openview. The main purpose is to allow a comprehensive experience of your environment from pleasant web GUI interface.
MongoDB 3.0 comes with a set of innovations regarding storage engine, operational facilities and improvements has well of security enhancements. This presentations describes these improvements and new features ready to be tested.
https://www.mongodb.com/lp/white-paper/mongodb-3.0
MongoDB + Java - Everything you need to know Norberto Leite
Learn everything you need to know to get started building a MongoDB-based app in Java. We'll explore the relationship between MongoDB and various languages on the Java Virtual Machine such as Java, Scala, and Clojure. From there, we'll examine the popular frameworks and integration points between MongoDB and the JVM including Spring Data and object-document mappers like Morphia.
Deploying any software can be a challenge if you don't understand how resources are used or how to plan for the capacity of your systems. Whether you need to deploy or grow a single MongoDB instance, replica set, or tens of sharded clusters then you probably share the same challenges in trying to size that deployment.
Enhancing Research Orchestration Capabilities at ORNL.pdfGlobus
Cross-facility research orchestration comes with ever-changing constraints regarding the availability and suitability of various compute and data resources. In short, a flexible data and processing fabric is needed to enable the dynamic redirection of data and compute tasks throughout the lifecycle of an experiment. In this talk, we illustrate how we easily leveraged Globus services to instrument the ACE research testbed at the Oak Ridge Leadership Computing Facility with flexible data and task orchestration capabilities.
Large Language Models and the End of ProgrammingMatt Welsh
Talk by Matt Welsh at Craft Conference 2024 on the impact that Large Language Models will have on the future of software development. In this talk, I discuss the ways in which LLMs will impact the software industry, from replacing human software developers with AI, to replacing conventional software with models that perform reasoning, computation, and problem-solving.
Globus Compute wth IRI Workflows - GlobusWorld 2024Globus
As part of the DOE Integrated Research Infrastructure (IRI) program, NERSC at Lawrence Berkeley National Lab and ALCF at Argonne National Lab are working closely with General Atomics on accelerating the computing requirements of the DIII-D experiment. As part of the work the team is investigating ways to speedup the time to solution for many different parts of the DIII-D workflow including how they run jobs on HPC systems. One of these routes is looking at Globus Compute as a way to replace the current method for managing tasks and we describe a brief proof of concept showing how Globus Compute could help to schedule jobs and be a tool to connect compute at different facilities.
In 2015, I used to write extensions for Joomla, WordPress, phpBB3, etc and I ...Juraj Vysvader
In 2015, I used to write extensions for Joomla, WordPress, phpBB3, etc and I didn't get rich from it but it did have 63K downloads (powered possible tens of thousands of websites).
First Steps with Globus Compute Multi-User EndpointsGlobus
In this presentation we will share our experiences around getting started with the Globus Compute multi-user endpoint. Working with the Pharmacology group at the University of Auckland, we have previously written an application using Globus Compute that can offload computationally expensive steps in the researcher's workflows, which they wish to manage from their familiar Windows environments, onto the NeSI (New Zealand eScience Infrastructure) cluster. Some of the challenges we have encountered were that each researcher had to set up and manage their own single-user globus compute endpoint and that the workloads had varying resource requirements (CPUs, memory and wall time) between different runs. We hope that the multi-user endpoint will help to address these challenges and share an update on our progress here.
We describe the deployment and use of Globus Compute for remote computation. This content is aimed at researchers who wish to compute on remote resources using a unified programming interface, as well as system administrators who will deploy and operate Globus Compute services on their research computing infrastructure.
Listen to the keynote address and hear about the latest developments from Rachana Ananthakrishnan and Ian Foster who review the updates to the Globus Platform and Service, and the relevance of Globus to the scientific community as an automation platform to accelerate scientific discovery.
Unleash Unlimited Potential with One-Time Purchase
BoxLang is more than just a language; it's a community. By choosing a Visionary License, you're not just investing in your success, you're actively contributing to the ongoing development and support of BoxLang.
Developing Distributed High-performance Computing Capabilities of an Open Sci...Globus
COVID-19 had an unprecedented impact on scientific collaboration. The pandemic and its broad response from the scientific community has forged new relationships among public health practitioners, mathematical modelers, and scientific computing specialists, while revealing critical gaps in exploiting advanced computing systems to support urgent decision making. Informed by our team’s work in applying high-performance computing in support of public health decision makers during the COVID-19 pandemic, we present how Globus technologies are enabling the development of an open science platform for robust epidemic analysis, with the goal of collaborative, secure, distributed, on-demand, and fast time-to-solution analyses to support public health.
Accelerate Enterprise Software Engineering with PlatformlessWSO2
Key takeaways:
Challenges of building platforms and the benefits of platformless.
Key principles of platformless, including API-first, cloud-native middleware, platform engineering, and developer experience.
How Choreo enables the platformless experience.
How key concepts like application architecture, domain-driven design, zero trust, and cell-based architecture are inherently a part of Choreo.
Demo of an end-to-end app built and deployed on Choreo.
top nidhi software solution freedownloadvrstrong314
This presentation emphasizes the importance of data security and legal compliance for Nidhi companies in India. It highlights how online Nidhi software solutions, like Vector Nidhi Software, offer advanced features tailored to these needs. Key aspects include encryption, access controls, and audit trails to ensure data security. The software complies with regulatory guidelines from the MCA and RBI and adheres to Nidhi Rules, 2014. With customizable, user-friendly interfaces and real-time features, these Nidhi software solutions enhance efficiency, support growth, and provide exceptional member services. The presentation concludes with contact information for further inquiries.
Experience our free, in-depth three-part Tendenci Platform Corporate Membership Management workshop series! In Session 1 on May 14th, 2024, we began with an Introduction and Setup, mastering the configuration of your Corporate Membership Module settings to establish membership types, applications, and more. Then, on May 16th, 2024, in Session 2, we focused on binding individual members to a Corporate Membership and Corporate Reps, teaching you how to add individual members and assign Corporate Representatives to manage dues, renewals, and associated members. Finally, on May 28th, 2024, in Session 3, we covered questions and concerns, addressing any queries or issues you may have.
For more Tendenci AMS events, check out www.tendenci.com/events
Exploring Innovations in Data Repository Solutions - Insights from the U.S. G...Globus
The U.S. Geological Survey (USGS) has made substantial investments in meeting evolving scientific, technical, and policy driven demands on storing, managing, and delivering data. As these demands continue to grow in complexity and scale, the USGS must continue to explore innovative solutions to improve its management, curation, sharing, delivering, and preservation approaches for large-scale research data. Supporting these needs, the USGS has partnered with the University of Chicago-Globus to research and develop advanced repository components and workflows leveraging its current investment in Globus. The primary outcome of this partnership includes the development of a prototype enterprise repository, driven by USGS Data Release requirements, through exploration and implementation of the entire suite of the Globus platform offerings, including Globus Flow, Globus Auth, Globus Transfer, and Globus Search. This presentation will provide insights into this research partnership, introduce the unique requirements and challenges being addressed and provide relevant project progress.
Prosigns: Transforming Business with Tailored Technology SolutionsProsigns
Unlocking Business Potential: Tailored Technology Solutions by Prosigns
Discover how Prosigns, a leading technology solutions provider, partners with businesses to drive innovation and success. Our presentation showcases our comprehensive range of services, including custom software development, web and mobile app development, AI & ML solutions, blockchain integration, DevOps services, and Microsoft Dynamics 365 support.
Custom Software Development: Prosigns specializes in creating bespoke software solutions that cater to your unique business needs. Our team of experts works closely with you to understand your requirements and deliver tailor-made software that enhances efficiency and drives growth.
Web and Mobile App Development: From responsive websites to intuitive mobile applications, Prosigns develops cutting-edge solutions that engage users and deliver seamless experiences across devices.
AI & ML Solutions: Harnessing the power of Artificial Intelligence and Machine Learning, Prosigns provides smart solutions that automate processes, provide valuable insights, and drive informed decision-making.
Blockchain Integration: Prosigns offers comprehensive blockchain solutions, including development, integration, and consulting services, enabling businesses to leverage blockchain technology for enhanced security, transparency, and efficiency.
DevOps Services: Prosigns' DevOps services streamline development and operations processes, ensuring faster and more reliable software delivery through automation and continuous integration.
Microsoft Dynamics 365 Support: Prosigns provides comprehensive support and maintenance services for Microsoft Dynamics 365, ensuring your system is always up-to-date, secure, and running smoothly.
Learn how our collaborative approach and dedication to excellence help businesses achieve their goals and stay ahead in today's digital landscape. From concept to deployment, Prosigns is your trusted partner for transforming ideas into reality and unlocking the full potential of your business.
Join us on a journey of innovation and growth. Let's partner for success with Prosigns.
Gamify Your Mind; The Secret Sauce to Delivering Success, Continuously Improv...Shahin Sheidaei
Games are powerful teaching tools, fostering hands-on engagement and fun. But they require careful consideration to succeed. Join me to explore factors in running and selecting games, ensuring they serve as effective teaching tools. Learn to maintain focus on learning objectives while playing, and how to measure the ROI of gaming in education. Discover strategies for pitching gaming to leadership. This session offers insights, tips, and examples for coaches, team leads, and enterprise leaders seeking to teach from simple to complex concepts.
Enterprise Resource Planning System includes various modules that reduce any business's workload. Additionally, it organizes the workflows, which drives towards enhancing productivity. Here are a detailed explanation of the ERP modules. Going through the points will help you understand how the software is changing the work dynamics.
To know more details here: https://blogs.nyggs.com/nyggs/enterprise-resource-planning-erp-system-modules/
5. 5
The Database of the Post-Relational Era
Combines the foundation of relational
databases with the innovations of NoSQL
Flexible Data Model
Performance
Scalability
NoSQL
Strong Consistency
Powerful Query Language
Rich Indexes
RELATIONAL
6. 6
MongoDB Features
JSON Document Model
with Dynamic Schemas
Auto-Sharding for
Horizontal Scalability
Text Search
Aggregation Framework
and MapReduce
Full, Flexible Index Support
and Rich Queries
Built-In Replication
for High Availability
Advanced Security
Large Media Storage
with GridFS
11. 11
Relational Database Challenges
Data Types
Unstructured data
Semi-structured data
Polymorphic data
Agile Development
Iterative
Short development cycles
New workloads
Volume of Data
Petabytes of data
Trillions of records
Millions of queries/sec
New Architectures
Horizontal scaling
Commodity servers
Cloud computing
12. Ps(x, s, e) = eng^e * s / x * C
Application change is const in today's development process!
17. 17
Storage Engine API
• Allows to "plug-in" different storage engines
– Different work sets require different performance
characteristics
– mmapv1 is not ideal for all workloads
– More flexibility
• Can mix storage engines on same replica
set/sharded cluster
• Opportunity to integrate further ( HDFS, native
encrypted, hardware optimized …)
18. 18
What is WiredTiger?
• Storage engine company founded by BerkeleyDB alums
• Recently acquired by MongoDB
• Available as a storage engine option in MongoDB 3.0
19. 19
Improving Concurrency Control
• 2.2 – Global
• 2.4 – Database-level
• 3.0 MMAPv1 – Collection-level
• 3.0 WT – Document-level
– Writes no longer block all other writes
– Higher level of concurrency leads to more
CPU usage
20. 20
Compression
• WT uses snappy compression by default
• Data is compressed on disk
• 2 supported compression algorithms
– snappy: default. Good compression, relatively low
overhead
– zlib: Better
• Indexes are compressed using prefix
compression
– Allows compression in memory
21. 21
Consistency without Journaling
• MMAPv1 uses write-ahead log (journal) to
guarantee consistency
• WT doesn't have this need: no in-place updates
– Write-ahead log committed at checkpoints
• 2GB or 60sec by default – configurable!
– No journal commit interval: writes are written to
journal as they come in
– Better for insert-heavy workloads
• Replication guarantees the durability
24. 24
Wider Range of Use Cases
How: Flexible Storage Architecture
• Fundamental rearchitecture, with new pluggable storage engine API
• Same data model, same query language, same ops
• But under the hood, many storage engines optimized for many use
cases
Single View Content Management
Real-Time Analytics Catalog
Internet of Things (IoT)Messaging
Log Data Tick Data
33. 33
Risk Aggregation & Reporting
• Intraday Controls
– Less than 1minute reporting
• Aggregate vast amount of data from different
trading desks (asset classes)
• Manage exposure to counter-party entities
– Can be thousands depending on the trade
– Challenge for existing RDBMS systems
35. 35
Trade Repository
• Scalable Database
– Size
– Velocity
– Variety
• Regulatory Requirements
– Dodd-Frank and EMIR
• Any trade, any point in time
• Unified view of product and trades across time
42. 42
Retail Bank Transactions Log
• Data needs to be fetched from Mainframe
– That costs Money!
• Read Requests
– Mobile Apps
– Home Banking
– Analytics
– Marketing Workloads
47. Data
Securities Master, Corporate Actions, Market
Data, Counter-Party Information, Economic
Calendar, Legal Entity Identifier.
Problem
Replicating reference data across
geographies in a timely and efficient manner.
Ensuring that data replication meets with
service level agreements. Ensure a
congruent view across all trading entities in a
global organisation.
Business Benefit
Reduced cost in managing infrastructure.
Timely reference data replicated with SLA.
Company in question will save about $40m
in costs and penalties over 5 years. Only
charged once for data from TR / Bloomberg /
etc instead of regionally as before.
Reference Data Management
Why MongoDB?
Dynamic data model means no schema
changes across geographies, built-in robust
replication mechanism simplifies infrastructure
and removes requirement for additional
integration technologies. Data replicated for
each change, not batch orientated. Both cache
and database cache always up-to-date; simple
data modelling & analysis : easy changes and
understanding.
Case Studies: Large American Investment /
Retail bank
48. Data
Risk metrics from upstream systems. For
instance, data from front office system for
monitoring counter-party exposure.
Problem
Investment Banks need a congruent view of
exposures across their business in order to
effectively manage risk – need for Intraday controls
– risk measures less than 1 minute old. Could not
scale with RDBMS. Data distributed across
multiple silos and consequently needed to be
aggregated. Need for versioning for data lineage
and auditing. Auditors requiring longer time
window
Business Benefit
Single view of exposure / risk data across the
business. Can make applications changes much
faster. Can hedge / trade with more confidence
and be more competitive. Have less capital
reserves.
Why MongoDB?
Scalable, replicable, flexible (a quick time-to-
market). Can handle more data and users
easily.
Dynamic Schema: can store disparate data and
make changes easily.
Replication: local reads and high availability.
Sharding: can add data and users easily by
scaling out.
Case Studies: Tier-1 Bank - Prime Services;
LargeAmerican Banking Group, Swiss Bank
(Equity Derivatives)
Risk Aggregation & Reporting
49. Data
Trade data for each new or updated trade.
Problem
Dodd-Frank and EMIR (European Markets and
Infrastructure Regulation) have mandated firms
to store all trade data (including updates) for
seven years. Investment Banks also have the
requirement to be able to query and report at
any time to the regulators in a bi-temporal
manner. Each application builds its own
persistence and audit trail. As an example, one
customer wants one unified framework and
persistence for all trades and products. Found
it hard to find a solution that could handle the
many variable structures across all securities.
Business Benefit
Quick access to data and reporting to ensure
that the regulators have what they need in a
timely manner. Ensure compliance to regulatory
mandates, and help to avoid the consequences
of not complying.
Why MongoDB?
Scalable, dynamic schema - trade information
can vary over time, scalable cost structure as
the data volumes grow, “pay as you grow”.
Case Studies: Global leader in institutional
research and investment management. Large
Australian Bank
Trade Repository
50. Data
Market, client/customer, trade, any data
Problem
Wanted application groups in the bank to focus
on building apps, not data access logic. It
takes 6 months for apps groups to get new
infrastructure ordered/delivered. Application
developers not very interested in speaking with
Hardware/DBA groups. Horizontal scaling
done by each application.
Business Benefit
Time-to-market decreased by at least 50%.
Object persistence included in framework. DB
capacity added in minutes not months. Same
environment from prototype to production.
Why MongoDB?
For new datamarts, single views, flexible
schema allows integrating disparate systems to
be simplified and “loosely coupled”, i.e.
changes to upstream systems won't break
downstream applications. Native language
drivers: groups can focus on agile application
development. Auto-replication: data distributed
globally in real time.
Case Studies: Large US Investment and
Retail Bank.
DBaaS
51. Data
Client/Customer data, addresses, personal
details, purchase history, status, etc.
Problem
Siloed data across organisation, no consistent
view across the customer. Difficult to identify
needs of the customer for cross-sell / up-sell
opportunities. Not able to positively deal with
the customer as source systems are hard to
change/touch so the business and IT are
normally stuck. In the customer example, they
had 70 source systems and 20 screens to view
customer policies, so couldn’t feasibly see a
single view.
Business Benefit
Provide the business with an accurate view of
their customer base.
Why MongoDB?
Flexible schema schema allows integrating any
disparate systems to be simplified and "loosely
coupled”, i.e. changes to upstream systems
won't break downstream applications.
Performance: can handle all data in one DB.
Replication: local reads and high availability.
Sharding: can add more data and users
globally by scaling out
Case Studies: MetLife.
Single View of Customer
53. We’re Always Looking for Top Talent
What are employees saying?
“Working with a group of individuals who you know will have your back is
one of the reasons I love working at MongoDB”
“Every day, we get to solve hard problems that make distributed databases
more accessible to developers all over the world”
“MongoDB lets you tackle real problems that affect hundreds of thousands
of users”
Why work with us?
• We’re by developers for developers
• $311 MM in capital raised to date
• #4 on DB-Engines list of top Database
Management Systems… and climbing
• Scaling our EMEA/APAC operations
aggressively
Visit us at www.mongodb.com/careers to see a full list of opportunities or email your resume to
jobs@mongodb.com
What are we hiring for?
• Technical Services Engineers (Dublin)
• Consulting Engineers (UK OR France)
• Solution Architects (France, Spain, Germany)
• Enterprise Account Executives ( France, Italy, UK,
Germany)
• Corporate Account Executives (Dublin)
• Renewals Account Managers (Dublin)
54. 54
For More Information
Resource Location
Case Studies mongodb.com/customers
Presentations mongodb.com/presentations
Free Online Training education.mongodb.com
Webinars and Events mongodb.com/events
Documentation docs.mongodb.org
MongoDB Downloads mongodb.com/download
Additional Info info@mongodb.com
We are not in the business of doing things like before
We are in the disruptive technology business
MongoDB provides agility, scalability, and performance without sacrificing the functionality of relational databases, like full index support and rich queriesIndexes: secondary, compound, text search, geospatial, and more
x = number of features
s= size of the team
e = expertise
In 1985, storage was the key expense: $100,000 per GB; developer salary: $28,000 per year
So relational databases were built to optimize for storage
In 2013, storage is cheap: $0.05 per GB. Developers are expensive: $90,000 per year
So MongoDB was built to optimize for developer productivity
This is what the ratio of those expenses looks like, in 1985 and today
Assumptions:
3-year TCO
1985: 2 developers and 5 GB
2013: 2 developers and 5 TB
Developer costs comprise the lion’s share relative to storage today. So optimize for developer productivity
Analysis of large sets of information
New streams of data from big data scnearios and IoT
Data formats that are very variable and constant changing
Enrichment of an existing feed and feed onboarding takes months
Data updates reach the traders with intra-day frequency
Sub-optimal data access and global availability
Licensing agreements are not effective