How a financial services company preserved legacy email data on the cloud and gained from ediscovery
For any business today, it is a fact that more than 70% of email carries critical information such as agreements, contract negotiations, commitments, issues, invoices, reports, notifications, contacts, etc. Our customer, a leading financial services company, understands this and for them safely storing email for the long term is critical to manage risks and ensuring compliance.
This particular customer of Mithi is one of India's leading NBFC brands offering a diverse range of financial products and services across rural, housing and infrastructure finance sector. It also offers mutual fund products and investment management services.
What was the use case at this customer?
The customer has been accumulating a high volume of email data related to employees that exit the company. Such email data was being downloaded from the primary mail platform and stored in a data lake before disabling the email account. The legacy email data of these inactive users has to then be conserved for extended periods of time for compliance purposes.
What were the challenges faced by this customer?
1. Leaving email data of inactive users on the primary mail platform in a disabled state was an expensive affair since they had to pay for the cost of a live mailbox.
2. Traditional methods of preserving data on tapes, drives and other media was risky and "passive" with no easy access when required.
3. Downloading and preserving email of a large number of existing employees was a manual and tedious process, prone to errors.
4. Email preserved on the cloud in a dark/passive email data lake, was making hard to search and retrieve from.
Join Mithi and Amazon to learn how this customer leveraged Vaultastic HOLD to
1. optimize the cost and reliability of storing legacy email data
2. automate the migration of data from the primary platform to the archive thereby improving IT team productivity for maintaining this data
3. stay compliance ready always with on-demand ediscovery
4. benefit from the pay per use billing model.
Focus on your core mission.. Instead of focusing on building and running your infra leverage AWS Global Infra..
AWS serves hundreds of thousands of customers in more than 190 countries.
May 2019 https://aws.amazon.com/about-aws/global-infrastructure/ The AWS Cloud spans 66 Availability Zones within 21 geographic Regions around the world, with announced plans for 12 more Availability Zones and four more Regions in Bahrain, Cape Town, Jakarta, and Milan.
https://aws.amazon.com/cloudfront/features/ To deliver content to end users with lower latency, Amazon CloudFront uses a global network of 180 Points of Presence (169 Edge Locations and 11 Regional Edge Caches) in 69 cities across 30 countries. Amazon CloudFront Edge locations are located in:
AWS has been continually expanding its services to support virtually any cloud workload, and it now has more than 100 services that range from compute, storage, networking, database, analytics, application services, deployment, management, developer, mobile, Internet of Things (IoT), Artificial Intelligence (AI), security, hybrid and enterprise applications. We offer the capability for both managed and unmanaged database options. The offerings for Analytics and Application Services enable advanced data processing and workloads. Our management tools offer a lot of insight and flexibility to let you manage your AWS resources through either our tools or the management tools you’re already familiar with.
Recent expansion into enterprise applications has been entirely driven by customer feedback on where they’d like us to deliver value.
One of the very clear ways that this manifests itself is in our instance delivery, where every year we ensure that you all have the absolute latest and greatest platforms on which to build your applications. I won’t go through each and every new instance that we launched this past year, but I will point out a couple of the highlights.
Grown from 3 to over 70 instance types Moved from 1 to 12 family types… Lots of innovations across our instance families over the past year, and some new ones we will talk about today.
- Compute is at the core of nearly every AWS customers infrastructure, whether it be in the form of instances, containers or Serverless compute. - Within each of those areas, we are rapidly adding completely new capabilities.
In order to meet the requirements of the wide variety of these use cases and other, AWS offers a storage platform with different types of storage suited for different needs, these include…
Q: How reliable is Amazon S3? Amazon S3 gives any developer access to the same highly scalable, highly available, fast, inexpensive data storage infrastructure that Amazon uses to run its own global network of web sites. The S3 Standard storage class is designed for 99.99% availability, the S3 Standard-IA storage class is designed for 99.9% availability, and the S3 One Zone-IA storage class is designed for 99% availability.
All of these storage classes are backed by the Amazon S3 Service Level Agreement.
Q: How durable is Amazon S3? Amazon S3 Standard, S3 Standard–IA, S3 One Zone-IA, and Amazon Glacier are all designed to provide 99.999999999% durability of objects over a given year. This durability level corresponds to an average annual expected loss of 0.000000001% of objects. For example, if you store 10,000,000 objects with Amazon S3, you can on average expect to incur a loss of a single object once every 10,000 years. In addition, Amazon S3 Standard, S3 Standard-IA, and Amazon Glacier are all designed to sustain data in the event of an entire S3 Availability Zone loss. As with any environment, the best practice is to have a backup and to put in place safeguards against malicious or accidental deletion. For S3 data, that best practice includes secure access permissions, Cross-Region Replication, versioning, and a functioning, regularly tested backup.
Durability & Availability Amazon S3 Standard storage and Standard-IA storage provide high levels of data durability and availability by automatically and synchronously storing your data across both multiple devices and multiple facilities within your selected geographical region. Error correction is built-in, and there are no single points of failure. Amazon S3 is designed to sustain the concurrent loss of data in two facilities, making it very well suited to serve as the primary data storage for mission-critical data
Additionally, you have a choice of enabling cross-region replication on each Amazon S3 bucket. Once enabled, cross-region replication automatically copies objects across buckets in different AWS Regions asynchronously, providing eleven 9s of durability and four 9s of availability on both the source and destination Amazon S3 objects
Performance In scenarios where you use Amazon S3 from within Amazon EC2 in the same Region, access to Amazon S3 from Amazon EC2 is designed to be fast. Amazon S3 is also designed so that server-side latencies are insignificant relative to internet latencies. In addition, Amazon S3 is built to scale storage, requests, and numbers of users to support an extremely large number of web-scale applications. If you access Amazon S3 using multiple threads, multiple applications, or multiple clients concurrently, total Amazon S3 aggregate throughput typically scales to rates that far exceed what any single server can generate or consume.
To improve the upload performance of large objects (typically over 100 MB), Amazon S3 offers a multipart upload command to upload a single object as a set of parts. After all parts of your object are uploaded, Amazon S3 assembles these parts and creates the object. Using multipart upload, you can get improved throughput and quick recovery from any network issues. Another benefit of using multipart upload is that you can upload multiple parts of a single object in parallel and in case of failures restart the upload of smaller parts instead of restarting the upload of the entire large object.
To speed up access to relevant data, many developers pair Amazon S3 with a search engine such as Amazon Elasticsearch or a database such as Amazon DynamoDB or Amazon RDS. In these scenarios, Amazon S3 stores the actual information, and the search engine or database serves as the repository for associated metadata (for example, the object name, size, keywords, and so on). Metadata in the database can easily be indexed and queried, making it very efficient to locate an object’s reference by using a search engine or a database query. This result can be used to pinpoint and retrieve the object itself from Amazon S3.
Amazon S3 also offers something called Transfer Acceleration which enables fast, easy, and secure transfer of files over long distances between your client and your Amazon S3 bucket. It leverages Amazon CloudFront’s (which is AWS’s content distribution network) globally distributed edge locations to route traffic to your Amazon S3 bucket over an Amazon-optimized network path. To get started with Amazon S3 Transfer Acceleration you first must enable it on an Amazon S3 bucket. Then modify your Amazon S3 PUT and GET requests to use the s3- accelerate endpoint domain name (<bucketname>.s3- accelerate.amazonaws.com). The Amazon S3 bucket can still be accessed using the regular endpoint. Some customers have measured performance improvements in excess of 500% when performing intercontinental uploads.
Scalability & Elasticity Amazon S3 has been designed to offer a very high level of automatic scalability and elasticity. Unlike a typical file system that encounters issues when storing a large number of files in a directory, Amazon S3 supports a virtually unlimited number of files in any bucket. Also, unlike a disk drive that has a limit on the total amount of data that can be stored before you must partition the data across drives and/or servers, an Amazon S3 bucket can store a virtually unlimited number of bytes. You can store any number of objects (files) in a single bucket, and Amazon S3 will automatically manage scaling and distributing redundant copies of your information to other servers in other locations in the same region, all using Amazon’s high-performance infrastructure.
Security Amazon S3 is highly secure. It provides multiple mechanisms for fine-grained access control to Amazon S3 resources, and it supports encryption. You can manage access to Amazon S3 by granting other AWS accounts and users permission to perform the resource operations by writing an access policy. You can also define object level permissions via Amazon S3 bucket policies.
You can protect data at rest in Amazon S3 by using server-side encryption, in which you request Amazon S3 to encrypt your object before it’s written to disks in data centers and decrypt it when you download the object or by using client-side encryption, in which you encrypt your data on the client side and upload the encrypted data to Amazon S3. You can protect the data in transit by using Secure Sockets Layer (SSL) or client-side encryption.
You can use versioning to preserve, retrieve, and restore every version of every object stored in your Amazon S3 bucket. With versioning, you can easily recover from both unintended user actions and application failures. Additionally, you can add an optional layer of security by enabling Multi-Factor Authentication (MFA) Delete for a bucket. With this option enabled for a bucket, two forms of authentication are required to change the versioning state of the bucket or to permanently delete an object version: valid AWS account credentials plus a six- digit code (a single-use, time-based password) from a physical or virtual token device.
To track requests for access to your bucket, you can enable access logging. Each access log record provides details about a single access request, such as the requester, bucket name, request time, request action, response status, and error code, if any. Access log information can be useful in security and access audits.
Interfaces Amazon S3 provides standards-based REST APIs for both management and data operations
Most developers building applications on Amazon S3 use a higher-level toolkit or SDK that wraps the underlying REST API. AWS SDKs are available for Android, Browser, iOS, Java, .NET, Node.js, PHP, Python, Ruby, and Go. The integrated AWS Command Line Interface (AWS CLI) also provides a set of high-level, Linux-like Amazon S3 file commands for common operations.
Then you have the AWS Management Console which you can use to easily create and manage Amazon S3 buckets, upload and download objects, and browse the contents of your S3 buckets using a simple web-based user interface
Amazon S3 also has a notifications feature you can use to receive notifications (SNS, SQS, SES via Lambda) when certain events happen in your bucket
AWS operates a shared responsibility model.
These security tools are incredibly powerful.
Of note, we have, Virtual Private Cloud, the ability to logically isolate your resources in a virtual network that you define.
The Key Management Service, the ability to encrypt your data in the Cloud and if you choose, to bring your own encryption key.
Cloudtrail, is a service that records API calls for your account and delivers log files for you, enabling detailed compliance auditing.
And new services like Macie - A machine learning-powered security service to discover, classify, and protect sensitive data.
Networking [COVER THIS ONE] Amazon VPC: Amazon Virtual Private Cloud lets you provision a logically isolated section of the AWS Cloud where you can launch resources in a virtual network that you define. [COVER THIS ONE] AWS KMS: AWS Key Management Service is a managed service that makes it easy for you to create and control the encryption keys used to encrypt your data, and uses Hardware Security Modules (HSMs) to protect the security of your keys. AWS KMS is integrated with several other AWS services to help you protect the data you store with these services and is also integrated with AWS CloudTrail to provide you with logs of all key usage to help meet your regulatory and compliance needs. [COVER THIS ONE] AWS CloudTrail: AWS CloudTrail is a web service that records AWS API calls for your account and delivers log files to you. The recorded information includes the identity of the API caller, the time of the API call, the source IP address of the API caller, the request parameters, and the response elements returned by the AWS service. With CloudTrail, you can get a history of AWS API calls for your account, including API calls made via the AWS Management Console, AWS SDKs, command line tools, and higher-level AWS services (such as AWS CloudFormation). The AWS API call history produced by CloudTrail enables security analysis, resource change tracking, and compliance auditing.
You've just had a bunch of info on this from previous presenters
What I'm here to cover is not only how we get to these - you'll have heard about that already - but how you get to map technologies from these standards - and others - so you can work toward security and compliance of your environments on top of what we do
How a financial services company preserved legacy email data on the cloud and gained from ediscovery
How a financial services
company preserved legacy
email data & gained from
Welcome to this Vaultastic
1 hour, starting at 3 PM IST,
including 15-min Q&A
Mr. Abhishek Mahanty
Amazon Internet Services Pvt. ltd.
Mithi Software Technologies
In this webinar, we’ll talk about
Why AWS: Why AWS’s
elastic cloud platform is the
best to store data with high
durability and strong
Why Vaultastic: How the
guaranteed SaaS cloud solution
can provide peace of mind with
RBAC access, industry compliant
security, and more
The CONTEXT: What are the
Risks and Opportunities
related to Business
communication for banks and
financial services companies?
The USE CASE: A little about our
customer’s business, the need to
preserve data and challenges
faced while attempting to stay
The SOLUTION: How automation
of legacy data migration, a pay
per use model and eDiscovery
can optimize cost and reliability
of storing legacy email data
Risks, Challenges & Concerns related to
for Banks and Financial Agencies
Which of these data related business risks and
challenges does your organisation face?
a. Fines & Penalties for delayed or non-compliance with
b. Unable to find data for handling litigation
c. Negative operational impact due to data theft, data loss and/or
d. Loss of trust due to compromised private data
e. Downtime/Disruptions due to Cyber attacks/ DDOS attacks/
<Choose as many as apply>
With data fragmented in
individual mailboxes and
devices, it remains hard to
secure and unavailable for
compliance, investigation and
Without a secure, dependable
and coherent data management
system for information
exchanged between team
member and with customers,
Banks and Financial Services
companies could be running
multiple risks, on maintaining
data privacy, compliance and
Negative Impact on the Bank’s Business without a secure,
reliable, coherent & durable data management platform
Failure to meet
compliance on data
privacy and security as
stipulated by the
Loss of public trust in
case of a compromise of
private citizen data or
related to customers
and banks working.
action for failure to
secure personal data of
the public they serve.
Negative impact on the
quality of customer service
and team productivity as a
result of loss of context
due to data
fragmentation or loss.
Lack of a central repository for knowledge gathered / captured
during the process of working with teams and customers makes it
hard to find, secure, and renders it unavailable for aggregate
search and analysis for to uncover actionable data and insights.
Banks and financial agencies are
required to preserve all traces of
electronic communication data and to
protect all this critical information.
The duration of data retention varies
from one agency to another.
Banks and financial service
companies must carefully deploy
data retention and management
policies for the emails of their staff
to ensure availability for the right
amount of time, control download
and sharing of information and
ensure accurate disposal of the
To ensure transparency in all
transactions and complaint
management and be ready for
compliance, Banks and financial
agencies should be able to locate
historical electronic communication
instantly and be able to produce it on
demand in a form acceptable as
Preservation Information Governance
& Litigation support
Growing need for Compliance & Information Governance
System Security Concerns
The rising importance of Email has also made it the
#1 vehicle for security breaches in the form of Spam,
Virus, Ransomware etc.
88% of businesses experience data loss and
email is the main culprit
Data Security Concerns
And with businesses becoming more digital, email is
gaining even more importance as a destination for
authentication, notification, authorization, besides
communication, making email data even more critical
An estimated 60% of business-critical data
getting captured in email boxes.
Uptime & Availability Concerns
Downtime, delayed or failed mail delivery, false positives,
malware and ransomware can cause serious disruption
to your work flow resulting in loss of productivity
reputation for the organizations.
Medium businesses (101-1000 employees) are losing
an average of 1% of their annual revenue, or $867,000
• High Risk of loss of critical pieces of information, because data resides in employee
mailboxes and devices
• Inadequate systems to capture and preserve legacy email data of inactive
• Inadequate cyber security measures can lead to the risk of personal citizen data
being compromised, from mounting targeted cyber attacks on financial organizations
• Hard to search and retrieve information records for compliance and business need, with
data scattered on various devices such as phones, tapes and drives
• Recovering from a disaster, breakdown, accident. or data loss due to lack of a
centralized, up-to-date and durable storage
Why are organizations ill-prepared to deal
with these risks & concerns?
Your Data Management methodology could look like one of these
• Poor recovery from ransomware, malware, virus, spam
• No business continuity to handle failures
• Data fragmentation & lack of access to historical data
• Nearly Impossible to search and extract data
• Lack of Mechanism for Oversight and Compliance,
• No Capability for Aggregate Analysis
• Lack of durability and security
• Slow retrieval , running out of space, capex
• Risk of obsolescence, theft, loss, corruption, tampering
• Lacking the ability to leverage accumulated data
• Slow Extraction,
• Poor Discoverability,
• No defense against malicious data tampering
• High Management & Maintenance workloads
• Data fragmentation & lack of access to historical data
• Nearly Impossible to search and extract data
• Lack of mechanism for Oversight and Compliance,
• No capability for Aggregate Analysis
• No control on Data residency
• Lack of Data Protection
• Absence of Data Governance
• Data Lock-in making it hard for data extraction & system
• Lacking Flexibility, Slow retrieval, Limited scalability,
• Hard to do e-discovery
• High costs resulting in limited use of digital communication tools
• High bandwidth consumption
• Lack of openness and flexibility to support a wider choice of
• Lack of integration with existing Business Applications
• May lack business continuity to handle systemic failures
• High Capex, high TCO, Scale bottlenecks
• High Management & Maintenance workloads
• Performance issues
What does your email data management
methodology look like?
a. Email data resides in individual mailboxes
b. Data resides on offline back-up devices or Personal Devices
c. Data resides on in-premise archival devices
d. Data Resides on Cloud Stores (Passive Data Lakes)
e. Hybrid of one or more of the above solution
Solution : Enable cloud Archival for all
incoming and outgoing mail for greater
security, elasticity and improved data
Move historical data on devices to the
cloud as needed and retire the back-up
Solution : Move the data from the PST files
etc. into indexed archival for quick e-
discovery, compliance management and
Solution : Configure your email system to
archive all ingoing and out going mail to a
central, secure and private data archival
on the cloud for quick discovery and
Manage the extent of historical data to
maintain the Archival as per the business
Reduce the active mailbox sizes
Solution : Add cloud based email archival
for all future mail exchanges,
If required for analysis, quick access or
defense against device obsolescence,
tampering etc. upload the archived historical
data to the cloud
The Data Management Maturity Matrix
Possible Solution for each quadrant
Positive Impact on Banks and financial agencies from a secure, private,
and dependable email data management system
Bring about uniformity and compatibility, ensuring easy
movement of information between various team members
and departments, agencies and working groups.
Create an overarching system for
oversight and compliance.
Reversing of the data fragmentation,
mitigating risks related to data breaches, loss
Enable the creation of a central repository of information gathered in the normal
course of work which could be mined for references and intelligence to further
improve the organization's offerings and processes.
Provide a platform for building more
specific workflows to improve process
Greater data security and
privacy of customer data
Increased transparency, trust
The USE CASE:
A leading financial services company
needed to preserve email of ex-employees,
in a discoverable form, for quick turn around
on compliance requests
The story at one of India's leading financial services companies offering a
diverse range of financial products and services across rural, housing and
infrastructure finance sector, and also offering mutual fund products and
investment management services.
The company uses a popular cloud email platform for business communication.
and has been accumulating a high volume of email data related to employees
that exit the company.
Such email data was being downloaded from the primary mail platform and stored
in a data lake before disabling the email account.
The legacy email data of these inactive users has to then be conserved for
extended periods of time for compliance purposes.
This data has to be search ready at all times to enable accurate and timely
response to compliance requests.
Storing the downloaded email data on-
premise or on cloud, didn’t help to
make the data easy to find when
required. This accumulated data was
passive and not search ready.
Storing data on-premise also
added to the risk of
corruption or data loss.
Disabling accounts and leaving the data
on the primary cloud email platform was
not an option due to high costs. This
would mean an active account for every
The entire process was manual
involving large amounts of time by IT
team members to manage the
processing serve requests.
Uploading the data to a cloud store
and downloading when required
was a tedious and slow option with
The trouble with
the approach was…
Automation of legacy data migration with
a flexible pay per use model and on-
a. Fast, accurate ediscovery to help you locate any information instantly
b. Easy Recovery/Extraction of selected information to be ready for compliance
c. Self-service portal for your users to search and extract information themselves
d. Role based ediscovery to allow searches across department users.
e. Ability to encode compliance rules/watch using stored searches.
Which of these data management and information
governance capabilities are critical to manage your
<choose as many as applicable>
Cloud native email data management
platform to store, manage, govern, access,
discover, restore, integrate and share email
data with ease
Tamper proof vaults store email data over the long term and provide tools
to discover and export data in various formats
Supports a hierarchical storage architecture enabling the organisation to
achieve leaner mail server storage & mailboxes, pushing up performance
Pre-processing & Post processing tools like saved ediscovery queries &
collaboration can help to maintain vigil on the conversations
Supports an organisation’s IG strategy by storing all email data, controlling it
based on retention and making it accessible based on roles
Data Security Data on our cloud is highly durable and secured at multiple layers putting the
organisation at ease
Productivity e-discovery to find useful knowledge from the "corporate archive"
Covering a Wide Range of Use Cases and Concerns
#BackupAutomation #DataSecurity #Compliance #E-discovery #DecisionSupport
globally, are opting for
SaaS & 92% of enterprises
are using a public cloud for
their workloads to gain
from Elasticity, Scale,
Reliability, Security and
Vaultastic follows Best Practices
For Email Archival
• SaaS on cloud for Zero Infrastructure
requirement and Simpler Management
• Performance guarantees on Scalability,
Reliability, Security and Privacy.
• Copy to a separate operational Infrastructure
to gain from redundancy and ease data migration
• Easy data discovery, retrieval and
• Data exit policy to prevent vendor lock in
• Pay as You Go
• Pay Per Use
• Self Service
• SLA backed warranties
• Data Portability
• Easy Migration
• Auto Updates &
• Users empowered to access
• Can help themselves discover
and recover email
• Reduce load on IT team,
• Uptime of 99.9%
• Data Durability of
• RPO of near Zero
• Data belongs to you, we
are only the custodians
• Options for Bulk data import
and Export using AWS
• Individual users can Export
data in PST or EML
• Historical email data in EML
or PST format can be
uploaded to Vaultastic
The AWS ADVANTAGE:
A secure & scalable cloud platform
available across regions provides a
robust foundation for data management
applications like Vaultastic
AWS: The new normal
Partner Solutions Architect
AWS Global Infrastructure
21 Regions – 66 Availability Zones – 180 PoPs
Region & Number of Availability Zones
AWS GovCloud West (3) EU
US West Frankfurt (3)
Oregon (3) London (3)
Northern California (3) Paris (3)
US East Singapore (3)
N. Virginia (6), Ohio (3) Sydney (3), Tokyo (4),
Seoul (2), Mumbai (3)
Central (2) China
São Paulo (3)
Bahrain, Cape Town, Jakarta, Milan
• Region is comprised of multiple Availability Zones
• Isolation from other Availability Zones (power, network, flood plains)
• Low latency (<10mS) direct connect between Availability Zones
• 1AZ can include multiple data centers
• Physical Separation < 100km
Availability Zone Availability Zone
ap-south-1a ap-south-1b ap-south-1c
Voice & Text
Auto Scaling Batch
Archive Block Storage
NoSQLAurora MySQL Oracle SQL ServerPostgreSQL
Scalable DNSGlobal CDN
P2M4 D2 X1 G2T2 R4I3C5 F1M5 P3H1 EC2 BareMetalG3T2Unlimited X1eI2C4
Direct access to
Compute — Broadest set of EC2 Instance Types
EC2 Instance Store
Amazon S3 Amazon Glacier
Amazon Simple Storage Service (Amazon S3)
Simple, durable, massively scalable low cost object storage
• Highly Durable & Available
• data stored across multiple facilities and multiple devices in each facility
• 99.999999999% durability per object
• 99.99% annual availability of objects with SLA
• cross regional replication
• multiple versions of an object - point-in-time recovery
• choose AWS region - optimize for latency and/or address regulatory requirements
• High Performance
• throughput scales to exceed what any single server can generate or consume
• multi-part upload
• pair with search engine to speed access
• transfer acceleration
Amazon Simple Storage Service (Amazon S3)
Simple, durable, massively scalable low cost object storage
• Scalable & Elastic
• virtually unlimited storage - store as much data as you want and access as needed
• agility - scale up and down as needed
• object permissions (bucket policies) and fine-grained access control (AWS IAM)
• encryption at rest and in transit
• Multi-factor authentication
• access logging
• AWS Management Console, REST APIs, AWS SDKs, or ISV integration
security OF the cloud
security IN the cloud
• Customer Data
• Platform, Applications, Identity & Access Management
• Operating System, Network & Firewall Configuration
• Client-side Data Encryption & Data Integrity Authentication
• Server-side Encryption (File System and/or Data)
• Network Traffic Protection (Encryption, Integrity, and/or Identity)
• AWS Global Infrastructure
• Availability Zones
• Edge Locations
Security: Shared responsibility model
Virtual Private Cloud
Isolated cloud resources
Filter Malicious Web Traffic
Provision, manage, and
Manage creation and
control of encryption keys
Flexible data encryption
Manage user access and
SAML 2.0 support to allow
Host and manage
Microsoft Active Directory
Manage settings for
Identity & Management
Create and use
Track resource inventory
Track user activity and API
Monitor resources and
Access a deep set of cloud security tools
Discover, Classify &
AWS and Compliance Standards
Certifications & Attestations Laws, Regulations and Privacy Alignments & Frameworks
Cloud Computing Compliance Controls
DE 🇩🇪 CISPE EU 🇪🇺 CIS (Center for Internet Security) 🌐
Cyber Essentials Plus UK 🇬🇧 EU Model Clauses EU 🇪🇺 CJIS (US FBI) US 🇺🇸
DoD SRG US 🇺🇸 FERPA US 🇺🇸 CSA (Cloud Security Alliance) 🌐
FedRAMP US 🇺🇸 GLBA US 🇺🇸 Esquema Nacional de Seguridad ES 🇪🇸
FIPS US 🇺🇸 HIPAA US 🇺🇸 EU-US Privacy Shield EU 🇪🇺
IRAP AU 🇦🇺 HITECH 🌐 FISC JP 🇯🇵
ISO 9001 🌐 IRS 1075 US 🇺🇸 FISMA US 🇺🇸
ISO 27001 🌐 ITAR US 🇺🇸 G-Cloud UK 🇬🇧
ISO 27017 🌐 My Number Act JP 🇯🇵 GxP (US FDA CFR 21 Part 11) US 🇺🇸
ISO 27018 🌐 Data Protection Act – 1988 UK 🇬🇧 ICREA 🌐
MLPS Level 3 CN 🇨🇳 VPAT / Section 508 US 🇺🇸 IT Grundschutz DE 🇩🇪
MTCS SG 🇸🇬 Data Protection Directive EU 🇪🇺 MITA 3.0 (US Medicaid) US 🇺🇸
PCI DSS Level 1 💳 Privacy Act [Australia] AU 🇦🇺 MPAA US 🇺🇸
SEC Rule 17-a-4(f) US 🇺🇸 Privacy Act [New Zealand] NZ 🇳🇿 NIST US 🇺🇸
SOC 1, SOC 2, SOC 3 🌐 PDPA - 2010 [Malaysia] MY 🇲🇾 Uptime Institute Tiers 🌐
PDPA - 2012 [Singapore] SG 🇸🇬 Cloud Security Principles UK 🇬🇧
PIPEDA [Canada] CA 🇨🇦
🌐 = industry or global standard Agencia Española de Protección de Datos ES 🇪🇸
Vaultastic powered by AWS is the
most secure, reliable and
scalable choice of platform to help
BFSI manage and leverage their
Why Vaultastic is a good fit for Banks &
Financial Services Companies
A choice of region for storing your data, ensuring
compliance with data residency regulations of
By having all email & data stored on a highly
elastic, available, durable cloud platform instances
of outages are reduced to near zero. And quick
recovery from any glitches.
Designed to ensure customer data privacy in the
multitenant SaaS setup. For highly sensitive
customers. Can also be offered as a
dedicated private setup on the cloud for
Reduced IT Costs
Fully managed SaaS, which implies Zero
hardware at your end, Zero management and
Tight security at multiple layers of the stack
to ensure that sensitive data stored on our
platform is encrypted, immutable and tamper
No Vendor Lockin
Built on the premise that the ownership of data
is that of the customer.
Processes and tools are specially designed to
prevent vendor lock in, are in place to allow
extraction of data on demand.
Get More Done
Vaultastic is part of Mithi’s Secure Digital Collaboration Environment
Industry compliant gateway
protection, policies, rbac,
encryption and more.
Chat with your
making, track progress
on initiatives, build on
Create & Share calendars,
schedule tasks, meetings,
events, check free-busy &
Create your own address
book or search for
colleagues & contacts on
the shared address book
Discover useful and
critical data for reuse
and insights to
Legacy data and
Leverage automated tools
and APIs to integrate with
legacy data and
applications for improved
Voice & Video Chat
Powered by Amazon
you can meet, chat, and
place business phone
calls with a single,
Access & send email
from the browser,
desktop & mobile
with Mithi’s Secure Dependable Durable
Digital Collaboration Environment
AWS driven cloud
architected for scale,
Preserve, Manage and
discover your most
critical business asset
for productivity and
Vaultastic: Secure, Dependable, Productive
High level of automation through the complete product
Complete lifecycle services framework Easy on-boarding
- Easy Subscription Management - Easy to Use, at an
‘Easy on the wallet’.
Flexible Plans and Adoption roadmap useful in meeting
Multi-layered email security framework covering
authentication, authorisation, encryption, RBAC access,
Meeting stringent security and compliance with local
regulations like IRDA, SEBI & RBI etc.
APIs enabled systems integration with existing systems
and other 3rd party tools to build custom solutions such as
Automated tools for legacy data migration and management
Useful in Heterogeneous set-ups, providing a uniform
system of data / records and enabling easy migration.
AWS cloud platform as the infrastructure foundation
Built in redundancies, monitoring and automatic recovery
mechanisms guarantee on uptime and data durability
Serving Customers across demanding Govt,
Banking and Financial Companies Segments
Breakthrough in productivity
through Effective Collaboration
To explore more please visit