Breaking Down the
Monolith
The company
Poste Italiane
 With 35 million users, Poste Italiane is the Italian
company serving the largest customer base providing
an impressive range of services in several industries
coped with financial, banking, insurance,
telco, utilities, and logistic products. Realizes the most
comprehensive and exhaustive customer journey
through an advanced digital ecosystem satisfying the
heterogeneous customer needs by ensuring seamless
and reliable interactions.
 With 12,8k offices spread all over the country, Poste
Italiane represents a solid reference for Italian citizens,
inspiring trustiness and recalling a strong social
commitment.
source: https://www.posteitaliane.it/files/1476534398194/PI_2024_Sustain_Innovate.pdf
transposing the multi industry operational plane on the analytical layer
Poste Italiane
 In the evolutionary journey toward digitalization, the legacy architectural landscape leveraging centralized data
management through the adoption of monolithic systems revealed several limitations that needed to be
addressed smoothly but effectively.
 The raise of a new paradigm advocating a distributed informational architecture and domain driven
decentralized ownership is leading the company in a new architectural era, achieving shortest time to market,
reducing gap analysis, improving quality, increasing the efficiency of infrastructures and processes and last but
not least achieving the most from the value of data enabling more fascinating and innovative opportunities.
 For 1 year so far, the Data Mesh paradigm has been successfully adopted with more and more use-cases
embracing it, thus enabling new visionary Data Opportunities development to support trustful Data Products-
driven business decisions.
architectural shift and centralized hub
segmentation
Limits of Monolith Centric Data Architecture
In the evolutionary journey toward
digitalization, the legacy architectural
landscape leveraging on centralized data
management through the adoption of
monolithic systems, revealed several
limitations that needs to be addressed
smoothly but effectively.
Pain Points
Distributed Data Lake has relevant maintenance effort
and requires frequent infrastructure renewal and
augmentation,
Requires the centralization of knowledge, potentially
originating gap analysis and poor ecosystem resiliency
too surrounding systems changes,
Doesn’t scale at the required pace,
Remarkable variety of formats and transport protocols
for data acquisition,
Doesn’t support streaming processing, mainly works
on batch workflows lasting 2 or 3 days.
Acquire Source
Operational System
Knowledge
Source system data
model understanding is
an overhead for data
analysis
marketing
campaign
mainframe
operational systems
business
intelligence
• Centralizes the
ownership of data
acquiring domain
specific knowledge and
skills,
• Accomplishes the
evaluation of data
insights through
centralized governed
batch processes,
• Implements Data
Warehousing practice
The limits of data warehouse practice
Centralized Data Architectural Landscape (before)
Centralized Data Governance
and Knowledge
Implementing Data Warehouse
Practice
Handles multiple etherogenous
data formats and transport layers
Breaking Down the Monolith
• Since 1 year, Data Mesh paradigm
has been successfully adopted on
several use-cases and the
evolutionary roadmap has been
depicted envisaging outstanding
benefits but raising new challenges
in both technical and organizational
perspectives.
• The mainframe offloading was the
first project receiving a beneficial
effect from the Data Mesh paradigm
applying domain oriented ownership
and data as a product thinking
pillars.
• The transactions on bank accounts,
the credit cards variations, are just
few of the data products which
allowed the enablement of multiple
use cases, both functional and
Decentralized data ownership towards Financial Domains
Envision the Evolutionary Journey Outcome
• The raise of a new paradigm advocating
a distributed informational architecture
and domain driven decentralized
ownership is leading the company in a
new architectural era.
• Achieving shortest time to market,
reducing gap analysis, improving
quality, increasing the efficiency of
infrastructures and processes and last
but not least exploiting the most from
the value of data enabling more
fascinating and innovative
opportunities.
• A mesh of continuously increasing data
domains will result in creating
information understandable across the
organization and distributed data and
consumer teams will result in a better
scaling
towards a new domain driven analytical
ecosystem
a mesh of domains enabling customer's analytic at scale
Customer’s Domain
8
• @PosteItaliane customer’s data are
collected in several different scenarios
that can be grouped in specific contextual
domains.
• In the engagement, onboarding and
support phases, analytical data are
collected seamlessly during the overall
customer journey.
• Each domain, dedicated to a specific
phase of the customer lifecycle is
providing invaluable insights aiming to
improve the omnichannel experiences on
the overall customer analytical
ecosystem.
• Domains will be exchanging accountable
analytical data implementing both
source and consumer aligned data products.
Leveraging on Data Mesh Boost platform
Boost Data Mesh Adoption
9
• Build: data mesh boost platform helps
us to build our data product definition
by leveraging on an abstraction
language shortening the distance
among business and technical users
• Deploy: the provisioning of data
products infrastructural services and
artefacts scaffolds happens in self-
service fashion applying infrastructure
as a code principles,
• Experience: data product market place
provides advanced catalog capabilities
paired with a collaborative platform
improving the mesh nodes
interoperability and reducing time to
market for new opportunities.
Still a long roadmap ahead
Tackle ground breaking changes @PosteItaliane
• Mindset Shift
• Explain advantages of Data Mesh in order to achieve global awareness
• Promote Data Trustworthiness/quality at source
• Promote Product Thinking,
• Architectural Shift
• Data Lake Segmentation, moving data ownership to source systems,
• Decentralize data consumption and insights generation,
• Centralize provisioning of tools and assets enabling data products realizations and
consumption (i.e. Data Infrastructure as a Platform)
• Organizational Shift
• Create autonomous cross-functional teams with data engineering capabilities,
• Supporting the evolutionary journey avoiding disruption and advocating a smooth
transition,
• Multi disciplinary and cross-department team supporting the definition of Federated
Computational Governance policies.
Who we are :
• We value transparency, collaboration
and results
• Totally decentralized and self-
managed
• International culture and mindset
• Customer laser focused
What we do:
• Data Engineering is our mission since 2013
• Crafting end-to-end data platforms
• Data Strategy
• Managed Data Service
The importance of an enabling platform
Data Mesh's main goal is to gain agility and speed in data initiatives at large scale. At the same time, this
approach shifts many responsibilities to DP teams. If they will not be supported by a platform providing ready to
use experiences, they will struggle to implement interoperable and compliant Data Products.
Finance Domain HR Domain Manufacturing Domain
Are they going to deploy
in a standard, auditable,
replicable and atomic way
?
Will they follow the same
security and
implementation
standards ?
Will these three DPs be
interoperable ? Addressable ?
Discoverable ?
Do they have the competences
to start from scratch ? Did they
understand all the facets of
data mesh practice ?
Data
Mesh
Data Product Marketplace
Data
Product
Data
Produ
ct
Data Product Provisioner
Infrastructure services
Speed up Mesh development
Data Product owner experience
Self service deployment
Data Consumption experience
Transparency / Information
Trust / Ownership
Data Product Builder
Data
Produ
ct
Pipelin
e
SQL
Engine
Data Product Templates
Metadat
a
Data Product Builder
Data Product
Descriptor
API
Data Product
Team Federated
Governance
Coding
Computational
Policies
Computational
Policies
Repository
ablement Team
Platform Team
Storag
e Data
Qualit
y
GDPR
Doc
Data Product Provisioner
Infrastructure Plane
Data Product
Marketplace
Data Consumers
Clone
templates
to build the
DP with
embedded
standards
Deploy the
DP on a
target
environmen
t
Generate a
new release
of the DP
and test it
Define a
new
template to
support a
new
technology
Create a
new policy
to enforce
quality
standards
Specific
Provisioners
GDPR
Observ.
Test and enforce
computational
policies against the
DP descriptor
Deployment is
technology agnostic,
through specific
provisioners is
possible to integrate
whatever stack you
prefer
Dremio
API
Storage
In the marketplace is
possible to discover,
understand, observe
and buy data
products belonging
to all the domains
Wrapping Up
• Lower the barrier to develop and deploy Data Products using templates
• Help to adopt an healthy end to end life cycle with embedded standards and
computational policies
• Achieve faster time to market
• Fully customizable technology stack, metadata, standards and processes
• Create the marketplace experience to discover, understand, observe and buy data
products
• Help the Federated Governance to create a centralized repository of human and machine
readable computational policies
• Lower the risk avoiding pitfalls
Next
Steps?
Thank you!
www.instagram.com/
agilelab_official/
www.agilelab.i
t
info@agilelab.it www.linkedin.com/company/agile-
lab

Data Mesh Implementation - a practical journey

  • 1.
  • 2.
    The company Poste Italiane With 35 million users, Poste Italiane is the Italian company serving the largest customer base providing an impressive range of services in several industries coped with financial, banking, insurance, telco, utilities, and logistic products. Realizes the most comprehensive and exhaustive customer journey through an advanced digital ecosystem satisfying the heterogeneous customer needs by ensuring seamless and reliable interactions.  With 12,8k offices spread all over the country, Poste Italiane represents a solid reference for Italian citizens, inspiring trustiness and recalling a strong social commitment. source: https://www.posteitaliane.it/files/1476534398194/PI_2024_Sustain_Innovate.pdf
  • 3.
    transposing the multiindustry operational plane on the analytical layer Poste Italiane  In the evolutionary journey toward digitalization, the legacy architectural landscape leveraging centralized data management through the adoption of monolithic systems revealed several limitations that needed to be addressed smoothly but effectively.  The raise of a new paradigm advocating a distributed informational architecture and domain driven decentralized ownership is leading the company in a new architectural era, achieving shortest time to market, reducing gap analysis, improving quality, increasing the efficiency of infrastructures and processes and last but not least achieving the most from the value of data enabling more fascinating and innovative opportunities.  For 1 year so far, the Data Mesh paradigm has been successfully adopted with more and more use-cases embracing it, thus enabling new visionary Data Opportunities development to support trustful Data Products- driven business decisions. architectural shift and centralized hub segmentation
  • 4.
    Limits of MonolithCentric Data Architecture In the evolutionary journey toward digitalization, the legacy architectural landscape leveraging on centralized data management through the adoption of monolithic systems, revealed several limitations that needs to be addressed smoothly but effectively. Pain Points Distributed Data Lake has relevant maintenance effort and requires frequent infrastructure renewal and augmentation, Requires the centralization of knowledge, potentially originating gap analysis and poor ecosystem resiliency too surrounding systems changes, Doesn’t scale at the required pace, Remarkable variety of formats and transport protocols for data acquisition, Doesn’t support streaming processing, mainly works on batch workflows lasting 2 or 3 days. Acquire Source Operational System Knowledge Source system data model understanding is an overhead for data analysis
  • 5.
    marketing campaign mainframe operational systems business intelligence • Centralizesthe ownership of data acquiring domain specific knowledge and skills, • Accomplishes the evaluation of data insights through centralized governed batch processes, • Implements Data Warehousing practice The limits of data warehouse practice Centralized Data Architectural Landscape (before) Centralized Data Governance and Knowledge Implementing Data Warehouse Practice Handles multiple etherogenous data formats and transport layers
  • 6.
    Breaking Down theMonolith • Since 1 year, Data Mesh paradigm has been successfully adopted on several use-cases and the evolutionary roadmap has been depicted envisaging outstanding benefits but raising new challenges in both technical and organizational perspectives. • The mainframe offloading was the first project receiving a beneficial effect from the Data Mesh paradigm applying domain oriented ownership and data as a product thinking pillars. • The transactions on bank accounts, the credit cards variations, are just few of the data products which allowed the enablement of multiple use cases, both functional and Decentralized data ownership towards Financial Domains
  • 7.
    Envision the EvolutionaryJourney Outcome • The raise of a new paradigm advocating a distributed informational architecture and domain driven decentralized ownership is leading the company in a new architectural era. • Achieving shortest time to market, reducing gap analysis, improving quality, increasing the efficiency of infrastructures and processes and last but not least exploiting the most from the value of data enabling more fascinating and innovative opportunities. • A mesh of continuously increasing data domains will result in creating information understandable across the organization and distributed data and consumer teams will result in a better scaling towards a new domain driven analytical ecosystem
  • 8.
    a mesh ofdomains enabling customer's analytic at scale Customer’s Domain 8 • @PosteItaliane customer’s data are collected in several different scenarios that can be grouped in specific contextual domains. • In the engagement, onboarding and support phases, analytical data are collected seamlessly during the overall customer journey. • Each domain, dedicated to a specific phase of the customer lifecycle is providing invaluable insights aiming to improve the omnichannel experiences on the overall customer analytical ecosystem. • Domains will be exchanging accountable analytical data implementing both source and consumer aligned data products.
  • 9.
    Leveraging on DataMesh Boost platform Boost Data Mesh Adoption 9 • Build: data mesh boost platform helps us to build our data product definition by leveraging on an abstraction language shortening the distance among business and technical users • Deploy: the provisioning of data products infrastructural services and artefacts scaffolds happens in self- service fashion applying infrastructure as a code principles, • Experience: data product market place provides advanced catalog capabilities paired with a collaborative platform improving the mesh nodes interoperability and reducing time to market for new opportunities.
  • 10.
    Still a longroadmap ahead Tackle ground breaking changes @PosteItaliane • Mindset Shift • Explain advantages of Data Mesh in order to achieve global awareness • Promote Data Trustworthiness/quality at source • Promote Product Thinking, • Architectural Shift • Data Lake Segmentation, moving data ownership to source systems, • Decentralize data consumption and insights generation, • Centralize provisioning of tools and assets enabling data products realizations and consumption (i.e. Data Infrastructure as a Platform) • Organizational Shift • Create autonomous cross-functional teams with data engineering capabilities, • Supporting the evolutionary journey avoiding disruption and advocating a smooth transition, • Multi disciplinary and cross-department team supporting the definition of Federated Computational Governance policies.
  • 11.
    Who we are: • We value transparency, collaboration and results • Totally decentralized and self- managed • International culture and mindset • Customer laser focused What we do: • Data Engineering is our mission since 2013 • Crafting end-to-end data platforms • Data Strategy • Managed Data Service
  • 13.
    The importance ofan enabling platform Data Mesh's main goal is to gain agility and speed in data initiatives at large scale. At the same time, this approach shifts many responsibilities to DP teams. If they will not be supported by a platform providing ready to use experiences, they will struggle to implement interoperable and compliant Data Products. Finance Domain HR Domain Manufacturing Domain Are they going to deploy in a standard, auditable, replicable and atomic way ? Will they follow the same security and implementation standards ? Will these three DPs be interoperable ? Addressable ? Discoverable ? Do they have the competences to start from scratch ? Did they understand all the facets of data mesh practice ? Data Mesh
  • 14.
    Data Product Marketplace Data Product Data Produ ct DataProduct Provisioner Infrastructure services Speed up Mesh development Data Product owner experience Self service deployment Data Consumption experience Transparency / Information Trust / Ownership Data Product Builder Data Produ ct
  • 15.
    Pipelin e SQL Engine Data Product Templates Metadat a DataProduct Builder Data Product Descriptor API Data Product Team Federated Governance Coding Computational Policies Computational Policies Repository ablement Team Platform Team Storag e Data Qualit y GDPR Doc Data Product Provisioner Infrastructure Plane Data Product Marketplace Data Consumers Clone templates to build the DP with embedded standards Deploy the DP on a target environmen t Generate a new release of the DP and test it Define a new template to support a new technology Create a new policy to enforce quality standards Specific Provisioners GDPR Observ. Test and enforce computational policies against the DP descriptor Deployment is technology agnostic, through specific provisioners is possible to integrate whatever stack you prefer Dremio API Storage In the marketplace is possible to discover, understand, observe and buy data products belonging to all the domains
  • 16.
    Wrapping Up • Lowerthe barrier to develop and deploy Data Products using templates • Help to adopt an healthy end to end life cycle with embedded standards and computational policies • Achieve faster time to market • Fully customizable technology stack, metadata, standards and processes • Create the marketplace experience to discover, understand, observe and buy data products • Help the Federated Governance to create a centralized repository of human and machine readable computational policies • Lower the risk avoiding pitfalls
  • 17.