This white paper analyse s the need for Reference Data Management in the financial services industry and elucidates the challenges associated with its implementation. The paper also focuses on the critical elements of RDM implementation and some of the major benefits an organization can derive by implementing a robust Reference Data Management into its IT infrastructure.
Reference data is something we often encounter in our projects. In our experience, it is often underestimated and does not get enough attention. In the webinar, we want to make you aware of some interesting aspects of ‘reference data’ such as how it relates to MDM, which it’s often mixed with.
Reference matter data management:
Two categories of structured data :
Master data: is data associated with core business entities such as customer, product, asset, etc.
Transaction data: is the recording of business transactions such as orders in manufacturing, loan and credit card payments in banking, and product sales in retail.
Reference data: is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise .
Master Data Management - Practical Strategies for Integrating into Your Data ...DATAVERSITY
Master Data Management (MDM) provides organizations with an accurate and comprehensive view of their business-critical data such as Customers, Products, Vendors, and more. While mastering these key data areas can be a complex task, the value of doing so can be tremendous – from real-time operational integration to data warehousing & analytic reporting. This webinar provides practical strategies for gaining value from your MDM initiative, while at the same time assuring a solid architectural and governance foundation that will ensure long-term, enterprise-wide success.
Most companies do not think of data when they start out, let alone the quality of that data. With the proliferation of data and the usages of that data, organizations are compelled to focus more and more on data and their quality.
Join Kasu Sista of The Wisdom Chain to understand how to think about, implement, and maintain data quality.
You will learn about:
What do data people think about?
How do you get them to listen to what you want?
Business processes and data life span
Impact of data capture and data quality on down stream business processes
Data quality metrics and how to define them and use them
Practical metadata and data governance
What are the takeaways from the session?
How to talk to your data people
Understanding the importance of capturing data in the right way
Understanding the importance of quality metrics and bench marks
Understanding of operationalizing data quality processes
Reference data is something we often encounter in our projects. In our experience, it is often underestimated and does not get enough attention. In the webinar, we want to make you aware of some interesting aspects of ‘reference data’ such as how it relates to MDM, which it’s often mixed with.
Reference matter data management:
Two categories of structured data :
Master data: is data associated with core business entities such as customer, product, asset, etc.
Transaction data: is the recording of business transactions such as orders in manufacturing, loan and credit card payments in banking, and product sales in retail.
Reference data: is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise .
Master Data Management - Practical Strategies for Integrating into Your Data ...DATAVERSITY
Master Data Management (MDM) provides organizations with an accurate and comprehensive view of their business-critical data such as Customers, Products, Vendors, and more. While mastering these key data areas can be a complex task, the value of doing so can be tremendous – from real-time operational integration to data warehousing & analytic reporting. This webinar provides practical strategies for gaining value from your MDM initiative, while at the same time assuring a solid architectural and governance foundation that will ensure long-term, enterprise-wide success.
Most companies do not think of data when they start out, let alone the quality of that data. With the proliferation of data and the usages of that data, organizations are compelled to focus more and more on data and their quality.
Join Kasu Sista of The Wisdom Chain to understand how to think about, implement, and maintain data quality.
You will learn about:
What do data people think about?
How do you get them to listen to what you want?
Business processes and data life span
Impact of data capture and data quality on down stream business processes
Data quality metrics and how to define them and use them
Practical metadata and data governance
What are the takeaways from the session?
How to talk to your data people
Understanding the importance of capturing data in the right way
Understanding the importance of quality metrics and bench marks
Understanding of operationalizing data quality processes
Master Data Management – Aligning Data, Process, and GovernanceDATAVERSITY
Master Data Management (MDM) provides organizations with an accurate and comprehensive view of their business-critical data such as customers, products, vendors, and more. While mastering these key data areas can be a complex task, the value of doing so can be tremendous – from real-time operational integration to data warehousing and analytic reporting. This webinar will provide practical strategies for gaining value from your MDM initiative, while at the same time assuring a solid architectural and governance foundation that will ensure long-term, enterprise-wide success.
Master Data Management - Aligning Data, Process, and GovernanceDATAVERSITY
Master Data Management (MDM) can provide significant value to the organization in creating consistent key data assets such as Customer, Product, Supplier, Patient, and the list goes on. But getting MDM “right” requires a strategic mix of Data Architecture, business process, and Data Governance. Join this webinar to learn how to find the “sweet spot” between technology, design, process, and people for your MDM initiative.
LDM Slides: Conceptual Data Models - How to Get the Attention of Business Use...DATAVERSITY
Achieving a ‘single version of the truth’ is critical to any MDM, DW, or data integration initiative. But have you ever tried to get people to agree on a single definition of “customer”? Or to get Sales, Marketing, and IT to agree on a target audience?
This webinar will discuss how a conceptual data model can be used as a powerful communication tool for data-intensive initiatives. It will cover how to build a high-level data model, how the core concepts in a data model can have significant business impact on an organization, and will provide some easy-to-use templates and guidelines for a step-by-step approach to implementing a conceptual data model in your organization.
Gartner: Master Data Management FunctionalityGartner
Gartner will further examine key trends shaping the future MDM market during the Gartner MDM Summit 2011, 2-3 February in London. More information at www.europe.gartner.com/mdm
Business Value Through Reference and Master Data StrategiesDATAVERSITY
Data tends to pile up and can be rendered unusable or obsolete without careful maintenance processes. Reference and Master Data Management (MDM) has been a popular Data Management approach to effectively gain mastery over not just the data but the supporting architecture for processing it. This webinar presents MDM as a strategic approach to improving and formalizing practices around those data items that provide context for many organizational transactions — the master data. Too often, MDM has been implemented technology-first and achieved the same very poor track record (one-third succeeding on time, within budget, and achieving planned functionality). MDM success depends on a coordinated approach, typically involving Data Governance and Data Quality activities.
Learning Objectives:
• Understand foundational reference and MDM concepts based on the Data Management Body of Knowledge (DMBoK)
• Understand why these are an important component of your Data Architecture
• Gain awareness of reference and MDM frameworks and building blocks
• Know what MDM guiding principles consist of and best practices
• Know how to utilize reference and MDM in support of business strategy
DAS Slides: Data Quality Best PracticesDATAVERSITY
Tackling Data Quality problems requires more than a series of tactical, one-off improvement projects. By their nature, many Data Quality problems extend across and often beyond an organization. Addressing these issues requires a holistic architectural approach combining people, process, and technology. Join Nigel Turner and Donna Burbank as they provide practical ways to control Data Quality issues in your organization.
Data Modeling, Data Governance, & Data QualityDATAVERSITY
Data Governance is often referred to as the people, processes, and policies around data and information, and these aspects are critical to the success of any data governance implementation. But just as critical is the technical infrastructure that supports the diverse data environments that run the business. Data models can be the critical link between business definitions and rules and the technical data systems that support them. Without the valuable metadata these models provide, data governance often lacks the “teeth” to be applied in operational and reporting systems.
Join Donna Burbank and her guest, Nigel Turner, as they discuss how data models & metadata-driven data governance can be applied in your organization in order to achieve improved data quality.
Data-Ed Webinar: Data Quality Success StoriesDATAVERSITY
Organizations must realize what it means to utilize data quality management in support of business strategy. This webinar will demonstrate how chronic business challenges can often be attributed to the root problem of poor data quality. Showing how data quality should be engineered provides a useful framework in which to develop an effective approach. Establishing this framework allows organizations to more efficiently identify business and data problems caused by structural issues versus practice-oriented defects; giving them the skillset to prevent these problems from re-occurring.
Learning Objectives:
Understanding foundational data quality concepts based on the DAMA DMBOK
Utilizing data quality engineering in support of business strategy
Case Studies illustrating data quality success
Data quality guiding principles & best practices
Steps for improving data quality at your organization
Tackling Data Quality problems requires more than a series of tactical, one-off improvement projects. By their nature, many Data Quality problems extend across and often beyond an organization. Addressing these issues requires a holistic architectural approach combining people, process, and technology. Join Nigel Turner and Donna Burbank as they provide practical ways to control Data Quality issues in your organization.
You Need a Data Catalog. Do You Know Why?Precisely
Data catalog has become a more popular discussion topic within data management and data governance circles. “What is it?” and “Do I need one?” are two common questions; along with “How does a catalog relate to and support the data governance program?”
The data catalog plays a key role in the governance process; How well information can be managed, aligned to business objectives and monetized depends in great part to what you know about your data.
In this webinar you will learn about:
- The role of the data catalog
- What kinds of information should be in your data catalog
- Those catalog items that can be harvested systemically versus those that require stewardship involvement
- The role of the catalog in your data quality program
We hope you’ll join this on-demand webinar and learn how a data catalog should be part of your governance and data quality program!
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
Describes what Enterprise Data Architecture in a Software Development Organization should cover and does that by listing over 200 data architecture related deliverables an Enterprise Data Architect should remember to evangelize.
Data Catalogs Are the Answer – What is the Question?DATAVERSITY
Organizations with governed metadata made available through their data catalog can answer questions their people have about the organization’s data. These organizations get more value from their data, protect their data better, gain improved ROI from data-centric projects and programs, and have more confidence in their most strategic data.
Join Bob Seiner for this lively webinar where he will talk about the value of a data catalog and how to build the use of the catalog into your stewards’ daily routines. Bob will share how the tool must be positioned for success and viewed as a must-have resource that is a steppingstone and catalyst to governed data across the organization.
Serene Zawaydeh - Big Data -Investment -WaveletsSerene Zawaydeh
Big data solutions are being implemented in the investment industry among other industries, allowing processing of a large volume of variables including real time changes.
In addition to highlighting current applications of big data in the investment industry, this paper identifies applications of Wavelets in finance and Big Data. Wavelets are used for the analysis of non stationary signals. Academic studies proved the benefits of using Wavelets for forecasting financial time series, data mining among other applications.
Master Data Management – Aligning Data, Process, and GovernanceDATAVERSITY
Master Data Management (MDM) provides organizations with an accurate and comprehensive view of their business-critical data such as customers, products, vendors, and more. While mastering these key data areas can be a complex task, the value of doing so can be tremendous – from real-time operational integration to data warehousing and analytic reporting. This webinar will provide practical strategies for gaining value from your MDM initiative, while at the same time assuring a solid architectural and governance foundation that will ensure long-term, enterprise-wide success.
Master Data Management - Aligning Data, Process, and GovernanceDATAVERSITY
Master Data Management (MDM) can provide significant value to the organization in creating consistent key data assets such as Customer, Product, Supplier, Patient, and the list goes on. But getting MDM “right” requires a strategic mix of Data Architecture, business process, and Data Governance. Join this webinar to learn how to find the “sweet spot” between technology, design, process, and people for your MDM initiative.
LDM Slides: Conceptual Data Models - How to Get the Attention of Business Use...DATAVERSITY
Achieving a ‘single version of the truth’ is critical to any MDM, DW, or data integration initiative. But have you ever tried to get people to agree on a single definition of “customer”? Or to get Sales, Marketing, and IT to agree on a target audience?
This webinar will discuss how a conceptual data model can be used as a powerful communication tool for data-intensive initiatives. It will cover how to build a high-level data model, how the core concepts in a data model can have significant business impact on an organization, and will provide some easy-to-use templates and guidelines for a step-by-step approach to implementing a conceptual data model in your organization.
Gartner: Master Data Management FunctionalityGartner
Gartner will further examine key trends shaping the future MDM market during the Gartner MDM Summit 2011, 2-3 February in London. More information at www.europe.gartner.com/mdm
Business Value Through Reference and Master Data StrategiesDATAVERSITY
Data tends to pile up and can be rendered unusable or obsolete without careful maintenance processes. Reference and Master Data Management (MDM) has been a popular Data Management approach to effectively gain mastery over not just the data but the supporting architecture for processing it. This webinar presents MDM as a strategic approach to improving and formalizing practices around those data items that provide context for many organizational transactions — the master data. Too often, MDM has been implemented technology-first and achieved the same very poor track record (one-third succeeding on time, within budget, and achieving planned functionality). MDM success depends on a coordinated approach, typically involving Data Governance and Data Quality activities.
Learning Objectives:
• Understand foundational reference and MDM concepts based on the Data Management Body of Knowledge (DMBoK)
• Understand why these are an important component of your Data Architecture
• Gain awareness of reference and MDM frameworks and building blocks
• Know what MDM guiding principles consist of and best practices
• Know how to utilize reference and MDM in support of business strategy
DAS Slides: Data Quality Best PracticesDATAVERSITY
Tackling Data Quality problems requires more than a series of tactical, one-off improvement projects. By their nature, many Data Quality problems extend across and often beyond an organization. Addressing these issues requires a holistic architectural approach combining people, process, and technology. Join Nigel Turner and Donna Burbank as they provide practical ways to control Data Quality issues in your organization.
Data Modeling, Data Governance, & Data QualityDATAVERSITY
Data Governance is often referred to as the people, processes, and policies around data and information, and these aspects are critical to the success of any data governance implementation. But just as critical is the technical infrastructure that supports the diverse data environments that run the business. Data models can be the critical link between business definitions and rules and the technical data systems that support them. Without the valuable metadata these models provide, data governance often lacks the “teeth” to be applied in operational and reporting systems.
Join Donna Burbank and her guest, Nigel Turner, as they discuss how data models & metadata-driven data governance can be applied in your organization in order to achieve improved data quality.
Data-Ed Webinar: Data Quality Success StoriesDATAVERSITY
Organizations must realize what it means to utilize data quality management in support of business strategy. This webinar will demonstrate how chronic business challenges can often be attributed to the root problem of poor data quality. Showing how data quality should be engineered provides a useful framework in which to develop an effective approach. Establishing this framework allows organizations to more efficiently identify business and data problems caused by structural issues versus practice-oriented defects; giving them the skillset to prevent these problems from re-occurring.
Learning Objectives:
Understanding foundational data quality concepts based on the DAMA DMBOK
Utilizing data quality engineering in support of business strategy
Case Studies illustrating data quality success
Data quality guiding principles & best practices
Steps for improving data quality at your organization
Tackling Data Quality problems requires more than a series of tactical, one-off improvement projects. By their nature, many Data Quality problems extend across and often beyond an organization. Addressing these issues requires a holistic architectural approach combining people, process, and technology. Join Nigel Turner and Donna Burbank as they provide practical ways to control Data Quality issues in your organization.
You Need a Data Catalog. Do You Know Why?Precisely
Data catalog has become a more popular discussion topic within data management and data governance circles. “What is it?” and “Do I need one?” are two common questions; along with “How does a catalog relate to and support the data governance program?”
The data catalog plays a key role in the governance process; How well information can be managed, aligned to business objectives and monetized depends in great part to what you know about your data.
In this webinar you will learn about:
- The role of the data catalog
- What kinds of information should be in your data catalog
- Those catalog items that can be harvested systemically versus those that require stewardship involvement
- The role of the catalog in your data quality program
We hope you’ll join this on-demand webinar and learn how a data catalog should be part of your governance and data quality program!
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
Describes what Enterprise Data Architecture in a Software Development Organization should cover and does that by listing over 200 data architecture related deliverables an Enterprise Data Architect should remember to evangelize.
Data Catalogs Are the Answer – What is the Question?DATAVERSITY
Organizations with governed metadata made available through their data catalog can answer questions their people have about the organization’s data. These organizations get more value from their data, protect their data better, gain improved ROI from data-centric projects and programs, and have more confidence in their most strategic data.
Join Bob Seiner for this lively webinar where he will talk about the value of a data catalog and how to build the use of the catalog into your stewards’ daily routines. Bob will share how the tool must be positioned for success and viewed as a must-have resource that is a steppingstone and catalyst to governed data across the organization.
Serene Zawaydeh - Big Data -Investment -WaveletsSerene Zawaydeh
Big data solutions are being implemented in the investment industry among other industries, allowing processing of a large volume of variables including real time changes.
In addition to highlighting current applications of big data in the investment industry, this paper identifies applications of Wavelets in finance and Big Data. Wavelets are used for the analysis of non stationary signals. Academic studies proved the benefits of using Wavelets for forecasting financial time series, data mining among other applications.
Big Data is Here for Financial Services White PaperExperian
Conquering Big Data Challenges
Financial institutions have invested in Big Data for many years, and new advances in technology infrastructure have opened the door for leveraging data in ways that can make an even greater impact on your business.
Learn how Big Data challenges are easier to overcome and how to find opportunities in your existing data and scale for the future.
FIMA's latest whitepaper evaluates how financial services companies are managing the challenges posed by data quality management. By analyzing which data types and data characteristics businesses are struggling with, it uncovers the true business costs associated with data quality. It will also gauge how data governance programs are maturing and how they are being measured. Finally, it assesses how data is being managed within financial institutions.
Key findings include:
Data quality has never been more important for financial institutions, but most of those companies feel their data is only mediocre: Quality data serves a myriad of central business goals, from risk reduction to increased productivity. Unfortunately, many businesses continue to struggle with data quality, despite the fact that four-fifths of them have it ranked as a top priority.
The top two business functions impacted by poor data quality are regulatory compliance and risk management: Because these concerns tend to be the most important drivers of data quality, many financial institutions see data governance as a “must-do,” rather than a ROI-boosting activity. Furthermore, the vast majority of financial services companies can not quantify the business cost of poor data quality.
Financial institutions vary greatly in the maturity of their data governance programs: Data governance cannot be overlooked – unsurprisingly, businesses with formalized data governance programs reported that their data was higher quality than most other groups.
Data quality management requires close collaboration between business and IT leaders: That collaboration already exists for 83% of respondents in this study, who say that IT and business leaders work together to manage data quality in their organizations. However, the tools these businesses use to manage their data are not all equal, leading to an uneven allocation of resources.
As businesses generate and manage vast amounts of data, companies have more opportunities to gather data, incorporate insights into business strategy and continuously expand access to data across the organisation. Doing so effectively—leveraging data for strategic objectives—is often easier said
than done, however. This report, Transforming data into action: the business outlook for data governance, explores the business contributions of data governance at organisations globally and across industries, the challenges faced in creating useful data governance policies and the opportunities to improve such programmes.
A successful data governance capability requires a strategy to align regulatory drivers and technology enhancement initiatives with business needs and objectives, taking into account the organizational, technological and cultural changes that will need to take place.
Best Hadoop Institutes : kelly tecnologies is the best Hadoop training Institute in Bangalore.Providing hadoop courses by realtime faculty in Bangalore.
Research: How To Manage Regulatory Compliance Conor Coughlan
This is a special report based on the latest market research relating to how your peers and other market players are addressing regulatory compliance and the management of regulatory data.
Specifically this report outlines the markets reaction to:
- MiFID II
- Solvency II
- Basel
- AIFMD
- CRD
- IFRS
- EMIR
- Volcker and many more regulations.
The survey included practitioners from Asset Management, Wealth Management, Insurance, Banking and other FS entities.
Stewarding Data : Why Financial Services Firms Need a Chief Data OfficierCapgemini
The C-suite could soon start to feel a little crowded, with Chief Digital Officers, Chief Innovation Officers, Chief Risk Officers
and Chief Data Officers joining the more established functional leaders. To avoid C-suite proliferation, companies need to
decide whether to elevate a new functional role to “chief” based on the strategic importance of the issue for the organization and its sector. For example, in many organizations, marketing will be so essential to performance that few would deny the need for a CMO. In financial services, data has become so mission-critical that the role of Chief Data Officer is simply essential.
The Economic Value of Data: A New Revenue Stream for Global CustodiansCognizant
Global custodians' big data offers myriad opportunities for generating value from analytics solutions; we explore various paths and offer three use cases to illustrate. Data aggregation, risk management, digital experience, operational agility and cross-selling are all covered.
The new ‘A and B’ of the Finance Function: Analytics and Big Data - -Evolutio...Balaji Venkat Chellam Iyer
Published in 2013, this White Paper discusses how the finance function would evolve with the combined forces of Big Data and Analytics and the levers that could help catalyze the change and has drawn upon the Global Trend Study conducted by Tata Consultancy Services (TCS) on how companies were investing in Big Data and deriving returns from it.
Boosting Cybersecurity with Data Governance (peer reviewed)Guy Pearce
Data Governance has a significant role to play in information security, with special data classes beyond the regular four cyber classes (public, confidential, classified and restricted) being useful in helping the organization identify whether sensitive data was exposed in a breach.
James J Okarimia
Managing Partner
Aligning Finance, Risk and Data Analytics in Meeting the Requirements of Emerging Regulations
Banks must meet more (and more varied) regulations today than ever. The sheer scale and scope of banking regulations, including Dodd-Frank, Basel III and IFRS, pose challenges to all financial institutions, from the smallest bank to the largest financial services enterprise.
Similar to Reference data management in financial services industry (20)
Q2 Highlights:
Revenues grew 19% YonY and 8.2% QonQ
Profit after taxes were up 12.2% YonY and 17.0% QonQ
Order intake of US$ 176 mn, marking the 10th consecutive quarter of sequential increase in order intake
The Board recommends an interim dividend of Rs 10 per share as interim dividend. The record date for this payout will be 5th November 2019.
Consolidated revenues for the quarter under review grew 19.0% over the same period last year and 8.2% sequentially over the preceding quarter to Rs 1038.5 crore. EBITDA margin for the quarter expanded to 18.3%, up 118 basis points QoQ.
Among verticals, Insurance grew 15.3% QonQ contributing 31.1% of overall revenues, BFS expanded 9.4% QonQ contributing to 16.7% of revenue, and Travel Transport and Hospitality (TTH) was up 5.8% QonQ contributing to 27.8% of revenue. Other segments collectively grew 0.8% QonQ and they now represent 24.4% of overall revenues.
Digital revenues grew by 56% YoY and 18% QoQ, contributing to 38% of the total revenues in the quarter under review. Americas, EMEA, APAC and India contributed 49%, 37%, 10% and 4% of the revenue mix.
Fresh business of US$176mn was secured by the company during the quarter. As a result, the order book executable over the next twelve months has also increased to US$405mn.
“We have delivered robust revenue and margin performance yet again in line with our stated intent to drive robust, predictable and profitable growth for our business. The fundamentals of the business continue to be strong, as reflected in the sustained deal wins and the operating margin threshold that we have established,” said Mr. Sudhir Singh, Chief Executive Officer, NIIT Technologies Ltd.
Acknowledgements:
A TBR Perspective on Transform at the intersect - NIIT Technologies and the near future of Digital and Post-digital Transformation
A special blog by NelsonHall on how NIIT Technologies Delivers Digital Transformation with Capacity & Capability at Speed and Scale
HfS Research PoV on Change the game with verticalized AI: NIIT Technologies’ unique play as a post-digital firm
Q1 Highlights:
Revenues grew 16.7% YoY.
Profit after taxes up 2.0% YoY on reported basis, up 17.3% YoY after adjusting for non-recurring expenses.
Order intake of US$ 175 mn, marking the 9th consecutive quarter of sequential increase in order intake.
The quarter under review had one-time non-recurring expenses of Rs. 235 mn translating to a negative impact of 240 bps. Adjusted for that, the EBITDA margin for the quarter stood at 16.9%, an expansion of 103 basis points YoY, and PAT increased 17.3% YoY to Rs 100.6 crore.
In constant currency terms, BFS expanded 2.8% QoQ contributing to 16.5% of revenue, Travel & Transportation (TT) was up 5.9% QoQ contributing to 28.3% of revenue and Insurance grew 6.6% QoQ contributing 29.1% of overall revenues. Others segments collectively grew 1.5% QoQ and they now represent 27.0% of overall revenues.
Digital revenues grew 46% YoY contributing to 34% of the total revenues. Americas, EMEA, APAC and India contributed 49%, 35%, 11% and 5% of the revenue mix.
The Company secured fresh business of US$175mn during the quarter. The order executable over the next twelve months has also increased to US$395mn.
“We registered a good performance in Q1FY20 and the fundamentals of the business are strong,” said Mr. Sudhir Singh, Chief Executive Officer, NIIT Technologies Ltd.
Acknowledgements:
NIIT Technologies ranked #1 in ‘Business Understanding’ for the second consecutive year in ‘Whitelane’s 2019 UK IT Sourcing Study’.
NIIT Technologies named as a Leader among midsize agile software development service providers, by Forrester Research Inc., an independent research and advisory firm, in their report, The Forrester WaveTM: Midsize Agile Software Development Service Providers, Q2 2019.
NIIT Technologies companies Incessant Technologies and RuleTek received Pega Partner Award 2019 for ‘Excellence in Growth and Delivery’.
FY’19 Key Highlights
• Revenues expand 22.9%
• Operating profit up 28.7%
• Operating margin improved 80 bps to 17.6%
• Net Profits improved by 43.9%
• Cumulative order intake for the year is USD 646 MN. Up 27% over previous year
FY’19 Geo mix
Americas- 49%
EMEA-33%
India-8%
APAC- 10%
FY’19 Industry mix
Insurance- 28.7%
BFS- 16.1%
Travel & Transportation- 26.9%
Leadership Speaks
“FY 19 was one of the most successful years in our firm’s history. Not only did we deliver very significant growth but we also increased operating margin simultaneously. Our strategy of transforming the three industries we serve at their intersection with emerging technologies continues to differentiate and drive growth.”
Mr. Sudhir Singh, Chief Executive Officer, NIIT Technologies Ltd.
“The year was characterized by strong deal momentum. Order intake improved steadily in each quarter with large deal wins and new logo additions. USD 170 m of fresh business was secured during the quarter”.
Mr. Arvind Thakur, Vice Chairman and Managing Director, NIIT Technologies Ltd.
“With strong leadership in place, the platform is set for our next phase of growth”.
Mr. Rajendra S Pawar, Chairman, NIIT Technologies Ltd.
Acknowledgements
• Recognized in the Best of The Global Outsourcing 100® list produced by IAOP
• Positioned as a Leader in the NelsonHall NEAT Report for RPA & AI in Banking 2019
Q2 FY19 PAT up 66.3% YoY
Q2 Highlights:
• Revenues up 23.1% YoY and 10.0% QoQ
• Strong improvement in Operating Profits, by 37.2% YoY and 25.1% QoQ
• Operating Margins expand by 186 bps YoY and 217 bps QoQ
• Fresh Order Intake expands to USD 160 Mn
NIIT Technologies delivers robust 145% growth in PAT for FY’16NIIT Technologies
NIIT Technologies Limited, a leading global IT solutions organization, announced its financial results for the year FY15-16 resulting in revenues of `2,682 Crores, operating profits at `473 Crores and net profits at `280 Crores.
4 factors to consider before finalizing a Cargo Management SystemNIIT Technologies
With a gradual increase in the air cargo traffic year on year, most airlines and cargo handling companies are increasingly facing challenges in cargo management. Cargo handling has become extremely complex and unpredictable, involving multiple stakeholders with multi-warehouse operations in a multi-location environment. An efficient cargo management system can enhance service capabilities and significantly improve customer experience, while saving precious time and money, especially at the time when the industry faces an uphill battle to restore competitiveness and increase its share of trade growth.
Unlock value potential from Cargo Management OperationsNIIT Technologies
Cargo Management Systems are beneficial in times where quantity of cargo and burden on infrastructure has increased. The improved cargo turnaround rate, easy integration with system landscapes and partner networks make it an effective solution.
New Distribution Capability benefits and challengesNIIT Technologies
Is NDC (New Distribution Capability) in Airlines a norm or a phenomenon?
Explore the benefits and challenges of New Distribution Capability: goo.gl/K5n5wO
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
Reference data management in financial services industry
1. www.niit-tech.com
NIIT Technologies White Paper
Reference Data Management in Financial
Services Industry
Reference Data Management in Financial
Services Industry
Vinit Sharma
2. Reference Data Management 1
Introduction 3
Data Management 3
Drivers behind Data Management 3
Data Classification 4
Reference Data Management in Financial Services 4
Challenges of Reference Data Management 5
Our Reference Data Management Process 5
Implementation 6
Conclusion 8
NIIT Experience and Benefits 8
About Author 9
About NIIT Technologies 9
CONTENTS
3. INVESTMENT
REVENUEREVENUE
INVESTMENT
INVESTMENT
INVESTMENT
FINANCE
FINANCE
FINANCE
FINAANCE
FINANCE
WEALTH
MARKET
MARKET
MATKET
CAPITAL
CAPITALCAPITAL
CARGO CAPITAL
ECONOMICS
ECONOMICS
BANKING
Data Management is becoming increasingly challenging in the
financial services industry. Financial institutions, exchanges, and
market participants are undergoing significant and fundamental
transformation. In this context, it is extremely important to manage
creation and maintenance of data to ensure its relevance and
mitigate any risks arising out of data inconsistency. Data accuracy
and reliability is vital for a financial organization as it is mission
critical and a key enabler for all its business operations including
trade execution, risk management or compliance reporting.
Effective data management calls for seamless integration between
all elements of the overall data management lifecycle.
• Strategy
• Governance
• Operations
• Review, analysis and actions
In most financial institutions data is spread across multiple regions,
departments and systems. Several of these entities have to
reference data pertaining to parent company; however, there is no
central source of data. Instead, the entities have their own
nomenclature and data sources piled in silos and redundant
systems designed to extract and process data for individual
requirements. Apart from being an inefficient design, it is extremely
cost ineffective and prone to data inconsistency.
Reference Data Management (RDM) is a solution that addresses all
the above stated issues. It is a methodology of managing the
creation and maintenance of data that can be shared across
multiple regions, departments and systems. RDM collates data
from multiple sources, normalizes it into a standard format,
Introduction
Data management is the development and execution of
architectures, policies, practices, and procedures to manage the
information lifecycle needs of an enterprise in an effective manner.
validates the data for accuracy, and consolidates it into a single
consistent data copy for distribution.
This white paper analyses the need for Reference Data
Management in the financial services industry and elucidates the
challenges associated with its implementation. The paper also
focuses on the critical elements of RDM implementation and some
of the major benefits an organization can derive by implementing a
robust Reference Data Management into its IT infrastructure.
Data Management
Fundamental changes in the financial services industry have
created a significant impact on data management platforms. Some
of the key drivers of change are:
Diverse Instruments
In the quest to offer compelling products toclients, brokers/dealers
have created many innovative financial instruments. Currently, there
are more than eight million instruments, each requiring a firm to
maintain detailed, timely and accurate information. Derivative
issues are only one example of financially engineered securities
that did not exist just a few years ago. These new financial
products and their complex terms have become a challenge for
executives managing financial information.
Drivers behind Data Management
3
4. Changes in Market Mechanism
Trade execution mechanisms have been altered by the shifting
composition of market participants. For example, there has been a
rapid increase in the number of hedge funds and the emergence of
mega “buy-side” firms, many of which use program trading and
other algorithmic execution models. Decimalization and program
trading have led to a reduction in trade size with a corresponding
increase in volume. These factors have put a strain on data
management platforms as they are required to deliver high volumes
of data with low latency to black-box trading systems.
Regulations and Compliance
Regulation and compliance are also key drivers in the march
towards an improved data management platform. The emergence
of Basel II, Sarbanes-Oxley and other key risk and compliance
considerations has forced firms to place high priority on production
of accurate and timely data to feed internal risk management
systems. As a result, institutions must now meet a more stringent
fiduciary responsibility to provide correct data to regulatory
agencies. Faulty information can result in dire consequences and
catastrophic financial exposure.
Data Aggregators’s Expanding Role
The industry’s demand for a wide range of security attributes
and pricing information has given rise to an entire sub-industry
populated by vendors who specialize in financial data capture
and distribution. These vendors are playing an increasingly
significant role in managing and providing data. However,
managing multiple sources of data creates cost and consistency
issues that must be fixed.
categories, each with its own set of characteristics. Each of these
categories may have strong dependencies on each other.
However, failure to recognize these differences is risky. Projects
that do not address the unique nature of each data category will
invariably encounter problems and are likely tofail.
Primarily data can be categorized into the following types:
Transaction Activity Data - It represents the transactions that
operational systems are designed to automate.
Transaction Audit Data – It is the data that tracks the progress of
an individual transaction such as web logs and database logs.
Enterprise Structure Data – This is the data that represents the
structure of an enterprise, particularly for reporting business activity
by responsibility. It includes organizational structure and charts of
accounts.
Master Data – Master Data represents the parties to the
transaction of the enterprise. It describes the interactions when a
transaction occurs.
Reference Data – Reference Data is any kind of data that is used
solely to categorize other data found in a database, or solely for
relating data in a database to information beyond the boundaries of
the enterprise. In financial services, it includes descriptive
information about securities, corporations and individuals.
Market Data – In financial services, market data refers to real time
or historical information about prices.
Derived Data – Derived data refers to data that is derived from
other data. It is calculated by various calculators and models made
available to a wide range of applications.
4
Data is not a homogoneous entity. It consists of different
Data Classification
Most Relevant
to Design
Metadata
Increasing:
• Semantic Content
• Data Quality Importance
DATABASE
• Volume of Data
• Rates of Update
• Population Later in Time
• Shorter Life Span
Reference Data
Master Data
Enterprise Structure Data
Transaction Activity Data
Transaction Audit Data
Most Relevant
to Outside World
Most Relevant
to Business
Most Relevant
to Technology
Fig. 1 Categories of Data
5. Some of the common challenges financial institutions face are –
• Challenges in managing exponential increase of asset classess,
new securities and volume
Challenges of Reference Data ManagementChallenges of Reference Data Management
NIIT Technologies deploys new methodologies, proprietary
software, and tools from industry leading software vendors to tackle
reference data management challenges. There are many third party
product providers who focus on specific elements in the chain of
reference data management without having a holistic view of the
complexities surrounding the entire life cycle of reference data. Our
Reference Data Management (RDM) processes focus on these
complexities and are divided into four critical stages –
Data Acquisition
a. Data is acquired via robust market facing interfaces such as
Bloomberg, Reuters, and JJ Kenney
b. Data is continuously updated and monitored as it is critical for
successful data acquisition
Data Validation and Mapping
a. Automated reference data validation and mapping is done via
rule engines as Exception Management and lot of support is
required to perform manual data mapping
Data Enrichment and Transformation
a. Reference data is enriched and standardized
b. A golden copy of the data is created for instrument pricing
Data Distribution
a. Golden data is distributed to external third party systems
b. Audit trail and action tracking is performed as it is extremely
important at this stage
Our Reference Data Management ProcessOur Reference Data Management Process
Increased global regulatory pressure coupled with fragmented
regulatory landscape is making financial institutions realize the
value of putting a data governance strategy in place. Improving
data quality is an ongoing effort and financial institutions are facing
the challenge of improving their technology infrastructure to
address this issue. Reference data management projects are major
technology investments to improve data quality. Data integration
and the concept of a single source is a massive challenge
especially in APAC banks as data is still being managed in silos.
Increasing volume of data means working with multiple data
sources. Client data and the single view of the customer is a critical
area driven by regulations such a Anti Money Laundering (AML)
and Know Your Customer (KYC).
Historically firms have maintained, built and managed their own
security and client master databases in isolation from other market
participants. As these organizations expanded organically or
through acquisition, data silos matching each line of business
emerged. Most of these data platforms are similar in style and
content within and across firms. Typically they are maintained
through a combination of automated data feeds from external
vendors, internal applications and manual entries and adjustments.
It is not uncommon for these platforms to contain aging
infrastructure and disparate, highly de-centralized data stores.
• Duplicate data vendor purchase, expensive manual data cleansing
and poor data management leading to high aggregate costs
• Challenges in managing multiple securities masters, multiple
repositories and different sources of all asset classes across
different geographical markets
• Different identifiers (CUSIP,ISIN,SEDOL,internal identifier) used
by front offices and middle offices
5
Reference Data Management in
Financial Services
6. Based on the fundamental components of the data life cycle, NIIT
Technologies has developed a nine-step solution for end-to-end
reference data management. Our reference data management
solution enables firms to manage the entire reference data
environment - from vendor data rationalization to enterprise
reference data architecture design and integration; from indexing to
automated data cleansing and distribution. Our reference data
management offering includes the following elements:
Reference and Data Rationalization – This process workflow
creates a cross reference of each data element and rationalizes
reference data spend by identifying duplicate purchases.
Enterprise Data Architecture Assessment &Package
Implementations –This process is used to evaluate current
architecture, align it with future growth plans and identify
constraints for the enterprise reference data architecture.
Index and Normalize Securities Data – Uses a set of industry
standard tools, to create a consistent and single enterprise-wide
key matrix for all securities.
Automated Data Cleansing System – This system supports a
rule based commercial reference data cleansing systems to
process reference data.
Data Validation and Mapping – This process automates data
mapping and data validation based on rules engine. This prevents
automatic overrides.
Corporate Actions Processing – Helps maintain security
reference data by automatically applying corporate actions with
manual support for complex electives.
New Securities Setup – Enables continuous monitoring of
security masters and sets up new securities on demand.
ImplementationImplementation
Settlement PlatformSettlement Platform 6
Market Facing Client Facing
Reference Data Management
Rule & Configuration Engine
Server & Graphical User Interface
Financial
Instruments
Provider Specific
Data Updates Data Integrity Monitoring Audit Trail Action Tracking
Instrument Type Specific Market Specific Client Specific
Data
Acquisition
DataMapping
Data
Transformation
DataValidation
DataDistribution
Issuers
Instrument
Prices
Data Enrichment Validated Data
Records
Market Client
Exception
Management
Corporate
Actions
Daily & Annual
Tax Figures
Information Gathering Data Normalization & Validation Data Delivery
Data Receiver
• Accounting
• Compliance
• Back Office
Vendor Feeds
Data Vendors
Fig 2 Reference Data Management Solution
7. 7
Enterprise Reference Data Distribution – Enable BOCADE
(Buy Once Clean and Distribute Everywhere) reference data
distribution across the enterprise and build audit capability for price
requests.
Instrument Pricing – Provides timely and accurate instrument
pricing data to bankers and financial advisors.
Reference Data Efficiency Dashboard – Makes RDMS black
box transparent by monitoring reference data consumption, quality
and cleansing status.
It includes pre-defined extensible data models and access methods
with powerful applications to centrally manage the quality and
lifecycle of business data.
Clean, consolidated and accurate data seamlessly propagated
throughout the enterprise can save companies millions of dollars a
year; dramatically increasing supply chain and selling efficiencies;
improve customer loyalty; and support sound corporate gover-
nance. NIIT Technologies has the implementation know-how to
develop and utilize best data management practices with proven
industry knowledge. These strengths have led to a large ecosystem
with a large number of partners.
Companies around the world are consolidating data; modernizing
applications; re-engineering business process; improving customer
loyalty scores and managing risk more efficiently by making use of
NIIT Technologies’ Reference Data Management solution.
NIIT Technologies reference data management solution delivers a
single, well defined, accurate, relevant, complete, and consistent
view of the data across multiple regions, departments and systems.
The results for companies that have implemented these solutions
are dramatic. They are successfully achieving the elusive goal – that
of a consolidated version of the data across the enterprise.
Strong Industry Focus
NIIT has several thousand person years of experience in designing,
building and maintaining large-scale applications for day-to-day
business and has considerable experience in Front Office, Middle
Office and Back Office operations. As per the Datamonitor Black
Book of Outsourcing 2010 survey, in the overall satisfaction ratings,
NIIT Technologies is ranked number 1 in the Data Management
Services. NIIT’s team has working knowledge of Charles River,
Calypso, Advent Moxy, Linedata Longview, MacGregor ITG, Eze
Castle, Omgeo, Bloomberg, Reuters, Yodlee solutions such as
“Yodlee Account Data gathering” and many other tools and
products used in the industry.
Financial services organizations deal with numerous financial instru-
ments ranging from stocks and funds to derivatives so as to meet
the requirements of the ever-increasing demands of the global
securities marketplace. As such they need to tackle a huge amount
of data to trade and keep track of these instruments.
NIIT Technologies Reference Data Management (RDM) solution
helps clients rationalize the process of reference data consumption.
It is designed to consolidate, cleanse, govern, and distribute these
key business data objects across the enterprise and across time.
ConclusionConclusion
NIIT Experience and Benefits
Fig 3 Reference Data Management Offerings
HolisticRDMSOffering
Reference and Data Rationalization
Index and Normalize securities data
Automated data cleansing systems
Data Validation & Mapping
New securities setup
Corporate actions processing
Enterprise reference data distribution
Instrument Pricing
Reference data efficiency dashboard
Enterprise data architecture assessment &
package implementations
8. Access to large resource base
NIIT has a large resource base of over 5000 analysts and consul-
tants and hence is able to quickly source professionals with the
desired skill sets required for the project. Furthermore, we also
possess the capability to ensure a quick ramp-up of project
resources when in need.
8
NIIT offerings span business and technology consulting, application
development and management services, IT infrastructure services,
and business process outsourcing. Our services to customer-
partners across the world has led to the evolution of a strong value-
optimizing framework for offering similar services through a cost
effective delivery model that can be used in single shore, dual or
multi shore formats.
Mature Best-in-class Process Framework
NIIT software factories are ISO 27001, CMMi Level 5 and PCM Level
5 accredited. Our resources are therefore well versed with operating
in a highly mature process oriented and secure environment and
bring this expertise to all client engagements.
Technology Bandwidth
9. A global IT sourcing organization | 21 locations and 14 countries | 7000+ professionals | Level 5 of SEI-CMMi, ver1.2
ISO 27001 certified | Level 5 of People CMM Framework
D_09_220612
Write to us at marketing@niit-tech.com www.niit-tech.com
NIIT Technologies is a leading IT solutions organization, servicing customers in North America,
Europe, Middle East, Asia and Australia. The company offers services in Application
Development and Maintenance, Managed Services, Cloud Computing and Business Process
Outsourcing to organizations in the Financial Services, Insurance, Travel, Transportation and
Logistics, Manufacturing and Distribution and Government sectors.
The company’s deep domain knowledge and new approaches to customer experience
management with robust outsourcing capabilities, and a dual shore delivery model, have made
NIIT Technologies a preferred IT partner for global majors in these chosen industries. Profound
and enduring customer engagements have become a hallmark of NIIT Technologies.
NIIT Technologies vision is to be the “First Choice” of services for the focused segments
serviced. The company has a simple strategy - to focus and differentiate. It competes on the
strength of its specialization.
Over the years the company has forged extremely rewarding relationships with global majors, a
testimony to mutual commitment and its ability to retain marquee clients, drawing repeat
business from them. Whether it is global banking and insurance major, leading Asset
Management solutions provider, the Number Two cement manufacturer, or travel big-wigs, NIIT
Technologies has been able to scale its interactions with these marquee clients into extremely
meaningful, multi-year "collaborations.
About NIIT Technologies
Vinit Sharma is a Business Solution designer within the Banking and Financial Services practice
at NIIT Technologies Ltd. He has over 8 years of experience. His expertise includes Capital
Markets, Corporate Finance, Credit Card and US Mortgage business.
About the Author
Europe
Singapore
India
NIIT Technologies Ltd.
Corporate Heights (Tapasya)
Plot No. 5, EFGH, Sector 126
Noida-Greater Noida Expressway
Noida – 201301, U.P., India
Ph: +91 1 120 399 9555
Fax: +91 1 120 399 9150
Americas
NIIT Technologies Inc.,
1050 Crown Pointe Parkway
5th
Floor, Atlanta, GA 30338, USA
Ph: +1 (770) 551 9494
Toll Free: +1 (888) 454 NIIT
Fax: +1 (770) 551 9229
NIIT Technologies Limited
2nd
Floor, 47 Mark Lane
London - EC3R 7QQ, U.K.
Ph: +44 (0) 20 70020700
Fax: +44 (0) 20 70020701
NIIT Technologies Pte. Limited
31 Kaki Bukit Road 3
#05-13 Techlink
Singapore 417818
Ph: +65 68488300
Fax: +65 68488322