This document provides a summary of Brooke Guthrie's experience and qualifications. She has over 20 years of experience in data governance, data modeling, database design, and business analysis. Her expertise includes data warehousing, dimensional modeling, logical and physical data modeling, and working with tools like ERWin and ER/Studio. She has worked on projects in various industries for companies such as CatamaranRX, Kaiser Permanente, Cigna, and Mass Mutual.
This document summarizes Jay Zabinsky's experience as a consultant with over 15 years of experience in metadata management, data modeling, database design, data analysis, data integration, and data governance. He has extensive experience with software such as Oracle, SQL Server, IBM Information Analyzer, Erwin, and SSIS. His experience includes roles managing metadata repositories, data modeling, ETL development, data warehousing, and data integration projects.
The document discusses several data-related careers including Chief Data Officer, Data Analyst, Data Scientist, Data Engineer, Data Modeler, Data Architect, and Data Entry Specialist. For each role, it provides a brief description of typical tasks and the average salary. It also notes common skills and experience levels associated with higher pay for some of the roles. The document serves as an overview of the various types of data-focused jobs that have emerged with the growth of data and its importance in business.
K.Sandeep has over 7.8 years of experience in data analytics, machine learning, business analysis, and statistical modeling. He has worked on projects in various domains including retail, utilities, telecom, banking, and finance. Some of the technologies he lists expertise in include R, SQL, Hive, and SPSS. Currently he works as a data scientist at Development Bank of Singapore where his responsibilities include data analysis, predictive modeling, and identifying trends to help with fraud detection and market analytics.
This document is a resume for Annupriya, who is currently pursuing an MS in Management Information Systems from the University at Buffalo. She has 5 years of experience as a Senior Systems Engineer at Infosys, where she implemented data warehousing and business intelligence solutions for banking clients using tools like SAS and Tableau. Her skills include SAS, SQL, databases, and data analytics. She has also completed academic projects in data warehousing and dashboard design.
Jaspreet Kaur Chawla has over 5 years of experience as a Data Analyst. She has a Master's degree in Management Information Systems from SUNY Buffalo and a Bachelor's degree in Information Technology. Her technical skills include SQL, PL/SQL, R, SAS, Tableau, and Microsoft BI tools. She has experience designing ETL processes, data warehouses, and dashboards.
Practical Applications for Data Warehousing, Analytics, BI, and Meta-Integrat...DATAVERSITY
The presentation provides an overview of data warehousing, business intelligence, analytics, and meta-integration technologies, explaining their definitions and importance for enabling analysis of previously unintegrated information to support better business decision making. It also discusses common data warehouse failures and outlines best practices for implementing these technologies, including the use of meta-models and a focus on data quality. The presentation concludes by emphasizing the takeaways and providing references and an opportunity for questions.
Data Quality in Data Warehouse and Business Intelligence Environments - Disc...Alan D. Duncan
Time and again, we hear about the failure of data warehouses – while things may be improving, they’re moving only slowly. One explanation data quality being overlooked is that the I.T. department is often responsible for delivering and operating the DWH/BI
environment. What ensues ends up being an agenda based on “how do we build it”, not a “why are we doing this”. This needs to change. In this discussion paper, I explore the issues of data quality in data warehouse, business intelligence and analytic environments, and propose an approach based on "Data Quality by Design"
This document discusses BP's data modelling challenges and solutions. BP has over 100,000 employees operating in over 100 countries with 250 data centers and over 7,000 applications. Their challenges included decentralized management of data modelling, lack of standards and governance, and models getting lost after projects. Their solution included a self-service DMaaS portal for ER/Studio licensing and model publishing. It provides automated reporting, judicious use of macros, and a community of interest. Next steps include promoting data modelling to SAP architects and expanding training, certification and the online community.
This document summarizes Jay Zabinsky's experience as a consultant with over 15 years of experience in metadata management, data modeling, database design, data analysis, data integration, and data governance. He has extensive experience with software such as Oracle, SQL Server, IBM Information Analyzer, Erwin, and SSIS. His experience includes roles managing metadata repositories, data modeling, ETL development, data warehousing, and data integration projects.
The document discusses several data-related careers including Chief Data Officer, Data Analyst, Data Scientist, Data Engineer, Data Modeler, Data Architect, and Data Entry Specialist. For each role, it provides a brief description of typical tasks and the average salary. It also notes common skills and experience levels associated with higher pay for some of the roles. The document serves as an overview of the various types of data-focused jobs that have emerged with the growth of data and its importance in business.
K.Sandeep has over 7.8 years of experience in data analytics, machine learning, business analysis, and statistical modeling. He has worked on projects in various domains including retail, utilities, telecom, banking, and finance. Some of the technologies he lists expertise in include R, SQL, Hive, and SPSS. Currently he works as a data scientist at Development Bank of Singapore where his responsibilities include data analysis, predictive modeling, and identifying trends to help with fraud detection and market analytics.
This document is a resume for Annupriya, who is currently pursuing an MS in Management Information Systems from the University at Buffalo. She has 5 years of experience as a Senior Systems Engineer at Infosys, where she implemented data warehousing and business intelligence solutions for banking clients using tools like SAS and Tableau. Her skills include SAS, SQL, databases, and data analytics. She has also completed academic projects in data warehousing and dashboard design.
Jaspreet Kaur Chawla has over 5 years of experience as a Data Analyst. She has a Master's degree in Management Information Systems from SUNY Buffalo and a Bachelor's degree in Information Technology. Her technical skills include SQL, PL/SQL, R, SAS, Tableau, and Microsoft BI tools. She has experience designing ETL processes, data warehouses, and dashboards.
Practical Applications for Data Warehousing, Analytics, BI, and Meta-Integrat...DATAVERSITY
The presentation provides an overview of data warehousing, business intelligence, analytics, and meta-integration technologies, explaining their definitions and importance for enabling analysis of previously unintegrated information to support better business decision making. It also discusses common data warehouse failures and outlines best practices for implementing these technologies, including the use of meta-models and a focus on data quality. The presentation concludes by emphasizing the takeaways and providing references and an opportunity for questions.
Data Quality in Data Warehouse and Business Intelligence Environments - Disc...Alan D. Duncan
Time and again, we hear about the failure of data warehouses – while things may be improving, they’re moving only slowly. One explanation data quality being overlooked is that the I.T. department is often responsible for delivering and operating the DWH/BI
environment. What ensues ends up being an agenda based on “how do we build it”, not a “why are we doing this”. This needs to change. In this discussion paper, I explore the issues of data quality in data warehouse, business intelligence and analytic environments, and propose an approach based on "Data Quality by Design"
This document discusses BP's data modelling challenges and solutions. BP has over 100,000 employees operating in over 100 countries with 250 data centers and over 7,000 applications. Their challenges included decentralized management of data modelling, lack of standards and governance, and models getting lost after projects. Their solution included a self-service DMaaS portal for ER/Studio licensing and model publishing. It provides automated reporting, judicious use of macros, and a community of interest. Next steps include promoting data modelling to SAP architects and expanding training, certification and the online community.
The document discusses business analysis and data warehousing. It covers the syllabus for Unit III which includes topics like business analysis, reporting and query tools, OLAP, patterns and models, statistics, and artificial intelligence. It then discusses business analysis in more detail including defining it, the business analysis process, ensuring goals are oriented, and roles of business analysts like strategist, architect and systems analyst. Finally, it covers business process improvement and different reporting and query tools.
This summary provides an overview of the professional experience and qualifications of Jack D. Erickson based on his resume. Erickson has over 20 years of experience in business analysis, project management, and quality assurance roles across various industries. He has strong skills in requirements gathering, documentation, testing, and acting as a liaison between technical and business stakeholders. Erickson has extensive experience leading projects involving system upgrades, implementations, and the selection and deployment of new technologies.
This presentation briefly discusses the following topics:
Classification of Data
What is Structured Data?
What is Unstructured Data?
What is Semistructured Data?
Structured vs Unstructured Data: 5 Key Differences
The Need to Know for Information Architects: Big Data to Big InformationDATAVERSITY
The document discusses the roles and skills of an information architect. It states that an information architect must be able to bridge various groups through skills like UI/UX, data warehousing, taxonomy, and knowledge management. The document also discusses how information architects can help organizations transform big data into big information through tools like master data management, data warehouses, and data hubs. It emphasizes that information architects should continue growing their careers through certification, training, mentorship programs, and contributing to their professional community.
Pavankumar Akula has over 8 years of experience in ETL, Oracle, Teradata SQL, and database administration. He has worked as a senior Teradata DBA for several large companies in retail, healthcare, and banking. His skills include query tuning, performance monitoring, database design, and utilities like FastLoad and MultiLoad. He is currently working as a Teradata DBA for Tesco on behalf of Tescra.
06. Transformation Logic Template (Source to Target)Alan D. Duncan
This document template defines an outline structure for the clear and unambiguous definition of transmission of data between one data storage location to another. (a.k.a. Source to Target mapping)
Data mapping in an important part of every data process. This eBook will help you understand what is data mapping and how it can help you establish connection between disparate data sets.
The Importance of MDM - Eternal Management of the Data MindDATAVERSITY
Despite its immaterial nature, data has a tendency to pile up as time goes on, and can quickly be rendered unusable or obsolete without careful maintenance and streamlining of processes for its management. This presentation will provide you with an understanding of reference and master data management (MDM), one such method for keeping mass amounts of business data organized and functional towards achieving business goals.
MDM’s guiding principles include the establishment and implementation of authoritative data sources and effective means of delivering data to various business processes, as well as increases to the quality of information used in organizational analytical functions (such as BI).
To that end, attendees of this webinar will learn how to:
- Structure their data management processes around these principles
- Incorporate data quality engineering into the planning of reference and MDM
- Understand why MDM is so critical to their organization’s overall data strategy
Why BI ?
Performance management
Identify trends
Cash flow trend
Fine-tune operations
Sales pipeline analysis
Future projections
business Forecasting
Decision Making Tools
Convert data into information
How to Think ?
What happened?
What is happening?
Why did it happen?
What will happen?
What do I want to happen?
The document discusses data clustering and preparation techniques applied to a dataset on start-up companies. It clusters the data into bins or buckets based on business logic to reduce redundancy and inconsistencies. It uses fuzzy logic mapping to identify textually similar records and remove duplicates. It also creates a cross table to cluster nominal data and reduce three columns of data into a single column, thereby reducing dimensionality. The clustering is done for each column individually and then the clustered columns are intersected to further categorize the data into bins based on column relationships.
ER/Studio XE3 is the fastest, easiest and most collaborative way for data modeling professionals to build and maintain enterprise-scale databases and data warehouses. ER/Studio XE3 sets a new standard for data management. ER/Studio XE3 empowers data management professionals to easily share, document, and publish models and metadata to distributed teams.
Learn more at
http://www.embarcadero.com/products/er-studio
Sindhuja Ramanathan is a senior data analyst based in Manchester, UK. She has 9 years of experience in data analysis, collection, management, and reporting. Her skills include SQL, UNIX scripting, data modeling, and tools like Teradata, Oracle, and MS Office. She holds certifications in ITIL and financial markets. Sindhuja has worked on banking campaigns and payment systems at Barclays and Visa, focusing on data delivery, validation, and problem-solving.
Harish Sanga has over 2 years of experience as an MDM developer. He has extensive experience creating complex objects, mappings, hierarchies, and rules in MDM applications like Informatica. He has worked on projects in the US and Singapore to consolidate customer master data from multiple sources and ensure data quality and accuracy. Some of his key skills include SQL, PL/SQL, Java, Informatica MDM, data modeling, ETL processes, and delivering robust MDM solutions. He holds a B.Tech degree from JNTU Hyderabad.
Metadata contains answers to questions about the data in a data warehouse. It is stored in a metadata repository and describes pertinent details about the data to users, developers, and the project team. Metadata is necessary for using, building, and administering the data warehouse as it provides information about data extraction, transformations, structure, refreshment, and more. It serves important roles for both business users and IT staff across the data acquisition, storage, and delivery processes.
Data-Ed Online Presents: Data Warehouse StrategiesDATAVERSITY
Integrating data across systems has been a perpetual challenge. Unfortunately, the current technology-focused solutions have not helped IT to improve its dismal project success statistics. Data warehouses, BI implementations, and general analytical efforts achieve the same levels of success as other IT projects – approximately 1/3rd are considered successes when measured against price, schedule, or functionality objectives. The first step is determining the appropriate analysis approach to the data system integration challenge. The second step is understanding the strengths and weaknesses of various approaches. Turns out that proper analysis at this stage makes actual technology selection far more accurate. Only when these are accomplished can proper matching between problem and capabilities be achieved as the third step and true business value be delivered. This webinar will illustrate that good systems development more often depends on at least three data management disciplines in order to provide a solid foundation.
Takeaways:
Data system integration challenge analysis
Understanding of a range of data system-integration technologies including
Problem space (BI, Analytics, Big Data), Data (Warehousing, Vault, Cube) and alternative approaches (Virtualization, Linked Data, Portals, Meta-models)
Understanding foundational data warehousing & BI concepts based on the Data Management Body of Knowledge (DMBOK)
How to utilize data warehousing & BI in support of business strategy
This document introduces an online course on data warehousing from Edureka. It provides an overview of key topics that will be covered in the course, including what a data warehouse is, its architecture, the ETL process, and modeling dimensions and facts. It also shows examples of using PostgreSQL to create tables and Talend to populate them as part of a hands-on project in the course. The course modules will cover data warehousing introduction, dimensions and facts, normalization, modeling, ETL concepts, and a project building a data warehouse using Talend.
A database is a large collection of integrated data that models real-world entities and relationships. A database management system (DBMS) is software that stores, manages, and provides access to databases. Key functions of a DBMS include data independence, efficient data access, data integrity and security, concurrent access, and crash recovery. While databases provide many advantages, their use requires substantial resources for setup, maintenance, and administration.
Enterprise data serves both running business operations and managing the business. Building a successful data architecture is challenging due to data complexity, competing stakeholder interests, data proliferation, and inaccuracies. A robust data architecture must address key components like data repositories, capture and ingestion, definition and design, integration, access and distribution, and analysis.
Data-Ed Online: Unlock Business Value through Reference & MDMDATAVERSITY
In order to succeed, organizations must realize what it means to utilize reference and MDM in support of business strategy. This presentation provides you with an understanding of the goals of reference and MDM, including the establishment and implementation of authoritative data sources, more effective means of delivering data to various business processes, as well as increasing the quality of information used in organizational analytical functions, e.g. BI. We also highlight the equal importance of incorporating data quality engineering into all efforts related to reference and master data management.
Learning objectives include:
What is Reference & MDM and why is it important?
Reference & MDM Frameworks and building blocks
Guiding principles & best practices
Understanding foundational reference & MDM concepts based on the Data Management Body of Knowledge (DMBOK)
Utilizing reference & MDM in support of business strategy
Jingjing Chen is seeking a position in data analysis. She received her M.A. in Economics and Education from Teachers College, Columbia University in 2015 and B.A. in Economics from Central University of Finance and Economics in 2013. She has skills in R, SAS, Stata, SQL, Python and Excel. Chen has completed internships in business intelligence, data analysis, and business analysis. Her projects include using text mining and machine learning techniques like LDA and kNN for tasks like text classification, face recognition, and analyzing bond election proposals.
Siva Kanagaraj has over 18 years of experience in information technology, including data modeling, ETL architecture, data warehousing, business intelligence, and data integration projects. He has extensive experience working with Fortune 500 companies in retail, banking, and telecommunications. Some of his key roles and responsibilities included designing conceptual, logical, and physical data models; defining ETL architectures and data mapping; managing software delivery from vendors; and developing enterprise data warehousing and master data management programs. He is proficient in various technologies such as IBM Infosphere, Oracle, SQL, Java, and mainframe applications.
David Colbourn is an experienced information architect seeking a senior role. He has over 25 years of experience in areas such as software analysis, design, data modeling, project management, and big data integration. His core competencies include information architecture, data modeling, database design, project management, and relational and non-relational systems. He has worked in various industries including banking, healthcare, telecommunications, and government.
The document discusses business analysis and data warehousing. It covers the syllabus for Unit III which includes topics like business analysis, reporting and query tools, OLAP, patterns and models, statistics, and artificial intelligence. It then discusses business analysis in more detail including defining it, the business analysis process, ensuring goals are oriented, and roles of business analysts like strategist, architect and systems analyst. Finally, it covers business process improvement and different reporting and query tools.
This summary provides an overview of the professional experience and qualifications of Jack D. Erickson based on his resume. Erickson has over 20 years of experience in business analysis, project management, and quality assurance roles across various industries. He has strong skills in requirements gathering, documentation, testing, and acting as a liaison between technical and business stakeholders. Erickson has extensive experience leading projects involving system upgrades, implementations, and the selection and deployment of new technologies.
This presentation briefly discusses the following topics:
Classification of Data
What is Structured Data?
What is Unstructured Data?
What is Semistructured Data?
Structured vs Unstructured Data: 5 Key Differences
The Need to Know for Information Architects: Big Data to Big InformationDATAVERSITY
The document discusses the roles and skills of an information architect. It states that an information architect must be able to bridge various groups through skills like UI/UX, data warehousing, taxonomy, and knowledge management. The document also discusses how information architects can help organizations transform big data into big information through tools like master data management, data warehouses, and data hubs. It emphasizes that information architects should continue growing their careers through certification, training, mentorship programs, and contributing to their professional community.
Pavankumar Akula has over 8 years of experience in ETL, Oracle, Teradata SQL, and database administration. He has worked as a senior Teradata DBA for several large companies in retail, healthcare, and banking. His skills include query tuning, performance monitoring, database design, and utilities like FastLoad and MultiLoad. He is currently working as a Teradata DBA for Tesco on behalf of Tescra.
06. Transformation Logic Template (Source to Target)Alan D. Duncan
This document template defines an outline structure for the clear and unambiguous definition of transmission of data between one data storage location to another. (a.k.a. Source to Target mapping)
Data mapping in an important part of every data process. This eBook will help you understand what is data mapping and how it can help you establish connection between disparate data sets.
The Importance of MDM - Eternal Management of the Data MindDATAVERSITY
Despite its immaterial nature, data has a tendency to pile up as time goes on, and can quickly be rendered unusable or obsolete without careful maintenance and streamlining of processes for its management. This presentation will provide you with an understanding of reference and master data management (MDM), one such method for keeping mass amounts of business data organized and functional towards achieving business goals.
MDM’s guiding principles include the establishment and implementation of authoritative data sources and effective means of delivering data to various business processes, as well as increases to the quality of information used in organizational analytical functions (such as BI).
To that end, attendees of this webinar will learn how to:
- Structure their data management processes around these principles
- Incorporate data quality engineering into the planning of reference and MDM
- Understand why MDM is so critical to their organization’s overall data strategy
Why BI ?
Performance management
Identify trends
Cash flow trend
Fine-tune operations
Sales pipeline analysis
Future projections
business Forecasting
Decision Making Tools
Convert data into information
How to Think ?
What happened?
What is happening?
Why did it happen?
What will happen?
What do I want to happen?
The document discusses data clustering and preparation techniques applied to a dataset on start-up companies. It clusters the data into bins or buckets based on business logic to reduce redundancy and inconsistencies. It uses fuzzy logic mapping to identify textually similar records and remove duplicates. It also creates a cross table to cluster nominal data and reduce three columns of data into a single column, thereby reducing dimensionality. The clustering is done for each column individually and then the clustered columns are intersected to further categorize the data into bins based on column relationships.
ER/Studio XE3 is the fastest, easiest and most collaborative way for data modeling professionals to build and maintain enterprise-scale databases and data warehouses. ER/Studio XE3 sets a new standard for data management. ER/Studio XE3 empowers data management professionals to easily share, document, and publish models and metadata to distributed teams.
Learn more at
http://www.embarcadero.com/products/er-studio
Sindhuja Ramanathan is a senior data analyst based in Manchester, UK. She has 9 years of experience in data analysis, collection, management, and reporting. Her skills include SQL, UNIX scripting, data modeling, and tools like Teradata, Oracle, and MS Office. She holds certifications in ITIL and financial markets. Sindhuja has worked on banking campaigns and payment systems at Barclays and Visa, focusing on data delivery, validation, and problem-solving.
Harish Sanga has over 2 years of experience as an MDM developer. He has extensive experience creating complex objects, mappings, hierarchies, and rules in MDM applications like Informatica. He has worked on projects in the US and Singapore to consolidate customer master data from multiple sources and ensure data quality and accuracy. Some of his key skills include SQL, PL/SQL, Java, Informatica MDM, data modeling, ETL processes, and delivering robust MDM solutions. He holds a B.Tech degree from JNTU Hyderabad.
Metadata contains answers to questions about the data in a data warehouse. It is stored in a metadata repository and describes pertinent details about the data to users, developers, and the project team. Metadata is necessary for using, building, and administering the data warehouse as it provides information about data extraction, transformations, structure, refreshment, and more. It serves important roles for both business users and IT staff across the data acquisition, storage, and delivery processes.
Data-Ed Online Presents: Data Warehouse StrategiesDATAVERSITY
Integrating data across systems has been a perpetual challenge. Unfortunately, the current technology-focused solutions have not helped IT to improve its dismal project success statistics. Data warehouses, BI implementations, and general analytical efforts achieve the same levels of success as other IT projects – approximately 1/3rd are considered successes when measured against price, schedule, or functionality objectives. The first step is determining the appropriate analysis approach to the data system integration challenge. The second step is understanding the strengths and weaknesses of various approaches. Turns out that proper analysis at this stage makes actual technology selection far more accurate. Only when these are accomplished can proper matching between problem and capabilities be achieved as the third step and true business value be delivered. This webinar will illustrate that good systems development more often depends on at least three data management disciplines in order to provide a solid foundation.
Takeaways:
Data system integration challenge analysis
Understanding of a range of data system-integration technologies including
Problem space (BI, Analytics, Big Data), Data (Warehousing, Vault, Cube) and alternative approaches (Virtualization, Linked Data, Portals, Meta-models)
Understanding foundational data warehousing & BI concepts based on the Data Management Body of Knowledge (DMBOK)
How to utilize data warehousing & BI in support of business strategy
This document introduces an online course on data warehousing from Edureka. It provides an overview of key topics that will be covered in the course, including what a data warehouse is, its architecture, the ETL process, and modeling dimensions and facts. It also shows examples of using PostgreSQL to create tables and Talend to populate them as part of a hands-on project in the course. The course modules will cover data warehousing introduction, dimensions and facts, normalization, modeling, ETL concepts, and a project building a data warehouse using Talend.
A database is a large collection of integrated data that models real-world entities and relationships. A database management system (DBMS) is software that stores, manages, and provides access to databases. Key functions of a DBMS include data independence, efficient data access, data integrity and security, concurrent access, and crash recovery. While databases provide many advantages, their use requires substantial resources for setup, maintenance, and administration.
Enterprise data serves both running business operations and managing the business. Building a successful data architecture is challenging due to data complexity, competing stakeholder interests, data proliferation, and inaccuracies. A robust data architecture must address key components like data repositories, capture and ingestion, definition and design, integration, access and distribution, and analysis.
Data-Ed Online: Unlock Business Value through Reference & MDMDATAVERSITY
In order to succeed, organizations must realize what it means to utilize reference and MDM in support of business strategy. This presentation provides you with an understanding of the goals of reference and MDM, including the establishment and implementation of authoritative data sources, more effective means of delivering data to various business processes, as well as increasing the quality of information used in organizational analytical functions, e.g. BI. We also highlight the equal importance of incorporating data quality engineering into all efforts related to reference and master data management.
Learning objectives include:
What is Reference & MDM and why is it important?
Reference & MDM Frameworks and building blocks
Guiding principles & best practices
Understanding foundational reference & MDM concepts based on the Data Management Body of Knowledge (DMBOK)
Utilizing reference & MDM in support of business strategy
Jingjing Chen is seeking a position in data analysis. She received her M.A. in Economics and Education from Teachers College, Columbia University in 2015 and B.A. in Economics from Central University of Finance and Economics in 2013. She has skills in R, SAS, Stata, SQL, Python and Excel. Chen has completed internships in business intelligence, data analysis, and business analysis. Her projects include using text mining and machine learning techniques like LDA and kNN for tasks like text classification, face recognition, and analyzing bond election proposals.
Siva Kanagaraj has over 18 years of experience in information technology, including data modeling, ETL architecture, data warehousing, business intelligence, and data integration projects. He has extensive experience working with Fortune 500 companies in retail, banking, and telecommunications. Some of his key roles and responsibilities included designing conceptual, logical, and physical data models; defining ETL architectures and data mapping; managing software delivery from vendors; and developing enterprise data warehousing and master data management programs. He is proficient in various technologies such as IBM Infosphere, Oracle, SQL, Java, and mainframe applications.
David Colbourn is an experienced information architect seeking a senior role. He has over 25 years of experience in areas such as software analysis, design, data modeling, project management, and big data integration. His core competencies include information architecture, data modeling, database design, project management, and relational and non-relational systems. He has worked in various industries including banking, healthcare, telecommunications, and government.
This document is a resume for R. Michael Levin summarizing his objective, clearances, technical skills, and professional experience. Levin has over 20 years of experience in fields like enterprise architecture, data architecture, data modeling, database administration, and project management. He currently works as a Solution Architect and Technical Project Manager developing data warehouses and managing teams. His previous roles include positions as a Principal Solution Data Architect, Senior Data Architect, and Senior Database Architect at various companies.
Meggan Carter has over 20 years of experience in data analysis, data modeling, and data architecture. She holds an MBA from the University of Kansas and a BS in Management Information Systems from the University of Nebraska-Lincoln. Carter has worked as a Senior Data Architect at UMB Bank and a Senior Data Analyst at Commerce Bank. She has extensive experience designing data warehouses, data marts, and enterprise databases. Carter also has experience working for Sprint, DST Systems, and IBM. She currently serves on the board of the DAMA Kansas City chapter.
Suneel Mandam has over 17 years of experience in data modeling and BI architecture. He has extensive experience designing data warehouses, data marts, and dimensional models across various industries including aviation, insurance, government, and healthcare. Some of his key projects include designing a data vault model for a large organization, an enterprise party model for Delta Dental, and a master data management solution for Dubai Municipality. He is proficient in various data modeling, ETL, and reporting tools and has worked with clients such as Qantas Airlines, GE, and TechMahindra.
This document contains the professional summary and experience of Madhukar Eunny. He has over 12 years of experience working as a senior consultant on data warehousing projects. His roles have included ETL architect, developer, team lead, and production support. He has strong skills in ETL tools like Informatica and databases like Teradata, Oracle, and SQL Server. He currently works as a senior BI consultant for Medibank where he is responsible for requirements gathering, data modeling, ETL development, and providing business support.
Varadarajan Sourirajan is a data architect with over 16 years of experience seeking a new position. He has extensive experience in data modeling for both online transaction processing and data warehousing applications. Currently he is working on implementing a data warehouse for the treasury line of business at a large bank in the US, drawing on his experience delivering previous data warehouse projects and a proven track record of success.
- The document contains the resume of Abdul Mohammed, an ETL developer with 8 years of experience using Informatica for data warehousing projects.
- He has expertise in requirements gathering, data extraction from various sources, transforming the data using Informatica tools, and loading the data into target databases.
- His most recent role was as an ETL/SR Informatica Lead from 2015-present where he worked on building a data warehouse for a pharmaceutical company using Informatica to extract data from Oracle and flat files.
Prasad Kommoju has 5 years of experience in ETL development using Informatica. He has worked on projects in banking and manufacturing. His skills include Informatica PowerCenter, IDQ, MDM, DIH, SQL, and UNIX. Currently he works as a Senior Software Engineer at Tech Mahindra on a project with MasterCard managing customer data.
This document provides a detailed summary of Arun Mathew Thomas's work experience in IT and data warehousing. It outlines his over 9 years of experience in developing and maintaining data warehouse applications, with expertise in ETL processes, data modeling, performance tuning, and working with tools like Informatica, Teradata, and SQL. It also provides details on two specific roles he held, including developing mappings to load data from various sources into an enterprise data warehouse for Anthem Inc., and serving as an ETL/data quality architect for customer data projects.
Diane England is a data architect with over 20 years of experience designing databases including Oracle, SQL Server, and Essbase. She has extensive experience in data warehousing, data modeling, ETL processes, and ensuring data quality and governance. Her background includes projects in various industries from telecommunications to retail. She is proficient in modeling, documentation, and working with both technical and non-technical teams.
This curriculum vitae summarizes Pretesh Gungaram's career and qualifications. He has over 15 years of experience in data analytics, business intelligence, data warehousing, and information technology. He holds a B-Tech in Information Technology from Technikon Witwatersrand and has worked in roles at Nedbank, ABSA, KPMG, and other organizations developing data warehouses, performing ETL, and conducting analytics. He has strong skills in SQL, databases, data modeling, and project management.
An experienced IT professional with over 18 years of experience in areas such as IT management, database administration, project management, security, and ETL development. Skilled in technologies like Oracle, SQL Server, .NET, and Informatica. Provides leadership and delivers results for projects involving the design, development and maintenance of database solutions and information systems. Holds an MS in Information Technology Management and ITIL certification.
This document provides a summary of Kiran Annamaneni's experience working as an Informatica ETL developer over 9.5 years. It outlines his technical skills and experience with Informatica PowerCenter, data warehousing, ETL processes, Oracle, and SQL. Specific experiences are listed for projects at Accenture, BCBSM, Farmers Insurance, and Level 3 Communications developing and supporting BI/DW solutions.
Alphonso Triplett is currently an Enterprise Data Manager and TDM/MDM consultant at BOKF, where he provides guidance on test data management, data masking, and data privacy solutions. Previously, he held several roles as a senior data architect and database administrator, where he was responsible for data modeling, database design, test data management, and implementing data governance policies.
This document provides a summary of Rizvi Shaik's professional experience as an Informatica ETL Developer over 9+ years. It outlines his extensive skills in areas like data warehousing, ETL development, data modeling, testing and production support. Recent roles include working with Horizon Blue Cross Blue Shield of NJ on projects involving data integration, replication and synchronization between various data sources using Informatica PowerCenter and Cloud.
Exploring Data Modeling Techniques in Modern Data Warehousespriyanka rajput
This article delves deep into data modeling techniques in modern data warehouses, shedding light on their significance and various approaches. If you are aspiring to be a data analyst or data scientist, understanding data modeling is essential, making a Data Analytics Course in Bangalore, Lucknow, Bangalore, Pune, Delhi, Mumbai, Gandhinagar, and other cities across India an attractive proposition.
The document provides a summary of Gerald Donaldson's experience and qualifications. It includes his contact information, objective of seeking an enterprise architecture role, and summaries of his past roles including Enterprise Data Architect, Data Warehouse Architect, and BI Architect. He has over 30 years of experience designing and implementing data warehouse and BI solutions primarily using Microsoft technologies. The document also lists his education background and technical skills.
Abhishek Ray has over 9 years of experience in data warehousing and ETL. He has expertise in designing and developing data warehouses, data modeling, ETL processes, and reporting solutions. Some of his skills include Oracle PL/SQL, Unix, Java, C, Oracle databases, ODI, and OFSAAI. He has worked on banking data warehouse projects for clients like Mizuho Bank and NAB. Currently he is working as a principal consultant on a Basel III CRD IV development project for Mizuho Bank.
Conceptual vs. Logical vs. Physical Data ModelingDATAVERSITY
A model is developed for a purpose. Understanding the strengths of each of the three Data Modeling types will prepare you with a more robust analyst toolkit. The program will describe modeling characteristics shared by each modeling type. Using the context of a reverse engineering exercise, delegates will be able to trace model components as they are used in a common data reengineering exercise that is also tied to a Data Governance exercise.
Learning objectives:
-Understanding the role played by models
-Differentiate appropriate use among conceptual, logical, and physical data models
- Understand the rigor of the round-trip data reengineering analyses
- Apply appropriate use of various Data Modeling types
1. Brooke Guthrie
Charlotte, NC
415.412.1336
Brooke@MamaEarthRocks.com
Summary:
20+ years combined experience in Data Governance, Data Architecture, Logical
Modeling and Physical Database Design Modeling in Agile Enviroments. Facilitating
JAD Sessions, Business Analysis, ErWin, ACORD, ErStudio, Data Warehouse Architect,
Dimensional Data Modeling, and Kimball Data Warehousing Data Mart design for Data
Warehouse, Gap Analysis, Object and Relational Data Mapping, Data Modeling, Master
Data Management (MDM), Data Administration, Metadata, Data Cleansing, Data
Dictionary, Health Care Insurance and Claims at CatamaranRx, Kaiser Permanente,
Cigna, Mass Mutual Insurance and for the - Army Medical Project/Army Medical
Clinical Trials, Financial Services (SAP environment) at the World Bank in Washington
DC and American Express Credit Card, International Shipping at APL, Project
Management, SDLC, Project Management, Data Architecture and Claims and Payor
experience at Mass Mutual.
Professional Summary and Career Highlights
18+ years of strong experience as a Senior Consultant, Data Governance, Business
Analyst, Information Engineer, Requirements Analyst, Financial Systems Analyst,
Financial Systems Data Modeler, Data Modeler, Gap Analysis, Master Data
Management (MDM),Data Administrator, Manual Data Cleansing, Metadata, Data
Dictionary, Object and Relational Data Mapping, Process Modeler and Junior
Database Administrator (DBA).
Expertise in Data Warehouse Architect and Kimball Data Mart design, logical data
modeling in the 3rd Normal Form using IDEF1X, Process Modeling using IDEF0,
ER/Studio, ErWIN/ERX CASE Tool, IEF CASE Tool, BP/Win, creating and
reviewing Use Cases, SAP and CLARIFY.
Strong experience in developing and implementing fully normalized, Physical
Models, Oracle Physical Databases (RDB), DB2, Sybase RDB and Informix RDB
using ER/Studio and ErWIN/ERX CASE Tools.
Experience working in Relational Development environments.
Strategic Systems Architect, Information Engineering, Business Area Analyst,
Systems Re-Engineer, Data Modeler, Requirement Analyst, Process Modeler and
JAD Workshop Facilitator.
Experienced in logical data modeling in the 3rd Normal Form and logical data
mapping, at the entity level, to the physical database.
EDUCATION:
North CarolinaStateUniversity - Textile Management
Developing SQL Queries for Oracle® Databases Training
Designing an Effective Data Warehouse Training
PROFESSIONAL SKILLS:
Data Warehouse Architect
2. Data Warehouse Modeler
Data Modeler
Physical Database Designer
CASE TOOLS: ER/Studio Data Architect Version 9.5.1
Embarcadero
ErWIN/ERX versions 4.0 to 9.6
IEF
BP/Win
RELATIONAL DATABASES: PL/SQL Developer Version 9.0
Netezza Workbench 6.0
Oracle RDB/NT 11g
Teradata
ORACLE RDB/UNIX
SQL Server - SSIS
Oracle RDB/NETWARE
Sybase
Informix
METHODOLOGIES: Kimball Data Warehouse Methodology
Data Governance
Business Process Reengineering
Process Redesign
Business Area Analysis
Gap Analysis
Use Cases
Requirements Analysis
Data Administration
Master Data Management (MDM)
Project Management
Data Modeling
IDEF1X
PROFESSIONAL DEVELOPMENT TRAINING:
ORACLE: ORACLE Database Administration I
ORACLE Database Administration II
Tune and Troubleshoot the ORACLE Database
Develop Data Models and Design Databases
BRIO:Brio Query
INFORMIX: Informix Database Administration MetaCube
NETWORKING: Advanced Network Administration
Microcomputer Concepts DOS
Network Administration
PROFESSIONAL EXPERIENCE:
11/14 - 4/15 CRG - Billion Dollar Developers 5 month Contract – Senior Level Data
Architect Warehouse Architect. Responsible for Data Warehouse Source to Target
Mapping and Data Warehouse Model Review. Responsible for Master Data Management
(MDM) Customer Data Integration (CDI) participating in Epic Healthcare Data Modeling
using IBM HealthCare Provider Data Model as Baseline for Model, JADs, Data Mapping
3. and review sessions. Also responsible for source to target mappings, creating Business
Conceptual Models, comparing IBM Model to our Data Model design, Logical Data
Models and Physical Models. Facilitated Model Review and Mapping Sessions.
07/14 – 10/14 – Microsoft Seattle WA Short Term Contract
Billion Dollar Developers – Senior Level Data Architect and Project Manager assigned
to MS SAP HR MDM Project. Responsible SAP HR Master Data Management (MDM)
Customer Data Integration (CDI) participating in Data Modeling, JADs, Data Mapping
and review sessions. Also assigned as Information Architect on Microsoft HR SAP
Project. Responsible for source to target mappings, creating Business Conceptual
Models, Logical Data Models and Physical Models. Facilitated Model Review and
Mapping Sessions. Collaborated in final solutions recommendations for HR Systems
consolidations.
10/13 – 3/28/14 – SAI
CatamaranRXPhiladelphia Mid-Level Data Warehouse Architect. Responsible for
creating Data Warehouse and Kimbal Data Mart designs using ErWin. Senior Data
Modeler and Database Designer.Experienced in using Netezza to compare, update and
create SQL scripts using. Knowledgeable in comparing Netezza DBs to ErWin Models.
I am also responsible for performing Gap Analysis reviews, Data Analysis, Data Model
reviews, Data Mapping and Project Management.
05/13 –09/13 Randstad Short Term Contract
AIG Berkeley HeightsNJ - Responsible for implementing Data Governance and Master
Data Management (MDM) program for Multi-National Insurance Project. I am also
responsible for performing Gap Analysis reviews, Data Analysis, Data Model reviews,
Data Mapping and Project Management.
01/11 – 03/13 Multi-Million Dollar Developers
Dunn and Bradstreet Data Modeler/Data Architect/Project Manager Parsippany,
NJ
I perform Gap Analysis reviews, Data Analysis, Data Model reviews, Data Mapping and
Project Management. I am also responsible for analyzing and designing of As-Is Data
Model and To-Be Dunn & Bradstreet Enterprise Business Logical Data and Physical
Models. I assist in identifying Data Gaps in various projects. I act as Process Facilitator
for all projects on team and am responsible for ensuring that changes are being completed
in a timely manner. I am responsible for identifying risks and issues for all assigned
projects.
Express Scripts Data Modeler/Data Architect Minneapolis, MN
I am responsible for ensuring that their Data Architecture and Data Administration
Standards are followed and enforced. I am also responsible for Data Governance, data
analysis and data mapping of Health Care Insurance and Claims related data.I create,
update and maintain logical data modeling in the 3rd Normal Form using IDEF1X and
ER/Studio. ERD and Entity-Relationship Modeling using ErWin.Queried data using
Teradata.
4. Dish Data Modeler/Data Architect Denver, CO
I am responsible for performing Data Governance, data analysis, business, analysis,
conducting JAD sessions with Subject Matter Experts, Metadata Management, Manual
Data Cleansing, updating Metadata Repository, Netezza and data modeling of my
assigned business area. Review and create Use Cases and Technical Documents. Created
and maintained Staging and Target Tables for Data Warehouse. I am also responsible for
analyzing and designing of As-Is Data Model and To-Be Dish Enterprise Data Models
using ER/Studio. I am also responsible for identifying business areas for improvements
in the Conceptual and Enterprise Level Data Models. I create ERD and Entity-
Relationship Modeling. I have also been responsible for creating Data Flow Diagrams
and conducting JAD Sessions. Maintenance of existing Logical and Physical Models and
the responsibility for consolidating of Models was also in my job requirements.
HP Data Governance/Data Architect
SomersetNJ
PerformedData Governance and data quality analysis of Corporate Level data.Acted as
Data Architect and Data Modeler on assigned Projects.I am also responsible for setting
data standards, performing data analysis, business, analysis, conducting JAD sessions
with Subject Matter Experts and data modeling of my assigned business area.Master Data
Management (MDM), Object and Relational Data Mapping. Oracle Physical Database
Design Modeling.DA Team Leader for various HP Projects.I create, update and maintain
logical data modeling in the 3rd Normal Form using IDEF1X and ER/Studio. ERD and
Entity-Relationship Modeling using ErWin.
Key Tech Staff Consultant - 04/2008 – 10/2010 Mass Mutual
SpringfieldMA
Data Governance/Data Modeler
As a member of Insurance and Claims Data Governance/Data Modeling team assigned at
Mass Mutual, I am responsible for Data Warehouse Architecture and Dimensional Data
Modeling for Data Warehouse design. I am also responsible for performing Gap Analysis
reviews, Data Analysis, Data Model reviews, Data Mapping and Project Management. I
am responsible for ensuring that their Data Governance, Data Architecture and Data
Administration Standards are followed and enforced.
Created and maintained Staging and Target Tables for Data Warehouse.Enforcing data
standards and BPR for various business areas.I am also responsible for data analysis and
data mapping.I create, update and maintain logical data modeling in the 3rd Normal Form
using IDEF1X and Erwin. Participating in JAD sessions. Mapped Objects to Relational
Elements in Model.Software tool evaluation. Acted as backup Vendor Liaison during
software tool evaluation. Acted as Project Manager for assigned Data Modeling
Projects.ERD and Entity-Relationship Modeling using ErWin for Claims and Payer
business areas.
CASE TOOLS: Erwin 7.3.2.1724
METHODOLOGIES: IDEF1X INFORMATION ENGINEERING BUSINESS
AREA ANALYSIS
Multi-Million Dollar Developers
5. 02/2005 – 12/2007
Cigna Data Governance/Data Modeler Consultant Hartford, CT
As a member of Insurance and Claims Data Modeling team assigned at Cigna,
I am responsible for ensuring that their Data Architecture and Data Administration
Standards are followed and enforced.
I am also responsible for Data Governance, data analysis and data mapping of Claims
data.
I create, update and maintain logical data modeling in the 3rd Normal Form using
IDEF1X and ER/Studio.
ERD and Entity-Relationship Modeling using ErWin.
CASE TOOLS: Erwin 4.1.4.4224
METHODOLOGIES: IDEF1X INFORMATION ENGINEERING
BUSINESS AREA ANALYSIS
APL Data Governance/Data Modeler Consultant Oakland, CA
As a member of Data Modeling team assigned at APL,
I am responsible for ensuring that APL's Data Governance, Data Architecture and
Data Administration Standards are followed and enforced.
I am also responsible for setting data standards, performing data analysis, business,
analysis, conducting JAD sessions with Subject Matter Experts and data modeling of
my assigned business area.
I create, update and maintain logical data modeling in the 3rd Normal Form using
IDEF1X and ER/Studio.
ERD and Entity-Relationship Modeling using ErWin.
CASE TOOLS: ER/Studio 7.1 & 7.0
METHODOLOGIES: IDEF1X INFORMATION ENGINEERING
BUSINESS AREA ANALYSIS
Delta Dental Enterprise Streamlining Project (ESP) - Data Governance/Data
Architect Consultant
San Francisco, CA
As a member of a team of Insurance and Claims Data Architects assigned to Delta
Dental's Enterprise Streamlining Project,
I am responsible for ensuring that the ESP Data Architecture is coordinated with the
Delta Dental Enterprise Data Architecture and Delta’s Data Governance.
I am also responsible for setting data standards, performing data analysis, business,
analysis, conducting JAD sessions with Subject Matter Experts and data modeling of
my assigned business area.
I create, update and maintain logical data modeling in the 3rd Normal Form using
IDEF1X and ErWin.
I also ensure that current data model entities and attributes comply with data model
standards.
ERD and Entity-Relationship Modeling using ErWin.
CASE TOOLS: ErWIN/ERX CASE Tool version 4.1.4.4
METHODOLOGIES: IDEF1X INFORMATION ENGINEERING
BUSINESS AREA ANALYSIS
Global Product Platform (GPP) Project/ Pre-Paid Card Financial System at
American Express Credit Card
6. Data Governance/Data Architect Consultant San Francisco, CA
Assigned to Re-Engineer American Express Credit Card Global Product Platform
(GPP) Pre-Paid Card Financial System. I am responsible for performing Data
Governance, data analysis, business, analysis, conducting JAD sessions with Subject
Matter Experts, Metadata Management, Manual Data Cleansing, updating Metadata
Repository and data modeling of my assigned business area. Also responsible for
analyzing and designing of As-Is Data Model for GPP Financial System that currently
resides on AS400 mainframe. Identify business areas for improvements for the
Conceptual Data Model. Data Administration.
Participate in project management. ERD and Entity-Relationship Modeling using
ErWin.
CASE TOOLS: ErWIN/ERX CASE Tool
METHODOLOGIES: IDEF1X INFORMATION ENGINEERING BUSINESS
AREA ANALYSIS
Medicare Modernization Act (MMA) & Medicare Part D
Data Governance/Business Analyst Consultant
Volt (Subcontractor to Sapient Short Contract) - Kaiser Permanente
Oakland, CA
Kaiser Permanente's (KP) Insurance and Claims Medicare Prescription Drug Benefit
Improvement (Pharma) and Medicare Modernization Act (MMA) and Part D Benefit
Program and Part D Benefit Program Data Feed and MMA Reporting to Centers for
Medicare and Medicaid Services (CMS), responsibilities include:
Use Case Design and development, performing Requirement Analysis, Business
Analysis and Data Analysis of National Medicare Finance (NMF) data, Prescription
Drug Event (PDE) Record Data, Pharmacy Claim Data, Membership Data, True Out-
Of-Pocket Expense (TrOOP) Data, Manual Data Cleansing and MMA Reporting
Data.
Create Concept and Event Response Diagrams, Use Case Business Designs and
Initial Data Element List for KP MMA / Part D Benefit Program.
Participate in MMA Report Requirements Workshops and Data Governance.
Investigate and gather MMA data elements for NMF Standard Reports and Data
Submission.
Create MMA Initial Data Element documentation.
Identify data groupings. Identify data sources.
Identify data needed of each NMF Reporting.
ERD and Entity-Relationship Modeling using ErWin.
TOOLS: Visio
Owner Dream Marketing, LLC
08/02-01/04 Consultant Alexandria, VA
Website designer. Responsible for designing and preparing content for the World
Wide Web, including text, images and site architecture. Leverage the client's brand
identity in a web-specific way. Foresee what visitors will want to do on the site and
create navigational interfaces that facilitate those needs. Establish the look and feel of
web pages, including typography, graphics, color, layout, and other factors.
7. Spherion - Battelle/Voter News Service
Short Term Contract 08/2001 - 10/2001
Data Governance/Data Architect Consultant
Arlington, VA
Acted as Project Manager for assigned Data Governance and Data Modeling Projects.
Create Business Data Model for ABC, CBS, FOX, CNN and NBC Voter News
Service (VNS) Board of Directors.
Acted as Vendor Liaison for the data modeling team with the client Voter News
Service. Document current Business Systems Design and Architecture in an Object
Oriented and Oracle back end development environment for VNS.
Create and maintain fully normalized Enterprise data models and logical data models.
Create and maintain entity relationship diagrams (ERD) by applying IDEF1X
techniques using various ErWin CASE Tools.
Transform user's business requirements into business assets in the form of
client/server systems.
Define system requirements as entities, attributes and relationships.
Evaluate and choose the appropriate design options for complex entities, attributes
and relationships.
Document business rules.
Maintain the data repository.
Perform quality analysis, Manual Data Cleansing and scrubbing of data and metadata.
CASE TOOLS: ErWIN/ERX CASE Tool
METHODOLOGIES: IDEF1X INFORMATION ENGINEERING BUSINESS
AREA ANALYSIS
Data Governance/Financial Systems Analyst - World Bank/IFC
11/1999 - 06/2001
Data Governance/Data Architect - New Boston Systems
Consultant GRC & American Red Cross
Falls Church, VA
Performed Data Governance. Created and maintain data models for Data Warehouse.
Create and maintain entity relationship diagrams (ERD) by applying advanced formal
data modeling concepts and techniques using various CASE tools. Transform user's
business requirements into business assets in the form of client/server systems.
Define system requirements as entities, attributes and relationships. Evaluate and
choose the appropriate design options for complex entities, attributes and
relationships. Document business rules.
Maintain the data repository. Perform quality analysis, Manual Data Cleansing and
scrubbing of data and metadata. Compare and synchronize different data models and
ERD. Perform Business Area Analysis and Financial systems Analysis.
Facilitate data modeling workgroup sessions. Produce system documentation. Create
and maintain logical naming standards. Interview Functional Area Experts.
Acted as Project Manager for assigned Data Modeling Projects.
Vendor Liaison for the Red Cross and Software Vendor.
Evaluate and recommend CASE Tools and other relevant software to resolve
customer Business Systems Design requirements. Document current Business
Systems Design and Architecture for SAP and CLARIFY and Object Oriented front
ends and Oracle and DB2 back end development and database environments.
8. Analyze as-is Business System Design and Architecture.
CASE TOOLS: ER/Studio CASE Tool ErWIN/ERX CASE Tool BP/Win
METHODOLOGIES: IDEF1X IDEF0 INFORMATION ENGINEERING
BUSINESS AREA ANALYSIS
RELATIONAL DATABASES: ORACLE DB2
Data Governance/Data Architect - Princeton Information Ltd.
04/199 - 10/1999 Short Term Contract
Consultant MCI/Telecommunications
McLean, VA
Requirements Analyst - Supervise the development, testing and coordination of the
production of Telecommunications Business Requirements being produced.
Perform Data Governance, Business Area Analysis. Facilitate JAD sessions. Analyze
knowledge of the Telecommunications business, including defining what fields of
information will be grouped together, to the business data in the database systems.
Work closely with the user to develop Telecommunications Business Requirement
Reports.
Design database queries and reports using Brio Query. Participate in the analysis
phase of the system development cycle. Project Management - Responsible for
scheduling and planning the Requirements development. Establish time frames for
completion of the project. Set priorities for the work to be done.
RELATIONAL DATABASE: ORACLE 7.3.3/NT
REPORTING TOOL: BRIOQUERY 5.5
CASE TOOL: ErWIN/ERX CASE TOOL
Data Governance/Data Architect - MicroLink, LLC
07/1998 - 03/1999
Consultant - Army Medical Project/Army Medical Clinical Trials
Data Governance and Business Analyst - Create and maintain fully normalized
logical data models.
Create and maintain entity relationship diagrams (ERD) by employing IDEF1X
modeling concepts and techniques using various CASE tools. Transform user's
business requirements into business assets in the form of client/server systems.
Define system requirements for Army Medical Clinical Trials entities, attributes and
relationships.
Evaluate and choose the appropriate design options for complex entities, attributes
and relationships. Document business rules. Maintain the data repository. Perform
quality analysis, Manual Data Cleansing and scrubbing of data and metadata.
Compare and synchronize different data models and ERD. Perform Business Area
Analysis.
Vendor Liaison for the Department of Defense Vendor and Government Programs.
Facilitate data modeling workgroup sessions. Produce system documentation. Create
and maintain logical naming standards. Interview Functional Area Experts.
CASE TOOLS: ErWIN/ERX CASE TOOL
METHODOLOGIES: IDEF1X INFORMATION ENGINEERING
9. Data Governance/Data Architect - New Boston Systems
Consultant - United States Enrichment Corporation, Inc
Rockville, MD
Responsible for Data Governance and performing the pre-conversion analysis of
converting the existing Sales and Marketing Oracle data and database structures.
Analyzed the impact of conversion of the database structures, datatypes and data.
Created SQL scripts and recommended solutions to conversion problems. Evaluate
and choose the appropriate conversion design options for complex entities, attributes
and relationships using IDEF1X.
Create data models and design Oracle database structures. Create and maintain
physical naming standards for each database. Document the physical naming
standards. Create and maintain entity relationship diagrams (ERD) by applying
advanced formal data modeling concepts and techniques using various CASE tools.
Forward and re-engineer Oracle databases.
Acted as Project Manager for assigned Data Modeling Projects.
RELATIONAL DATABASES: ORACLE 7.3/UNIX
CASE TOOLS: ErWIN/ERX CASE TOOL
METHODOLOGIES: IDEF1X INFORMATION ENGINEERING
Multi-Million Dollar Developers
04/1997 - 06/1998
Financial Systems Analyst/Consultant/ Data Governance/Data Architect
AMS Momentum Financial Baseline Project Dulles, VA
Data Governance Project Manager for the establishment and implementation of Physical
Naming Standards Project and the Meta-Data Repository Project. I created and
maintained physical naming standards for each of the Momentum Financial databases. I
documented the physical naming standards and created the meta-data repository.
Directed team effort to recreate existing Momentum Financial Oracle and Sybase
databases using the new physical naming standards. I created and maintained entity
relationship diagrams (ERD) by applying advanced formal data modeling concepts and
techniques using various CASE tools. Forward and re-engineer Oracle and Sybase
databases. Created Power Designer and Erwin Data Dictionaries. Lead team effort to
define entities and attributes for the Meta-Data Repository.
I was also the Team Leader for the Analysis Phase Project of the Informix Database
Conversion Project. I was responsible for performing the pre-conversion analysis of
converting the existing Momentum Financial Oracle database to a Momentum Financial
Informix database using IDEF1X. I analyzed the impact of conversion of all the database
structures, datatypes and data. I recommended solutions to conversion problems. I
evaluated and choose the appropriate conversion design options for complex entities,
attributes and relationships.
RELATIONAL DATABASES: ORACLE 7.3/UNIX SYBASE 10 INFORMIX
9.13
CASE TOOLS: ErWIN/ERX CASE TOOL POWER DESIGNER CASE
TOOL
METHODOLOGIES: IDEF1X INFORMATION ENGINEERING
10. Data Governance/Data Architect
Consultant - MCI Local Service Profile/ Telecommunications
McLean, VA
Create and maintain fully normalized logical data models. Create and maintain entity
relationship diagrams (ERD) by applying IDEF1X using various CASE tools.
Transform user's business requirements into business assets in the form of
client/server systems. Define system requirements as entities, attributes and
relationships. Evaluate and choose the appropriate design options for complex
entities, attributes and relationships. Forward and re-engineer Oracle databases.
Data Governance which included mapping the data model, at the entity level, to an
initial database table design. Document business rules.
Maintain the data repository. Perform quality analysis and scrubbing of data and
metadata. Compare and synchronize client application and database servers. Compare
and synchronize different data models and ERD. Perform Business Area Analysis.
Facilitate data modeling workgroup sessions. Produce system documentation. Create
and maintain logical naming standards. Interview Functional Area Experts.
Perform Oracle Database Administration (DBA). Write SQL scripts and queries.
Create SQL tables. Define the system architecture of the Oracle 7.3 server. Provide
strategic planning and implementation of information technology solutions. Design
Oracle databases. Install, create and maintain Oracle 7.3 Relational Database (RDB)
and data dictionary. Configure the system for heterogeneous network environments
when additional software options are required. Perform database performance tuning.
Create and maintain physical schema. Manage data. Monitor system performance.
Implement basic security and data integrity measures. Enroll, monitor and maintain
database users. Manage access to data by granting privileges to individual users.
Support end users. Create and maintain physical naming standards.
RELATIONAL DATABASE: ORACLE 7.3/UNIX
CASE TOOL: ErWIN/ERX CASE TOOL
METHODOLOGIES: IDEF1X INFORMATION ENGINEERING
Data Architect
FAA - Senior Consultant Information Engineer/Oracle DBA/Data Modeler
Washington, DC
Perform Oracle Database Administration (DBA). Write SQL scripts and queries. Create
SQL tables. Define the system architecture of the Oracle 7.3 server. Provide strategic
planning and implementation of information technology solutions. Design Oracle
databases. Install, create and maintain Oracle 7.3 Relational Database (RDB) and data
dictionary. Configure the system for heterogeneous network environments when
additional software options are required. Perform database performance tuning. Create
and maintain physical schema. Manage data. Monitor system performance. Implement
basic security and data integrity measures. Enroll, monitor and maintain database users.
Manage access to data by granting privileges to individual users. Support end users.
Create and maintain physical naming standards.
Create and maintain fully normalized logical data models. Create and maintain entity
relationship diagrams (ERD) employing Information Engineering and IDEF1X modeling
concepts and techniques using various CASE tools. Transform user's business
requirements into business assets in the form of client/server systems.
Define system requirements as entities, attributes and relationships. Evaluate and choose
the appropriate design options for complex entities, attributes and relationships. Forward
11. and re-engineer Oracle databases. Map the data model, at the entity level, to an initial
database table design. Document business rules. Maintain the data repository. Perform
quality analysis and scrubbing of data and metadata. Compare and synchronize client
application and database servers. Compare and synchronize different data models and
ERD. Perform Business Area Analysis. Facilitate data modeling workgroup sessions.
Produce system documentation. Create and maintain logical naming standards. Interview
Functional Area Experts.
RELATIONAL DATABASES: ORACLE 7.3/NT ORACLE 7.0/NETWARE
CASE TOOL: ErWIN/ERX CASE TOOL
METHODOLOGIES: IDEF1X INFORMATION ENGINEERING
Data Governance/Data Architect - BDM CORPORATION
03/1995 – 03/1997
Environmental Project-Senior Information Engineer / Oracle DBA / Data Modeler
Falls Church, VA
Perform Data Governance and Oracle Database Administration (DBA). Define the
system architecture of the Oracle 7.3 server. Provide strategic planning and
implementation of information technology solutions. Process Change Requests. Design
Oracle databases. Install, create and maintain Oracle 7.3 Relational Database (RDB) and
data dictionary. Configure the system for heterogeneous network environments when
additional software options are required. Perform database performance tuning. Create
and maintain physical schema. Manage data. Monitor system performance. Implement
basic security and data integrity measures. Enroll, monitor and maintain database users.
Manage access to data by granting privileges to individual users.
Support end users. Create and maintain physical naming standards. Vendor Liaison for
the Department of Defense Vendor and Government Programs.
Perform the duties of a Senior Functional Analyst and Data Administrator for a full life-
cycle development project. Develop detailed functional specifications, system
specifications and program specifications using Information Engineering, IDEF1X,
ErWin and IEF CASE tools. Works directly with management and users to analyze
specify and design business applications. Utilize data modeling techniques to analyze and
specify data usage within an application area. Responsible for defining both logical views
and physical data structures. Administers and controls an organization's data resources.
Utilize data dictionary software packages to discover corrupted data and eliminate data
redundancy and tuning tools to improve database performance.
Maintain and update data models. Create description files for entities and attributes that
include entity names, definitions, attribute names, definitions and metadata. Work
directly with systems analysts to write and edit system documentation, user manuals,
training courses and procedures. Prepare proposals and technical reports. Responsible
for creating reports that included entity names and definitions, attribute names and
definitions for submission into DDDS. Perform cross-functional reviews of proposal
packages submitted to DESCIM for incorporation into DOD's Enterprise Data
Dictionary. Prepare ErWIN ERD strawman model for functional area. Review
functional requirements in preparation for ERD strawman model. Initiate and coordinate
preparations for Toxic Substance GroupWare sessions that included creating text files for
the facilitators that include entity names, definitions, attribute names, definitions and
relationships. Facilitate GroupWare modeling sessions.
RELATIONAL DATABASE: ORACLE 7.2/UNIX
12. CASE TOOLS: IEF CASE TOOL ErWIN/ERX CASE TOOL BP/Win
JAD SOFTWARE: GROUPWARE
METHODOLOGIES: IDEF1X IDEF0 INFORMATION ENGINEERING
Data Governance/Data Architect - AMERIND INCORPORATION
04/1994 - 00/1995
ARMY HUMAN RESOURCE PROJECT -
DATA ADMINISTRATOR/ INFORMATION ENGINEER/ DATA MODELER
Arlington, VA
Perform Data Governance, business area analysis, data administration, information
engineering and data modeling in support of the DOD Corporate Information
Management/Business Process Improvement Program (CIM/BPIP) for the Department of
the Army's Human Resource (Personnel & Readiness (P&R)) military project. Reverse
engineered HR source code into logical elements for analysis and integration into P&R
logical data model. Perform data repository management and quality assurance of data.
Design, develop and analyze HR data models. Participate in subject matter expert
workshops to determine and obtain descriptive HR data element definitions and to
evaluate data analysis of source data. Produce new documentation from data analysis.
Maintain HR logical data models, structured diagrams and entity-relationship diagrams.
Map source data elements to logical elements for use in data dictionary. Developed re-
engineering strategies using IDEF1X and ErWIN/ERX CASE tool.
CASE TOOL: ErWIN/ERX CASE TOOL DDDS DATA REPOSITORY
METHODOLOGIES: IDEF1X INFORMATION ENGINEERING
NAVAL INTELLIGENCE COMMAND
1992-1993
COMPUTER SPECIALIST/ INFORMATION ENGINEER/ DATA
GOVERNANCE/ DATA MODELER
Suitland, MD
Vendor Liaison for the Department of the Navy and Program Vendors. Provide Data
Governance, business area analysis, data administration, and information modeling in
support of the DOD Corporate Information Management/Business Process Improvement
Program (CIM/BPIP) for the Department of the Navy's (DON) Data Administration (DA)
program. Plan, direct and manage the implementation phase of the DON DA program.
Provide program and project management for DON. Establish Plan of Actions and
Milestones (POA&M) using MS Project Manager for the DA program.
Provide quality assurance of source data in Naval databases. Responsible for performing
quality analysis and control of logical data models. Produce data standards and structures'
manuals, training plans and meeting reports. Coordinate formal responses to DOD and
DON manuals, policies and candidate data element reviews. Brief Senior Staff members
on project status. Participate in subject matter expert workshops to determine descriptive
data element definitions and to evaluate data analysis of source data. Maintain logical
data models, structured diagrams and entity-relationship diagrams. Additionally, develop
re-engineering strategies IDEF1X and ErWIN/ERX CASE tool.
CASE TOOL: ErWIN/ERX CASE TOOL DDDS DATA REPOSITORY
METHODOLOGIES: IDEF1X INFORMATION ENGINEERING
CSC
13. 05/1990- 09/1992
SENIOR COMPUTER SCIENTIST / DATABASE SUPPORT
Suitland, MD
Provide system support for U.S. Naval Fleet Operations requirements. Produce
specialized system reports for Senior Staff members. Write and produce briefings and
presentations for Senior Staff members. Brief Senior Staff members on various topics
related to operational requirements. Analyze specialized operational data. Update Naval
databases. Provide database support for end users and data input support. Develop
retrieval queries for operational requirements.
SOFTWARE: AOS/VS DMS CEO VM
EDS
01/1990 – 05/1990
COMPUTER OPERATOR/DATABASE SUPPORT
Dulles, VA
Perform Automated Data Processing (ADP) for end users. Damage Recovery Storage
(DRS) of tape library. Provide database support. Provide end user support for system
users. Execute system commands and provide system quality control. Produce system
reports.
SOFTWARE: VM MVS/XA JES2 JCS2
U.S. NAVY
01/1986 – 01/1990
CRYPTOLOGIC TECHNICAL TECHNICIAN/SYSTEM ANALYSIS
Japan
Provide Signal Intelligence (SIGINT) processing AND system analysis in support of
DOD DON Naval Fleet Operational requirements. Perform signal analysis, tasking and
quality control of SIGINT. Produce SIGINT reports for Senior Staff members. Write
and produce briefings and presentations for Senior Staff.