The document provides a summary of an ETL developer's skills and experience. It includes 3+ years of experience developing ETL processes in IBM InfoSphere Datastage 9.1. Specific experience includes developing Datastage jobs using various stages, debugging, performance tuning, implementing slowly changing dimensions, and working with databases like Oracle, SQL Server and Netezza. Project experience is provided for three projects involving reverse mortgage data warehousing, risk data warehousing, and an order tracking application. Responsibilities included developing ETL processes, testing, and supporting production environments.
Pedro P. Gatica Jr. provides his contact information and an extensive professional profile summarizing his experience and skills as an IT Lead Software Developer/Integrator with over 25 years of experience in data warehousing, ETL development, and project management. He has expertise in technologies like DataStage, Informatica, DB2, Oracle, and SQL Server. His experience includes roles at USAA from 1997 to 2014 where he led many strategic data integration projects and established batch and real-time ETL environments.
Rajarao Marisa has over 4 years of experience in data warehousing and ETL using Informatica. He has worked on multiple projects extracting data from sources like SAP, Salesforce, and flat files and loading it into databases including Teradata, Oracle, and SQL Server. Rajarao is proficient in Informatica PowerCenter and has experience with transformations, mappings, performance tuning, and testing. He currently works as a senior software engineer for Tech Mahindra.
Ramachandran has over 9 years of experience as a Senior Informatica Developer. He has expertise in data warehousing, ETL development and implementation using Informatica PowerCenter. Some of his key skills include dimensional modeling, mapping complex transformations, tuning Informatica workflows, and working with databases such as Oracle, SQL Server and Teradata. He has worked on multiple projects in the healthcare domain for clients in the US and India.
Shipra Jaiswal has over 6 years of experience in data warehousing and business intelligence solutions using tools like Informatica and Teradata. She has worked on ETL projects in various domains including healthcare, banking, e-commerce, and aviation. Her responsibilities have included requirements gathering, data modeling, mapping design, development, testing, implementation, and support.
Mukhtar Ahmed has over 8 years of experience in data warehousing and ETL projects. He has designed, developed, deployed and supported large scale ETL processes involving sources over 100 terabytes. He is specialized in IBM InfoSphere Datastage and Teradata utilities. He has worked on multiple industries including healthcare, banking and insurance.
Sivakumar has over 9 years of experience in data warehousing and ETL development using tools like Informatica and Teradata. He has extensive experience designing and developing ETL processes, performing testing, and collaborating with other teams on data migration projects for clients in various industries.
This resume is for Basu K S, an SAP BODS ETL Developer with over 3 years of experience developing and maintaining data warehouses and performing data migration. He has extensive experience using tools like SAP BODS, Information Steward, SQL Server, and writing SQL stored procedures. Some of his responsibilities include providing ETL designs, developing ETL jobs, performing data cleansing, testing, and supporting production loads. He is currently working at Utopia and has previously worked at Mindtree.
The document provides a summary of an ETL developer's skills and experience. It includes 3+ years of experience developing ETL processes in IBM InfoSphere Datastage 9.1. Specific experience includes developing Datastage jobs using various stages, debugging, performance tuning, implementing slowly changing dimensions, and working with databases like Oracle, SQL Server and Netezza. Project experience is provided for three projects involving reverse mortgage data warehousing, risk data warehousing, and an order tracking application. Responsibilities included developing ETL processes, testing, and supporting production environments.
Pedro P. Gatica Jr. provides his contact information and an extensive professional profile summarizing his experience and skills as an IT Lead Software Developer/Integrator with over 25 years of experience in data warehousing, ETL development, and project management. He has expertise in technologies like DataStage, Informatica, DB2, Oracle, and SQL Server. His experience includes roles at USAA from 1997 to 2014 where he led many strategic data integration projects and established batch and real-time ETL environments.
Rajarao Marisa has over 4 years of experience in data warehousing and ETL using Informatica. He has worked on multiple projects extracting data from sources like SAP, Salesforce, and flat files and loading it into databases including Teradata, Oracle, and SQL Server. Rajarao is proficient in Informatica PowerCenter and has experience with transformations, mappings, performance tuning, and testing. He currently works as a senior software engineer for Tech Mahindra.
Ramachandran has over 9 years of experience as a Senior Informatica Developer. He has expertise in data warehousing, ETL development and implementation using Informatica PowerCenter. Some of his key skills include dimensional modeling, mapping complex transformations, tuning Informatica workflows, and working with databases such as Oracle, SQL Server and Teradata. He has worked on multiple projects in the healthcare domain for clients in the US and India.
Shipra Jaiswal has over 6 years of experience in data warehousing and business intelligence solutions using tools like Informatica and Teradata. She has worked on ETL projects in various domains including healthcare, banking, e-commerce, and aviation. Her responsibilities have included requirements gathering, data modeling, mapping design, development, testing, implementation, and support.
Mukhtar Ahmed has over 8 years of experience in data warehousing and ETL projects. He has designed, developed, deployed and supported large scale ETL processes involving sources over 100 terabytes. He is specialized in IBM InfoSphere Datastage and Teradata utilities. He has worked on multiple industries including healthcare, banking and insurance.
Sivakumar has over 9 years of experience in data warehousing and ETL development using tools like Informatica and Teradata. He has extensive experience designing and developing ETL processes, performing testing, and collaborating with other teams on data migration projects for clients in various industries.
This resume is for Basu K S, an SAP BODS ETL Developer with over 3 years of experience developing and maintaining data warehouses and performing data migration. He has extensive experience using tools like SAP BODS, Information Steward, SQL Server, and writing SQL stored procedures. Some of his responsibilities include providing ETL designs, developing ETL jobs, performing data cleansing, testing, and supporting production loads. He is currently working at Utopia and has previously worked at Mindtree.
Sreekanth has over 5 years of experience in data warehousing and ETL development using IBM DataStage. He has worked on projects in the telecom and automotive industries, extracting data from various sources and loading it into data warehouses. His responsibilities included designing and developing ETL jobs, testing, troubleshooting, performance tuning, and providing production support. He is proficient in DataStage, Oracle, Teradata, UNIX, and scheduling tools like Autosys.
This document provides a summary of Rizvi Shaik's professional experience as an Informatica ETL Developer over 9+ years. It outlines his extensive skills in areas like data warehousing, ETL development, data modeling, testing and production support. Recent roles include working with Horizon Blue Cross Blue Shield of NJ on projects involving data integration, replication and synchronization between various data sources using Informatica PowerCenter and Cloud.
This document contains a resume summary for Richa Sharma highlighting her 8.5 years of experience in data warehousing, ETL development, and business intelligence. She has expertise in IBM Datastage and Cognos reporting tools. Her experience includes data modeling, ETL development, database administration, performance tuning, and project experience with clients in the retail and telecom industries.
The document provides a technical summary and experience profile of Nootan Sharma. It summarizes his 8 years of experience in data warehousing and business intelligence projects. It details his expertise in tools like Informatica PowerCenter, Oracle, SQL Server and data quality management. It also lists his past work experience with companies like Capgemini, Birlasoft and Infogain on various BI and data warehousing projects for clients in different sectors.
The candidate has over 4 years of experience as an ETL developer using Informatica. They have extensive experience developing mappings using transformations like aggregator, lookup, filter and joiner. They have worked on multiple projects in the telecommunications domain for clients like British Telecom extracting data from various sources and loading to data warehouses. Their skills include Informatica, Oracle, SQL and they have experience in requirements analysis, mapping development, testing and support.
This document contains a resume for Tefari Heyie. It summarizes his professional experience which includes over 3 years working with SQL Server, databases, and data integration. He has expertise in SQL queries, indexes, stored procedures, SSIS ETL packages, and SSRS reports. He is proficient in designing databases, data warehousing, and business intelligence solutions. His education includes a Bachelor's degree in Computer Science and Microsoft SQL Server certifications.
Kunal N Kadbhane has over 3 years of experience as a software engineer in the IT industry. He has extensive experience with Informatica PowerCenter 9x, developing ETL mappings and transformations between various data sources like Oracle, SQL Server, and flat files. He has worked on multiple projects involving data warehousing and migration for clients such as Ultra Tech Cement and Reliance Industries. His responsibilities included requirement gathering, mapping development, testing, production support, and issue resolution.
Kallesha has over 4 years of experience as an Informatica/PLSQL developer. She has extensive experience developing mappings in Informatica to extract, transform and load data from various sources into data warehouses. She has worked on projects in various domains including storage, sales, banking, and finance. Kallesha is proficient in technologies like Informatica, Pentaho, Hive, HBase, Pig, Oracle, Teradata, and Shell scripting.
PharmMD has a new job opening for an ETL Developer. See job description attached. Interested parties should send their resume to Cynthia.Sandahl@pharmmd.com
IT Professional with 9 years of Data Warehousing experience in the areas of ETL design and Development.Excellent Experience in Requirement Gathering, Designing, Developing, Documenting, Testing of ETL jobs and mappings in Parallel jobs using Data Stage to populate tables in Data Warehouse and Data marts.
The document provides a summary of an ETL Developer's skills and experience. The developer has 3 years of experience using IBM InfoSphere Datastage for ETL projects involving data extraction, transformation, and loading. Responsibilities include developing and debugging ETL jobs, testing and tuning performance, implementing changes, and working with databases like Oracle. The developer has worked on risk data warehousing and order tracking projects, developing jobs to move data between systems and load enterprise data warehouses.
This document contains a professional summary and resume for Arun Roy Kondra. It summarizes his experience of over 4 years working with data warehousing and ETL solutions using Informatica PowerCenter. It details his technical skills and 3 projects he has worked on extracting and transforming data from various sources like Oracle, GreenPlum and flat files into data warehouses.
Sudhir Gajjela has over 3 years of experience as a Big Data Hadoop Administrator and Informatica Administrator. He has expertise in architecting, building, supporting and troubleshooting both Cloudera and Hortonworks Big Data clusters as well as various Informatica tools. He has also worked as an Informatica Developer. His skills include Hadoop services like HDFS, Hive, HBase, Zookeeper, Flume, Sqoop, Oozie and Storm as well as Informatica tools such as PowerCenter, PowerExchange, Data Quality, MDM, Cloud, BDE and BDM. He has delivered training sessions to colleagues on topics such as Big Data, Hadoop, In
Shivaprasada Kodoth is seeking a position as an ETL Lead/Architect with experience in data warehousing and ETL. He has over 8 years of experience in data warehousing and Informatica design and development. He is proficient in technologies like Oracle, Teradata, SQL, and PL/SQL. Some of his key projects include developing ETL mappings and workflows for integrating various systems at BoheringerIngelheim and UBS. He is looking for opportunities in Bangalore, Mangalore, Cochin, Europe, USA, Australia, or Singapore.
Tamilarasu Uthirasamy has over 10 years of experience in data warehousing, database design, ETL processes, and analytics. He has skills in technologies like Python, Spark, UNIX shell scripting, databases like Netezza and Oracle, and tools like Datastage and R. He has worked on projects in healthcare, retail, and banking domains, designing data models and warehouses and developing ETL processes.
The document provides a summary of Praveena Chandrasekaran's professional experience and skills. She has over 7 years of experience in data warehousing, business intelligence, and ETL development using Informatica. Some of her roles included developing ETL mappings and workflows to load data from various sources into targets, performing testing, and leading a project as an onsite coordinator. She has strong skills in SQL, databases, data modeling, Informatica, and UNIX scripting.
Prasad Kommoju has 5 years of experience in ETL development using Informatica. He has worked on projects in banking and manufacturing. His skills include Informatica PowerCenter, IDQ, MDM, DIH, SQL, and UNIX. Currently he works as a Senior Software Engineer at Tech Mahindra on a project with MasterCard managing customer data.
The document provides a summary of Ashutosh Pandey's experience as an Oracle Database professional with over 10 years of experience. He has extensive skills in database administration, performance tuning, high availability solutions, database backups and recovery, and data replication technologies. Some of the key projects listed in his experience include Oracle 11g RAC implementations, database upgrades, data migrations involving large databases, and GoldenGate setups for active-active replication.
Pratik Dey is an IT professional with over 4 years of experience in data warehousing and ETL development. He has strong skills in Informatica PowerCenter and experience loading data from various sources into Teradata. Currently he works as an ETL Data Specialist for Thomson Reuters implementing their Connect Data Warehouse. Previously he worked on projects in healthcare and banking to develop ETL processes and load data into data warehouses.
Hemasundar Mondhi is seeking a job in a reputable organization where he can improve his skills through collaboration with others. He has a B.Tech in computer science from JNTU-K with a 63.51% score. He is proficient in programming languages like C, C++, C# and databases like SQL Server and Oracle 10g. He completed a social networking project using C#, ASP.NET, and SQL Server. In his free time, Hemasundar enjoys playing volleyball, listening to music, and surfing the internet. He believes his ability to work well in a team and take initiative will help him succeed in his career.
Dr. J. Ross Tanner is writing a letter of recommendation for Kimberly Anderson, FNP-C. He was her direct supervisor at Providence Hospital in Alaska and later worked with her at his specialty clinic. Dr. Tanner states that Kimberly did an outstanding job caring for patients who needed primary care services. He recommends Kimberly without reservation, noting that she is very personable, responsible, and interested in continuing to learn to help her patients.
The document describes five writing camps for teens offered by the Bay Area Writing Project in the summer of 2016. The camps include Flash Fiction, Flash Memoir, Rebellious Writing, Teenage Reporter, and The Personal Statement. The camps are offered at different dates from June to August and are open to students in grades 4th through 12th. Camp costs range from $320 to $575.
Sreekanth has over 5 years of experience in data warehousing and ETL development using IBM DataStage. He has worked on projects in the telecom and automotive industries, extracting data from various sources and loading it into data warehouses. His responsibilities included designing and developing ETL jobs, testing, troubleshooting, performance tuning, and providing production support. He is proficient in DataStage, Oracle, Teradata, UNIX, and scheduling tools like Autosys.
This document provides a summary of Rizvi Shaik's professional experience as an Informatica ETL Developer over 9+ years. It outlines his extensive skills in areas like data warehousing, ETL development, data modeling, testing and production support. Recent roles include working with Horizon Blue Cross Blue Shield of NJ on projects involving data integration, replication and synchronization between various data sources using Informatica PowerCenter and Cloud.
This document contains a resume summary for Richa Sharma highlighting her 8.5 years of experience in data warehousing, ETL development, and business intelligence. She has expertise in IBM Datastage and Cognos reporting tools. Her experience includes data modeling, ETL development, database administration, performance tuning, and project experience with clients in the retail and telecom industries.
The document provides a technical summary and experience profile of Nootan Sharma. It summarizes his 8 years of experience in data warehousing and business intelligence projects. It details his expertise in tools like Informatica PowerCenter, Oracle, SQL Server and data quality management. It also lists his past work experience with companies like Capgemini, Birlasoft and Infogain on various BI and data warehousing projects for clients in different sectors.
The candidate has over 4 years of experience as an ETL developer using Informatica. They have extensive experience developing mappings using transformations like aggregator, lookup, filter and joiner. They have worked on multiple projects in the telecommunications domain for clients like British Telecom extracting data from various sources and loading to data warehouses. Their skills include Informatica, Oracle, SQL and they have experience in requirements analysis, mapping development, testing and support.
This document contains a resume for Tefari Heyie. It summarizes his professional experience which includes over 3 years working with SQL Server, databases, and data integration. He has expertise in SQL queries, indexes, stored procedures, SSIS ETL packages, and SSRS reports. He is proficient in designing databases, data warehousing, and business intelligence solutions. His education includes a Bachelor's degree in Computer Science and Microsoft SQL Server certifications.
Kunal N Kadbhane has over 3 years of experience as a software engineer in the IT industry. He has extensive experience with Informatica PowerCenter 9x, developing ETL mappings and transformations between various data sources like Oracle, SQL Server, and flat files. He has worked on multiple projects involving data warehousing and migration for clients such as Ultra Tech Cement and Reliance Industries. His responsibilities included requirement gathering, mapping development, testing, production support, and issue resolution.
Kallesha has over 4 years of experience as an Informatica/PLSQL developer. She has extensive experience developing mappings in Informatica to extract, transform and load data from various sources into data warehouses. She has worked on projects in various domains including storage, sales, banking, and finance. Kallesha is proficient in technologies like Informatica, Pentaho, Hive, HBase, Pig, Oracle, Teradata, and Shell scripting.
PharmMD has a new job opening for an ETL Developer. See job description attached. Interested parties should send their resume to Cynthia.Sandahl@pharmmd.com
IT Professional with 9 years of Data Warehousing experience in the areas of ETL design and Development.Excellent Experience in Requirement Gathering, Designing, Developing, Documenting, Testing of ETL jobs and mappings in Parallel jobs using Data Stage to populate tables in Data Warehouse and Data marts.
The document provides a summary of an ETL Developer's skills and experience. The developer has 3 years of experience using IBM InfoSphere Datastage for ETL projects involving data extraction, transformation, and loading. Responsibilities include developing and debugging ETL jobs, testing and tuning performance, implementing changes, and working with databases like Oracle. The developer has worked on risk data warehousing and order tracking projects, developing jobs to move data between systems and load enterprise data warehouses.
This document contains a professional summary and resume for Arun Roy Kondra. It summarizes his experience of over 4 years working with data warehousing and ETL solutions using Informatica PowerCenter. It details his technical skills and 3 projects he has worked on extracting and transforming data from various sources like Oracle, GreenPlum and flat files into data warehouses.
Sudhir Gajjela has over 3 years of experience as a Big Data Hadoop Administrator and Informatica Administrator. He has expertise in architecting, building, supporting and troubleshooting both Cloudera and Hortonworks Big Data clusters as well as various Informatica tools. He has also worked as an Informatica Developer. His skills include Hadoop services like HDFS, Hive, HBase, Zookeeper, Flume, Sqoop, Oozie and Storm as well as Informatica tools such as PowerCenter, PowerExchange, Data Quality, MDM, Cloud, BDE and BDM. He has delivered training sessions to colleagues on topics such as Big Data, Hadoop, In
Shivaprasada Kodoth is seeking a position as an ETL Lead/Architect with experience in data warehousing and ETL. He has over 8 years of experience in data warehousing and Informatica design and development. He is proficient in technologies like Oracle, Teradata, SQL, and PL/SQL. Some of his key projects include developing ETL mappings and workflows for integrating various systems at BoheringerIngelheim and UBS. He is looking for opportunities in Bangalore, Mangalore, Cochin, Europe, USA, Australia, or Singapore.
Tamilarasu Uthirasamy has over 10 years of experience in data warehousing, database design, ETL processes, and analytics. He has skills in technologies like Python, Spark, UNIX shell scripting, databases like Netezza and Oracle, and tools like Datastage and R. He has worked on projects in healthcare, retail, and banking domains, designing data models and warehouses and developing ETL processes.
The document provides a summary of Praveena Chandrasekaran's professional experience and skills. She has over 7 years of experience in data warehousing, business intelligence, and ETL development using Informatica. Some of her roles included developing ETL mappings and workflows to load data from various sources into targets, performing testing, and leading a project as an onsite coordinator. She has strong skills in SQL, databases, data modeling, Informatica, and UNIX scripting.
Prasad Kommoju has 5 years of experience in ETL development using Informatica. He has worked on projects in banking and manufacturing. His skills include Informatica PowerCenter, IDQ, MDM, DIH, SQL, and UNIX. Currently he works as a Senior Software Engineer at Tech Mahindra on a project with MasterCard managing customer data.
The document provides a summary of Ashutosh Pandey's experience as an Oracle Database professional with over 10 years of experience. He has extensive skills in database administration, performance tuning, high availability solutions, database backups and recovery, and data replication technologies. Some of the key projects listed in his experience include Oracle 11g RAC implementations, database upgrades, data migrations involving large databases, and GoldenGate setups for active-active replication.
Pratik Dey is an IT professional with over 4 years of experience in data warehousing and ETL development. He has strong skills in Informatica PowerCenter and experience loading data from various sources into Teradata. Currently he works as an ETL Data Specialist for Thomson Reuters implementing their Connect Data Warehouse. Previously he worked on projects in healthcare and banking to develop ETL processes and load data into data warehouses.
Hemasundar Mondhi is seeking a job in a reputable organization where he can improve his skills through collaboration with others. He has a B.Tech in computer science from JNTU-K with a 63.51% score. He is proficient in programming languages like C, C++, C# and databases like SQL Server and Oracle 10g. He completed a social networking project using C#, ASP.NET, and SQL Server. In his free time, Hemasundar enjoys playing volleyball, listening to music, and surfing the internet. He believes his ability to work well in a team and take initiative will help him succeed in his career.
Dr. J. Ross Tanner is writing a letter of recommendation for Kimberly Anderson, FNP-C. He was her direct supervisor at Providence Hospital in Alaska and later worked with her at his specialty clinic. Dr. Tanner states that Kimberly did an outstanding job caring for patients who needed primary care services. He recommends Kimberly without reservation, noting that she is very personable, responsible, and interested in continuing to learn to help her patients.
The document describes five writing camps for teens offered by the Bay Area Writing Project in the summer of 2016. The camps include Flash Fiction, Flash Memoir, Rebellious Writing, Teenage Reporter, and The Personal Statement. The camps are offered at different dates from June to August and are open to students in grades 4th through 12th. Camp costs range from $320 to $575.
This document provides information about various young writers camps being offered throughout the San Francisco Bay Area in summer 2016. The camps will run from June to August, are held 2-3 weeks per session, and cost $600 each. Locations include Bacich Elementary in Kentfield, School of the Madeleine in North Berkeley, Dougherty Valley High in San Ramon, St. Vincent de Paul and St. Ignatius Prep in San Francisco, Seven Hills School in Walnut Creek, Westlake Middle in Oakland, and a digital storytelling camp.
BAWP 2016 Summer Open Programs for TeachersKarin Seid
The document lists various summer writing and English-related workshops for teachers from June to August 2016. The workshops cover topics such as argumentative writing, grammar, bringing writing workshop to life, research and writing skills, poetry, strategies for English learners, using memoir to teach history, addressing microaggressions, blogging, and using Google Drive. The workshops range from 3 to 5 days and are offered at different locations, with registration fees ranging from $301 to $546. Teachers can use Title II funds to attend the workshops and improve writing instruction.
This document provides details on various knit collections for 2015, including pullovers, cardigans, and ponchos. It lists the article number, brand, yarn composition, gauge, weight, minimum order quantity, lead time, and FOB price for each item. The collections include brands like AJC, Arizona, Chillytime, Flash Light, and KangaROOS, and yarn compositions range from 100% acrylic to blends of cotton, acrylic, polyester, and mohair. Lead times range from 60 to 90 days and FOB prices are provided in US dollars per piece.
This document contains the professional profile of Abdul Malik S.A, including his contact information, skills, certifications, and work experience in systems administration, networking, and technical support roles over the past 7 years. He has extensive experience working with Windows and Linux operating systems, networking protocols, firewalls, antivirus software, and networking tools. His most recent role was as an IT Network Support Engineer from 2011-2016 where he provided level 1 and 2 support and administered routers, switches, Active Directory, and network security devices.
Valores Personales y Valores de la Escuela Colombiana de Carreras IndustrialesJuan Esteban Vega Suarez
Los valores personales son normas internas que guían a una persona a vivir bien y mejorar cada día. La Universidad ECCI enfatiza valores como la honestidad, respeto y empatía. Siguiendo valores personales adecuados y los de la universidad, una persona puede trascender espiritualmente y destacar entre los demás de manera positiva.
Ahmed Mohammed Ahmed Ibrahim is seeking a challenging position in electro-mechanics engineering that utilizes his academic and work experience. He has a BSc in Power Mechanics Engineering from Zagazig University in Egypt and over 5 months of work experience designing plumbing and firefighting systems for commercial buildings. His skills include HVAC design, firefighting system design, plumbing, hydraulics, AutoCAD, and he is proficient in English.
The document summarizes work done on chiral phosphine-catalyzed asymmetric γ-addition of oxygen nucleophiles to alkynoates. Key findings include: 3-thiophene substituted alkynoate reacted well under standard conditions while other heterocycles did not. Increasing nucleophile loading and slow addition improved yields. Distance of oxygen from aryl group in nucleophile decreased yield and increased ee. Methanol and alkynamide alkynoates gave good yields. Slow addition and altered reaction conditions improved reactivity of problematic substrates. The author thanks advisors and SURF program for the opportunity.
Creating Content with Brand Consistency
Content Organization
Content Calendars
Social Media Scheduling
Blog Orgainzation
Categories
Permalinks
Content for Passive Income
E-books
E-courses
Niti Srivastava is seeking opportunities as an Application Developer with 2.7 years of experience working with IBM India Pvt Ltd. She has experience in COBOL, JCL, VSAM, DB2, and other mainframe technologies. She has worked on projects for clients like National Grid involving gas billing migration, data extraction, and system conversions. Niti holds a B.Tech in Computer Science and a postgraduate diploma in Advanced Computing. She is proficient in various programming languages, databases, and operating systems.
Gathara Thumbi is a senior software developer with over 30 years of experience developing mainframe applications using languages like COBOL, PL/1, SQL, and JCL. He has extensive experience in software development, testing, and mentoring at various companies including General Motors, Wipro, South East Data Cooperative, and Prudential Insurance. He is seeking a new position to utilize his expertise in software solutions and development.
Lizabeth A. Stetz is a registered nurse with over 10 years of experience in emergency departments, critical care units, and orthopedic units. She has a passion for patient care and strong communication, analytical, and problem-solving skills. Currently, she is working as an RN at Ohio Health - Riverside Methodist Hospital while pursuing her BSN degree. She has several certifications including ACLS, BLS, and is an Advanced EMT-Intermediate.
Lizabeth A. Stetz completed clinical rotations in adult health, medical/surgical, pediatrics, critical care, women's health, and mental health. She gained skills in areas such as tracheal suctioning, IV insertion, medication administration, and communication with patients and families at various facilities including Mayfair Village, Doctor's West Hospital, Nationwide Children's Hospital, Grant Hospital, and Regency Manor between 2012 and 2014 while attending Chamberlain College of Nursing.
Strong analytical skills used in requirements gathering and converting them into effective solutions.
Understand other processes and methodologies and can speak intelligently about them and leverage other techniques to provide value to a team/enterprise
Understand basic fundamentals of software development processes and procedures.
Understand backlog tracking, burn down metrics and task definition
Keen problem solving skills allowing rapid assimilation and resolution of complex problems.
Excellent written and oral communication skills with the ability to communicate appropriately to business and technical teams at all levels.
Experience of both of clients and developers viewpoints in outsourcing situations.
Vast experience in Project Management, IT Business Analysis and functional & technical designs.
Experience in working with Multilingual Clients & multiple vendor models.
Ian Perryman has over 25 years of experience in the insurance industry as a senior support analyst and programmer. He has extensive experience with COBOL and Visual Basic programming. Key skills include full life cycle development, client relationship management, and supervising colleagues. He has worked on various projects from system maintenance to developing new interfaces.
This document is a curriculum vitae for Sandeep Garrepalli that provides information about his work experience, skills, education, and projects. It summarizes that he has over 4 years of experience working with IBM Mainframe technologies like COBOL, DB2, and VSAM. His experience includes projects in the insurance and healthcare domains for clients like CNO and Express Scripts.
Paul Vlasek is an experienced IT professional with expertise in COBOL, mainframe systems, and leading migrations from mainframe to open systems platforms. He has over 20 years of experience consulting for companies on software development, data migration, and modernization projects. His technical skills include COBOL, Unix, Linux, Oracle, SQL Server, and he has led teams in migrating applications from IBM mainframes to distributed platforms like AIX, Windows, and Linux.
This resume is for Arun Mantha, a Mainframe COBOL Developer with over 4.8 years of experience developing and enhancing applications using COBOL, JCL, VSAM, IMS DB, CICS, DB2 and other tools. He has strong skills in the software development lifecycle and programming in COBOL. His experience includes working as a developer for IBM India and Capgemini on insurance applications for clients like Fireman's Fund Insurance Company and De Goudse Verzekeringen B.V. He is proficient in technologies like COBOL, JCL, VSAM, DB2, IMS/DB and CICS.
This document provides a summary of Rajesh Dheeti's professional experience and qualifications. It summarizes his 4+ years of experience developing ETL processes using Informatica PowerCenter to extract, transform, and load data from sources like Oracle and Teradata. It also lists 5 projects he has worked on involving building ETL mappings and workflows to load data into data warehouses.
Sivakumar has over 9 years of experience in data warehousing and ETL development using tools like Informatica and Teradata. He has extensive experience designing and developing ETL processes for data migration, analytics projects for clients in various industries. His roles have included requirement analysis, mapping design, testing, performance tuning and managing project timelines.
This document contains a resume for Kumaraswamy N summarizing his experience and skills as an ETL developer. He has over 8 years of experience in ETL development using Informatica and developing data warehouses. Some of his key skills and experiences include designing and developing ETL mappings in Informatica, data modeling, working with various data sources, performance tuning, and implementing slowly changing dimensions. He has worked on multiple projects for clients in various industries extracting data from sources like Oracle, SQL Server, and flat files and loading it into data warehouses and data marts.
Sakthi Shenbagam is a senior Informatica lead developer with over 7 years of experience in data warehousing and ETL development. She has extensive experience designing and developing complex ETL mappings and workflows to load data into data warehouses from various sources like Oracle, Netezza, and flat files. Some of her responsibilities include requirement gathering, design, development, testing, performance tuning, and support of ETL processes and data warehousing projects for clients across various industries. She is proficient in Informatica PowerCenter, QlikView, Oracle, and other tools.
Mani Sagar is an ETL Sr Developer and Lead with over 8 years of experience in designing, developing, and maintaining large enterprise applications. He has expert knowledge of ETL technologies like Informatica and data management processes including data migration, profiling, quality, security, and warehousing. He has led teams of up to 8 developers and delivered projects on time for clients across various industries.
Ashish Mishra has over 1 year of experience as an ETL Developer working with Cognizant Technology Solutions and XL Catlin Insurance Company. He has extensive experience designing and developing mappings using Informatica PowerCenter to extract, transform, and load data from various sources like Oracle and SQL Server into data warehouses. His projects involved tasks like data profiling, claim conversion, and performance tuning of mappings.
Deepak Sharma has over 9 years of experience as an ETL programmer focusing on data warehousing, data integration, and business intelligence. He has expertise in designing, developing, and implementing data warehouse/data integration solutions using tools like Informatica PowerCenter and Oracle. Some of his responsibilities include interacting with business stakeholders to understand requirements, designing mappings, developing ETL processes, testing solutions, and supporting production environments. He currently works as a senior ETL developer at Bank of America on their Corporate Investment Data Warehouse.
The document describes an Ab Initio developer with over 8 years of experience developing ETL applications using Ab Initio. Some key responsibilities included designing ETL processes to extract data from various sources like DB2, Oracle, SQL Server, load it into staging tables, transform the data using Ab Initio components, and load it into data warehouses and datamarts. The developer has extensive experience building complex graphs with various Ab Initio components, implementing data parallelism, and automating processes with shell scripts. Programming skills include SQL, Perl, and experience with databases like Oracle, SQL Server, and Teradata.
This document provides a summary of Neethu Abraham's skills and experience. She has over 9 years of experience in ETL development, data warehousing, and business intelligence solutions using tools like Informatica and Oracle. She has extensive experience designing and developing ETL processes for healthcare, retail, and hospitality clients. Her roles have included team lead, senior consultant, and she has experience working on projects involving customer data management, data migration, and clinical data warehousing. She is proficient in technologies like Informatica, Oracle, Unix, and has a history of successfully delivering ETL solutions on schedule and within budget.
The document provides a summary of Subramanyam M's professional experience including 4+ years working with ETL tools like IBM Datastage and Ascential Datastage. He has experience designing, developing and testing data warehouse projects involving data extraction, transformation and loading. He also has skills in SQL, Oracle, DB2, Unix and data warehousing techniques. The profile outlines various ETL projects he has worked on involving large datasets for clients like HSBC Bank, Scope International and Bharti Airtel.
This document is a resume for Sujit Kumar Jha, an Oracle Certified Professional (OCP) and Tuning expert with 13 years of experience in PL/SQL development. He has extensive experience designing, developing and implementing solutions for clients in finance, telecom and insurance. Some of his key skills include Oracle SQL, PL/SQL, Unix shell scripting, data modeling, ETL processes, and working in Agile methodologies. He has worked as a lead developer on numerous projects involving building databases, ETL code, reports and batch processes to meet business requirements.
The document provides a professional summary and work experience for VenkataSubbaReddy gvenkat9bi@gmail.com. It outlines over 5 years of experience in ETL development using Informatica and data warehousing. Specific skills and responsibilities mentioned include developing mappings and transformations in Informatica to load data from various sources into staging and target databases, designing logical and physical data models, working on projects in the banking and insurance domains, and developing reports in OBIEE. Details of 6 projects are also provided demonstrating experience implementing ETL processes and data warehousing solutions for clients.
Shrikantha DM is a senior software engineer with over 3.5 years of experience developing applications using Oracle PL/SQL. He has skills in application design, development, testing and implementation using Oracle databases, PL/SQL and SQL. His experience includes developing applications for tasks such as URL filtering, managing IP addresses and domain servers, and organizing meetings using wireless networks.
Rajesh S has over 3 years of experience in developing ETL applications using IBM Datastage. He has extensive experience designing and developing Datastage jobs to extract, transform and load data from various sources such as Oracle and Teradata databases into data warehouses. Some of his key skills include Datastage, Unix scripting, Oracle, Teradata and working on projects in the healthcare and retail domains.
Rameshkumar has over 7 years of experience in ETL development using Informatica. He has worked on multiple projects involving data extraction from various sources like flat files, CSV files, COBOL files and loading the data into Oracle databases. He is proficient in requirements analysis, designing mappings, workflows and sessions in Informatica. He has expertise in performance tuning of ETL processes and managing SQL queries for optimization.
- The document contains the resume of Abdul Mohammed, an ETL developer with 8 years of experience using Informatica for data warehousing projects.
- He has expertise in requirements gathering, data extraction from various sources, transforming the data using Informatica tools, and loading the data into target databases.
- His most recent role was as an ETL/SR Informatica Lead from 2015-present where he worked on building a data warehouse for a pharmaceutical company using Informatica to extract data from Oracle and flat files.
This document provides a summary of Magesh Babu Arumugam's experience as a data warehousing professional with over 10 years of experience. It highlights his strengths such as experience tuning SQL queries, loading and extracting data using ETL tools like WhereScape RED, and working with databases like Teradata and Greenplum. It also lists his technical expertise and professional experience managing projects involving data migration, ETL development, and performance tuning for clients in various industries.
This document is a curriculum vitae for Bodala Jagadeesh summarizing his skills and experience. He has 1.5 years of experience in designing data warehousing projects using Informatica and Oracle. He has created mappings to extract data from various sources, load data into databases, and create reusable objects. He also has experience with Tableau creating dashboards from various data sources according to client requirements. His technical skills include Informatica, Oracle, Postgres, SQL, and Tableau.
This document provides a summary of experience and qualifications for Ram Prasad SVK, an Informatica developer with over 4 years of experience developing and enhancing data warehousing projects using Informatica PowerCenter. It includes details of his work experience on 3 projects involving ETL development from various data sources to load into data warehouses and data marts. It also lists his educational qualifications and technical skills including expertise in Informatica, Oracle, and Windows.
Suneel Manne has over 4.9 years of experience in IT with a focus on data warehousing using tools like IBM Data Stage. He has expertise in designing, developing, and implementing ETL processes to extract data from various sources and load it into data warehouses. Some of his key skills include SQL, DB2, Data Stage, Linux, bug tracking tools, and data modeling. He has worked on several projects for clients like GTECH Corporation and T-Mobile BI, taking on roles such as ETL Developer and Big Data Developer.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
“An Outlook of the Ongoing and Future Relationship between Blockchain Technologies and Process-aware Information Systems.” Invited talk at the joint workshop on Blockchain for Information Systems (BC4IS) and Blockchain for Trusted Data Sharing (B4TDS), co-located with with the 36th International Conference on Advanced Information Systems Engineering (CAiSE), 3 June 2024, Limassol, Cyprus.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
20 Comprehensive Checklist of Designing and Developing a WebsitePixlogix Infotech
Dive into the world of Website Designing and Developing with Pixlogix! Looking to create a stunning online presence? Look no further! Our comprehensive checklist covers everything you need to know to craft a website that stands out. From user-friendly design to seamless functionality, we've got you covered. Don't miss out on this invaluable resource! Check out our checklist now at Pixlogix and start your journey towards a captivating online presence today.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Introducing Milvus Lite: Easy-to-Install, Easy-to-Use vector database for you...Zilliz
Join us to introduce Milvus Lite, a vector database that can run on notebooks and laptops, share the same API with Milvus, and integrate with every popular GenAI framework. This webinar is perfect for developers seeking easy-to-use, well-integrated vector databases for their GenAI apps.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Large Language Model (LLM) and it’s Geospatial Applications
Resume ratna rao updated
1. Resume Of Ratna Rao Yamani
ratnarao_yamanidwh2016@yahoo.com +91 – 9542366354
Competent professional offering around 9.5 years of experience in IT and 7 years of experience in data-
warehousing technologies of Informatica Power Center 9.1 and participated in all phases of SDLC. I am
having good Exposure on Informatica MDM (Landing,Stageing (Raw & Reject tables, Base Object, Match,
AutoMerge,ManualMerge Data steward and Golden Record).and Informatica Data Directory).
Summary:
Served as the lead troubleshooter to diagnose technology and functional related issues
Interacted with Business Analysts to understand the Business Requirements and developed the
Technical Design docs.
Coordinated between offshore team members and onsite business analysts.
Created Logical and Physical Models by using Erwin tool.
Proficient in the development of ETL Code using Informatica Power Center and Informatica Power
Exchange.
Extensively working knowledge for CDC and data profiling.
Excellent knowledge in Informatica IDQ.
Exposure with Informatica MDM Hub configurations.
Proficient in performance tuning at Mapping and Session Level and query level.
Design the testing strategy to ensure the functionality and performance.
Developed test approaches, test plan and test cases.
Sound knowledge of Oracle database concepts and Proficient in writing functions, Procedures.
Written functional and technical test cases and executed the same.
Excellent problem solving skills with a strong technical background and result oriented team player with
excellent communication and interpersonal skills.
Education:
Master of Computer Applications (M.C.A) from Periyar University, INDIA.
Certifications:
I have cleared the Informatica Mapping Designer and Administration papers.
Technical Skills:
Data Warehousing Informatica Power Center 9.1/8.1/6.2/5.1.1/5.0 Power Mart 5.1.1/5.0/4.7 (Repository Admin
Console, Repository Manager, Designer, Workflow Manger, Workflow Monitor), Power
Connect. Power Exchange and IDQ
Databases DB2 10.1/9.7/9.5/9.0, Oracle 10g/9i/8i/8.0/7.x, MS-Access 2000/97/7.0.
BI Tools Business Objects 5.x/6.5(Supervisor, Designer, Reporter).
Data Modeling Logical and Physical Modeling using Erwin tool
Programming SQL, PL/SQL
Environment Windows 98/NT/2000/2003/XP, Sun Solaris 2.6/2.7 and MS Dos 6.22.
Project Details: 1
Organization: Trianz IT Solutions
Client: Liberty Mutual
Project Name: LM – Data Management
Duration: June 2014 – Till Date
Liberty Mutual Group is leading insurance provider in US and presence in the Property, Auto and Liability
insurance sectors. This project is to implement all the Commercial Insurance Claims Transformation effort of
Commercial Markets platform to future state solution. The scope of this project is to bring all Liberty
2. Mutual Agency Markets commercial lines of business PAL (Property, Auto and Liability) claims into a single
platform, Commercial Insurance Claim Center.
Responsibilities:
Involved in gathering Business Requirements and developed the Technical Design documents.
Prepared Effort estimations for ETL build and Resource plan.
Extensively working knowledge for CDC and data profiling.
Mentored team in troubleshooting validation issues, data analysis and make recommendations for
system improvements.
Reviewing (Code / Document) of all deliverable aspects.
Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and
Workflow Monitor.
Parsed high-level design specification to simple ETL coding and mapping standards.
Designed and customized data models for Data warehouse supporting data from multiple sources on
real time
Involved in building the ETL architecture and Source to Target mapping to load data into Data
warehouse.
Created mapping documents to outline data flow from sources to targets.
Extracted the data from the flat files and other RDBMS databases into staging area and populated onto
Data warehouse.
Maintained stored definitions, transformation rules and targets definitions using Informatica repository
Manager.
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner,
Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
Developed mapping parameters and variables to support SQL override.
Created mapplets to use them in different mappings.
Developed mappings to load into staging tables and then to Dimensions and Facts.
Used existing ETL standards to develop these mappings.
Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail,
command, worklets, Assignment, Timer and scheduling of the workflow.
Created sessions, configured workflows to extract data from various sources, transformed data, and
loading into data warehouse.
Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
Extensively used DB2 to load data from flat files to the database tables in Oracle.
Modified existing mappings for enhancements of new business requirements.
Used Debugger to test the mappings and fixed the bugs.
Prepared Effort estimations for ETL build and Resource plan.
Extensively working knowledge for CDC and data profiling.
Mentored team in troubleshooting validation issues, data analysis and make recommendations for
system improvements.
Reviewing (Code / Document) of all deliverable aspects.
Environment: Informatica Power Center 9.1, Oracle 10g, DB2 10.1, Erwin 9, Windows NT 4.0 and Sun
Solaris.
Project Details: 2
QWEST, Denver, CO
Duration: Sept 2012 to May 2014
Client : Morgan Stanly
Role : Team Lead
Qwest is a large telecommunications carrier. Qwest Communications provides long-distance services and
broadband data, as well as voice and video communications globally. This project includes developing Data
warehouse from different data feeds and other operational data sources. Built a central Database where
data comes from different sources like oracle, SQL server and flat files. Actively involved as an Analyst for
preparing design documents and interacted with the data modelers to understand the data model and
design the ETL logic.
Responsibilities:
3. Responsible for Business Analysis and Requirements Collection.
Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow
Manager, and Workflow Monitor.
Parsed high-level design specification to simple ETL coding and mapping standards.
Designed and customized data models for Data warehouse supporting data from multiple
sources on real time
Involved in building the ETL architecture and Source to Target mapping to load data into Data
warehouse.
Created mapping documents to outline data flow from sources to targets.
Extracted the data from the flat files and other RDBMS databases into staging area and
populated onto Data warehouse.
Maintained stored definitions, transformation rules and targets definitions using Informatica
repository Manager.
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy,
Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
Developed mapping parameters and variables to support SQL override.
Created mapplets to use them in different mappings.
Developed mappings to load into staging tables and then to Dimensions and Facts.
Used existing ETL standards to develop these mappings.
Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-
mail, command, worklets, Assignment, Timer and scheduling of the workflow.
Created sessions, configured workflows to extract data from various sources, transformed
data, and loading into data warehouse.
Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
Extensively used DB2 to load data from flat files to the database tables in Oracle.
Modified existing mappings for enhancements of new business requirements.
Used Debugger to test the mappings and fixed the bugs.
Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and
backup of repository and folder.
Involved in Performance tuning at source, target, mappings, sessions, and system levels.
Prepared migration document to move the mappings from development to testing and then to
production repositories.
Environment: Informatica Power Center 8.6.1, Workflow Manager, Workflow Monitor, Oracle
10g/9i, Autosys, DB2, UNIX AIX,
Project Details: 3
Client: Zurich Germany
ZDW (Zurich Data warehouse)
Senior Software Engineer
Feb 2008 – Jan 2012
Prior to ZDW, there were several stand alone products or Data Marts. But then the need aroused to have a
general DW for ZURICH. As of now we have plenty of Data Marts and General Insurance is one of them
which have been recently moved into production. In this Data is loaded on a daily basis into this from various
branches and in turn it provides the data to various Data Marts. As always, the same case is here as well
and we have 4 different environments to work like Development, Testing, Production and special
environment i.e. FIX environment:
Development - Every developer works in using a particular DEB ID.
Integration – The testing is done in deeper sense and once it’s complete then only anything is moved to
Production.
Production – It’s the real environment where every thing that has been worked upon so far in dev and
integration has to work properly.
4. Fix Environment – This is a special environment where programs are rectified if something fails in the
production. Using this environment we get into a position so that we do not have to follow the complete
cycle again i.e. the Dev Integration Prod.
Environment: Informatica Power Center 6.1, DB2 9.7, Flat Files and COBOL Files and UNIX.
Responsibilities:
Understanding existing business model and customer requirements.
Created the mappings for the following business transactions based on the sources and targets loaded
in the Informatica.
Used Source Analyzer and Warehouse Designer to import the source and target database schemas,
and the mapping designer to map source to the target.
Used Transformation Developer to create the Expression, Filters, lookups and Aggregation
transformations, which are used in mappings. Maintained source definitions, transformation rules and
targets definitions using Informatica repository manager.
Project Details: 4
Client: Sun Microsystems
Sales Data Mart
Software Engineer
Sep 2005 –Dec 2007
Mainframes Re-hosting in Super Value, Many legacy application systems residing in Mainframes were
being re-hosted to Windows platform. The project involved Legacy Enhancements as part of
Mainframes re-hosting. The systems would be re-hosted in windows platform using MICROFOCUS
ENTERPRISE SERVER and MICROFOCUS NET EXPRESS. As a part of this program one of the
legacy language MARKV, which access IMS DB, would be re-engineered to
COBOL-CICS-IMSDB in windows platform using Microfocus.
Responsibilities:
Understanding and implementing the re-hosting concept
Understanding the Mark V-IMS program and developing the COBOL-CICS-IMS programs
Testing and debugging the new application
Providing support to seniors
Environment: Platform/OS Used: Mainframes/ Z OS/Unix/Windows
Databases Used: IMS DB
Language(s)/Tools/ Packages Used: COBOL, Microfocus COBOL, CICS, SQL/PLSQL,
File-AID, Microfocus Enterprise Server,
Microfocus NetExpress, TSO/ISPF, SPUFI, JCL
Employment History:
Company Name Designation Start date End date
Techno Sphere IT Solutions Team Lead August-2012 Till Date
CSC India Team Lead Sep-2005 Jan- 2012