SlideShare a Scribd company logo
Gopi
singamsettigopi21@gmail.com
+ 91-8970777649
Experience Summary
• Overall 3+ Years of experience in implementation of Data Warehousing projects with Teradata.
• Working at Tata consultancy services ltd (Bangalore), as Software Engineer from July 2013 to
till date.
Qualification
• B.Tech (2012) in Electrical and Electronics Engineering from Prakasam Engineering College.
Professional Skills
• Good experience with Teradata Utilities like BTEQ, Fast Load, Multi Load, Tpump and Fast
Export.
• Extensive experience loading of data into Teradata from flat files using FASTLOAD scripts.
• Worked extensively on Teradata SQL assistant.
• Good Understanding in Data Warehousing Concepts.
• Did error handling and performance tuning of Teradata queries.
• Did Data reconciliation in various source systems and in Teradata.
• Worked with Explain and Collect Statistics.
• Worked with ET and UV Tables.
• Addressed AD-HOC Requests of the Clients.
• Used Sub Queries, Joins, Set Operations, Functions and advanced OLAP functions extensively.
• Prepared unit test specification requirements.
Core Competencies
Databases : Teradata 12/13/14, ORACLE
Languages : SQL, C
Operating systems : UNIX, Windows XP, 2003
Tools : Teradata SQL Assistant
Scheduling Tool : WLM (Work Load Manager)
Ticketing Tool : Tivoli
Projects Handled/Recent Accomplishments:
Project #1:
Project : Black hawk Network Financial Data Reporting System
Client : Black hawk Network Financial Services
Role : Teradata Developer
Team Size : 8
Environment : Teradata utilities (Fast Load, MLOAD, BTEQ) and TD SQL Assistant 13, UNIX, Tivoli
Duration : Mar 2015 – till date
Project Description:
Black hawk Network financial Data Reporting System to provide different types of commercial loans to
the prospective customers in a short span of time. The objective is to achieve single point of reference
to get company, Contact, Process, and deal data from various databases. Distributed data residing in
heterogeneous data sources is consolidated into the target Teradata database. The project involved
creating the data warehouse, analyzing the source data, and then deciding on the appropriate
extraction, transformation and loading strategy. Subsequently, the data flow from the data source to
the target tables were created along with the utilities.
Roles and Responsibilities:
 Analysis of the specifications provided by the clients.
 Created scripts for Teradata utilities like Fast load, MLoad, Fast Export and BTEQ.
 Wrote hundreds of DDL scripts to create tables, views and indexes in the company “Data
Warehouse".
 Reduced Teradata space used by optimizing tables – adding compression where appropriate
and ensuring optimum column definitions.
 Involved in loading of data into Teradata from legacy systems and flat files using complex
MLOAD scripts and FASTLOAD scripts.
 Analyzing the Tables and Indexes selections for Data and Access, Primary and Secondary
indexes for Tables.
 Modifying the queries to use the Teradata features for performance improvement.
 Worked with Explain and Collect Statistics.
Project# 2:
Project : Customer Enterprise Data Warehouse
Client : Verizon, UK
Role : Teradata Developer
Team Size : 12
Environment : Teradata utilities (Fast Load, MLOAD, BTEQ) and TD SQL Assistant 13, UNIX, WLM, Tivoli
Duration : Aug 2013 – Feb 2015
Project Description:
This Project is to Design and Construct Billing Mart for Customer Enterprise Data Warehouse. The
objective is to achieve single point of reference to get the customer data from the various databases.
Distributed data residing in heterogeneous data sources is consolidated into the target Teradata
database. The project involved creating the data warehouse, analyzing the source data, and then
deciding on the appropriate extraction, transformation and loading strategy. Subsequently, the data
flow from the data source to the target tables were created along with the required utilities.
Roles and Responsibilities:
 Working with different sources like flat files and XML files.
 Extensively worked in Data Extraction, Transformation and Loading from source to
target system using BTEQ, Fast Load and Multi Load.
 Worked exclusively with the Teradata SQL Assistant.
 Creating BTEQ scripts from Staging to Target.
 Query optimizations (explain plans, collect statistics).
 Working with ET, UV and WT tables in error handling worked exclusively with the
Teradata SQLA.
(Gopi S)

More Related Content

What's hot

Enforcing Schemas with Kafka Connect | David Navalho, Marionete and Anatol Lu...
Enforcing Schemas with Kafka Connect | David Navalho, Marionete and Anatol Lu...Enforcing Schemas with Kafka Connect | David Navalho, Marionete and Anatol Lu...
Enforcing Schemas with Kafka Connect | David Navalho, Marionete and Anatol Lu...
HostedbyConfluent
 
Encompassing Information Integration
Encompassing Information IntegrationEncompassing Information Integration
Encompassing Information Integrationnguyenfilip
 
Kettle: Pentaho Data Integration tool
Kettle: Pentaho Data Integration toolKettle: Pentaho Data Integration tool
Kettle: Pentaho Data Integration tool
Alex Rayón Jerez
 
Data virtualization, Data Federation & IaaS with Jboss Teiid
Data virtualization, Data Federation & IaaS with Jboss TeiidData virtualization, Data Federation & IaaS with Jboss Teiid
Data virtualization, Data Federation & IaaS with Jboss Teiid
Anil Allewar
 
Pentaho | Data Integration & Report designer
Pentaho | Data Integration & Report designerPentaho | Data Integration & Report designer
Pentaho | Data Integration & Report designer
Hamdi Hmidi
 
Kettle – Etl Tool
Kettle – Etl ToolKettle – Etl Tool
Kettle – Etl Tool
Dr Anjan Krishnamurthy
 
Data Quality with AI
Data Quality with AIData Quality with AI
Data Quality with AI
Vera Ekimenko
 
Data Virtualization and ETL
Data Virtualization and ETLData Virtualization and ETL
Data Virtualization and ETL
Lily Luo
 
Oracle data integrator (odi) online training
Oracle data integrator (odi) online trainingOracle data integrator (odi) online training
Oracle data integrator (odi) online training
Glory IT Technologies Pvt. Ltd.
 
Data pipelines observability: OpenLineage & Marquez
Data pipelines observability:  OpenLineage & MarquezData pipelines observability:  OpenLineage & Marquez
Data pipelines observability: OpenLineage & Marquez
Julien Le Dem
 
Jboss Teiid - The data you have on the place you need
Jboss Teiid - The data you have on the place you needJboss Teiid - The data you have on the place you need
Jboss Teiid - The data you have on the place you need
Jackson dos Santos Olveira
 
KnowIT, semantic informatics knowledge base
KnowIT, semantic informatics knowledge baseKnowIT, semantic informatics knowledge base
KnowIT, semantic informatics knowledge baseLaurent Alquier
 
Spatial ETL For Web Services-Based Data Sharing
Spatial ETL For Web Services-Based Data SharingSpatial ETL For Web Services-Based Data Sharing
Spatial ETL For Web Services-Based Data Sharing
Safe Software
 
Moving and Transforming Data with Pentaho Data Integration 5.0 CE (aka Kettle)
Moving and Transforming Data with Pentaho Data Integration 5.0 CE (aka Kettle)Moving and Transforming Data with Pentaho Data Integration 5.0 CE (aka Kettle)
Moving and Transforming Data with Pentaho Data Integration 5.0 CE (aka Kettle)
Roland Bouman
 
Pentaho Data Integration: Extrayendo, integrando, normalizando y preparando m...
Pentaho Data Integration: Extrayendo, integrando, normalizando y preparando m...Pentaho Data Integration: Extrayendo, integrando, normalizando y preparando m...
Pentaho Data Integration: Extrayendo, integrando, normalizando y preparando m...
Alex Rayón Jerez
 
The Big Data Analytics Ecosystem at LinkedIn
The Big Data Analytics Ecosystem at LinkedInThe Big Data Analytics Ecosystem at LinkedIn
The Big Data Analytics Ecosystem at LinkedIn
rajappaiyer
 
OSLC & The Future of Interoperability
OSLC & The Future of InteroperabilityOSLC & The Future of Interoperability
OSLC & The Future of Interoperability
Koneksys
 
GeoKettle: A powerful open source spatial ETL tool
GeoKettle: A powerful open source spatial ETL toolGeoKettle: A powerful open source spatial ETL tool
GeoKettle: A powerful open source spatial ETL tool
Thierry Badard
 
Open core summit: Observability for data pipelines with OpenLineage
Open core summit: Observability for data pipelines with OpenLineageOpen core summit: Observability for data pipelines with OpenLineage
Open core summit: Observability for data pipelines with OpenLineage
Julien Le Dem
 
Evaluation criteria for nosql databases
Evaluation criteria for nosql databasesEvaluation criteria for nosql databases
Evaluation criteria for nosql databases
Ebenezer Daniel
 

What's hot (20)

Enforcing Schemas with Kafka Connect | David Navalho, Marionete and Anatol Lu...
Enforcing Schemas with Kafka Connect | David Navalho, Marionete and Anatol Lu...Enforcing Schemas with Kafka Connect | David Navalho, Marionete and Anatol Lu...
Enforcing Schemas with Kafka Connect | David Navalho, Marionete and Anatol Lu...
 
Encompassing Information Integration
Encompassing Information IntegrationEncompassing Information Integration
Encompassing Information Integration
 
Kettle: Pentaho Data Integration tool
Kettle: Pentaho Data Integration toolKettle: Pentaho Data Integration tool
Kettle: Pentaho Data Integration tool
 
Data virtualization, Data Federation & IaaS with Jboss Teiid
Data virtualization, Data Federation & IaaS with Jboss TeiidData virtualization, Data Federation & IaaS with Jboss Teiid
Data virtualization, Data Federation & IaaS with Jboss Teiid
 
Pentaho | Data Integration & Report designer
Pentaho | Data Integration & Report designerPentaho | Data Integration & Report designer
Pentaho | Data Integration & Report designer
 
Kettle – Etl Tool
Kettle – Etl ToolKettle – Etl Tool
Kettle – Etl Tool
 
Data Quality with AI
Data Quality with AIData Quality with AI
Data Quality with AI
 
Data Virtualization and ETL
Data Virtualization and ETLData Virtualization and ETL
Data Virtualization and ETL
 
Oracle data integrator (odi) online training
Oracle data integrator (odi) online trainingOracle data integrator (odi) online training
Oracle data integrator (odi) online training
 
Data pipelines observability: OpenLineage & Marquez
Data pipelines observability:  OpenLineage & MarquezData pipelines observability:  OpenLineage & Marquez
Data pipelines observability: OpenLineage & Marquez
 
Jboss Teiid - The data you have on the place you need
Jboss Teiid - The data you have on the place you needJboss Teiid - The data you have on the place you need
Jboss Teiid - The data you have on the place you need
 
KnowIT, semantic informatics knowledge base
KnowIT, semantic informatics knowledge baseKnowIT, semantic informatics knowledge base
KnowIT, semantic informatics knowledge base
 
Spatial ETL For Web Services-Based Data Sharing
Spatial ETL For Web Services-Based Data SharingSpatial ETL For Web Services-Based Data Sharing
Spatial ETL For Web Services-Based Data Sharing
 
Moving and Transforming Data with Pentaho Data Integration 5.0 CE (aka Kettle)
Moving and Transforming Data with Pentaho Data Integration 5.0 CE (aka Kettle)Moving and Transforming Data with Pentaho Data Integration 5.0 CE (aka Kettle)
Moving and Transforming Data with Pentaho Data Integration 5.0 CE (aka Kettle)
 
Pentaho Data Integration: Extrayendo, integrando, normalizando y preparando m...
Pentaho Data Integration: Extrayendo, integrando, normalizando y preparando m...Pentaho Data Integration: Extrayendo, integrando, normalizando y preparando m...
Pentaho Data Integration: Extrayendo, integrando, normalizando y preparando m...
 
The Big Data Analytics Ecosystem at LinkedIn
The Big Data Analytics Ecosystem at LinkedInThe Big Data Analytics Ecosystem at LinkedIn
The Big Data Analytics Ecosystem at LinkedIn
 
OSLC & The Future of Interoperability
OSLC & The Future of InteroperabilityOSLC & The Future of Interoperability
OSLC & The Future of Interoperability
 
GeoKettle: A powerful open source spatial ETL tool
GeoKettle: A powerful open source spatial ETL toolGeoKettle: A powerful open source spatial ETL tool
GeoKettle: A powerful open source spatial ETL tool
 
Open core summit: Observability for data pipelines with OpenLineage
Open core summit: Observability for data pipelines with OpenLineageOpen core summit: Observability for data pipelines with OpenLineage
Open core summit: Observability for data pipelines with OpenLineage
 
Evaluation criteria for nosql databases
Evaluation criteria for nosql databasesEvaluation criteria for nosql databases
Evaluation criteria for nosql databases
 

Viewers also liked

Things fall apart
Things fall apartThings fall apart
Things fall apart
sfryxell
 
Validity andreliability
Validity andreliabilityValidity andreliability
Validity andreliability
nuwan udugampala
 
New microsoft word document
New microsoft word documentNew microsoft word document
New microsoft word document6116520
 
Church history1
Church history1Church history1
Church history1
Stephen Lo
 
Bulk sms service.compressed
Bulk sms service.compressedBulk sms service.compressed
Bulk sms service.compressed
Jigs Patel
 
皆の日本語32
皆の日本語32皆の日本語32
皆の日本語32
Aly Chia
 
Melani's family
Melani's familyMelani's family
Etapia ubezpieczenia
Etapia ubezpieczeniaEtapia ubezpieczenia
Etapia ubezpieczenia
Etapia Ubezpieczenia
 
Introdução a química orgânica
Introdução a química orgânicaIntrodução a química orgânica
Introdução a química orgânicaprofrafaelquimica
 
Cultura gamer
Cultura gamerCultura gamer
Cultura gamer
Juan Camilo Garcia
 
Eletroterapia Resumo
Eletroterapia ResumoEletroterapia Resumo
Eletroterapia Resumo
Danillo Aguiar
 
Entrega contínua com github e windows azure
Entrega contínua com github e windows azureEntrega contínua com github e windows azure
Entrega contínua com github e windows azure
Luis Rudge
 
Uepa 2009 pronto tcc
Uepa 2009 pronto tccUepa 2009 pronto tcc
Uepa 2009 pronto tcc
Fabryzzyo Souto
 
Digital Camera Technology
Digital Camera TechnologyDigital Camera Technology
Digital Camera Technology100747361
 
Clasificación de la Contabilidad
Clasificación de la ContabilidadClasificación de la Contabilidad
Clasificación de la Contabilidad
Angie Carolina Díaz Ramirez
 

Viewers also liked (16)

Things fall apart
Things fall apartThings fall apart
Things fall apart
 
Validity andreliability
Validity andreliabilityValidity andreliability
Validity andreliability
 
New microsoft word document
New microsoft word documentNew microsoft word document
New microsoft word document
 
Com par 25jun14
Com par 25jun14Com par 25jun14
Com par 25jun14
 
Church history1
Church history1Church history1
Church history1
 
Bulk sms service.compressed
Bulk sms service.compressedBulk sms service.compressed
Bulk sms service.compressed
 
皆の日本語32
皆の日本語32皆の日本語32
皆の日本語32
 
Melani's family
Melani's familyMelani's family
Melani's family
 
Etapia ubezpieczenia
Etapia ubezpieczeniaEtapia ubezpieczenia
Etapia ubezpieczenia
 
Introdução a química orgânica
Introdução a química orgânicaIntrodução a química orgânica
Introdução a química orgânica
 
Cultura gamer
Cultura gamerCultura gamer
Cultura gamer
 
Eletroterapia Resumo
Eletroterapia ResumoEletroterapia Resumo
Eletroterapia Resumo
 
Entrega contínua com github e windows azure
Entrega contínua com github e windows azureEntrega contínua com github e windows azure
Entrega contínua com github e windows azure
 
Uepa 2009 pronto tcc
Uepa 2009 pronto tccUepa 2009 pronto tcc
Uepa 2009 pronto tcc
 
Digital Camera Technology
Digital Camera TechnologyDigital Camera Technology
Digital Camera Technology
 
Clasificación de la Contabilidad
Clasificación de la ContabilidadClasificación de la Contabilidad
Clasificación de la Contabilidad
 

Similar to Gopi (20)

Resume_Informatica&IDQ_4+years_of_exp
Resume_Informatica&IDQ_4+years_of_expResume_Informatica&IDQ_4+years_of_exp
Resume_Informatica&IDQ_4+years_of_exp
 
Ajith_kumar_4.3 Years_Informatica_ETL
Ajith_kumar_4.3 Years_Informatica_ETLAjith_kumar_4.3 Years_Informatica_ETL
Ajith_kumar_4.3 Years_Informatica_ETL
 
Informatica_Rajesh-CV 28_03_16
Informatica_Rajesh-CV 28_03_16Informatica_Rajesh-CV 28_03_16
Informatica_Rajesh-CV 28_03_16
 
Lokesh_Kansal_Resume
Lokesh_Kansal_ResumeLokesh_Kansal_Resume
Lokesh_Kansal_Resume
 
VamsiKrishna Maddiboina
VamsiKrishna MaddiboinaVamsiKrishna Maddiboina
VamsiKrishna Maddiboina
 
Gowthami_Resume
Gowthami_ResumeGowthami_Resume
Gowthami_Resume
 
Dinesh_Deshpande_9Yrs_Exp_Informatica
Dinesh_Deshpande_9Yrs_Exp_InformaticaDinesh_Deshpande_9Yrs_Exp_Informatica
Dinesh_Deshpande_9Yrs_Exp_Informatica
 
Arun Mathew Thomas_resume
Arun Mathew Thomas_resumeArun Mathew Thomas_resume
Arun Mathew Thomas_resume
 
Magesh_Babu_Resume
Magesh_Babu_ResumeMagesh_Babu_Resume
Magesh_Babu_Resume
 
GouriShankar_Informatica
GouriShankar_InformaticaGouriShankar_Informatica
GouriShankar_Informatica
 
ETL Profile-Rajnish Kumar
ETL Profile-Rajnish KumarETL Profile-Rajnish Kumar
ETL Profile-Rajnish Kumar
 
Anil_Kumar_Andra_ETL
Anil_Kumar_Andra_ETLAnil_Kumar_Andra_ETL
Anil_Kumar_Andra_ETL
 
Shivaprasada_Kodoth
Shivaprasada_KodothShivaprasada_Kodoth
Shivaprasada_Kodoth
 
Sakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam - Data warehousing ConsultantSakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam - Data warehousing Consultant
 
Resume_gmail
Resume_gmailResume_gmail
Resume_gmail
 
Resume_Arun_Baby_03Jan17
Resume_Arun_Baby_03Jan17Resume_Arun_Baby_03Jan17
Resume_Arun_Baby_03Jan17
 
VenkatSubbaReddy_Resume
VenkatSubbaReddy_ResumeVenkatSubbaReddy_Resume
VenkatSubbaReddy_Resume
 
Richa_Profile
Richa_ProfileRicha_Profile
Richa_Profile
 
Resume
ResumeResume
Resume
 
Renu_Resume
Renu_ResumeRenu_Resume
Renu_Resume
 

Gopi

  • 1. Gopi singamsettigopi21@gmail.com + 91-8970777649 Experience Summary • Overall 3+ Years of experience in implementation of Data Warehousing projects with Teradata. • Working at Tata consultancy services ltd (Bangalore), as Software Engineer from July 2013 to till date. Qualification • B.Tech (2012) in Electrical and Electronics Engineering from Prakasam Engineering College. Professional Skills • Good experience with Teradata Utilities like BTEQ, Fast Load, Multi Load, Tpump and Fast Export. • Extensive experience loading of data into Teradata from flat files using FASTLOAD scripts. • Worked extensively on Teradata SQL assistant. • Good Understanding in Data Warehousing Concepts. • Did error handling and performance tuning of Teradata queries. • Did Data reconciliation in various source systems and in Teradata. • Worked with Explain and Collect Statistics. • Worked with ET and UV Tables. • Addressed AD-HOC Requests of the Clients. • Used Sub Queries, Joins, Set Operations, Functions and advanced OLAP functions extensively. • Prepared unit test specification requirements. Core Competencies Databases : Teradata 12/13/14, ORACLE Languages : SQL, C Operating systems : UNIX, Windows XP, 2003 Tools : Teradata SQL Assistant Scheduling Tool : WLM (Work Load Manager)
  • 2. Ticketing Tool : Tivoli Projects Handled/Recent Accomplishments: Project #1: Project : Black hawk Network Financial Data Reporting System Client : Black hawk Network Financial Services Role : Teradata Developer Team Size : 8 Environment : Teradata utilities (Fast Load, MLOAD, BTEQ) and TD SQL Assistant 13, UNIX, Tivoli Duration : Mar 2015 – till date Project Description: Black hawk Network financial Data Reporting System to provide different types of commercial loans to the prospective customers in a short span of time. The objective is to achieve single point of reference to get company, Contact, Process, and deal data from various databases. Distributed data residing in heterogeneous data sources is consolidated into the target Teradata database. The project involved creating the data warehouse, analyzing the source data, and then deciding on the appropriate extraction, transformation and loading strategy. Subsequently, the data flow from the data source to the target tables were created along with the utilities.
  • 3. Roles and Responsibilities:  Analysis of the specifications provided by the clients.  Created scripts for Teradata utilities like Fast load, MLoad, Fast Export and BTEQ.  Wrote hundreds of DDL scripts to create tables, views and indexes in the company “Data Warehouse".  Reduced Teradata space used by optimizing tables – adding compression where appropriate and ensuring optimum column definitions.  Involved in loading of data into Teradata from legacy systems and flat files using complex MLOAD scripts and FASTLOAD scripts.  Analyzing the Tables and Indexes selections for Data and Access, Primary and Secondary indexes for Tables.  Modifying the queries to use the Teradata features for performance improvement.  Worked with Explain and Collect Statistics. Project# 2: Project : Customer Enterprise Data Warehouse Client : Verizon, UK Role : Teradata Developer Team Size : 12 Environment : Teradata utilities (Fast Load, MLOAD, BTEQ) and TD SQL Assistant 13, UNIX, WLM, Tivoli Duration : Aug 2013 – Feb 2015 Project Description: This Project is to Design and Construct Billing Mart for Customer Enterprise Data Warehouse. The objective is to achieve single point of reference to get the customer data from the various databases. Distributed data residing in heterogeneous data sources is consolidated into the target Teradata database. The project involved creating the data warehouse, analyzing the source data, and then deciding on the appropriate extraction, transformation and loading strategy. Subsequently, the data flow from the data source to the target tables were created along with the required utilities. Roles and Responsibilities:  Working with different sources like flat files and XML files.  Extensively worked in Data Extraction, Transformation and Loading from source to target system using BTEQ, Fast Load and Multi Load.  Worked exclusively with the Teradata SQL Assistant.  Creating BTEQ scripts from Staging to Target.
  • 4.  Query optimizations (explain plans, collect statistics).  Working with ET, UV and WT tables in error handling worked exclusively with the Teradata SQLA. (Gopi S)