SlideShare a Scribd company logo
1 of 2
Lei Liu
11 Peachtree Home 281-890-6436
Houston, TX 77064 Email leiliu525@gmail.com


SKILLS
• Strong background in pipeline database model and data manipulation
• High level knowledge and experience in ArcGIS
• Excellent experimental and problem solving skills
• Expert level in database development as well as database administration
• Knowledge and skills with web development, familiar with main
frameworks
• Skillful with Object Oriented Programming (OOP)
• Fluent in several computer programming languages (PL/SQL, T-SQL, C#,
Python, C++, JavaScript, Jquery, MEAN stacks, Unix scripts)
CERTIFICATES
• Oracle DBA, MongoDB Developer Associate
RELATED WORKING
EXPERIENCE
04/2016-Present Independent Consultant, Houston, TX
• Continue to provide technical support for applications of Eagle Information
Mapping, Inc.
• While supporting applications, my primary focus is on web development,
particularly on MEAN stacks as well as other main web develop
frameworks, like Ruby, PHP, React, Ember and Backbone
01/2015-04/2016 Eagle Information Mapping at G2 Partners, Houston, TX
Principal Developer
• Implemented and supported SQL server database systems, reporting
services, analysis services and integration service on enterprise and
dedicated windows servers
• Maintained, tuned and monitored SQL server and DB products including
storage management, DB performance tuning, DB software debugging,
backup and recovery, and configuration reset maintenance
12/1995-01/2015 Eagle Information Mapping Inc., Houston, TX
Senior GIS Developer
• Designed and developed pipeline data management application-PDAT suite
with a complete set of functionalities for maintaining pipeline facilities data
in Pipeline Open Data Standard (PODS) model. This database/GIS product
has been installed in several major North American Oil and Pipeline
companies, including Chevron, Chevron Indonesia, BP, Oneok, Marathon
and Koch Pipeline Company.
• As a primary developer for the PDAT suite of products, responsible for
coding stored procedures for line maintenance with complex pipeline
centerline maintenance operations and the process for event loading and
editing using PL/SQL for Oracle and T-SQL for SQL Server database.
• Worked as an Oracle database administrator which included installing,
tuning, maintaining and monitoring Oracle database systems, implementing
storage management, performing backup and recovery operations, exporting
and importing database dump files as well as supporting all database
associated techniques.
• Developed and maintained C++/Python applications to create GIS spatial
data in SDE layers for pipeline data stored in PODS relational database.
• Developed Python/PLSQL application to transfer data between the Pipeline
Open Data Standard (PODSTM
) database model and the ArcGIS Pipeline
Data Model (APDMTM
) database model using a JSON data-interchange
format.
• Developed C++/Python application to implement Geo search operations for
pipeline companies.
• Developed ETL applications to migrate pipeline data from legacy ISAT
database model into PODS database model by mapping all fields and
loading all data.
• Worked in the Client’s office to assist in correcting and/or transferring
pipeline data.
• Developed packages or stored procedures to upgrade PODS models to
newer database versions while maintaining complete data integration.
• Designed and developed a complete set of the Quality Control procedures to
monitor PODS database activities and detect all types of existing data errors
and on-going data input errors. The QC process is run daily and provides
detail error reports to assure the quality of pipeline data.
• Design a new data sub-model to extend the PODS database model to include
High Consequence Area (HCA) data and developed a Python/PLSQL
application to load HCA data into a PODS database.
• Designed and developed the Seismic Location Database System (SLDS) – a
GIS/Database application to manage seismic shot point data. The
application uses standard seismic data as input and generates SDE layers
(maps) for both shot points and line attributes.
• Designed and developed an Alignment sheets application for generating
quality maps that display pipeline oil and gas assets. This was a very
popular product.
EDUCATION
M.S., Computer Science
M.S., Geological Engineering
THE UNIVERSITY OF MISSISSIPPI, Oxford, MS

More Related Content

What's hot

Integrating Ontario’s Provincially Tracked Species Data Using FME
Integrating Ontario’s Provincially Tracked Species Data Using FMEIntegrating Ontario’s Provincially Tracked Species Data Using FME
Integrating Ontario’s Provincially Tracked Species Data Using FME
Safe Software
 
2004-11-13 Supersite Relational Database Project: (Data Portal?)
2004-11-13 Supersite Relational Database Project: (Data Portal?)2004-11-13 Supersite Relational Database Project: (Data Portal?)
2004-11-13 Supersite Relational Database Project: (Data Portal?)
Rudolf Husar
 
A Scalable Data Transformation Framework using Hadoop Ecosystem
A Scalable Data Transformation Framework using Hadoop EcosystemA Scalable Data Transformation Framework using Hadoop Ecosystem
A Scalable Data Transformation Framework using Hadoop Ecosystem
DataWorks Summit
 
DanKnightResume
DanKnightResumeDanKnightResume
DanKnightResume
DanKnight
 
Stream Analytics with SQL on Apache Flink - Fabian Hueske
Stream Analytics with SQL on Apache Flink - Fabian HueskeStream Analytics with SQL on Apache Flink - Fabian Hueske
Stream Analytics with SQL on Apache Flink - Fabian Hueske
Evention
 

What's hot (20)

Lift and Shift 20 Million Features with ArcGIS Data Interoperability
Lift and Shift 20 Million Features with ArcGIS Data Interoperability Lift and Shift 20 Million Features with ArcGIS Data Interoperability
Lift and Shift 20 Million Features with ArcGIS Data Interoperability
 
Pentaho etl-tool
Pentaho etl-toolPentaho etl-tool
Pentaho etl-tool
 
Integrating Ontario’s Provincially Tracked Species Data Using FME
Integrating Ontario’s Provincially Tracked Species Data Using FMEIntegrating Ontario’s Provincially Tracked Species Data Using FME
Integrating Ontario’s Provincially Tracked Species Data Using FME
 
SAS Online Training
SAS Online TrainingSAS Online Training
SAS Online Training
 
Srds Pres011120
Srds Pres011120Srds Pres011120
Srds Pres011120
 
2004-11-13 Supersite Relational Database Project: (Data Portal?)
2004-11-13 Supersite Relational Database Project: (Data Portal?)2004-11-13 Supersite Relational Database Project: (Data Portal?)
2004-11-13 Supersite Relational Database Project: (Data Portal?)
 
A Scalable Data Transformation Framework using Hadoop Ecosystem
A Scalable Data Transformation Framework using Hadoop EcosystemA Scalable Data Transformation Framework using Hadoop Ecosystem
A Scalable Data Transformation Framework using Hadoop Ecosystem
 
DanKnightResume
DanKnightResumeDanKnightResume
DanKnightResume
 
NAPE 2019 Presentation
NAPE 2019 PresentationNAPE 2019 Presentation
NAPE 2019 Presentation
 
TCS SUSE sapphire2016_booth-presentation
TCS SUSE sapphire2016_booth-presentationTCS SUSE sapphire2016_booth-presentation
TCS SUSE sapphire2016_booth-presentation
 
Powering GIS Operations with ColdFusion
Powering GIS Operations with ColdFusionPowering GIS Operations with ColdFusion
Powering GIS Operations with ColdFusion
 
Syamali
SyamaliSyamali
Syamali
 
Intro to QuickSight.pptx
Intro to QuickSight.pptxIntro to QuickSight.pptx
Intro to QuickSight.pptx
 
1Spatial: Leeds FME World Tour: Improving the flow of asset information with FME
1Spatial: Leeds FME World Tour: Improving the flow of asset information with FME1Spatial: Leeds FME World Tour: Improving the flow of asset information with FME
1Spatial: Leeds FME World Tour: Improving the flow of asset information with FME
 
Hadoop World 2011: Radoop: a Graphical Analytics Tool for Big Data - Gabor Ma...
Hadoop World 2011: Radoop: a Graphical Analytics Tool for Big Data - Gabor Ma...Hadoop World 2011: Radoop: a Graphical Analytics Tool for Big Data - Gabor Ma...
Hadoop World 2011: Radoop: a Graphical Analytics Tool for Big Data - Gabor Ma...
 
Stream Analytics with SQL on Apache Flink - Fabian Hueske
Stream Analytics with SQL on Apache Flink - Fabian HueskeStream Analytics with SQL on Apache Flink - Fabian Hueske
Stream Analytics with SQL on Apache Flink - Fabian Hueske
 
George Vander Veer DBA
George Vander Veer DBAGeorge Vander Veer DBA
George Vander Veer DBA
 
AdvancedMiner predictive analytics platform overview
AdvancedMiner predictive analytics platform overviewAdvancedMiner predictive analytics platform overview
AdvancedMiner predictive analytics platform overview
 
Appfluent - Transforming the Economics of Big Data
Appfluent - Transforming the Economics of Big DataAppfluent - Transforming the Economics of Big Data
Appfluent - Transforming the Economics of Big Data
 
Checkpoints for service level operations
Checkpoints for service level operationsCheckpoints for service level operations
Checkpoints for service level operations
 

Similar to Lei Liu Resume (20)

Prashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEWPrashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEW
 
aparna_sas
aparna_sasaparna_sas
aparna_sas
 
Tah 03302015 withendclient
Tah 03302015 withendclientTah 03302015 withendclient
Tah 03302015 withendclient
 
scopp_chris_resume
scopp_chris_resumescopp_chris_resume
scopp_chris_resume
 
scopp_chris_resume
scopp_chris_resumescopp_chris_resume
scopp_chris_resume
 
Shane_O'Neill_CV_slim
Shane_O'Neill_CV_slimShane_O'Neill_CV_slim
Shane_O'Neill_CV_slim
 
Resume_Parthiban_Ranganathan
Resume_Parthiban_RanganathanResume_Parthiban_Ranganathan
Resume_Parthiban_Ranganathan
 
Resume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - HadoopResume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - Hadoop
 
Krishna_IBM_Infosphere_Certified_Datastage_Consultant
Krishna_IBM_Infosphere_Certified_Datastage_Consultant Krishna_IBM_Infosphere_Certified_Datastage_Consultant
Krishna_IBM_Infosphere_Certified_Datastage_Consultant
 
HamsaBalajiresume
HamsaBalajiresumeHamsaBalajiresume
HamsaBalajiresume
 
Olaleye_Ajibola_May_2015
Olaleye_Ajibola_May_2015Olaleye_Ajibola_May_2015
Olaleye_Ajibola_May_2015
 
Gomez Marvin_8_10_2016
Gomez Marvin_8_10_2016Gomez Marvin_8_10_2016
Gomez Marvin_8_10_2016
 
Resume (1)
Resume (1)Resume (1)
Resume (1)
 
Informatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQLInformatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQL
 
SivakumarS
SivakumarSSivakumarS
SivakumarS
 
Catherine Sullivan Resume 2010
Catherine Sullivan Resume 2010Catherine Sullivan Resume 2010
Catherine Sullivan Resume 2010
 
Siva-CV
Siva-CVSiva-CV
Siva-CV
 
Kevin A Williams
Kevin A WilliamsKevin A Williams
Kevin A Williams
 
SamBarrie_Primaryvzt
SamBarrie_PrimaryvztSamBarrie_Primaryvzt
SamBarrie_Primaryvzt
 
Resume Aden bahdon
Resume Aden bahdonResume Aden bahdon
Resume Aden bahdon
 

Lei Liu Resume

  • 1. Lei Liu 11 Peachtree Home 281-890-6436 Houston, TX 77064 Email leiliu525@gmail.com   SKILLS • Strong background in pipeline database model and data manipulation • High level knowledge and experience in ArcGIS • Excellent experimental and problem solving skills • Expert level in database development as well as database administration • Knowledge and skills with web development, familiar with main frameworks • Skillful with Object Oriented Programming (OOP) • Fluent in several computer programming languages (PL/SQL, T-SQL, C#, Python, C++, JavaScript, Jquery, MEAN stacks, Unix scripts) CERTIFICATES • Oracle DBA, MongoDB Developer Associate RELATED WORKING EXPERIENCE 04/2016-Present Independent Consultant, Houston, TX • Continue to provide technical support for applications of Eagle Information Mapping, Inc. • While supporting applications, my primary focus is on web development, particularly on MEAN stacks as well as other main web develop frameworks, like Ruby, PHP, React, Ember and Backbone 01/2015-04/2016 Eagle Information Mapping at G2 Partners, Houston, TX Principal Developer • Implemented and supported SQL server database systems, reporting services, analysis services and integration service on enterprise and dedicated windows servers • Maintained, tuned and monitored SQL server and DB products including storage management, DB performance tuning, DB software debugging, backup and recovery, and configuration reset maintenance 12/1995-01/2015 Eagle Information Mapping Inc., Houston, TX Senior GIS Developer • Designed and developed pipeline data management application-PDAT suite with a complete set of functionalities for maintaining pipeline facilities data in Pipeline Open Data Standard (PODS) model. This database/GIS product has been installed in several major North American Oil and Pipeline companies, including Chevron, Chevron Indonesia, BP, Oneok, Marathon and Koch Pipeline Company.
  • 2. • As a primary developer for the PDAT suite of products, responsible for coding stored procedures for line maintenance with complex pipeline centerline maintenance operations and the process for event loading and editing using PL/SQL for Oracle and T-SQL for SQL Server database. • Worked as an Oracle database administrator which included installing, tuning, maintaining and monitoring Oracle database systems, implementing storage management, performing backup and recovery operations, exporting and importing database dump files as well as supporting all database associated techniques. • Developed and maintained C++/Python applications to create GIS spatial data in SDE layers for pipeline data stored in PODS relational database. • Developed Python/PLSQL application to transfer data between the Pipeline Open Data Standard (PODSTM ) database model and the ArcGIS Pipeline Data Model (APDMTM ) database model using a JSON data-interchange format. • Developed C++/Python application to implement Geo search operations for pipeline companies. • Developed ETL applications to migrate pipeline data from legacy ISAT database model into PODS database model by mapping all fields and loading all data. • Worked in the Client’s office to assist in correcting and/or transferring pipeline data. • Developed packages or stored procedures to upgrade PODS models to newer database versions while maintaining complete data integration. • Designed and developed a complete set of the Quality Control procedures to monitor PODS database activities and detect all types of existing data errors and on-going data input errors. The QC process is run daily and provides detail error reports to assure the quality of pipeline data. • Design a new data sub-model to extend the PODS database model to include High Consequence Area (HCA) data and developed a Python/PLSQL application to load HCA data into a PODS database. • Designed and developed the Seismic Location Database System (SLDS) – a GIS/Database application to manage seismic shot point data. The application uses standard seismic data as input and generates SDE layers (maps) for both shot points and line attributes. • Designed and developed an Alignment sheets application for generating quality maps that display pipeline oil and gas assets. This was a very popular product. EDUCATION M.S., Computer Science M.S., Geological Engineering THE UNIVERSITY OF MISSISSIPPI, Oxford, MS