SlideShare a Scribd company logo
1 of 4
Kumar R L D P Godasi
Email id: durga.kumar.g@gmail.com
Mobile No: 9505758555
Professional Summary:
 Having 8 Years’ experience in Development and Reporting.
 Currently working in UHG ( 4 Years ) Prior to that worked for BACS ( 3 Years)
 1.5 Years Hands On experience on HDFS, SPARK with SCALA, HIVE, PIG and SQOOP.
 Worked extensively with Scala Language (SPARK)
 Hands On experience on developing UDF in SPARK
 Developed PIG Latin scripts and SPARK scripts for handling data formation.
 Implemented SQOOP for large dataset transfer between RDBMS to Hadoop
 Expertise in SQL Server BI tools,
 Hands on experience on ETL tools SSIS and reporting tools SSRS.
 Knowledge of Tableau and Creating Dashboards in Tableau
 Knowledge of VBA with Macros (Excel, MS Access, MS Word)
 Good verbal and written business English - communication skills
 A resourceful team player with an analytical bent of mind
 Good Technical Knowledge and Willing to shoulder higher responsibilities
Technical Skills:
 Big Data Ecosystems: HDFS, SPARK with SCALA, Hive, Pig, Sqoop
 Programming Languages: SCALA, VBA
 Databases: SQL Server 2008 R2 , AS400 ( Mainframe Database)
 BI TOOL: SSIS, SSRS and Tableau
 Platforms: Windows, Linux
 Tools: MS Excel and MS Access.
 Certificates : “Introduction to Pig” and “Accessing Hadoop Data Using Hive V2” from
BIGDATA UNIVERSITY
Experience Summary:
2011- Till date with United Health Group, Hyderabad as Development and automation Lead
2008-2011 Worked with Bank of America as Developer.
2006-2008 Worked with Karvy Stock Broking Pvt Ltd.
2005-2006 Worked with I-Process India Pvt Ltd (Wholly owned subsidiary of ICICI Bank Ltd).
Professional Experience:
1) Company : United Health Group.
Period of work : October 2011 to till date
Designation : Sr. Software Engineer
1) Project Name: Optum Call Center Project ( BD PASS)
Project Description:
Optum is UHG subsidiary serves more than 30 million customers in US having its own call center for
the different products this project contains 5000+ MIS reports standard and Ad-hoc Reports.
Problem Statement.
On the daily basis logs of call are generated which loaded into DB2 database having nearly 34
GB data per day. This much high data inflow was difficult to manage in our traditional database
Data source and that is and for the analysis we need to deliver to end users through Tableau. It
was highly latency to perform operation on this huge data and provides a solution to end users.
Solution:
We have implemented the HADOOP environment to overcome this problem. Data is ingested
from various sources through SQOOP and the perform data processing and transformation in
SPARK or PIG as per the requirement and pushed the same data in Hive for further
processing. Reports delivered to end users though Tableau and to be leveraged by a small
groups of BA’s to modify reports and for the automated distribution of the reports.
Duration:
June 2014 to till Now.
Environment: HDFS, SPARK with SCALA, PIG, Hive, SQOOP and Tableau.
Role:
Hadoop developer.
Roles and Responsibility
 Data ingestion from various sources using SQOOP.
 Writing of SPARK and PIG Scripts to make the data in such a format so that It can be
processed further.
 Calculate the aggregate values with using SPARK with SCALA.
 Writing of PIG script for further cleaning output into reporting format.
 Load PIG transformation data into HIVE for mining and reporting purposes.
 Creating the External tables in HIVE for prepare the dashboards in Tableau.
 Reports delivered to end users through Tableau
 Reports to be leveraged by small group of BA’s to modify reports and for the automated
distribution of the reports.
2) Project Name: OptumRx Claims Operations.
Project Description:
OptumRx Claims Operation, this project contains 5000+ MIS reports standard and Ad-hoc Reports.
This reports main based upon the Member Claim information, Client usage and Plan Usage, Drug
Analysis and Pharmacy Claims information.
Client
Client Service Management OptumRx
Role Definition SQL Reporting, SSIS Datamining and SSRS Technical lead.
Duration September 2011 to Till date
Technologies SQL Server,SSRS, SSIS, AS400 and Excel
Client Client Service Management OptumRx
Roles and Responsibilities
 Get the requirements from the business and create the requirement documentation.
 Create request on SharePoint to raise a ticket and set TAT.
 Assign SharePoint ticket to the resources and me as well & Take daily update on requests.
 Take sign off from client on each Ad-hoc request.
 We are using SQL server as a centralized data base and pulling data from SQL, AS400 batch system in
the form of Flat files.
 Datamining using SSIS packages.
 Create SSRS reports to create MIS reports and provide sometimes in excel output.
 All Standard reports (26) are automated in SSRS.
 Encryption output in flat files as encrypted mode using process standard rules.
3) Project Name: Benefit Operation Management.
Project Description:
Benefit Operation Management is the process where team do set up new plans, Create new Clients
any new drug implementations Or the Pharmacy set u. Overall this process responsible to regulate
the benefit for various products.
If any product is not setup properly we might have copay or benefit error during the claim
processing. And need to correct the same for all claims with proper adjustment and proper benefit
schedule.
Client Benefit Operation Management.
Role Definition SQL Reporting, SSIS Datamining and SSRS Technical Lead
Duration September 2011 to Till date
Technologies SQL,S SRS, SSIS, AS400 and Excel
Client Client Service Management OptumRx
Roles and Responsibilities
 Pinch down the start date of error in plan setup and fetch data for effected timeframe and product
 Analyze the data to confirm the issue.
 Send the data to in Excel template to Mass adjustment team to reprocess the claims
 Involved in all this processes as individual contributor.
 Developed SSIS package for import process using SQL server 2008 R2.
 Responsible for Standardization, Normalization, and Collapse of data.
 Responsible for applying business logic on data
 Providing technical support to the team both onshore and offshore resources.
2) Company : Bank Of America (BACS)
Period of work : Aug 2008 to September 2011
Designation : Reporting Analyst
1. Project: Associate Learning Portal.
Existing processes involves extracting the reports from the Associate Learning Portal site and manually
format the reports by hiding unwanted columns, highlighting delinquencies, adding header to the reports,
setting print preview size, adding page numbers, applying filters. Same procedures are followed for various
mandatory training courses.
Technology Used: SQL Server, MS Access, VBA
Roles and Responsibilities
 Involved in all the phases of the Data Base Creation & Application Design and creation.
 Use MS Access VBA front end with SQL Database
 Creation of Daily/Weekly/Quarterly/Yearly reports templates in Excel.
Education Qualifications:
Qualification University/Board Year of Passing
B.Com (Computers) Andhra University 2004
Personal Profile:
Father’s Name : V D S Murty
Date of Birth : 20th Jul, 1979
Marital Status : Married
Language Known : English and Telugu.
Nationality : Indian
Kumar R L D P Godasi

More Related Content

What's hot

josh huspen - resume
josh huspen - resumejosh huspen - resume
josh huspen - resumeJosh Huspen
 
Power BI Dashboard | Microsoft Power BI Tutorial | Data Visualization | Edureka
Power BI Dashboard | Microsoft Power BI Tutorial | Data Visualization | EdurekaPower BI Dashboard | Microsoft Power BI Tutorial | Data Visualization | Edureka
Power BI Dashboard | Microsoft Power BI Tutorial | Data Visualization | EdurekaEdureka!
 
Microsoft business intelligence and analytics
Microsoft business intelligence and analyticsMicrosoft business intelligence and analytics
Microsoft business intelligence and analyticsJeannette Browning
 
Building a marketing data lake
Building a marketing data lakeBuilding a marketing data lake
Building a marketing data lakeSumit Sarkar
 
Swetha Nelwad Resume (1)
Swetha Nelwad Resume (1)Swetha Nelwad Resume (1)
Swetha Nelwad Resume (1)Swetha Nelwad
 
Journey to Marketing Data Lake [BRK1098]
Journey to Marketing Data Lake [BRK1098]Journey to Marketing Data Lake [BRK1098]
Journey to Marketing Data Lake [BRK1098]Sumit Sarkar
 
Power BI Overview, Deployment and Governance
Power BI Overview, Deployment and GovernancePower BI Overview, Deployment and Governance
Power BI Overview, Deployment and GovernanceJames Serra
 
Business Intelligence and Big Data Analytics with Pentaho
Business Intelligence and Big Data Analytics with Pentaho Business Intelligence and Big Data Analytics with Pentaho
Business Intelligence and Big Data Analytics with Pentaho Uday Kothari
 
shun(Michael)_Liang_Resume_2-1-2017
shun(Michael)_Liang_Resume_2-1-2017shun(Michael)_Liang_Resume_2-1-2017
shun(Michael)_Liang_Resume_2-1-2017MICHAEL LIANG
 
Ibm machine learning for z os
Ibm machine learning for z osIbm machine learning for z os
Ibm machine learning for z osCuneyt Goksu
 
Michael Liang Resume_Irvine_CA_ShortVersion
Michael Liang Resume_Irvine_CA_ShortVersionMichael Liang Resume_Irvine_CA_ShortVersion
Michael Liang Resume_Irvine_CA_ShortVersionMICHAEL LIANG
 
Jayalakshmi Parasaram_ SAP Business Objects Consultant_8+years
Jayalakshmi Parasaram_ SAP Business Objects Consultant_8+yearsJayalakshmi Parasaram_ SAP Business Objects Consultant_8+years
Jayalakshmi Parasaram_ SAP Business Objects Consultant_8+yearsJayalakshmi parasaram
 
SqlSaturday#699 Power BI - Create a dashboard from zero to hero
SqlSaturday#699 Power BI - Create a dashboard from zero to heroSqlSaturday#699 Power BI - Create a dashboard from zero to hero
SqlSaturday#699 Power BI - Create a dashboard from zero to heroVishal Pawar
 
Scalable ETL with Talend and Hadoop, Cédric Carbone, Talend.
Scalable ETL with Talend and Hadoop, Cédric Carbone, Talend.Scalable ETL with Talend and Hadoop, Cédric Carbone, Talend.
Scalable ETL with Talend and Hadoop, Cédric Carbone, Talend.OW2
 
#dbhouseparty - Graph Technologies - More than just Social (Distancing) Networks
#dbhouseparty - Graph Technologies - More than just Social (Distancing) Networks#dbhouseparty - Graph Technologies - More than just Social (Distancing) Networks
#dbhouseparty - Graph Technologies - More than just Social (Distancing) NetworksTammy Bednar
 

What's hot (19)

josh huspen - resume
josh huspen - resumejosh huspen - resume
josh huspen - resume
 
Power BI Dashboard | Microsoft Power BI Tutorial | Data Visualization | Edureka
Power BI Dashboard | Microsoft Power BI Tutorial | Data Visualization | EdurekaPower BI Dashboard | Microsoft Power BI Tutorial | Data Visualization | Edureka
Power BI Dashboard | Microsoft Power BI Tutorial | Data Visualization | Edureka
 
Kanakaraj_Periasamy
Kanakaraj_PeriasamyKanakaraj_Periasamy
Kanakaraj_Periasamy
 
Microsoft business intelligence and analytics
Microsoft business intelligence and analyticsMicrosoft business intelligence and analytics
Microsoft business intelligence and analytics
 
Pradeepa dharmappa
Pradeepa dharmappaPradeepa dharmappa
Pradeepa dharmappa
 
Building a marketing data lake
Building a marketing data lakeBuilding a marketing data lake
Building a marketing data lake
 
Swetha Nelwad Resume (1)
Swetha Nelwad Resume (1)Swetha Nelwad Resume (1)
Swetha Nelwad Resume (1)
 
Journey to Marketing Data Lake [BRK1098]
Journey to Marketing Data Lake [BRK1098]Journey to Marketing Data Lake [BRK1098]
Journey to Marketing Data Lake [BRK1098]
 
Power BI Overview
Power BI OverviewPower BI Overview
Power BI Overview
 
Power BI Overview, Deployment and Governance
Power BI Overview, Deployment and GovernancePower BI Overview, Deployment and Governance
Power BI Overview, Deployment and Governance
 
Business Intelligence and Big Data Analytics with Pentaho
Business Intelligence and Big Data Analytics with Pentaho Business Intelligence and Big Data Analytics with Pentaho
Business Intelligence and Big Data Analytics with Pentaho
 
Introduction To Pentaho
Introduction To PentahoIntroduction To Pentaho
Introduction To Pentaho
 
shun(Michael)_Liang_Resume_2-1-2017
shun(Michael)_Liang_Resume_2-1-2017shun(Michael)_Liang_Resume_2-1-2017
shun(Michael)_Liang_Resume_2-1-2017
 
Ibm machine learning for z os
Ibm machine learning for z osIbm machine learning for z os
Ibm machine learning for z os
 
Michael Liang Resume_Irvine_CA_ShortVersion
Michael Liang Resume_Irvine_CA_ShortVersionMichael Liang Resume_Irvine_CA_ShortVersion
Michael Liang Resume_Irvine_CA_ShortVersion
 
Jayalakshmi Parasaram_ SAP Business Objects Consultant_8+years
Jayalakshmi Parasaram_ SAP Business Objects Consultant_8+yearsJayalakshmi Parasaram_ SAP Business Objects Consultant_8+years
Jayalakshmi Parasaram_ SAP Business Objects Consultant_8+years
 
SqlSaturday#699 Power BI - Create a dashboard from zero to hero
SqlSaturday#699 Power BI - Create a dashboard from zero to heroSqlSaturday#699 Power BI - Create a dashboard from zero to hero
SqlSaturday#699 Power BI - Create a dashboard from zero to hero
 
Scalable ETL with Talend and Hadoop, Cédric Carbone, Talend.
Scalable ETL with Talend and Hadoop, Cédric Carbone, Talend.Scalable ETL with Talend and Hadoop, Cédric Carbone, Talend.
Scalable ETL with Talend and Hadoop, Cédric Carbone, Talend.
 
#dbhouseparty - Graph Technologies - More than just Social (Distancing) Networks
#dbhouseparty - Graph Technologies - More than just Social (Distancing) Networks#dbhouseparty - Graph Technologies - More than just Social (Distancing) Networks
#dbhouseparty - Graph Technologies - More than just Social (Distancing) Networks
 

Similar to Kumar Godasi - Resume

Similar to Kumar Godasi - Resume (20)

Nagarjuna_Damarla
Nagarjuna_DamarlaNagarjuna_Damarla
Nagarjuna_Damarla
 
ABHIJEET MURLIDHAR GHAG Axisbank
ABHIJEET MURLIDHAR GHAG AxisbankABHIJEET MURLIDHAR GHAG Axisbank
ABHIJEET MURLIDHAR GHAG Axisbank
 
Maharshi_Amin_416
Maharshi_Amin_416Maharshi_Amin_416
Maharshi_Amin_416
 
354836_(General_Format)Mahaboob Basha Shaik
354836_(General_Format)Mahaboob Basha Shaik354836_(General_Format)Mahaboob Basha Shaik
354836_(General_Format)Mahaboob Basha Shaik
 
Sanjay Lakhanpal 2015
Sanjay Lakhanpal 2015Sanjay Lakhanpal 2015
Sanjay Lakhanpal 2015
 
MuthulakshmiRajendran
MuthulakshmiRajendranMuthulakshmiRajendran
MuthulakshmiRajendran
 
Rama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerRama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developer
 
Manas IBM SAP BW
Manas IBM SAP BWManas IBM SAP BW
Manas IBM SAP BW
 
SAP BO Resume
SAP BO ResumeSAP BO Resume
SAP BO Resume
 
Resume_Susmita
Resume_SusmitaResume_Susmita
Resume_Susmita
 
Abhishek jaiswal
Abhishek jaiswalAbhishek jaiswal
Abhishek jaiswal
 
Sandeep_Rampalle_Resume
Sandeep_Rampalle_ResumeSandeep_Rampalle_Resume
Sandeep_Rampalle_Resume
 
Mani_Sagar_ETL
Mani_Sagar_ETLMani_Sagar_ETL
Mani_Sagar_ETL
 
Ganesh CV
Ganesh CVGanesh CV
Ganesh CV
 
Updated Resume
Updated ResumeUpdated Resume
Updated Resume
 
resume
resumeresume
resume
 
SACHIN_SINGH_SAP-BW
SACHIN_SINGH_SAP-BWSACHIN_SINGH_SAP-BW
SACHIN_SINGH_SAP-BW
 
Resume_BI_Jai
Resume_BI_JaiResume_BI_Jai
Resume_BI_Jai
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop Resume
 
Big Data Analyst at BankofAmerica
Big Data Analyst at BankofAmericaBig Data Analyst at BankofAmerica
Big Data Analyst at BankofAmerica
 

Kumar Godasi - Resume

  • 1. Kumar R L D P Godasi Email id: durga.kumar.g@gmail.com Mobile No: 9505758555 Professional Summary:  Having 8 Years’ experience in Development and Reporting.  Currently working in UHG ( 4 Years ) Prior to that worked for BACS ( 3 Years)  1.5 Years Hands On experience on HDFS, SPARK with SCALA, HIVE, PIG and SQOOP.  Worked extensively with Scala Language (SPARK)  Hands On experience on developing UDF in SPARK  Developed PIG Latin scripts and SPARK scripts for handling data formation.  Implemented SQOOP for large dataset transfer between RDBMS to Hadoop  Expertise in SQL Server BI tools,  Hands on experience on ETL tools SSIS and reporting tools SSRS.  Knowledge of Tableau and Creating Dashboards in Tableau  Knowledge of VBA with Macros (Excel, MS Access, MS Word)  Good verbal and written business English - communication skills  A resourceful team player with an analytical bent of mind  Good Technical Knowledge and Willing to shoulder higher responsibilities Technical Skills:  Big Data Ecosystems: HDFS, SPARK with SCALA, Hive, Pig, Sqoop  Programming Languages: SCALA, VBA  Databases: SQL Server 2008 R2 , AS400 ( Mainframe Database)  BI TOOL: SSIS, SSRS and Tableau  Platforms: Windows, Linux  Tools: MS Excel and MS Access.  Certificates : “Introduction to Pig” and “Accessing Hadoop Data Using Hive V2” from BIGDATA UNIVERSITY Experience Summary: 2011- Till date with United Health Group, Hyderabad as Development and automation Lead 2008-2011 Worked with Bank of America as Developer. 2006-2008 Worked with Karvy Stock Broking Pvt Ltd. 2005-2006 Worked with I-Process India Pvt Ltd (Wholly owned subsidiary of ICICI Bank Ltd).
  • 2. Professional Experience: 1) Company : United Health Group. Period of work : October 2011 to till date Designation : Sr. Software Engineer 1) Project Name: Optum Call Center Project ( BD PASS) Project Description: Optum is UHG subsidiary serves more than 30 million customers in US having its own call center for the different products this project contains 5000+ MIS reports standard and Ad-hoc Reports. Problem Statement. On the daily basis logs of call are generated which loaded into DB2 database having nearly 34 GB data per day. This much high data inflow was difficult to manage in our traditional database Data source and that is and for the analysis we need to deliver to end users through Tableau. It was highly latency to perform operation on this huge data and provides a solution to end users. Solution: We have implemented the HADOOP environment to overcome this problem. Data is ingested from various sources through SQOOP and the perform data processing and transformation in SPARK or PIG as per the requirement and pushed the same data in Hive for further processing. Reports delivered to end users though Tableau and to be leveraged by a small groups of BA’s to modify reports and for the automated distribution of the reports. Duration: June 2014 to till Now. Environment: HDFS, SPARK with SCALA, PIG, Hive, SQOOP and Tableau. Role: Hadoop developer. Roles and Responsibility  Data ingestion from various sources using SQOOP.  Writing of SPARK and PIG Scripts to make the data in such a format so that It can be processed further.  Calculate the aggregate values with using SPARK with SCALA.  Writing of PIG script for further cleaning output into reporting format.  Load PIG transformation data into HIVE for mining and reporting purposes.  Creating the External tables in HIVE for prepare the dashboards in Tableau.  Reports delivered to end users through Tableau  Reports to be leveraged by small group of BA’s to modify reports and for the automated distribution of the reports.
  • 3. 2) Project Name: OptumRx Claims Operations. Project Description: OptumRx Claims Operation, this project contains 5000+ MIS reports standard and Ad-hoc Reports. This reports main based upon the Member Claim information, Client usage and Plan Usage, Drug Analysis and Pharmacy Claims information. Client Client Service Management OptumRx Role Definition SQL Reporting, SSIS Datamining and SSRS Technical lead. Duration September 2011 to Till date Technologies SQL Server,SSRS, SSIS, AS400 and Excel Client Client Service Management OptumRx Roles and Responsibilities  Get the requirements from the business and create the requirement documentation.  Create request on SharePoint to raise a ticket and set TAT.  Assign SharePoint ticket to the resources and me as well & Take daily update on requests.  Take sign off from client on each Ad-hoc request.  We are using SQL server as a centralized data base and pulling data from SQL, AS400 batch system in the form of Flat files.  Datamining using SSIS packages.  Create SSRS reports to create MIS reports and provide sometimes in excel output.  All Standard reports (26) are automated in SSRS.  Encryption output in flat files as encrypted mode using process standard rules. 3) Project Name: Benefit Operation Management. Project Description: Benefit Operation Management is the process where team do set up new plans, Create new Clients any new drug implementations Or the Pharmacy set u. Overall this process responsible to regulate the benefit for various products. If any product is not setup properly we might have copay or benefit error during the claim processing. And need to correct the same for all claims with proper adjustment and proper benefit schedule. Client Benefit Operation Management. Role Definition SQL Reporting, SSIS Datamining and SSRS Technical Lead Duration September 2011 to Till date Technologies SQL,S SRS, SSIS, AS400 and Excel Client Client Service Management OptumRx
  • 4. Roles and Responsibilities  Pinch down the start date of error in plan setup and fetch data for effected timeframe and product  Analyze the data to confirm the issue.  Send the data to in Excel template to Mass adjustment team to reprocess the claims  Involved in all this processes as individual contributor.  Developed SSIS package for import process using SQL server 2008 R2.  Responsible for Standardization, Normalization, and Collapse of data.  Responsible for applying business logic on data  Providing technical support to the team both onshore and offshore resources. 2) Company : Bank Of America (BACS) Period of work : Aug 2008 to September 2011 Designation : Reporting Analyst 1. Project: Associate Learning Portal. Existing processes involves extracting the reports from the Associate Learning Portal site and manually format the reports by hiding unwanted columns, highlighting delinquencies, adding header to the reports, setting print preview size, adding page numbers, applying filters. Same procedures are followed for various mandatory training courses. Technology Used: SQL Server, MS Access, VBA Roles and Responsibilities  Involved in all the phases of the Data Base Creation & Application Design and creation.  Use MS Access VBA front end with SQL Database  Creation of Daily/Weekly/Quarterly/Yearly reports templates in Excel. Education Qualifications: Qualification University/Board Year of Passing B.Com (Computers) Andhra University 2004 Personal Profile: Father’s Name : V D S Murty Date of Birth : 20th Jul, 1979 Marital Status : Married Language Known : English and Telugu. Nationality : Indian Kumar R L D P Godasi