SUDHIR GAJJELA Email: sudhir.gs77@gmail.com
Deloitte Consulting LLP, Hyderabad Mobile: +91 9949781990
PROFILE:
 Big Data Hadoop Administrator
 Informatica Administrator
 Informatica Developer
Experience Summary:
 Working for Deloitte as an Informatica Administrator and Big Data/Hadoop Administrator
currently since 27 months.
 Worked as Informatica Developer for 10 months during intial phases of his professional
career.
Big Data/Hadoop Administration
Role: To architect, build, support and troubleshoot Big Data/Hadoop environments of several Internal
and external client projects/environments.
Technical Environments Include:
 Cloudera Big Data Clusters
 Hortonworks Big Data Clusters
Other Technical Skills Include:
 Worked on complete setup of Cloudera manager with all CDH components right from
architecting phase to final project release.
 Installed and configured Hortonworks environment with HDP applications by setting up
Ambari user interface.
 Integrated Hortonworks cluster with Informatica BDM version 10 Update1 letting
Informatica jobs read/write data from/to Hadoop cluster.
 Performed regular health-checks and applying resolution approaches for all the big data
environments.
Page 2 of 7
Copyright © 2007 Deloitte Development LLC. All rights reserved. (r.v.. 9.1)
 Tasks include complete administering including architecture/Infrastructure setup, user-
management and troubleshooting issues faced by developers.
 Worked on setting up high availability for individual Hadoop applications.
 Takes part in architecting discussions for Hadoop clusters and load balancing.
 Worked on multiple Hadoop services like HDFS, Hive, HBase, Zookeeper, Flume, Sqoop,
Oozie and Storm.
 Installed Cloudera Navigator to view data lineages. Integrated it with Informatica
Metadata Manager.
 User management for clusters, web UIs and services like HBase, HDFS and Hive etc.
 Worked on Cloudera backup and disaster recovery mechanisms like BDR and DistCp.
 Experienced in graph analysis in Cloudera manager related to IO, RAM, disk usage,
performance analysis and root-cause analysis.
 Works on critical security protocols like Kerberos and sentry, thereby establishing
connectivity between Informatica components(tools like BDM and BDRM) and Big data
services.
 Worked on re-generating Kerberos service keytabs to achieve synchronization with kvno’s
between runtime service keytabs and principals’ in Kerberos database.
 Worked on securing individual Hadoop services with Kerberos mechanism.
 Worked on Apache Storm service installation and configured it with Kerberos so as to let
storm write streaming data to kerberised HBase.
 Worked on embedded and external databases like Postgres and MySQL.
 Worked on advanced troubleshooting thereby finding of causes for several health issues
of the cluster.
Page 3 of 7
Copyright © 2007 Deloitte Development LLC. All rights reserved. (r.v.. 9.1)
Informatica Administration
Role: To architect, build, support and troubleshoot Informatica environments of Internal and Client
projects/environments.
Technical Environments Include:
 Informatica PowerCenter, Informatica PowerExchange and Informatica Data Quality.
 Informatica MDM and Informatica Cloud Customer 360.
 Informatica Cloud, Informatica Vibe and Informatica BDRM.
 Informatica BDE and Informatica BDM v10 tools.
Other Technical Skills Include:
 Installed and configured various Informatica tools like PowerCenter, PowerExchange,
Informatica Data Quality, Informatica Vibe, Informatica Cloud, Informatica MDM, Informatica
Cloud Customer 360/Cloud MDM, Informatica BDE and BDM editions.
 Worked on Informatica BDM(Big Data Management) tools and established HDFS and Hive
connectivity to Kerberos-enabled cloudera/big data clusters.
 Experienced in configuring Data Quality services like Model Repository and Data Integration
services to be able to read/write data from/to Hadoop clusters.
 Worked on Informatica upgrades like Parallel upgrade approach and In-Place upgrade
approach in Windows and Linux environments.
 Troubleshooting was done on issues of Informatica Developer client tool where custom
variables were added in client configuration files where the tool could access right relevant
server information.
 Researched and resolved domain and host level issues where the domain information needs
to be modified in underlying server files using infasetup commands.
 Worked on Informatica-SAP connectivity and configuration for a project.
 Handled user management, group management, parameter settings, hosts file entries, setting
up environment variables in Linux environments.
 Worked on setting up a new release from Informatica called ‘Big Data Relationship
Management’ tool. BDRM deals with data management across Informatica MDM and big data
applications like Hive, HBase and HDFS.
 Integrated Informatica CC360(Cloud Customer 360) with AddressDoctor and configured
connection with DQC(Data Quality Center) account.
Page 4 of 7
Copyright © 2007 Deloitte Development LLC. All rights reserved. (r.v.. 9.1)
 Worked on setting up linux parameters related to open files and number of processes
required for setting up an Informatica environment in a recommended way.
 Worked on migrating Informatica objects like workflows, sessions and mappings by import-
export mechanism in .xml formats amongst different Informatica environments.
 Deployed Informatica mappings in IDQ environments thereby creating applications which
would run in the form of workflows.
 Worked on scripting wherein few required linux commands related to kerberos were
executed through a file and scheduled it using cron job in linux.
 Worked on troubleshooting Kerberos authentication issues in environments where
Informatica should connect to Kerberos-enabled Hadoop clusters to run Informatica jobs in
pushdown mode in Hadoop environments.
 Handles user/group management under Informatica Security, granted Informatica related
privileges specific to each service according to project scope and requirement considering
access policies.
 Monitors all current Informatica environments on a timely basis to validate status of various
services and ensures the environment should be up and running.
 Knowledgeable on types of code pages usually used in configuration of various repository
services in process of setting up Informatica environments.
 Prepares technical documentation of work performed and provides to clients and internal
team during project release, to make them aware of process followed that would be used in
troubleshooting going forward.
Page 5 of 7
Copyright © 2007 Deloitte Development LLC. All rights reserved. (r.v.. 9.1)
Informatica Developer
Role: To design and develop Informatica mappings, sessions and workflows as per Project requirements.
Technical Environments Include:
 Informatica PowerCenter
Other Technical Skills Include:
 Worked as Informatica Developer for a Life sciences-Health care and a Banking project.
 Developed mappings that would transform data as per project and state policies and
regulations.
 Worked on command task and email task workflows.
 Took part in all phases of SDLC right from design to final phase of various projects.
 Worked on SQL clients like SQL Plus and SQL Developer on various data operations which
were used to load/retrieve data from databases.
Project Management Tasks
 Experienced in providing cost estimates(Install/Support/hardware) to clients.
 Worked on analyzing resource and time estimates of various tasks.
 Trained new joinees on Big Data and Hadoop concepts with both descriptive analysis and
hands-on experience.
 Prepared SOW(Statement of Work) documents during initial phases of project and
Release guides during final phases of projects.
 Works very often with our internal network and server admin team on firewall,
linux/windows admin task requests.
 Manages Weekly status reports and health check sheets of various Informatica and Big
data tools of few crucial projects.
Page 6 of 7
Copyright © 2007 Deloitte Development LLC. All rights reserved. (r.v.. 9.1)
System Experience
Software/Products:
Informatica Tools
Cloudera Big Data, Hortonworks Big Data
Development Tools / Languages:
C, C++, Data Structures, Dot Net (University Level Experience)
Hardware/Operating Systems:
Red Hat Linux
Windows
Work Experience: 3 years 1 month
Education:
 Bachelors of Technology (Computer Science and Engineering) from
National Institute of Technology (NIT) - Warangal, Telangana.
 Professional career started as a campus hire from 2013 batch in Deloitte.
Eminence:
 Delivered training sessions on Big data, Cloudera Hadoop, Informatica BDM and Kerberos
protocol to internal teams in Deloitte.
White Papers:
 Prepared a white paper on ‘Informatica PowerCenter In-Place Upgrade and Parallel
Upgrade’, that got published in Informatica CoP portal.
 Prepared a whitepaper on ‘Apache Storm HBase Integration with Kerberos’.
Page 7 of 7
Copyright © 2007 Deloitte Development LLC. All rights reserved. (r.v.. 9.1)
Technical Documents:
 Contributed to an internal Informatica Newsletter by writing a technical paper on “Hive and
HDFS connections configuration of Kerberised Cloudera cluster in Informatica BDE tool”.
Trainings Delivered:
 Gave multiple training sessions to internal teams on Big Data/Hadoop and Kerberos topics.
 Delivered a session to fresher batch on “Kerberos Authentication Protocol”.
Language Skills:
 English
 Telugu - Native
 Hindi
Personal Details:
Name : Sudhir Gajjela
Father’s Name : Rajeshwar Gajjela
Marital Status : Single
Nationality : Indian

Sudhir Gajjela_Resume

  • 1.
    SUDHIR GAJJELA Email:sudhir.gs77@gmail.com Deloitte Consulting LLP, Hyderabad Mobile: +91 9949781990 PROFILE:  Big Data Hadoop Administrator  Informatica Administrator  Informatica Developer Experience Summary:  Working for Deloitte as an Informatica Administrator and Big Data/Hadoop Administrator currently since 27 months.  Worked as Informatica Developer for 10 months during intial phases of his professional career. Big Data/Hadoop Administration Role: To architect, build, support and troubleshoot Big Data/Hadoop environments of several Internal and external client projects/environments. Technical Environments Include:  Cloudera Big Data Clusters  Hortonworks Big Data Clusters Other Technical Skills Include:  Worked on complete setup of Cloudera manager with all CDH components right from architecting phase to final project release.  Installed and configured Hortonworks environment with HDP applications by setting up Ambari user interface.  Integrated Hortonworks cluster with Informatica BDM version 10 Update1 letting Informatica jobs read/write data from/to Hadoop cluster.  Performed regular health-checks and applying resolution approaches for all the big data environments.
  • 2.
    Page 2 of7 Copyright © 2007 Deloitte Development LLC. All rights reserved. (r.v.. 9.1)  Tasks include complete administering including architecture/Infrastructure setup, user- management and troubleshooting issues faced by developers.  Worked on setting up high availability for individual Hadoop applications.  Takes part in architecting discussions for Hadoop clusters and load balancing.  Worked on multiple Hadoop services like HDFS, Hive, HBase, Zookeeper, Flume, Sqoop, Oozie and Storm.  Installed Cloudera Navigator to view data lineages. Integrated it with Informatica Metadata Manager.  User management for clusters, web UIs and services like HBase, HDFS and Hive etc.  Worked on Cloudera backup and disaster recovery mechanisms like BDR and DistCp.  Experienced in graph analysis in Cloudera manager related to IO, RAM, disk usage, performance analysis and root-cause analysis.  Works on critical security protocols like Kerberos and sentry, thereby establishing connectivity between Informatica components(tools like BDM and BDRM) and Big data services.  Worked on re-generating Kerberos service keytabs to achieve synchronization with kvno’s between runtime service keytabs and principals’ in Kerberos database.  Worked on securing individual Hadoop services with Kerberos mechanism.  Worked on Apache Storm service installation and configured it with Kerberos so as to let storm write streaming data to kerberised HBase.  Worked on embedded and external databases like Postgres and MySQL.  Worked on advanced troubleshooting thereby finding of causes for several health issues of the cluster.
  • 3.
    Page 3 of7 Copyright © 2007 Deloitte Development LLC. All rights reserved. (r.v.. 9.1) Informatica Administration Role: To architect, build, support and troubleshoot Informatica environments of Internal and Client projects/environments. Technical Environments Include:  Informatica PowerCenter, Informatica PowerExchange and Informatica Data Quality.  Informatica MDM and Informatica Cloud Customer 360.  Informatica Cloud, Informatica Vibe and Informatica BDRM.  Informatica BDE and Informatica BDM v10 tools. Other Technical Skills Include:  Installed and configured various Informatica tools like PowerCenter, PowerExchange, Informatica Data Quality, Informatica Vibe, Informatica Cloud, Informatica MDM, Informatica Cloud Customer 360/Cloud MDM, Informatica BDE and BDM editions.  Worked on Informatica BDM(Big Data Management) tools and established HDFS and Hive connectivity to Kerberos-enabled cloudera/big data clusters.  Experienced in configuring Data Quality services like Model Repository and Data Integration services to be able to read/write data from/to Hadoop clusters.  Worked on Informatica upgrades like Parallel upgrade approach and In-Place upgrade approach in Windows and Linux environments.  Troubleshooting was done on issues of Informatica Developer client tool where custom variables were added in client configuration files where the tool could access right relevant server information.  Researched and resolved domain and host level issues where the domain information needs to be modified in underlying server files using infasetup commands.  Worked on Informatica-SAP connectivity and configuration for a project.  Handled user management, group management, parameter settings, hosts file entries, setting up environment variables in Linux environments.  Worked on setting up a new release from Informatica called ‘Big Data Relationship Management’ tool. BDRM deals with data management across Informatica MDM and big data applications like Hive, HBase and HDFS.  Integrated Informatica CC360(Cloud Customer 360) with AddressDoctor and configured connection with DQC(Data Quality Center) account.
  • 4.
    Page 4 of7 Copyright © 2007 Deloitte Development LLC. All rights reserved. (r.v.. 9.1)  Worked on setting up linux parameters related to open files and number of processes required for setting up an Informatica environment in a recommended way.  Worked on migrating Informatica objects like workflows, sessions and mappings by import- export mechanism in .xml formats amongst different Informatica environments.  Deployed Informatica mappings in IDQ environments thereby creating applications which would run in the form of workflows.  Worked on scripting wherein few required linux commands related to kerberos were executed through a file and scheduled it using cron job in linux.  Worked on troubleshooting Kerberos authentication issues in environments where Informatica should connect to Kerberos-enabled Hadoop clusters to run Informatica jobs in pushdown mode in Hadoop environments.  Handles user/group management under Informatica Security, granted Informatica related privileges specific to each service according to project scope and requirement considering access policies.  Monitors all current Informatica environments on a timely basis to validate status of various services and ensures the environment should be up and running.  Knowledgeable on types of code pages usually used in configuration of various repository services in process of setting up Informatica environments.  Prepares technical documentation of work performed and provides to clients and internal team during project release, to make them aware of process followed that would be used in troubleshooting going forward.
  • 5.
    Page 5 of7 Copyright © 2007 Deloitte Development LLC. All rights reserved. (r.v.. 9.1) Informatica Developer Role: To design and develop Informatica mappings, sessions and workflows as per Project requirements. Technical Environments Include:  Informatica PowerCenter Other Technical Skills Include:  Worked as Informatica Developer for a Life sciences-Health care and a Banking project.  Developed mappings that would transform data as per project and state policies and regulations.  Worked on command task and email task workflows.  Took part in all phases of SDLC right from design to final phase of various projects.  Worked on SQL clients like SQL Plus and SQL Developer on various data operations which were used to load/retrieve data from databases. Project Management Tasks  Experienced in providing cost estimates(Install/Support/hardware) to clients.  Worked on analyzing resource and time estimates of various tasks.  Trained new joinees on Big Data and Hadoop concepts with both descriptive analysis and hands-on experience.  Prepared SOW(Statement of Work) documents during initial phases of project and Release guides during final phases of projects.  Works very often with our internal network and server admin team on firewall, linux/windows admin task requests.  Manages Weekly status reports and health check sheets of various Informatica and Big data tools of few crucial projects.
  • 6.
    Page 6 of7 Copyright © 2007 Deloitte Development LLC. All rights reserved. (r.v.. 9.1) System Experience Software/Products: Informatica Tools Cloudera Big Data, Hortonworks Big Data Development Tools / Languages: C, C++, Data Structures, Dot Net (University Level Experience) Hardware/Operating Systems: Red Hat Linux Windows Work Experience: 3 years 1 month Education:  Bachelors of Technology (Computer Science and Engineering) from National Institute of Technology (NIT) - Warangal, Telangana.  Professional career started as a campus hire from 2013 batch in Deloitte. Eminence:  Delivered training sessions on Big data, Cloudera Hadoop, Informatica BDM and Kerberos protocol to internal teams in Deloitte. White Papers:  Prepared a white paper on ‘Informatica PowerCenter In-Place Upgrade and Parallel Upgrade’, that got published in Informatica CoP portal.  Prepared a whitepaper on ‘Apache Storm HBase Integration with Kerberos’.
  • 7.
    Page 7 of7 Copyright © 2007 Deloitte Development LLC. All rights reserved. (r.v.. 9.1) Technical Documents:  Contributed to an internal Informatica Newsletter by writing a technical paper on “Hive and HDFS connections configuration of Kerberised Cloudera cluster in Informatica BDE tool”. Trainings Delivered:  Gave multiple training sessions to internal teams on Big Data/Hadoop and Kerberos topics.  Delivered a session to fresher batch on “Kerberos Authentication Protocol”. Language Skills:  English  Telugu - Native  Hindi Personal Details: Name : Sudhir Gajjela Father’s Name : Rajeshwar Gajjela Marital Status : Single Nationality : Indian