This resume summarizes Arbind Kumar Jha's experience working with big data technologies like Hadoop, Hive, Pig, and HBase. He has over 12 years of IT experience, including 1.5 years working with Hadoop. His current role is a Technical Architect Lead at HCL Technologies, where he works on architectures, designs, and develops solutions involving big data, NoSQL, Hadoop, and BIRT. His technical skills include programming languages like Java, databases like Oracle and SQL Server, and big data tools like Hadoop, Hive, Pig, Cassandra, and Flume.
This resume summarizes Arbind Kumar Jha's experience working with big data technologies like Hadoop, Hive, Pig, and HBase. He has over 12 years of IT experience, including 1.5 years working with Hadoop. His current role is a Technical Architect Lead at HCL Technologies, where he works on architectures, designs, and develops solutions involving big data, NoSQL, Hadoop, and BIRT. His technical skills include programming languages like Java, databases like Oracle and SQL Server, and big data tools like Hadoop, Hive, Pig, Cassandra, and Flume.
Borja González is a Big Data Architect with over 5 years of experience in information technologies. He has expertise in Splunk, Big Data, cloud solutions, and other technologies. Currently, he leads a team using Splunk to analyze large volumes of data and create dashboards to monitor business metrics for a major natural gas company. Previously, he deployed a Hadoop cluster for a bank and worked as a software architect for Telefonica Movistar developing their website.
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram Parida
Jayaram Parida has over 19 years of experience in IT, including 3 years as a Big Data Technical Solution Architect. He has extensive skills in technologies like Hadoop, HDFS, HBase, Hive, MapReduce, Kafka, Storm, YARN, Pig, Python, and data analytics tools. He has experience architecting and developing big data solutions for clients in various industries. His roles have included designing Hadoop infrastructures, developing real-time analytics platforms, and creating visualizations and reports.
This document contains a summary of Renuga Veeraragavan's work experience and qualifications. It outlines 7 years of experience in IT with expertise in areas like Hadoop, Java, SQL, and web technologies. Specific roles are highlighted including current role as Hadoop Developer at Lowe's where responsibilities include data analysis, Hive queries, and HBase. Previous roles include Senior Java UI Developer at TD Bank and Accenture developing web applications. Educational background includes a B.E. in IT from Avinashilingam University.
Praveen Reddy Gajjala has over 2 years of experience as a Hadoop Developer at Wipro in Hyderabad. He has extensive skills in technologies like Java, Hadoop, Pig, Hive, HBase, and Sqoop. As part of a project for Sears, he helped analyze clickstream data from their websites and apps using Hadoop to improve the customer experience.
This document contains the resume of Hassan Qureshi. He has over 9 years of experience as a Hadoop Lead Developer with expertise in technologies like Hadoop, HDFS, Hive, Pig and HBase. Currently he works as the technical lead of a data engineering team developing insights from data. He has extensive hands-on experience installing, configuring and maintaining Hadoop clusters in different environments.
This document contains a summary of a candidate's experience and qualifications. In 2 sentences:
The candidate has over 2 years of experience in big data technologies like Hadoop, Hive, and Spark, and has worked on projects involving data ingestion, analytics, and building recommendation engines. They are seeking a position in big data or data science that provides opportunities for learning, growth, and career advancement.
Abhinav is an ETL Hadoop developer with over 3 years of experience working with technologies like Hadoop, Informatica Power Centre, PL/SQL, and Unix. He has 1.5 years of experience on big data projects using Hadoop and technologies like Hive, Pig, HDFS, and Sqoop. He is proficient in Informatica for ETL development, data modeling, and data integration. Abhinav has also worked on projects involving data migration, requirement analysis, automation, and testing. He is looking for a role that offers learning opportunities while allowing him to utilize his skills.
This resume summarizes Arbind Kumar Jha's experience working with big data technologies like Hadoop, Hive, Pig, and HBase. He has over 12 years of IT experience, including 1.5 years working with Hadoop. His current role is a Technical Architect Lead at HCL Technologies, where he works on architectures, designs, and develops solutions involving big data, NoSQL, Hadoop, and BIRT. His technical skills include programming languages like Java, databases like Oracle and SQL Server, and big data tools like Hadoop, Hive, Pig, Cassandra, and Flume.
Borja González is a Big Data Architect with over 5 years of experience in information technologies. He has expertise in Splunk, Big Data, cloud solutions, and other technologies. Currently, he leads a team using Splunk to analyze large volumes of data and create dashboards to monitor business metrics for a major natural gas company. Previously, he deployed a Hadoop cluster for a bank and worked as a software architect for Telefonica Movistar developing their website.
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram Parida
Jayaram Parida has over 19 years of experience in IT, including 3 years as a Big Data Technical Solution Architect. He has extensive skills in technologies like Hadoop, HDFS, HBase, Hive, MapReduce, Kafka, Storm, YARN, Pig, Python, and data analytics tools. He has experience architecting and developing big data solutions for clients in various industries. His roles have included designing Hadoop infrastructures, developing real-time analytics platforms, and creating visualizations and reports.
This document contains a summary of Renuga Veeraragavan's work experience and qualifications. It outlines 7 years of experience in IT with expertise in areas like Hadoop, Java, SQL, and web technologies. Specific roles are highlighted including current role as Hadoop Developer at Lowe's where responsibilities include data analysis, Hive queries, and HBase. Previous roles include Senior Java UI Developer at TD Bank and Accenture developing web applications. Educational background includes a B.E. in IT from Avinashilingam University.
Praveen Reddy Gajjala has over 2 years of experience as a Hadoop Developer at Wipro in Hyderabad. He has extensive skills in technologies like Java, Hadoop, Pig, Hive, HBase, and Sqoop. As part of a project for Sears, he helped analyze clickstream data from their websites and apps using Hadoop to improve the customer experience.
This document contains the resume of Hassan Qureshi. He has over 9 years of experience as a Hadoop Lead Developer with expertise in technologies like Hadoop, HDFS, Hive, Pig and HBase. Currently he works as the technical lead of a data engineering team developing insights from data. He has extensive hands-on experience installing, configuring and maintaining Hadoop clusters in different environments.
This document contains a summary of a candidate's experience and qualifications. In 2 sentences:
The candidate has over 2 years of experience in big data technologies like Hadoop, Hive, and Spark, and has worked on projects involving data ingestion, analytics, and building recommendation engines. They are seeking a position in big data or data science that provides opportunities for learning, growth, and career advancement.
Abhinav is an ETL Hadoop developer with over 3 years of experience working with technologies like Hadoop, Informatica Power Centre, PL/SQL, and Unix. He has 1.5 years of experience on big data projects using Hadoop and technologies like Hive, Pig, HDFS, and Sqoop. He is proficient in Informatica for ETL development, data modeling, and data integration. Abhinav has also worked on projects involving data migration, requirement analysis, automation, and testing. He is looking for a role that offers learning opportunities while allowing him to utilize his skills.
Suresh Yadav is seeking a challenging position to improve his skills. He has a degree in Electronics and Communication Engineering from Malla Reddy Engineering College. He has technical skills in Hadoop, Java, SQL, HBase, Eclipse, Linux, and Windows. He has experience developing MapReduce programs and migrating data to HDFS. For a POC, he analyzed Twitter data using Hadoop, Hive, and Pig. He previously worked as a Technical Support Engineer at Krish Technologies, where he troubleshot computers and built marketing products using VPN technology.
This document contains Anil Kumar's resume. It summarizes his contact information, professional experience working with Hadoop and related technologies like MapReduce, Pig, and Hive. It also lists his technical skills and qualifications, including being a MapR certified Hadoop Professional. His work experience includes developing MapReduce algorithms, installing and configuring MapR Hadoop clusters, and working on projects for clients like Pfizer and American Express involving data analytics using Hadoop, Spark, and Hive.
Vasudevan Venkatraman has over 11 years of experience working in the IT industry, including 7+ years of experience with Oracle PL/SQL, data warehousing, and 3+ years in performance consulting and applications database administration. He has experience designing and developing applications using Oracle PL/SQL, Hadoop, and big data technologies. Currently he works as an Assistant Consultant at TCS focusing on data warehousing projects using Oracle and Hadoop.
Sergey L. Sundukovskiy has over 15 years of experience leading IT teams and delivering business applications. He has served as Chief Information Officer, VP of Engineering, and Director of Architecture. He has expertise in software development, project management, product management, and marketing analytics. His technical skills include Java, .NET, databases, cloud computing, and various project management methodologies.
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopabinash bindhani
Abinash Bindhani is seeking a position as a Hadoop developer where he can utilize over 2 years of experience with Hadoop and Java technologies. He currently works as a senior systems engineer at Infosys where he has gained experience migrating data from Oracle to Hadoop platforms and collecting/analyzing log data using tools like Flume, Pig, and Hive. His technical skills include MapReduce, HBase, HDFS, Java, Spring, MySQL, and Apache Tomcat. He has expertise in Hadoop architecture, cluster concepts, and each phase of the software development life cycle.
Pankaj Resume for Hadoop,Java,J2EE - Outside WorldPankaj Kumar
Pankaj Kumar is seeking a challenging position utilizing his 7.9 years of experience in big data technologies like Hadoop, Java, and machine learning. He has deep expertise in technologies such as MapReduce, HDFS, Pig, Hive, HBase, MongoDB, and Spark. His experience includes successfully developing and delivering big data analytics solutions for healthcare, telecom, and other industries.
Sasmita Swain is a Hadoop Administrator and Developer with over 3.9 years of experience implementing Big Data solutions using Hadoop, Java, and Liferay Portal. She has expertise in Hadoop Distributed File System, MapReduce, Pig, Hive, Sqoop, HBase, and Cloudera distributions. She currently works as a Senior Software Engineer at Accenture implementing their software portal using Hadoop and AWS. Previously, she developed portlets for the Studentnext education portal using Liferay Portal.
Kishore Babu has over 7 years of experience in data analytics, business analytics, and project management. He is currently an Associate Business Analyst at GlobalLogic Technologies working on projects for Google. He leads a team that manages project dashboards, performs cost analysis, and automates reports using tools like Hive, Google SQL, and Dremel. Previously he has held roles as a Senior Lead and Lead at GlobalLogic where he mentored teammates and ensured project metrics were achieved.
Romy Khetan is a senior software engineer with over 3 years of experience in big data technologies like Elasticsearch, MongoDB, Hadoop, Spark, and Java. She has worked on multiple projects involving sentiment analysis, vertical search, and identifying relationships across social media data. Her roles have included backend development, designing plugins, APIs, and interfaces between applications and services. She is proficient in technologies such as Scala, Redis, RabbitMQ, and graph databases.
Resume_Triveni_Bigdata_Hadoop ProfessionalTRIVENI PATRO
Triveni Patro is currently working as a Hadoop admin at Tata Consultancy Services in India with over 4 years of experience in IT development, administration, implementation, and 24/7 support of Hortonworks Hadoop distribution. Some key responsibilities include supporting production clusters of 400+ nodes and troubleshooting Hadoop cluster issues. Previous experience includes working as a Hadoop developer on projects for clients like Comcast and automotive companies, developing MapReduce, Pig, and Hive scripts for data processing and report generation.
This document contains the resume of Bharath Kumar Rapolu, which summarizes his professional experience working with big data technologies like Hadoop, HDFS, MapReduce, Apache Pig, Hive, Sqoop, HBase and Oozie. It lists his 1.5 years of experience in Hadoop and skills in setting up Hadoop clusters, writing Pig and Hive scripts, importing/exporting data with Sqoop, and scheduling jobs with Oozie. It also provides details of his 4+ years of experience in application development using PL/SQL and his work on projects involving data processing with Hadoop and reporting with SQL.
Suresh Yadav is seeking a challenging position to improve his skills. He has a B.Tech in E.C.E from Malla Reddy Engineering College with 67% and expertise in Java, Hadoop, and Ubuntu. His one year of work experience was as a Technical Support Engineer at Krish Technologies, an IT services company. He completed an academic project on a traffic density analyzer and signal system using GSM technology. Suresh has strengths in learning new things, a positive attitude, and self-confidence.
Ayush Singhal's CV summarizes his work experience and qualifications. He has over 3 years of experience working as a programmer and assistant programmer at the National Informatics Centre HQ in New Delhi. He holds an MBA from IMT Ghaziabad and a B.Tech in Computer Science. His technical skills include Hadoop, Hive, Pig, Java, PHP and Oracle. He has led 6 projects including websites for the National Tiger Conservation Authority and Central Zoo Authority.
Robin David is seeking a position that utilizes his 9+ years of experience in IT and 3+ years experience with Hadoop. He has extensive experience designing, implementing, and managing Hadoop clusters and data solutions. His experience includes building data lakes, ETL processes, data integration, and analytics solutions for clients across various industries. He is proficient in Hadoop ecosystem tools like Hive, Pig, Sqoop, Flume, and Spark and has expertise in Hadoop administration, performance tuning, and support.
Resume rodney matejek sql server bi developerrmatejek
Rodney Matejek has over 15 years of experience developing business intelligence solutions using SQL Server Integration Services (SSIS) and SQL Server Reporting Services (SSRS). He has experience developing ETL processes to extract data from over 176 databases for the Kentucky Department of Education. He is proficient in T-SQL, SSIS, SSRS, and has experience maintaining large scale reporting and data extraction systems.
Ch. Manoj is a senior software professional with over 8 years of experience developing Java/J2EE applications. He has extensive skills in technologies like Java, JSP, HTML, CSS and frameworks like JEE. He has worked on projects for clients like TCS and USAA developing applications like an online test taking system, a document conversion tool for disabled users, and a knowledge delivery system to help customer service representatives. He is proficient in agile methodologies and tools like Eclipse and has received several achievements and certifications.
I am having 6 years of experience in Java Programming.Involved in building Java web applications which are scalable for cloud infrastructure.Good experience in RDBMS database and NOSQL database MongoDb. Good knowledge in REST API design.Worked in Business Intelligence,Financial domain.
Sukhwant Singh has over 19 years of experience as a database manager and software developer with skills in SQL Server, SSIS, SSRS, PowerBuilder, C#, .NET, and Oracle. He has worked on projects integrating data from various sources for companies in the energy sector, including Williams and Gazelle Transportation. His technical skills include database administration, ETL processes, report development, and application development.
Terrance Bowerman is a hands-on software team lead and developer based in Dallas, Texas with over 8 years of experience. He has extensive skills in languages like Java, Scala, and Python as well as frameworks like Play, Spring, and Angular. Currently he works as a lead Java developer at Intuit where he leads teams, builds applications, and helps introduce agile practices. He is looking for senior opportunities involving cutting edge technologies.
Arnab Chakraborty has over 9 years of experience as a Java Technical Lead. He is seeking a challenging role to effectively deliver projects for a reputed IT company. He has extensive experience in software development using technologies like Java, J2EE, Spring Framework and leading teams of developers.
Vijay has over 4 years of experience in IT, including 2 years of experience in Hadoop administration and 1 year of experience in WAS. He has expertise in installing and configuring multi-node Hadoop clusters, understanding and managing Hadoop log files, and working with the Hadoop ecosystem including HDFS, MapReduce, HBase, Pig, Sqoop, Zookeeper and Hive. He also has experience migrating data between HDFS and relational databases using Sqoop, performing health checks and node decommissioning, and troubleshooting issues on live systems.
This document is a curriculum vitae for Amith Rayappa that summarizes his work experience and qualifications. It outlines his 6 years of experience in IT and over 2 years working as a Hadoop Administrator, along with expertise in installing and managing Hadoop clusters. It also provides details of his past roles as a Siebel Administrator and his educational background, including a master's degree from BITS Pilani.
Suresh Yadav is seeking a challenging position to improve his skills. He has a degree in Electronics and Communication Engineering from Malla Reddy Engineering College. He has technical skills in Hadoop, Java, SQL, HBase, Eclipse, Linux, and Windows. He has experience developing MapReduce programs and migrating data to HDFS. For a POC, he analyzed Twitter data using Hadoop, Hive, and Pig. He previously worked as a Technical Support Engineer at Krish Technologies, where he troubleshot computers and built marketing products using VPN technology.
This document contains Anil Kumar's resume. It summarizes his contact information, professional experience working with Hadoop and related technologies like MapReduce, Pig, and Hive. It also lists his technical skills and qualifications, including being a MapR certified Hadoop Professional. His work experience includes developing MapReduce algorithms, installing and configuring MapR Hadoop clusters, and working on projects for clients like Pfizer and American Express involving data analytics using Hadoop, Spark, and Hive.
Vasudevan Venkatraman has over 11 years of experience working in the IT industry, including 7+ years of experience with Oracle PL/SQL, data warehousing, and 3+ years in performance consulting and applications database administration. He has experience designing and developing applications using Oracle PL/SQL, Hadoop, and big data technologies. Currently he works as an Assistant Consultant at TCS focusing on data warehousing projects using Oracle and Hadoop.
Sergey L. Sundukovskiy has over 15 years of experience leading IT teams and delivering business applications. He has served as Chief Information Officer, VP of Engineering, and Director of Architecture. He has expertise in software development, project management, product management, and marketing analytics. His technical skills include Java, .NET, databases, cloud computing, and various project management methodologies.
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopabinash bindhani
Abinash Bindhani is seeking a position as a Hadoop developer where he can utilize over 2 years of experience with Hadoop and Java technologies. He currently works as a senior systems engineer at Infosys where he has gained experience migrating data from Oracle to Hadoop platforms and collecting/analyzing log data using tools like Flume, Pig, and Hive. His technical skills include MapReduce, HBase, HDFS, Java, Spring, MySQL, and Apache Tomcat. He has expertise in Hadoop architecture, cluster concepts, and each phase of the software development life cycle.
Pankaj Resume for Hadoop,Java,J2EE - Outside WorldPankaj Kumar
Pankaj Kumar is seeking a challenging position utilizing his 7.9 years of experience in big data technologies like Hadoop, Java, and machine learning. He has deep expertise in technologies such as MapReduce, HDFS, Pig, Hive, HBase, MongoDB, and Spark. His experience includes successfully developing and delivering big data analytics solutions for healthcare, telecom, and other industries.
Sasmita Swain is a Hadoop Administrator and Developer with over 3.9 years of experience implementing Big Data solutions using Hadoop, Java, and Liferay Portal. She has expertise in Hadoop Distributed File System, MapReduce, Pig, Hive, Sqoop, HBase, and Cloudera distributions. She currently works as a Senior Software Engineer at Accenture implementing their software portal using Hadoop and AWS. Previously, she developed portlets for the Studentnext education portal using Liferay Portal.
Kishore Babu has over 7 years of experience in data analytics, business analytics, and project management. He is currently an Associate Business Analyst at GlobalLogic Technologies working on projects for Google. He leads a team that manages project dashboards, performs cost analysis, and automates reports using tools like Hive, Google SQL, and Dremel. Previously he has held roles as a Senior Lead and Lead at GlobalLogic where he mentored teammates and ensured project metrics were achieved.
Romy Khetan is a senior software engineer with over 3 years of experience in big data technologies like Elasticsearch, MongoDB, Hadoop, Spark, and Java. She has worked on multiple projects involving sentiment analysis, vertical search, and identifying relationships across social media data. Her roles have included backend development, designing plugins, APIs, and interfaces between applications and services. She is proficient in technologies such as Scala, Redis, RabbitMQ, and graph databases.
Resume_Triveni_Bigdata_Hadoop ProfessionalTRIVENI PATRO
Triveni Patro is currently working as a Hadoop admin at Tata Consultancy Services in India with over 4 years of experience in IT development, administration, implementation, and 24/7 support of Hortonworks Hadoop distribution. Some key responsibilities include supporting production clusters of 400+ nodes and troubleshooting Hadoop cluster issues. Previous experience includes working as a Hadoop developer on projects for clients like Comcast and automotive companies, developing MapReduce, Pig, and Hive scripts for data processing and report generation.
This document contains the resume of Bharath Kumar Rapolu, which summarizes his professional experience working with big data technologies like Hadoop, HDFS, MapReduce, Apache Pig, Hive, Sqoop, HBase and Oozie. It lists his 1.5 years of experience in Hadoop and skills in setting up Hadoop clusters, writing Pig and Hive scripts, importing/exporting data with Sqoop, and scheduling jobs with Oozie. It also provides details of his 4+ years of experience in application development using PL/SQL and his work on projects involving data processing with Hadoop and reporting with SQL.
Suresh Yadav is seeking a challenging position to improve his skills. He has a B.Tech in E.C.E from Malla Reddy Engineering College with 67% and expertise in Java, Hadoop, and Ubuntu. His one year of work experience was as a Technical Support Engineer at Krish Technologies, an IT services company. He completed an academic project on a traffic density analyzer and signal system using GSM technology. Suresh has strengths in learning new things, a positive attitude, and self-confidence.
Ayush Singhal's CV summarizes his work experience and qualifications. He has over 3 years of experience working as a programmer and assistant programmer at the National Informatics Centre HQ in New Delhi. He holds an MBA from IMT Ghaziabad and a B.Tech in Computer Science. His technical skills include Hadoop, Hive, Pig, Java, PHP and Oracle. He has led 6 projects including websites for the National Tiger Conservation Authority and Central Zoo Authority.
Robin David is seeking a position that utilizes his 9+ years of experience in IT and 3+ years experience with Hadoop. He has extensive experience designing, implementing, and managing Hadoop clusters and data solutions. His experience includes building data lakes, ETL processes, data integration, and analytics solutions for clients across various industries. He is proficient in Hadoop ecosystem tools like Hive, Pig, Sqoop, Flume, and Spark and has expertise in Hadoop administration, performance tuning, and support.
Resume rodney matejek sql server bi developerrmatejek
Rodney Matejek has over 15 years of experience developing business intelligence solutions using SQL Server Integration Services (SSIS) and SQL Server Reporting Services (SSRS). He has experience developing ETL processes to extract data from over 176 databases for the Kentucky Department of Education. He is proficient in T-SQL, SSIS, SSRS, and has experience maintaining large scale reporting and data extraction systems.
Ch. Manoj is a senior software professional with over 8 years of experience developing Java/J2EE applications. He has extensive skills in technologies like Java, JSP, HTML, CSS and frameworks like JEE. He has worked on projects for clients like TCS and USAA developing applications like an online test taking system, a document conversion tool for disabled users, and a knowledge delivery system to help customer service representatives. He is proficient in agile methodologies and tools like Eclipse and has received several achievements and certifications.
I am having 6 years of experience in Java Programming.Involved in building Java web applications which are scalable for cloud infrastructure.Good experience in RDBMS database and NOSQL database MongoDb. Good knowledge in REST API design.Worked in Business Intelligence,Financial domain.
Sukhwant Singh has over 19 years of experience as a database manager and software developer with skills in SQL Server, SSIS, SSRS, PowerBuilder, C#, .NET, and Oracle. He has worked on projects integrating data from various sources for companies in the energy sector, including Williams and Gazelle Transportation. His technical skills include database administration, ETL processes, report development, and application development.
Terrance Bowerman is a hands-on software team lead and developer based in Dallas, Texas with over 8 years of experience. He has extensive skills in languages like Java, Scala, and Python as well as frameworks like Play, Spring, and Angular. Currently he works as a lead Java developer at Intuit where he leads teams, builds applications, and helps introduce agile practices. He is looking for senior opportunities involving cutting edge technologies.
Arnab Chakraborty has over 9 years of experience as a Java Technical Lead. He is seeking a challenging role to effectively deliver projects for a reputed IT company. He has extensive experience in software development using technologies like Java, J2EE, Spring Framework and leading teams of developers.
Vijay has over 4 years of experience in IT, including 2 years of experience in Hadoop administration and 1 year of experience in WAS. He has expertise in installing and configuring multi-node Hadoop clusters, understanding and managing Hadoop log files, and working with the Hadoop ecosystem including HDFS, MapReduce, HBase, Pig, Sqoop, Zookeeper and Hive. He also has experience migrating data between HDFS and relational databases using Sqoop, performing health checks and node decommissioning, and troubleshooting issues on live systems.
This document is a curriculum vitae for Amith Rayappa that summarizes his work experience and qualifications. It outlines his 6 years of experience in IT and over 2 years working as a Hadoop Administrator, along with expertise in installing and managing Hadoop clusters. It also provides details of his past roles as a Siebel Administrator and his educational background, including a master's degree from BITS Pilani.
Tejasri Pavuluri is seeking a full-time role in computer science utilizing her skills in programming, distributed computing, software defined networking, machine learning, and big data. She has a Master's degree in computer science from UT Dallas and a Bachelor's degree in electronics and communication engineering from JNTUH, India. For internships, she developed applications related to Bluetooth data streaming, software defined networking, and packet capturing. She also completed academic projects involving Hadoop, machine learning algorithms, search engines, peer-to-peer file sharing, and distributed algorithms.
Farhan Mazhar is a Masters student at the University of Southern California expected to graduate in May 2017 with a GPA of 3.5/4.0. He received his Bachelors in Computer Science from Siddaganga Institute of Technology in 2013 with a GPA of 3.8/4.0. He has work experience as a Software Engineer at Robert Bosch from July 2014 to June 2015 and as an Associate Software Engineer from August 2013 to June 2014. His technical skills include programming languages like Java, Python, C++ and data analytics tools like Hadoop, Spark and MongoDB. He has experience developing various projects involving machine learning, cloud computing, e-commerce applications and web search engines.
This document provides a summary of qualifications and experience for Eugene A. Klaus. It outlines over 25 years of experience in systems administration across various hardware and software platforms, including expertise in networking, security, and troubleshooting. Key skills include AIX, Linux, Solaris administration, backup and disaster recovery, and programming languages like Perl, Python, Shell.
This document provides a detailed summary of Poorna Chandra Rao Kommana's professional experience and technical skills. It outlines his 8 years of experience in big data technologies including Hadoop, Hive, Pig, Spark, Kafka and AWS services. It details his roles and responsibilities in building scalable big data solutions, developing ETL pipelines, performing data analysis, and optimizing performance. His skills include Java, Python, SQL, Pig Latin, HiveQL, and tools like Eclipse and PyCharm.
The document provides a summary of a senior big data consultant with over 4 years of experience working with technologies such as Apache Spark, Hadoop, Hive, Pig, Kafka and databases including HBase, Cassandra. The consultant has strong skills in building real-time streaming solutions, data pipelines, and implementing Hadoop-based data warehouses. Areas of expertise include Spark, Scala, Java, machine learning, and cloud platforms like AWS.
Abhijeet Murlidhar Ghag is a manager at Axis Bank with over 8 years of experience in information technology. He has expertise in data warehousing, analytics and implementing solutions using tools like Informatica, Hadoop, and Oracle. Currently, he is working on projects to automate reporting processes, build a data infrastructure, and develop data marts and dimensions for analytics at Axis Bank.
Progressive system engineer with 8 years of hands-on experience developing and implementing innovative software
products and solutions that significantly increase productivity and profitability. Adept at delivering high-quality products
while establishing solid analytical and problem solving abilities. Skilled using Core Java, PHP, OOP, Design Patterns,
SOA, Data Structure / Algorithms, JavaScript, jQuery, CSS, XML, HTML, JSON, MySQL, Oracle, and Informix while
leading comprehensive software development. Experienced in implementing application through entire Software
Development Life Cycle.
Progressive system engineer with 8 years of hands-on experience developing and implementing innovative software
products and solutions that significantly increase productivity and profitability. Adept at delivering high-quality products
while establishing solid analytical and problem solving abilities. Skilled using Core Java, PHP, OOP, Design Patterns,
SOA, Data Structure / Algorithms, JavaScript, jQuery, CSS, XML, HTML, JSON, MySQL, Oracle, and Informix while
leading comprehensive software development. Experienced in implementing application through entire Software
Development Life Cycle.
This document provides a summary of Kiran Immadi's professional experience and qualifications. He has over 6 years of experience in software development using technologies like Java, Servlets, JSP, Struts and EJB. He also has experience with Big Data technologies including Hadoop, HDFS, MapReduce, Pig and Hive. Kiran has worked as a senior consultant on projects for clients in Singapore and has led development teams. He holds an MSc and has received several awards and certifications for his work.
This document contains a resume summary for Rajesh Kumar Bharathan. It lists his contact information, objective to develop his career as a software professional and contribute innovative ideas as a valuable team member. It also outlines his 15 years of work experience including roles as a Technical Architect Associate Manager at Accenture and Software Engineer at other companies. It details his Microsoft certifications and skill sets in technologies like ASP.Net, C#, SQL Server and tools like Visual Studio. It provides descriptions of some of the projects he worked on at Accenture and other companies in roles like Technical Architect and responsibilities including architecture, design, development and mentoring.
Umesh Kumar is seeking senior level assignments in project management and product development. He has over 9 years of experience managing large and complex IT projects for banks. He is proficient in Oracle, PL/SQL, and Sybase and has experience across the full software development life cycle, including requirement analysis, design, development, testing, and implementation. He has successfully led global project teams and delivers projects on time, on budget, and according to best practices.
Mayur Kumar Budha has over 11 years of experience in IT testing with expertise in test automation using tools like QTP, Selenium, and Rational Robot. He has extensive experience developing test automation frameworks and scripts for various domains including financial services, healthcare, and project management. Some of Budha's roles and responsibilities included developing test plans, frameworks, and reusable assets, automating test cases, reviewing test scripts, and reporting on project status.
The document provides a summary of an individual's experience and skills. It includes over 10 years of IT experience, with a focus on data warehousing, ETL development and implementation. Specific skills and technologies highlighted include IBM DataStage, Oracle, SQL, UNIX scripting, and experience leading teams on various projects in industries such as retail, insurance, and energy.
Mahesh Sibbadi is a software developer with over 3 years of experience developing applications using Microsoft technologies like C#.NET, ASP.NET, and SQL Server. He has expertise in developing web applications, databases, and reporting solutions. His experience includes projects in security management, business intelligence, government spending, and textile billing systems. He is looking for opportunities to contribute his IT and development skills.
Sharique Khan has over 15 years of experience in software engineering, DevOps, cloud computing and SRE roles. He has extensive skills in technologies like Python, Azure, AWS, Docker, Kubernetes, Terraform, Ansible and more. Some of his key projects include developing 360Oncology, a cloud-native care coordination platform, AURA reporting and analytics suite, and various products for Varian Medical Systems and TechMahindra. He holds certifications like Certified Scrum Master and GIAC Secure Software Programmer.
This resume summarizes the qualifications and experience of Nagaraj Hiremath. He has 9 years of experience working with Oracle 9i (SQL and PL/SQL) and VB 6.0. He has a Master's degree in Computer Science and experience in roles such as branch manager, computer technician, and technical incharge developing software solutions for clients. His skills include databases, programming languages, and computer hardware and he has experience developing projects for insurance management and agricultural products automation.
This document provides a professional summary for Samarendra Sahoo. It outlines his 7 years of experience developing client server applications using technologies like Java, C#, C, and databases like Oracle, SQL, and Sybase. It also lists his technical skills including programming languages, operating systems, databases, and tools. Finally, it provides details on 4 projects he worked on for Avis Budget Car Rental including developing applications, providing production support, and gathering competitive rental rates.
Ravi Kiran Dhiman is a Senior System Engineer with over 4.8 years of experience in SQL Server and Microsoft Business Intelligence tools. He has expertise in developing ETL packages, data cubes, and reports for projects involving data warehousing and business intelligence. His skills include SSIS, SSAS, SSRS, and Axiom. He is looking for new opportunities to grow professionally and add value through his technical skills and commitment.
Umesh Kumar has over 9 years of experience in IT projects for banks and other industries. He has experience leading large, complex projects from initiation through delivery. His technical skills include Oracle, SQL, PL/SQL, and Sybase. He has expertise in full lifecycle development, requirements analysis, design, testing, and implementation. Key projects include developing reporting systems for banks, migrating systems from Sybase to Oracle, and building applications using Oracle APEX.
Amit Porwal is a manager and platform specialist with 12 years of experience designing, developing, and testing Java/J2EE applications. He has extensive experience in capital markets applications involving portfolio analysis, order management, and risk and pricing systems. He is skilled in technologies like Java, Spring, Hibernate, Apache Drools, Tibco EMS, Oracle, and more. Currently he is working as an architect at Sapient to design a data lake solution for RBS using Hadoop, Hive, Spark, and related tools.
Yusuf Osmani is a highly experienced technical and data architect with over 19 years of experience in the financial and insurance industries. He has a proven track record of leading transformation projects, delivering data warehouse solutions, and developing enterprise applications. Throughout his career, he has consistently managed technically diverse teams, provided technical expertise to projects, and worked with business stakeholders. Most recently, he served as an IT consultant and technical architect at Guardian Insurance, where he spearheaded several data integration projects and systems implementations.
Ramesh Mannam has over 10 years of experience as a SAP BO consultant. He has extensive experience designing, developing and maintaining SAP BO universes and reports for clients in various industries. Some of his skills include SAP BO 4.2, XI 3.1, Oracle, SQL, and he has worked on projects for clients such as Shell, Johnson & Johnson, HSBC, and Pfizer.
HarishKumar Chennupati provides a curriculum vitae summarizing his professional experience and technical skills. He has over 8 years of experience in information technology as a team lead, scrum master, and senior developer working with technologies like .NET, SQL Server, SSIS, SSRS, Informatica, and QlikView. Some of his projects include applications for HP, Western Digital, and American International Assurance involving development, testing, reporting, ETL processes, and maintenance support. He is proficient in languages like C#, VB.NET, and databases like SQL Server, Oracle, and Vertica.
Suresh Kumar has over 10 years of experience developing and supporting web applications using technologies like ASP.NET, C#, SQL Server, and AngularJS. He has worked as a consultant and team lead on projects for clients in various industries including retail, manufacturing, and financial services. Suresh is proficient in .NET frameworks, web APIs, databases, and agile methodologies.
Suresh Kumar has over 10 years of experience developing and supporting web applications using technologies like ASP.NET, C#, SQL Server, and AngularJS. He has worked as a consultant and team lead on projects for clients in various industries including retail, manufacturing, and financial services. Suresh is proficient in .NET frameworks, web APIs, databases, and agile methodologies.
1. Big Data Resume
ARBIND KUMAR JHA Mail to: arbind_jha@rediffmail.com
Contact No: +91- 9731067370, Location: Bangalore
Passport No: G3082986 (Valid Fr 18/07/2007 To 17/07/2017)
CAREER PROFILE
I am an IT Professional with more than 1.5 Years experience in Hadoop ecosystem out of more than 12
Years of Experience in IT.
CURRENT PROFILE
Present Status: -- Working in HCL Technology Limited from Dec 2009 in BFSI Technology as a
Technical Architect Lead.
SERVICE DESCRIPTION
Wide range of services around Big Data, NoSQL, Hadoop and BIRT including following.
• Architecture, Design, Development, Hadoop installation and Tuning
• HDFS Map Reduce, HIVE installation and configuration. PIG Scripting, Python development, Log
analysis and ETL processing.
• Collaborated with the infrastructure, network, database, application and BI teams to ensure data
quality and availability.
• Created HBase tables to load large sets of structured, semi-structured and unstructured data coming
from UNIX, NoSQL and a variety of portfolios.
• Designed and developed a reporting framework using BIRT writing HIVE Query.
• Hands on experience in MySQL, Data Ware Housing and Unix.
• Ability to achieve project goals within project constraints such as scope, timing and budget.
• Participated in project planning, budgeting, requirement analysis, and client interaction.
• Proven ability to deliver high quality modules and services on schedule and under budget.
TECHNICAL SKILLS
• Big Data Ecosystems: Hadoop, Map Reduce, HDFS, HBase, Zookeeper, Hive, Pig, Sqoop,
Cassandra, Oozie, Flume and Talend
• Programming Languages: Java, PL/SQL, T-SQL, Perl, Pro*c, UNIX Sell, C, C++.
• Scripting Languages: JSP & Servlets, PHP, JavaScript, XML, HTML, Shell, Perl and Bash
• Databases: NoSQL, Oracle 11.2, Sybase 15.0., SQL Server 2008.
• Tools: Eclipse, PB 11.5 Using PFC., VSS, Toad. BIRT, Informatica , MS Visual Studio
• Platforms: Windows(2000/XP), Linux, Solaris, AIX, HPUX
• Application Servers: Apache Tomcat 5.x 6.0, EA Server 4.1.
• Testing Tools: Eclipse.
• Methodologies: Agile, UML, Design Patterns
ACADEMIC QUALIFICATION
B.Sc. (Hons.) with First Class. (Weighted aggregate: 61%)
I.Sc. ( XII ) with First Div . (Weighted aggregate: 60%)
S.S.C. (X) with First Div . (Weighted aggregate: 63%)
2. PROFESSIONAL QUALIFICATION:-- (MCM) Post Graduate in Computer science from I.M.E.D
(Bharati Vidya Peeths) PUNE UNIVERSITY.
PROFESSIONAL EXPERIENCE
Organization - HCL Technology. Nov 2005 to Mar 2008 and Nov 2009 to Current date
XI. Project Name :-- LBG Digital, UUID
Client :-- HCL Technology.
Role :-- Architecture
Hadoop Analyst
Facilitated insightful daily analyses of 60 to 80GB of website data collected by external sources. Spawning
recommendations and tips that increased traffic 38% and advertising revenue 16% for this online provider of
financial market intelligence.
• Developed MapReduce programs to parse the raw data, populate staging tables and store the refined
data in partitioned tables in the EDW.
• Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with
EDW reference tables and historical metrics.
• Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the
Hadoop Distributed File System and PIG to pre-process the data.
• Provided design recommendations and thought leadership to sponsors/stakeholders that improved
review processes and resolved technical problems.
• Managed and reviewed Hadoop log files.
• Tested raw data and executed performance scripts.
• Shared responsibility for administration of Hadoop, Hive and Pig.
• Created HBase tables to load large sets of structured, semi-structured and unstructured data coming
from UNIX, NoSQL and a variety of portfolios.
• Supported code/design analysis, strategy development and project planning.
• Created reports for the BI team using Sqoop to export data into HDFS and Hive.
• Developed multiple Map Reduce jobs in Java for data cleaning and preprocessing.
• Assisted with data capacity planning and node forecasting.
• Collaborated with the infrastructure, network, database, application and BI teams to ensure data
quality and availability.
• Administrator for Pig, Hive and HBase installing updates, patches and upgrades
Java Developer
• Improved user satisfaction and adoption rates by designing, coding, debugging, documenting,
maintaining and modifying a number of apps and programs for ATM and online banking.
• Participated in Hadoop training and development as part of a cross-training program.
• Lead the migration of monthly statements from UNIX platform to MVC Web-based Windows
application using Java, JSP, Struts technology.
• Prepared use cases, designed and developed object models and class diagrams.
• Developed SQL statements to improve back-end communications.
• Incorporated custom logging mechanism for tracing errors, resolving all issues and bugs before
deploying the application in the WebSphere Server.
3. • Received praise from users, shareholders and analysts for developing a highly interactive and intuitive
UI using JSP, AJAX, JSF and JQuery techniques.
X. Project Name: - CTB LASER (DB LS2 LOAN LIRA)
Client - Deutsche Bank
The LIRA Group provides functional and technical analysis, design and daily support for applications
that use the LS2 Reporting Database.
IX. Project Name: - Loan IQ.
Client - BNP Paribas
System is based on the forecasting and realization of the transaction of the system. Loan IQ is an
integrated Commercial Lending system that has been designed for financial institutions in the business of
origination, syndication, and distribution of commercial loans. The system integrates the Origination,
Distribution, Trading, Loan Servicing, Management Control, and Portfolio Management business
processes.
VIII. Project Name: - Custom System Nov 2005 to Mar 2008
Client : -- Cadbury Schweppes.
Custom module is customized system. It supports Case sales history management, pricing management.
This is support system for the Beverages all feature like promotion, franchisee.
Role : -- PL/SQL code, Performance tuning of oracle query and problem solved.
Developed and maintained PowerBuilder application. Fixed Ticket and Work order. SRS, Technical and
Test Case prepared. Interacted with onsite team. And handled offshore small team.
Technology :-- Oracle 9i, Unix, Pro*C, Power Builder 9.0 with PFC.
Organization - Mahindra Satyam. Apr 2008 to Dec 2009
VII. Project Name:- VPTS.
Module: VPTS are a customized module to the Satyam Computer Srv. VPTS system tracks visitor and
packages in the organization.
VI. Project Name: - Essar SAP and PPC
Module: module is system for Essar Steel. It supports Steel Coil and Slab management. This is support
system for the SAP and Level 3
Client : -- Essar Steel.
Platform: -- Oracle 8i.
Organization - Vitech Asia System Pvt Ltd. Jul 2004 to Nov 2005
V. Project Name: - V3 System.
V3 is powerful enterprise class Benefits Solution. It offers powerful features for member and employer
administration, benefit enrollment, wage and contribution reporting, benefits eligibility determination,
benefit calculation, claims adjudication, benefits disbursement, rule-based definition of contribution
formulas, benefit plan and fund structures, eligibility calculations, benefit calculations, disbursement
procedures and business processes.
4. Client :-- PSERS, OHIO.
Platform :-- Oracle 9i and PowerBuilder 8.0 with PFC.
Organization - Tradeship India Pvt. Ltd. Oct 2002 to July 2004
IV. Project Name: -- SEA.LINER
MODULE 1 : IJS (Intermodal Job Order System)
The business of this module in sea.liner is to Create Transportation Contract, Define Routing (Single
Leg/Multiple Leg.), Define Cost & Billing Rates (Contract Level & Route Level)
MODULE 2 : MPR (Management Personal Report)
This module provides all the MIS Reports of the company. This module provides all the summary
transaction of the project.
Role: -- SRS creation. write PL/SQL code, tuned sql code
Client: -- Malaysia Shipping Industries.
Platform: -- Oracle 8i and Power Builder 7.0,
Organization - Self Consultancy. Aug 2001 to Oct 2002
III. Project Name: -- MIL.ERP
Following are the Modules:
HRD (Humane Resource Development), Finance (Financial Accounting) and FD(Fixed Deposit).
Role :-- Study, Design, document preparation and Developing and maintaining
Application using PB script and PL/SQL programming. Database Administration. (And
maintenance of the Existing system)
Client :-- Milton Cycle Indus.
Platform :-- Power Builder 6.0, Oracle 8.x. DBA activity and PL/SQL
Organization - Virtual Reality System P Ltd. Mar 2000 to Aug 2001
Organization - InfoWorld Consultancy P Ltd. Jun 1999 to Mar 2000
(ARBIND KUMAR JHA)