1. 905 Pleasant Street
Apt 2W, Oak Park, IL, 60302
AKSHITA Email: akshit@ncsu.edu
Phone: +1-860-371-0206
TECHNICAL INTEREST: SQL data analyst, Informatica/SSIS ETL developer, SSRS report developer, BI analyst
EDUCATION: North Carolina State University (NCSU), Raleigh, NC Spring 2015 - Present
Masters in Computer Engineering (CPE)
Current G.P.A: 3.83/4.0 (anticipated graduation in July 2016)
Galgotia’s College of Engineering and Technology, India Fall 2007-Spring
2011
B. Tech. in Electronics and Communication Engineering (ECE)
W.E.S GPA: 3.82/4.0 (First rank in the graduating batch)
Merit Scholarship in the second year of undergraduate studies.
TECHNICAL SKILLS:
Relational Databases: Oracle 10g, Sybase, MS SQL
BI ETL, Analysis & Reporting: Informatica 9.1, SSIS, SSAS, SSRS (v 2008), Actuate 11, BO, MS Office, MS Visio
Report Scheduling & Version Control: Autosys, Perforce
Programming Languages: SQL, PL/SQL, C++, Perl, Shell, CUDA/OpenCL, python
HW Description Languages & Design Tools: Verilog, System Verilog, Quartus II, Modelsim, Synopsys Designware
Networking: Opnet, Wireshark, TCP/IP, Ethernet, OpenStack, AWS-Hadoop, MapReduce
PROFESSIONAL EXPERIENCE:
Profile Summary: Senior Systems Engineer, Infosys Limited (2011 – 2014)
IT professional with 3+ years of extensive experience in data modelling, data warehousing, ETL, data analysis, reporting,
development, maintenance, testing and documentation.
6 months of extensive training on data warehousing, RDBMS, data integration, analysis and reporting, data mining, data
purging, multi-dimensional data modelling, SQL, MS SQL server, Oracle 10g, SSIS, SSAS and SSRS.
Experience in dimensional data modelling, snow flake schema modelling, star schema modelling, E-R modelling, and data-
normalization using SQL Server Management Studio.
Experience in ETL development for staging and populating Operational Data Store (ODS) and Enterprise Data Warehouse
(EDW) from heterogeneous data sources using Informatica Power Center 9.1 (Source Analyzer, Mapping/Mapplet Designer,
Transformation Designer, Workflow Manager, Workflow Monitor and Repository Manager) on Sybase and SSIS (Control
Flow, Data Flow, Event Handler, Package Explorer) on MS SQL server.
Experience in Data Analysis through SSAS using facts, dimensions and measures to form data cubes and Reporting using
SSRS/Actuate/BO to develop parameterized, cascade, tabular, graphical, drill-down and drill-through reports.
Experience in developing SQL queries (used for data validation), procedures and functions in PL/SQL or shell (used in
Informatica mappings and Actuate reports), UNIX shell scripts (used in Actuate reports).
Experience in preparing high level design, low level design, exhaustive test plans and test exit criteria to conduct end-to-end
and efficient testing of the design.
Experience in Autosys scheduling tool to schedule the execution of workflows, reports and assigning dependencies.
Awarded with Insta Award along with the ‘Certificate of Appreciation for Outstanding Performance’ in Infosys Ltd.
Completed Infosys’ internal certification on Software Quality Certification and Oracle.
Experience in providing training to the new employees on BI concepts, data warehousing, ETL, analysis and reporting tools.
Developed effective working relationships with client team to understand support requirements.
Experience to work in a development team and handling critical situations for successful completion of the tasks/projects.
Ability to set effective priorities to achieve immediate and long-term goals and meet operational deadlines.
Self-motivated and passionate to learn new tools and technologies along with excellent interpersonal, verbal & written
communication skills.
2. Project Details:
Client: Capital Group Companies (CGC), Irvine, CA
Project Name: Portfolio Accounting (PA)
Role: ETL/Reporting Developer
Capital Group Companies is one of the largest investment management organization. It comprises a group of investment management
companies, including capital research and management, American Funds, Capital Bank and Trust, Capital Guardian and Capital
International. This project was about updating the ODS with daily transactional data and enterprise data warehouse for analysis and
reporting. The project also included implementation of new mappings and reports for Dodd-Frank.
Tools/Technologies: Informatica Power Center 9.1, Actuate 11 Report Developer, Autosys, Perforce, UNIX shell, SQL, Sybase
Responsibilities:
Captured and analyzed client requirements and developed detailed documentation explaining the requirement understanding,
logical steps and procedures to be followed to successfully complete the design, exhaustive test-plans and questions if any.
Involved in creating new table structures and modifying the existing tables in the ODS based on the new client requirement.
Developed new mappings in Informatica using Source and Target Analyzer, Mapping Designer and Transformation
Designer to perform ETL of source data into staging tables and then from staging tables to destination tables stored in Sybase.
Used various power center transformations like Source Qualifier, Look up, Rank, Joiner, Expression, Union, Aggregator,
Filter, Router, Sequence Generator, Stored Procedure, SQL and Update Strategy to meet business logic in the mappings.
Modified existing mappings for the implementation of new business logic in the ETL structure.
Developed Informatica workflows and sessions associated with the mappings in Workflow Manager.
Developed new reports in Actuate 11 eRD using Actuate functionalities such as Filter, Containers, Group by, OnRow and
OnRead methods, Frames, Section, and stored procedures.
Developed UNIX shell scripts to implement the complex logic used in the report to fetch the data from the database.
Created parameterized reports containing sequential and parallel sections and deployed them on Actuate server.
Developed automation scripts in shell for Autosys to schedule the execution of the workflows and reports as required.
Involved in debugging the mappings by checking the workflow, session, event and error logs and testing the mappings by
executing SQL queries to check the validity and consistency of the data in the database.
Involved in debugging the reports by checking error logs and testing the reports and shell scripts by executing SQL queries
mimicking the logic in the reports and comparing both of the outputs.
Involved in providing maintenance and support to the on-shore team during the release of the project in the production.
As a senior, involved in managing the team of 4, prioritizing and dividing the tasks, peer review and providing on-call support.
ACADEMIC EXPERIENCE: MS Student, NCSU, Computer Engineering (2015-16)
(Specialization: Computer networking/security, Cloud Services, Digital Circuit Design)
Higher Performance ASIC-
based Emulation:
(Tools/Languages: Verilog,
System Verilog, Modelsim,
Design Vision)
Designed a synthesizable digital ASIC using Verilog and Synopsys Design Vision for Jacobi
Iteration, optimized for maximum performance per unit area, to solve the system equation YV=I
having a sparse matrix. Designed a layered test-bench in System Verilog incorporating a
generator, driver, scoreboard and monitor to perform the verification of data and control path of
LC-3 microcontroller on Modelsim using a comprehensive instruction set.
High Performance Cloud
Services:
(Tools/Languages: Amazon Web
Services, DynamoDB, OpenStack,
python)
Analyzed Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS) and Software-
as-a-Service (SaaS) of cloud computing. Analyzed different deployment models (hybrid cloud,
community cloud, private cloud and public cloud) of cloud computing. Analyzed the current
cloud platforms such as Amazon Web Services (AWS), Google App Engine, IBM SoftLayer
and Microsoft Azure.
Computer Networking &
Security:
(Tools/Languages: Opnet,
WireShark, python, Ethernet,
TCP/IP)
Analyzed the behavior of internet protocols (TCP, UDP, OSPF, IP, ARP, BGP, DHCP, DNS)
using Wireshark. Implemented the structure of internet containing edges (hosts), core (routers
and switches) and physical media (wired or wireless) and captured its behavior using Opnet.
Analyzed major security threats (reply attack, reflection attack, man-in-the-middle attack,
meet-in-the-middle attack, and dictionary attack) on data over the internet. Analyzed strong
security protocols: encrypted key exchange, secure remote password, secret key
cryptography and public key cryptography.