SlideShare a Scribd company logo
1 of 15
SIVA PRASAD VUNDAVALLI
Phone: 65 -91044959 email: v_sivaprasad_v@yahoo.co.in
SUMMARY
• Prasad has Over Twelve years and Ten months of IT experience in database
development, maintenance, and Implementation.
• SQL Server implementation, management, maintenance.
• Performed all aspects of administration, including data modeling, backups,
replication, partition of tables and recovery.
• Gathering database requirements & functional specifications.
• Development of SQL scripts for maintenance.
• Extensive knowledge in Database Programming and Development using SQL
Server (2000/2005/2008/2012/2014) and MS Access to include Database
Architecture, Logical, and Physical Design.
• Excellent knowledge in T-SQL Programming (Stored Procedures, Functions,
and Triggers)
• Ability to work well in teams, rapid learner with excellent problem solving skills,
strong technical background, and fine interpersonal skill
• Transferring Data using Data Transformation Services (DTS) Packages and
SSIS packages
• Excellent in High Level Design of ETL DTS Packages & SSIS Packages for
integrating data using OLE DB connection from heterogeneous sources like
(Excel, CSV, Oracle, flat file, Text Format Data) by using multiple
transformations provided by SSIS such as Data Conversion, Conditional
Split, Bulk Insert , Merge and union all.
• Involved in Business Analysis of the OLTP and OLAP data and in decision
support.
• Knowledge of Data Warehousing methodologies and concepts, including star
schemas, snowflakes, ETL processes, dimensional modelling and reporting tools.
• Experience in MS SQL Server 2008/2012 Business Intelligence in MS SQL
Server Integration Services (SSIS), MS SQL Server Reporting Services (SSRS)
and MS SQL Server Analysis Services (SSAS).
• Development of ad hoc Reports using SSRS.
• Good knowledge of creating report models using SSRS and creation of drill down
and drill through reports.
• Monitoring and tuning SQL Server performance with multiple tools including
Performance Monitor, SQL Profiler, and Index Tuning Wizard/Tuning
Advisor to achieve better performance, accessibility and efficiency
• Use of T-SQL for advanced SQL queries, stored procedures for data
manipulation.
• Excellent knowledge in data snapshot and data archiving.
• Excellent knowledge in Backup , Restore, Log shipping , Replication, Failover
Clustering and Database Mirroring
• Excellent knowledge in performance tuning to optimize DB scripts.
• Handled independently the migration of non Unicode databases to support
Unicode. Developed migration tool using C#
• Involved in creation of SSIS packages to process xml files, transform the data to
apply validations using script component tasks in C# and load data in SQL Server
databases
• Maintenance of logshipping under Unicode migration
• Substantial knowledge of data compression techniques in SQL Server
2008/2012/2014
• Monitoring top expensive queries that consume considerable amount of CPU and
high IO and resolve them on demand basis.
• Handled Volumetric testing and analysis to identify growth factors and set
application and database threshold values.
• Implementation of DR setup including Transactional Replication and Log
shipping.
• Involved in design of Data Valut model to provide long-term historical storage of
data coming in from multiple operational systems.
TECHNICAL SKILLS
Languages : C, C++, T-SQL, Core Java, C#
Web Technologies : HTML
Databases : SQL Server 2000/2005/2008/2012 and MS Access
Web Server : IIS 5.0/6.0/8.0
Operating System : Windows 2000/2003/NT 4.0/98/XP/2008
Others : Visual Source Safe, Star team, SVN, TFS, JIRA, PowerShell,
QlikView, Perforce
EDUCATION:
Bachelor’s Degree in Electrical & Electronics Engineering, JNTU, Hyderabad, and
Andhrapradesh, India.
PROJECTS:
Comtel Solutions Pte Ltd, Singapore Aug'15 – Present
Client: Bank Of Singapore, Singapore
Designation: Senior Consultant
Singapore Deposit Insurance Corporation (SDIC):
The Scheme members, one of scheme member is BOS, should implement the necessary
systems and processes such that the Specified Information can be delivered to SDIC
according to the prescribed data structure and format, as well as within the timeframe set
out by SDIC.
As part of SDIC BOS has to submit the files related to Bank’s Deposits, Loans, Current
Account, GL, SPOWNER information etc.,
Responsibilities
• Handled the project independently right from requirement gathering, creating data
mapping document, design of Data Vault model to maintain core data as well as
SDIC specific tables and creation of schematic views/functions and export
procedures.
• Involved in creation of mapping document to identify data from different sources
and map to the data in UDW database which is designed in Data Vault model.
• Responsible for creating T-SQL objects such as Tables, Views, Stored
Procedures, Functions to store data in Data Vault model and generate interface
files required by SDIC.
• Creation of PIT table logic to maintain snapshot data and creation of config data
to generate PIT procedures and required tables dynamically
• Developed utility using Powershell to generate PIT procedures and tables scripts
dynamically.
• Involved in enhancing UDW Utility tool to generate PIT related SSIS packages
using C#.
• Developed SSIS packages to populate PIT tables in parallel.
• Developed SSIS packages to populate HUB, SAT, LINK tables in Data Vault
model.
• Developed macro excels using VBA to mask the data and provide SDIC specific
excel based reports.
• Developed logic using T-SQL procedures to mask the data as per SDIC masking
rules.
• Creation of deployment packages based on the scripts in TFS.
• Support to SIT and UAT users for validation of SDIC files.
• Support to Production support team in assisting SDIC simulation exercise.
• Creation of statistics related to Capacity planning and execution times.
• Involved in extending the Utility tool which will generate the SSIS packages
dynamically.
• Designed the Utility tool to support Excel Import/Flat Import/Excel Export/Flat
Export using SSIS script task in C#.
• Creation of Technical Specification Document and data mapping documents.
• Involved in leading the project Transaction Life Cycle Management (TLM) to
gather requirements, creation of source to destination mapping document, design
of Data Vault tables and monitor the project status till it goes live.
Environment: MS SQL Server 2000/2005/2008/2012/2014, Windows 2003/2008,
SSIS, SSRS
Randstad Consulting Technologies Pte Ltd, Singapore Oct'14 – Aug'15
Client: EscherGroup, Singapore
Designation: Database Developer
Responsibilities
• Responsible for creating T-SQL objects such as Tables, Views, Stored
Procedures, Functions and Triggers etc to implement the requirements of Report
DB for Post Malaysia.
• Carrying out Volumetric testing and analysis to identify application load
thresholds and database resource thresholds.
• Analysis and implementation of DR set up for Report database server and MO
database server which includes Transactional Replication and Log shipping.
• Creation of database maintenance plans which includes backup frequency and
index maintenance.
• Involved in creation of SSIS package to load xml data into database and process
this data to generate different report text files like ITF-150, ITF-151 and ITF-152.
• Enhanced the SSIS packages to apply validation on xml data using script
component tasks using C#.
• Creation of SQL Server Jobs to schedule SSIS packages.
• Creation of reports using SSRS for HQ reports.
• Involved in enhancing HQ reports business layer using C#.
Environment: MS SQL Server 2000/2005/2008/2012, Windows 2003/2008, SSIS,
SSRS
TangSpac Consulting Pte Ltd, Singapore Oct'12 – Oct’14
Client: Barclays Capital, Singapore
Designation: SQL Developer
Specific Risk Engine (SRE)
This is the project for Barclays Capital to generate PnLs for different client systems like
CVA and OZ etc.
SRE collects inputs from various internal systems such as Bedrock and Asset Control etc
and sources PnLs to CVA to facilitate CVA to generate charge calculations
Credit Value Adjustment (CVA)
This is the project for Barclays Capital to comply with the BASELIII regulatory system.
Credit value adjustment (CVA) is by definition the difference between the risk-free
portfolio value and the true portfolio value that takes into account the possibility of
counterparty’s default. In other words, CVA is the market value of a counterparty credit
risk.
In the view of leading investment banks, CVA is essentially an activity carried out by
both finance and a trading desk in the Front Office. Tier 1 banks either already generate
counterparty EPE and ENE (expected positive/negative exposure) under the ownership of
the CVA desk (although this often has another name) or plan to do so. Whilst a CVA
platform is based on an exposure measurement platform, the solution drivers are very
different and it is unwise to create dependencies between the risk exposure management
system and front office CVA system, even if they share similar intermediate outputs.
Responsibilities
• Responsible for creating T-SQL objects such as Tables, Views, Stored
Procedures, Functions and Triggers etc to implement the requirements of CVA.
• Involved in database design for implementation of CVA to be compatible with
Service Risk Engine (SRE).
• Involved in the analyzing the business requirement document (BRD) and the
functional specification document (FSD) provided by BA to implement the
features required for CVA.
• Played active role in creating scripts required for data archiving and feed file
archiving (Created Powershell scripts for feed file archiving) .
• Responsible for creating data snapshots POC, design and writing T-SQL scripts
for the same.
• Active role in tuning the stored procedure scripts to optimize the performance and
reduce the SQL Server resource usage which includes reduction of tempdb
growth, reduction of I/O and CPU utilization.
• Active role in identification of duplicate, missing and unused indexes and
involved in analyzing and implementing partitioning.
• Responsible to monitor daily top expensive queries that consume more amount of
CPU and high IO and tune them and involved in creating SSIS packages to load
data from daily and monthly feeds received from various source systems.
• Server 2008 to save disk space and reduction of I/O. Estimation of capacity
planning (workspace, I/O and CPU) to implement data compression and
evaluating the advantages gained by data compression versus CPU resources
consumed for compression and decompression.
• Helping QA team to provide functional and technical details and assist them with
Unit test plans, test cases and assisting across teams for all databases related
activities.
• Handled all database related activities in Full reval and Strategic Month end
projects in SRE.
• Created optimized SSIS packages to copy data from transactional database to
reporting database.
• Creating and enhancing the SSIS packages, writing stored procedure scripts to
support full reval and strategic month end tasks.
• Handled database purging tasks.
• Involved in creating ad-hoc reports using SSRS.
• Responsible for design of database model for Tradingbook and Tradingbook
Proxy Addon Requirements which includes creation of tables, writing optimized
stored procedures, copying data to reporting database using SSIS and write
optimized views to support ad-hoc reporting using SSRS and Qlikview.
• Involved in performance tuning of procedures in full reval stored procedures.
• Used Perforce for version control management.
• Used JIRA for code maintenance, bug logging, time tracking and document sharing.
Environment: MS SQL Server 2000/2005/2008, Windows 2003/2008
Host Analytics Software Pvt. Ltd., Hyderabad Aug ’10 –Oct'12
Designation: Lead Software Engineer
Host Analytics is the leading provider of on-demand corporate performance
management. Host Analytics’ solutions help financial and departmental executives
improve their budgeting, forecasting, financial consolidations, dashboarding,
scorecarding, reporting and analysis.
Host CPM is comprised of these modules:
• Host Budget provides connected and streamlined budgeting, planning,
forecasting, reporting and analysis, which eliminates the errors and
cumbersomeness of Excel.
• Host Analytics Revenue Planning is an integrated revenue, forecasting and
budgeting application that supports detailed sales forecasts across thousands of
customers and products and provides multiple methods for managing sales plans.
• Host Analytics Scorecard enables companies to encapsulate the strategic plan
into a dynamic system and allows employees across the organization to
continually monitor and measure progress against the strategic plan.
• Host Analyzer provides robust Host Analyzer reporting and analysis capabilities
for employees across the organization, from sales and marketing to finance and
operations.
• Host Consolidator speeds the process and manages the integrity of collecting,
consolidating and reporting financial information on a global basis.
Responsibilities
• Responsible for creating T-SQL objects such as Tables, Views, Stored
Procedures, Functions and Triggers etc to implement new requests rose by
product management and provide hot-fix scripts to fix the bugs raised by QA and
services team for the production databases.
• Active role in tuning the stored procedure scripts to optimize the performance and
reduce the SQL Server resource usage which includes reduction of tempdb
growth, reduction of I/O and CPU utilization.
• Involved in the analyzing the functional specification document (FSD) provided
by Product management and developing T-SQL scripts to cater the needs of FSD
and responsible for all new database activities especially the ETL module in the
application.
• Responsible to monitor daily top expensive queries that consume more amount of
CPU and high IO and tune them and involved in creating SSIS packages to load
data from the flat files in FTP to the local database, delete the files after
successful data load and created job to run these packages.
• Involved in performance tuning of sales and DLR procedures which are a major
modules of the product.
• Responsible for creating the T-SQL database objects scripts to maintain them
under version control. Extensive research on tools such as RedGate Source
control, MSCCI provider which support database version control and assisted
release management team to prepare scripts to provide the migration scripts for
each release.
• Played active role in providing scripts for Master Tenant management which
include backup, copy, restore, deletion of backup file and data validation scripts
and all major data load rules functionality.
• Reviewing of work done by peers in detail and suggesting them for the
improvements if any and signing off the peer review document and created tool to
maintain metadata description of each table column.
• Handled independently the major task of converting the databases to support
Unicode for Internationalization functionality which includes creation of tool to
convert all the T-SQL objects to support Unicode and preparation of migration
scripts and evaluation of different process to implement the migration scripts
(under Full recovery mode by disabling and enabling the logshipping jobs and
bulk-logged recovery mode) without disturbing the logshipping functionality.
• Involved in analyzing and handling data compression feature available in SQL
Server 2008 to save disk space and reduction of I/O. Estimation of capacity
planning (workspace, I/O and CPU) to implement data compression and
evaluating the advantages gained by data compression versus CPU resources
consumed for compression and decompression.
• Research on Visual Studio Database Edition for building schema scripts, test data
generation and schema comparison.
• Active participation in Agile (SCRUM) process which includes sprint planning
and execution of task on time.
• Helping QA team to provide functional and technical details and assist them with
Unit test plans, test cases and assisting across teams for all database related
activities.
• Researching on open source database “MySQL” and the feasibility to convert the
existing SQL Server databases to MySQL databases.
• Studied existing OLAP system(s) and created facts, dimensions and star schema
representation for the data mart and warehouse systems.
• Used SVN and TFS for version control management.
• Used Visual Studio Database Edition for building schema scripts, test data generation
and schema comparison.
• Used JIRA for code maintenance, bug logging, time tracking and document sharing.
Environment: MS SQL Server 2000/2005/2008, Windows 2003/2008
CoreLogic® MarketLinx®, Hyderabad
Tempo4 & NexGen (MLExchange) Mar ’09 –Aug’10
Designation: Senior Software Engineer
The platform that powers MLXchange™ and TEMPO® is ideal for large and medium
sized MLSs looking for uncompromising MLS functionality and unmatched flexibility.
With MarketLinx's MLS platform, you can customize the way your MLS looks and
works, partner with other vendors without worrying about compatibility, and control and
distribute your valuable data any way you want. MarketLinx lets you choose only the
optional MLS products you and your membership need, and makes it easy for your
members to personalize their services so they can stand out from the crowd.
Responsibilities
• Worked on project Tempo4 & NexGen (MLExchange) for CoreLogic® MarketLinx®
• Responsible for creating T-SQL objects such as Tables, Views, Stored
Procedures, Functions and Triggers etc to implement new requests raised by
product management and cleanup scripts to clean the data and update with latest
data on the production databases.
• Creation and monitoring of SQL jobs which check for the consistency of
databases and loading data from different real estate agents to the local databases.
• Played any active role in conversion of SQL Server 2000 DTS packages to SQL
Server 2005 SSIS packages and changing all its associated T-SQL scripts.
• Reviewing of work done by peers in detail and suggesting them for the
improvements if any and signing off the peer review document.
• Responsible for creating scripts that include latest changes of every development
changes and publish them (both schema and data changes) from one environment
to others (Dev, Stage, Preproduction and production) and performed backup and
restore tasks as part of publishing the scripts.
• Reviewed all database scripts to make them compatible to 2005 and 2008 versions
and preparation of checklist documents that need to be followed for upgrading the
database scripts to be compatible with sql server 2005 and sql server 2008.
• Involved in rewriting the T-SQL scripts to be compatible with sql server 2005
from sql server 2000.
• Involved in analysing the existing T-SQL scripts for performance bottlenecks and
tuned the stored procedure scripts for better optimization.
• Active participation in creating SSIS packages to migrate data to and from Tempo
databases and RETS database and involved in writing replication scripts (jobs and
stored procedures).
• Active role in creating CMA (Comparative Market Analysis) DB scripts for
populating metadata via CMA tools.
• Involved in writing DB scripts for SSO (Single Sign-on) functionality which
include login authentication and user details management.
• Responsible for creating maintenance plans(including backup, restoring and
scheduling of jobs and monitoring of jobs and scheduling them)
• Simultaneously working for Tempo4 and NexGen projects.
• Active participation in SCRUM meetings and given training to a DBA in our
team.
• Helped QA team in testing DB issues (DB incidents) by providing functional and
technical details.
• Used MS Visual Source Safe and SVN for version control management
Environment: MS SQL Server 2000/2005/2008, Windows 2003
Ness Technologies, Hyderabad
Global Reporter Apr ‘07- Mar ‘09
Designation: Senior Software Engineer
Global Reporter is a Web-based application that you can use to generate reports and
analyses of activity, such as incoming calls and trunk usage, in your organizations
phone system network.
You can use reports generated by Global Reporter to improve business by focusing
on activity in the system as a whole, at individual InstantOffice locations, or in other
ways. For example:
• Management can work to enhance efficiency in personnel and technology by
analyzing employee productivity.
• Customer service managers can promote higher levels of customer service and
call handling by analyzing caller behavior, such as trends in the numbers of
incoming and abandoned calls.
• Operations managers can analyze facilities usage to dramatically lower system
maintenance and management costs and find more cost-effective ways of
managing of branch/store operations.
• Marketing managers can assess the effectiveness of promotional campaigns by
tracking store traffic during critical periods.
Responsibilities
• Worked on the Project called Global Reporter for the client Vertical
• Created T-SQL objects tables, stored procedure, triggers and functions for new
plug-in functionality especially for Service Response, Fax Manager, Missing
Reports and Administration reports etc.,
• Responsible for designing tables, writing stored procedures to load data from flat
files generated from wave box to staging tables, main tables and render data to
reports for the major modules Service Response, Fax Manager, Missing Reports
and Administration reports etc.,
• Creating and scheduling jobs to load data from source which are in flat files
generated by telephonic wave boxes to local database destination tables and
developing T-SQL code as per the given functional specification.
• Creating complex stored procedures which pull the data from flat files process
them to store in staging and main tables and then render data that is necessary to
generate reports.
• Handled and played a significant role in performance tuning of the several T-SQL
scripts by analyzing SQL Profiler trace, Index Tuning/Database Engine Tuning
Advisor, Execution plans, performance counters and using various performance
checks available in SQL Server.
• Involved in designing various telephonic reports especially for Service Response,
Fax Manager, Missing Reports and Administration reports etc., using SSRS
• Installed new SQL server 2005/2008 servers and monitoring of three client
production servers for tempdb growth, wait statistics, transaction log growth and
performed all aspects of administration, including data modelling, backups,
replication, partition of tables and recovery.
• Involved in design and implementation of disaster recovery system which include
logshipping.
• Responsible for creating design and functional documents for the new plug in
functionality and preparation of Unit test cases to test the new functionality.
• Helped QA team to provide functional and technical details and assist them with
Unit test plans and assisting across teams for all database related activities.
• Active participation in sprint planning and SCRUM process to estimate time and
task implementation.
• Used MS Visual Source Safe and Star Team for version control management.
Environment: MS SQL Server 2005, Windows 2K/NT/98
First Indian Corporation Private Limited, Bangalore
Title Data Warehouse Feb ‘06- Mar ‘07
Designation: Software Engineer
The objective of the validation process for the First American Data Warehouse is to verify in a
timely manner that data in the Data Warehouse correctly reflect the corresponding data in the
source tables. The standard used is the metadata information contained in the ModelMart for the
data warehouse. Validations cover the Data Warehouse tables for both the Title and Centex
servers.
Although ideally the data being brought into the Data Warehouse ought to be validated during the
Staging phase of the ETL process, the present processing window and amount of data being
processed doesn't permit this. Instead, data are verified once in place in the Data Warehouse. The
Validation process begins after the ETL process is complete, Count validations are run every
night, and Detail validations are run once a week on a rotating schedule.
Responsibilities:
• Coding on SQL-Server using T-SQL for creating Database Objects
• Performance Monitoring and Performance Tuning
• Responsible for code review, stored procedures tuning and optimization
• Responsible for writing scripts to validate data between dataware housing database and
local databases.
• Creation of DTS packages for ETL
• Creating and Scheduling of SQL Server Jobs for data validations.
• Creation of DQ Designs and Build Requests for implementing changes
• Co-coordinating between the On-Site and Offshore team
• Checking of Data Quality and initiate actions in case of any discrepancies
• Rewriting of Stored Procedures to improve performance.
• Used MS Visual Source Safe for version control management.
• Used MS Visio for data modeling.
Environment: MS SQL Server 2000/2005, Windows 2K/NT/98
Elite Technologies, Hyderabad
Organization and Human Resource Planning (OHRP) Dec ‘04 – Jan’06
Designation: Developer
OHRP is specially designed to give you control over how the HRIS system operates and it
equally makes the managers work very simple to monitor his/her employes.This powerful
application gives employees instantaneous accesses to choose data from their information
set,while easing up valuable time to focus on more important tasks.the employee will have
theability to make instant changes to some of their information. In all Industrial countries and
increasingly in developing countries,computer systems are economically critical.OHRP is the
perfect solution for today’s managers and human resource professionals,who need an easy way to
store,retrieve and safeguard employee data and information.OHRP is a strategic HR solution for
Organizations of all sizes.OHRP is a powerful and highly customizable HR solutions leverage the
power of the web and the latest in an n-tier application framework to meet the changing and
challenging needs of HR.
Responsibilities:
• Designed the SQL Server database.
• Created and maintained very Complex Stored Procedures.
• Develop standard procedures for SQL Server tuning and optimization
• Establish QA environment to analyze existing unused indexes by running ITW
• Write stored procedures for monitoring, management and maintenance
• Wrote scripts to extract data from SQL Server databases
• Performed fine-tuning of stored procedures to improve performance
• Designed and implemented indexing strategy
• Improved logical and physical database design
• Wrote stored procedures for data capturing, manipulation and for use in the web search
engine
• Redesigned existing database schema to meet standard normalization requirements
Environment: SQL Server 2000 (including Transact-SQL triggers and stored
procedures), Windows 2K/NT/98
Elite Technologies, A.P.
Smart Portal Apr ‘ 04 – Nov ‘ 04
Designation: Developer
This revolutionary Web Portal is a great online solution for a Club, Association, Corporation,
Nonprofit or any other company or organization. It combines several online applications into one
easy to use package. Now you too can build an Online Community, Intranet or Extranet
The portal comes with several e-content modules, such as:
• Events - Event List provides the new events occurred in recent times.
• Links - Provides links to different areas such as science based sites, New Discoveries etc.
• Classifieds – Provides provision to view the news posted by the administrator
• Discussions – Provides discussion environment on various issues with global community.
• Administration – Restricted to administrator for administrative activities.
Responsibilities:
• Involved in analysis and design of the system.
• Created data access component to retrieve data from the database, and created dynamic
SQL query for the customer request.
• Responsible for installation of the product and solving queries at the client location.
• Written Stored Procedures, Functions, Database Triggers, Packages
• Responsible for backup and recovery, performance tuning and maintenance of SQL
Server 2000.
• Responsible for importing mission critical data into the databases.
• Designed and created tables, triggers, views, and stored procedures in SQL Server.
• Database maintenance, writing queries and generating reports in the database.
• Provided back-end support to developers for tasks like database design and modeling;
stored procedures development and T-SQL scripts optimization
• Gathering database requirements, functional specifications
Environment: SQL Server 2000, Windows 2K/NT/98
Elite Technologies, A.P.
NET SURVEY Aug ‘03– Mar‘ 04
Designation: Developer
NetSurvey is a is a powerful Survey engine which can create different types of survey scripts
which can be easily used to collect public opinion. It includes a Survey script builder where in
different questions can be created with different types of output to collect opinion from the
public. The generated survey script can span over a single page of multiple pages.
Along with the above mentioned areas it also provides the following
Survey Statistics: Monthly score of the survey which includes no of times it is being viewed
and no of voted casted.
Settings: Activating and deactivating the survey, resetting the score, Notification modes.
Security: Password protection, IP Protection, Cookie Protecion etc.
Form Builder: To generate customized templates for different types of surveys.
Results: Reports results with respect to filter etc. (Graphical Report, Cross Tabulation, Data
Export)
Mailing: Mailing survey invitation to people.
Code Creator: It creates the template based Asp.net Code
Responsibilities:
• Wrote stored procedures using Transact SQL (T-SQL) for data access from the SQL Server
2000 database.
• Wrote triggers using Transact SQL (T-SQL) for updating multiple tables in SQL Server 2000
database.
• Used MS DTS (Microsoft Data Transformation Services) packages to convert legacy data in
dBase 3.x into a SQL Server 2000 database.
• Interacted with the Accounting Staff and Business Managers for gathering requirements as
well as understanding the business processes involved.
• Analyzed and documented the limited scope of the existing application.
• Extensively worked on database design issues on the SQL Server 2000 backend so that the
data is most efficiently stored.
• Evaluation and implementation of SQL tuning and performance modifications.
• Creating and manage views
• Designed tables, indexes and establishing relationships
• Involved in writing Triggers and cursors.
• Involved in Exception Handling.
Environment: MS SQL Server 2000, Windows 2K/NT/98
PERSONAL DETAILS
Date of Birth: 04-04-1982
Age: 29 years
Marital Status: Married
Languages Known: Telugu, English
Passport Details:
Passport Number: G0556112
Type: P
Surname: VUNDAVALLI
Given Name: SIVA PRASAD
Nationality: INDIAN
Sex: M
Date of Birth: 04-04-1982
Place of Birth: MANDAPETA
Place of Issue: HYDERABAD
Date of Issue: 03-11-2006
Date of Expiry: 02-11-2016
Issuing Authority: Ministry of External Affairs, Passport office, Hyderabad

More Related Content

What's hot

Resume_Navneet_Formatted
Resume_Navneet_FormattedResume_Navneet_Formatted
Resume_Navneet_FormattedNavneet Tiwari
 
ETL Developer Resume
ETL Developer ResumeETL Developer Resume
ETL Developer ResumeTeferi Tamiru
 
simha msbi resume
simha msbi resumesimha msbi resume
simha msbi resumeT.N simha
 
Tx 09 G3 Jayasri Santhappan
Tx 09 G3 Jayasri SanthappanTx 09 G3 Jayasri Santhappan
Tx 09 G3 Jayasri Santhappanjayasrisan
 
Sql developer resume
Sql developer resumeSql developer resume
Sql developer resumeguest9db49b8
 
John Harisiadis - Bi Resume
John Harisiadis  - Bi ResumeJohn Harisiadis  - Bi Resume
John Harisiadis - Bi ResumeJohnHarisiadis
 
Jaime Carlson\'s BI Resume
Jaime Carlson\'s BI ResumeJaime Carlson\'s BI Resume
Jaime Carlson\'s BI Resumejaimecarlson
 
Mohamed sakr Senior ETL Developer
Mohamed sakr   Senior ETL Developer Mohamed sakr   Senior ETL Developer
Mohamed sakr Senior ETL Developer Mohamed Sakr
 
Satwinder SQL.SSK_Brd_2015
Satwinder SQL.SSK_Brd_2015Satwinder SQL.SSK_Brd_2015
Satwinder SQL.SSK_Brd_2015Satwinder Khural
 
Sophia wang (2)
Sophia wang (2)Sophia wang (2)
Sophia wang (2)Dibakar27
 
Nicholas Dragon Sql Developer2
Nicholas Dragon Sql Developer2Nicholas Dragon Sql Developer2
Nicholas Dragon Sql Developer2Nicholas Dragon
 
Resume rodney matejek sql server bi developer
Resume rodney matejek sql server bi developerResume rodney matejek sql server bi developer
Resume rodney matejek sql server bi developerrmatejek
 

What's hot (19)

Resume_Navneet_Formatted
Resume_Navneet_FormattedResume_Navneet_Formatted
Resume_Navneet_Formatted
 
ETL Developer Resume
ETL Developer ResumeETL Developer Resume
ETL Developer Resume
 
simha msbi resume
simha msbi resumesimha msbi resume
simha msbi resume
 
Tx 09 G3 Jayasri Santhappan
Tx 09 G3 Jayasri SanthappanTx 09 G3 Jayasri Santhappan
Tx 09 G3 Jayasri Santhappan
 
Nikhil_MSBI_updated
Nikhil_MSBI_updatedNikhil_MSBI_updated
Nikhil_MSBI_updated
 
Sql developer resume
Sql developer resumeSql developer resume
Sql developer resume
 
John Harisiadis - Bi Resume
John Harisiadis  - Bi ResumeJohn Harisiadis  - Bi Resume
John Harisiadis - Bi Resume
 
Jaime Carlson\'s BI Resume
Jaime Carlson\'s BI ResumeJaime Carlson\'s BI Resume
Jaime Carlson\'s BI Resume
 
Mohamed sakr Senior ETL Developer
Mohamed sakr   Senior ETL Developer Mohamed sakr   Senior ETL Developer
Mohamed sakr Senior ETL Developer
 
Resume_Krishna.M
Resume_Krishna.MResume_Krishna.M
Resume_Krishna.M
 
Bhargavi_MSBI
Bhargavi_MSBIBhargavi_MSBI
Bhargavi_MSBI
 
resume mumbai
resume mumbairesume mumbai
resume mumbai
 
updated_profile_ak
updated_profile_akupdated_profile_ak
updated_profile_ak
 
Satwinder SQL.SSK_Brd_2015
Satwinder SQL.SSK_Brd_2015Satwinder SQL.SSK_Brd_2015
Satwinder SQL.SSK_Brd_2015
 
Sitharthan_CV
Sitharthan_CVSitharthan_CV
Sitharthan_CV
 
Sophia wang (2)
Sophia wang (2)Sophia wang (2)
Sophia wang (2)
 
Nicholas Dragon Sql Developer2
Nicholas Dragon Sql Developer2Nicholas Dragon Sql Developer2
Nicholas Dragon Sql Developer2
 
Resume rodney matejek sql server bi developer
Resume rodney matejek sql server bi developerResume rodney matejek sql server bi developer
Resume rodney matejek sql server bi developer
 
Abdul ETL Resume
Abdul ETL ResumeAbdul ETL Resume
Abdul ETL Resume
 

Viewers also liked

ChallengesFacedinaProject
ChallengesFacedinaProjectChallengesFacedinaProject
ChallengesFacedinaProjectdrvijayindia
 
Propiedades de-la-suma-y-la-multiplicación exposición
Propiedades de-la-suma-y-la-multiplicación exposiciónPropiedades de-la-suma-y-la-multiplicación exposición
Propiedades de-la-suma-y-la-multiplicación exposiciónMiriam Garcia Mendez
 
Coaching emocji-subiektywny-przeglad
Coaching emocji-subiektywny-przegladCoaching emocji-subiektywny-przeglad
Coaching emocji-subiektywny-przegladKrzysztof Skubis
 
Introduction to Information security
Introduction to Information securityIntroduction to Information security
Introduction to Information securityRashad Aliyev
 
COLICIA'S RESUME WITHOUT PHOTOGRAPH
COLICIA'S RESUME WITHOUT PHOTOGRAPHCOLICIA'S RESUME WITHOUT PHOTOGRAPH
COLICIA'S RESUME WITHOUT PHOTOGRAPHColicia Paul
 
dskp pjpk tingkatan 1 pdf
dskp pjpk tingkatan 1 pdfdskp pjpk tingkatan 1 pdf
dskp pjpk tingkatan 1 pdfNas Lin
 

Viewers also liked (7)

ChallengesFacedinaProject
ChallengesFacedinaProjectChallengesFacedinaProject
ChallengesFacedinaProject
 
Propiedades de-la-suma-y-la-multiplicación exposición
Propiedades de-la-suma-y-la-multiplicación exposiciónPropiedades de-la-suma-y-la-multiplicación exposición
Propiedades de-la-suma-y-la-multiplicación exposición
 
Coaching emocji-subiektywny-przeglad
Coaching emocji-subiektywny-przegladCoaching emocji-subiektywny-przeglad
Coaching emocji-subiektywny-przeglad
 
Introduction to Information security
Introduction to Information securityIntroduction to Information security
Introduction to Information security
 
COLICIA'S RESUME WITHOUT PHOTOGRAPH
COLICIA'S RESUME WITHOUT PHOTOGRAPHCOLICIA'S RESUME WITHOUT PHOTOGRAPH
COLICIA'S RESUME WITHOUT PHOTOGRAPH
 
dskp pjpk tingkatan 1 pdf
dskp pjpk tingkatan 1 pdfdskp pjpk tingkatan 1 pdf
dskp pjpk tingkatan 1 pdf
 
Podstawy komunikacji
Podstawy komunikacjiPodstawy komunikacji
Podstawy komunikacji
 

Similar to Siva-CV

Similar to Siva-CV (20)

Rakesh
RakeshRakesh
Rakesh
 
Shane_O'Neill_CV_slim
Shane_O'Neill_CV_slimShane_O'Neill_CV_slim
Shane_O'Neill_CV_slim
 
Colinsobersbiresume 20120723doc
Colinsobersbiresume 20120723docColinsobersbiresume 20120723doc
Colinsobersbiresume 20120723doc
 
HamsaBalajiresume
HamsaBalajiresumeHamsaBalajiresume
HamsaBalajiresume
 
Milind_Dhotarkar - MSBI
Milind_Dhotarkar - MSBIMilind_Dhotarkar - MSBI
Milind_Dhotarkar - MSBI
 
Mohamed Elsafory Detail Skills
Mohamed Elsafory Detail SkillsMohamed Elsafory Detail Skills
Mohamed Elsafory Detail Skills
 
zahidCvFinal(Updated Jan 17)-2
zahidCvFinal(Updated Jan 17)-2zahidCvFinal(Updated Jan 17)-2
zahidCvFinal(Updated Jan 17)-2
 
Sql Resume2010
Sql Resume2010Sql Resume2010
Sql Resume2010
 
Kevin Fahy BI Resume
Kevin Fahy BI ResumeKevin Fahy BI Resume
Kevin Fahy BI Resume
 
Sanyog_Resume
Sanyog_ResumeSanyog_Resume
Sanyog_Resume
 
sql resume
sql resumesql resume
sql resume
 
Balamurugan msbi cv
Balamurugan msbi cvBalamurugan msbi cv
Balamurugan msbi cv
 
SQL_DBA USA_M&T Bank
SQL_DBA USA_M&T BankSQL_DBA USA_M&T Bank
SQL_DBA USA_M&T Bank
 
Chetan.Kumar-SQL_DBA 9115
Chetan.Kumar-SQL_DBA 9115Chetan.Kumar-SQL_DBA 9115
Chetan.Kumar-SQL_DBA 9115
 
Copy of Alok_Singh_CV
Copy of Alok_Singh_CVCopy of Alok_Singh_CV
Copy of Alok_Singh_CV
 
Jeevitha_Resume
Jeevitha_ResumeJeevitha_Resume
Jeevitha_Resume
 
ShashankJainMSBI
ShashankJainMSBIShashankJainMSBI
ShashankJainMSBI
 
Subhoshree_ETLDeveloper
Subhoshree_ETLDeveloperSubhoshree_ETLDeveloper
Subhoshree_ETLDeveloper
 
MSBI_MSSQL_Bhrath
MSBI_MSSQL_BhrathMSBI_MSSQL_Bhrath
MSBI_MSSQL_Bhrath
 
Bi Resume Ejd
Bi Resume EjdBi Resume Ejd
Bi Resume Ejd
 

Siva-CV

  • 1. SIVA PRASAD VUNDAVALLI Phone: 65 -91044959 email: v_sivaprasad_v@yahoo.co.in SUMMARY • Prasad has Over Twelve years and Ten months of IT experience in database development, maintenance, and Implementation. • SQL Server implementation, management, maintenance. • Performed all aspects of administration, including data modeling, backups, replication, partition of tables and recovery. • Gathering database requirements & functional specifications. • Development of SQL scripts for maintenance. • Extensive knowledge in Database Programming and Development using SQL Server (2000/2005/2008/2012/2014) and MS Access to include Database Architecture, Logical, and Physical Design. • Excellent knowledge in T-SQL Programming (Stored Procedures, Functions, and Triggers) • Ability to work well in teams, rapid learner with excellent problem solving skills, strong technical background, and fine interpersonal skill • Transferring Data using Data Transformation Services (DTS) Packages and SSIS packages • Excellent in High Level Design of ETL DTS Packages & SSIS Packages for integrating data using OLE DB connection from heterogeneous sources like (Excel, CSV, Oracle, flat file, Text Format Data) by using multiple transformations provided by SSIS such as Data Conversion, Conditional Split, Bulk Insert , Merge and union all. • Involved in Business Analysis of the OLTP and OLAP data and in decision support. • Knowledge of Data Warehousing methodologies and concepts, including star schemas, snowflakes, ETL processes, dimensional modelling and reporting tools. • Experience in MS SQL Server 2008/2012 Business Intelligence in MS SQL Server Integration Services (SSIS), MS SQL Server Reporting Services (SSRS) and MS SQL Server Analysis Services (SSAS). • Development of ad hoc Reports using SSRS. • Good knowledge of creating report models using SSRS and creation of drill down and drill through reports. • Monitoring and tuning SQL Server performance with multiple tools including Performance Monitor, SQL Profiler, and Index Tuning Wizard/Tuning Advisor to achieve better performance, accessibility and efficiency • Use of T-SQL for advanced SQL queries, stored procedures for data manipulation.
  • 2. • Excellent knowledge in data snapshot and data archiving. • Excellent knowledge in Backup , Restore, Log shipping , Replication, Failover Clustering and Database Mirroring • Excellent knowledge in performance tuning to optimize DB scripts. • Handled independently the migration of non Unicode databases to support Unicode. Developed migration tool using C# • Involved in creation of SSIS packages to process xml files, transform the data to apply validations using script component tasks in C# and load data in SQL Server databases • Maintenance of logshipping under Unicode migration • Substantial knowledge of data compression techniques in SQL Server 2008/2012/2014 • Monitoring top expensive queries that consume considerable amount of CPU and high IO and resolve them on demand basis. • Handled Volumetric testing and analysis to identify growth factors and set application and database threshold values. • Implementation of DR setup including Transactional Replication and Log shipping. • Involved in design of Data Valut model to provide long-term historical storage of data coming in from multiple operational systems. TECHNICAL SKILLS Languages : C, C++, T-SQL, Core Java, C# Web Technologies : HTML Databases : SQL Server 2000/2005/2008/2012 and MS Access Web Server : IIS 5.0/6.0/8.0 Operating System : Windows 2000/2003/NT 4.0/98/XP/2008 Others : Visual Source Safe, Star team, SVN, TFS, JIRA, PowerShell, QlikView, Perforce EDUCATION: Bachelor’s Degree in Electrical & Electronics Engineering, JNTU, Hyderabad, and Andhrapradesh, India. PROJECTS: Comtel Solutions Pte Ltd, Singapore Aug'15 – Present
  • 3. Client: Bank Of Singapore, Singapore Designation: Senior Consultant Singapore Deposit Insurance Corporation (SDIC): The Scheme members, one of scheme member is BOS, should implement the necessary systems and processes such that the Specified Information can be delivered to SDIC according to the prescribed data structure and format, as well as within the timeframe set out by SDIC. As part of SDIC BOS has to submit the files related to Bank’s Deposits, Loans, Current Account, GL, SPOWNER information etc., Responsibilities • Handled the project independently right from requirement gathering, creating data mapping document, design of Data Vault model to maintain core data as well as SDIC specific tables and creation of schematic views/functions and export procedures. • Involved in creation of mapping document to identify data from different sources and map to the data in UDW database which is designed in Data Vault model. • Responsible for creating T-SQL objects such as Tables, Views, Stored Procedures, Functions to store data in Data Vault model and generate interface files required by SDIC. • Creation of PIT table logic to maintain snapshot data and creation of config data to generate PIT procedures and required tables dynamically • Developed utility using Powershell to generate PIT procedures and tables scripts dynamically. • Involved in enhancing UDW Utility tool to generate PIT related SSIS packages using C#. • Developed SSIS packages to populate PIT tables in parallel. • Developed SSIS packages to populate HUB, SAT, LINK tables in Data Vault model. • Developed macro excels using VBA to mask the data and provide SDIC specific excel based reports. • Developed logic using T-SQL procedures to mask the data as per SDIC masking rules. • Creation of deployment packages based on the scripts in TFS. • Support to SIT and UAT users for validation of SDIC files. • Support to Production support team in assisting SDIC simulation exercise. • Creation of statistics related to Capacity planning and execution times.
  • 4. • Involved in extending the Utility tool which will generate the SSIS packages dynamically. • Designed the Utility tool to support Excel Import/Flat Import/Excel Export/Flat Export using SSIS script task in C#. • Creation of Technical Specification Document and data mapping documents. • Involved in leading the project Transaction Life Cycle Management (TLM) to gather requirements, creation of source to destination mapping document, design of Data Vault tables and monitor the project status till it goes live. Environment: MS SQL Server 2000/2005/2008/2012/2014, Windows 2003/2008, SSIS, SSRS Randstad Consulting Technologies Pte Ltd, Singapore Oct'14 – Aug'15 Client: EscherGroup, Singapore Designation: Database Developer Responsibilities • Responsible for creating T-SQL objects such as Tables, Views, Stored Procedures, Functions and Triggers etc to implement the requirements of Report DB for Post Malaysia. • Carrying out Volumetric testing and analysis to identify application load thresholds and database resource thresholds. • Analysis and implementation of DR set up for Report database server and MO database server which includes Transactional Replication and Log shipping. • Creation of database maintenance plans which includes backup frequency and index maintenance. • Involved in creation of SSIS package to load xml data into database and process this data to generate different report text files like ITF-150, ITF-151 and ITF-152. • Enhanced the SSIS packages to apply validation on xml data using script component tasks using C#. • Creation of SQL Server Jobs to schedule SSIS packages.
  • 5. • Creation of reports using SSRS for HQ reports. • Involved in enhancing HQ reports business layer using C#. Environment: MS SQL Server 2000/2005/2008/2012, Windows 2003/2008, SSIS, SSRS TangSpac Consulting Pte Ltd, Singapore Oct'12 – Oct’14 Client: Barclays Capital, Singapore Designation: SQL Developer Specific Risk Engine (SRE) This is the project for Barclays Capital to generate PnLs for different client systems like CVA and OZ etc. SRE collects inputs from various internal systems such as Bedrock and Asset Control etc and sources PnLs to CVA to facilitate CVA to generate charge calculations Credit Value Adjustment (CVA) This is the project for Barclays Capital to comply with the BASELIII regulatory system. Credit value adjustment (CVA) is by definition the difference between the risk-free portfolio value and the true portfolio value that takes into account the possibility of counterparty’s default. In other words, CVA is the market value of a counterparty credit risk. In the view of leading investment banks, CVA is essentially an activity carried out by both finance and a trading desk in the Front Office. Tier 1 banks either already generate counterparty EPE and ENE (expected positive/negative exposure) under the ownership of the CVA desk (although this often has another name) or plan to do so. Whilst a CVA platform is based on an exposure measurement platform, the solution drivers are very different and it is unwise to create dependencies between the risk exposure management system and front office CVA system, even if they share similar intermediate outputs. Responsibilities
  • 6. • Responsible for creating T-SQL objects such as Tables, Views, Stored Procedures, Functions and Triggers etc to implement the requirements of CVA. • Involved in database design for implementation of CVA to be compatible with Service Risk Engine (SRE). • Involved in the analyzing the business requirement document (BRD) and the functional specification document (FSD) provided by BA to implement the features required for CVA. • Played active role in creating scripts required for data archiving and feed file archiving (Created Powershell scripts for feed file archiving) . • Responsible for creating data snapshots POC, design and writing T-SQL scripts for the same. • Active role in tuning the stored procedure scripts to optimize the performance and reduce the SQL Server resource usage which includes reduction of tempdb growth, reduction of I/O and CPU utilization. • Active role in identification of duplicate, missing and unused indexes and involved in analyzing and implementing partitioning. • Responsible to monitor daily top expensive queries that consume more amount of CPU and high IO and tune them and involved in creating SSIS packages to load data from daily and monthly feeds received from various source systems. • Server 2008 to save disk space and reduction of I/O. Estimation of capacity planning (workspace, I/O and CPU) to implement data compression and evaluating the advantages gained by data compression versus CPU resources consumed for compression and decompression. • Helping QA team to provide functional and technical details and assist them with Unit test plans, test cases and assisting across teams for all databases related activities. • Handled all database related activities in Full reval and Strategic Month end projects in SRE. • Created optimized SSIS packages to copy data from transactional database to reporting database. • Creating and enhancing the SSIS packages, writing stored procedure scripts to support full reval and strategic month end tasks. • Handled database purging tasks. • Involved in creating ad-hoc reports using SSRS. • Responsible for design of database model for Tradingbook and Tradingbook Proxy Addon Requirements which includes creation of tables, writing optimized stored procedures, copying data to reporting database using SSIS and write optimized views to support ad-hoc reporting using SSRS and Qlikview. • Involved in performance tuning of procedures in full reval stored procedures. • Used Perforce for version control management. • Used JIRA for code maintenance, bug logging, time tracking and document sharing. Environment: MS SQL Server 2000/2005/2008, Windows 2003/2008
  • 7. Host Analytics Software Pvt. Ltd., Hyderabad Aug ’10 –Oct'12 Designation: Lead Software Engineer Host Analytics is the leading provider of on-demand corporate performance management. Host Analytics’ solutions help financial and departmental executives improve their budgeting, forecasting, financial consolidations, dashboarding, scorecarding, reporting and analysis. Host CPM is comprised of these modules: • Host Budget provides connected and streamlined budgeting, planning, forecasting, reporting and analysis, which eliminates the errors and cumbersomeness of Excel. • Host Analytics Revenue Planning is an integrated revenue, forecasting and budgeting application that supports detailed sales forecasts across thousands of customers and products and provides multiple methods for managing sales plans. • Host Analytics Scorecard enables companies to encapsulate the strategic plan into a dynamic system and allows employees across the organization to continually monitor and measure progress against the strategic plan. • Host Analyzer provides robust Host Analyzer reporting and analysis capabilities for employees across the organization, from sales and marketing to finance and operations. • Host Consolidator speeds the process and manages the integrity of collecting, consolidating and reporting financial information on a global basis. Responsibilities • Responsible for creating T-SQL objects such as Tables, Views, Stored Procedures, Functions and Triggers etc to implement new requests rose by product management and provide hot-fix scripts to fix the bugs raised by QA and services team for the production databases. • Active role in tuning the stored procedure scripts to optimize the performance and reduce the SQL Server resource usage which includes reduction of tempdb growth, reduction of I/O and CPU utilization. • Involved in the analyzing the functional specification document (FSD) provided by Product management and developing T-SQL scripts to cater the needs of FSD and responsible for all new database activities especially the ETL module in the application.
  • 8. • Responsible to monitor daily top expensive queries that consume more amount of CPU and high IO and tune them and involved in creating SSIS packages to load data from the flat files in FTP to the local database, delete the files after successful data load and created job to run these packages. • Involved in performance tuning of sales and DLR procedures which are a major modules of the product. • Responsible for creating the T-SQL database objects scripts to maintain them under version control. Extensive research on tools such as RedGate Source control, MSCCI provider which support database version control and assisted release management team to prepare scripts to provide the migration scripts for each release. • Played active role in providing scripts for Master Tenant management which include backup, copy, restore, deletion of backup file and data validation scripts and all major data load rules functionality. • Reviewing of work done by peers in detail and suggesting them for the improvements if any and signing off the peer review document and created tool to maintain metadata description of each table column. • Handled independently the major task of converting the databases to support Unicode for Internationalization functionality which includes creation of tool to convert all the T-SQL objects to support Unicode and preparation of migration scripts and evaluation of different process to implement the migration scripts (under Full recovery mode by disabling and enabling the logshipping jobs and bulk-logged recovery mode) without disturbing the logshipping functionality. • Involved in analyzing and handling data compression feature available in SQL Server 2008 to save disk space and reduction of I/O. Estimation of capacity planning (workspace, I/O and CPU) to implement data compression and evaluating the advantages gained by data compression versus CPU resources consumed for compression and decompression. • Research on Visual Studio Database Edition for building schema scripts, test data generation and schema comparison. • Active participation in Agile (SCRUM) process which includes sprint planning and execution of task on time. • Helping QA team to provide functional and technical details and assist them with Unit test plans, test cases and assisting across teams for all database related activities. • Researching on open source database “MySQL” and the feasibility to convert the existing SQL Server databases to MySQL databases. • Studied existing OLAP system(s) and created facts, dimensions and star schema representation for the data mart and warehouse systems. • Used SVN and TFS for version control management. • Used Visual Studio Database Edition for building schema scripts, test data generation and schema comparison. • Used JIRA for code maintenance, bug logging, time tracking and document sharing.
  • 9. Environment: MS SQL Server 2000/2005/2008, Windows 2003/2008 CoreLogic® MarketLinx®, Hyderabad Tempo4 & NexGen (MLExchange) Mar ’09 –Aug’10 Designation: Senior Software Engineer The platform that powers MLXchange™ and TEMPO® is ideal for large and medium sized MLSs looking for uncompromising MLS functionality and unmatched flexibility. With MarketLinx's MLS platform, you can customize the way your MLS looks and works, partner with other vendors without worrying about compatibility, and control and distribute your valuable data any way you want. MarketLinx lets you choose only the optional MLS products you and your membership need, and makes it easy for your members to personalize their services so they can stand out from the crowd. Responsibilities • Worked on project Tempo4 & NexGen (MLExchange) for CoreLogic® MarketLinx® • Responsible for creating T-SQL objects such as Tables, Views, Stored Procedures, Functions and Triggers etc to implement new requests raised by product management and cleanup scripts to clean the data and update with latest data on the production databases. • Creation and monitoring of SQL jobs which check for the consistency of databases and loading data from different real estate agents to the local databases. • Played any active role in conversion of SQL Server 2000 DTS packages to SQL Server 2005 SSIS packages and changing all its associated T-SQL scripts. • Reviewing of work done by peers in detail and suggesting them for the improvements if any and signing off the peer review document. • Responsible for creating scripts that include latest changes of every development changes and publish them (both schema and data changes) from one environment to others (Dev, Stage, Preproduction and production) and performed backup and restore tasks as part of publishing the scripts. • Reviewed all database scripts to make them compatible to 2005 and 2008 versions and preparation of checklist documents that need to be followed for upgrading the database scripts to be compatible with sql server 2005 and sql server 2008.
  • 10. • Involved in rewriting the T-SQL scripts to be compatible with sql server 2005 from sql server 2000. • Involved in analysing the existing T-SQL scripts for performance bottlenecks and tuned the stored procedure scripts for better optimization. • Active participation in creating SSIS packages to migrate data to and from Tempo databases and RETS database and involved in writing replication scripts (jobs and stored procedures). • Active role in creating CMA (Comparative Market Analysis) DB scripts for populating metadata via CMA tools. • Involved in writing DB scripts for SSO (Single Sign-on) functionality which include login authentication and user details management. • Responsible for creating maintenance plans(including backup, restoring and scheduling of jobs and monitoring of jobs and scheduling them) • Simultaneously working for Tempo4 and NexGen projects. • Active participation in SCRUM meetings and given training to a DBA in our team. • Helped QA team in testing DB issues (DB incidents) by providing functional and technical details. • Used MS Visual Source Safe and SVN for version control management Environment: MS SQL Server 2000/2005/2008, Windows 2003 Ness Technologies, Hyderabad Global Reporter Apr ‘07- Mar ‘09 Designation: Senior Software Engineer Global Reporter is a Web-based application that you can use to generate reports and analyses of activity, such as incoming calls and trunk usage, in your organizations phone system network. You can use reports generated by Global Reporter to improve business by focusing on activity in the system as a whole, at individual InstantOffice locations, or in other ways. For example: • Management can work to enhance efficiency in personnel and technology by analyzing employee productivity. • Customer service managers can promote higher levels of customer service and call handling by analyzing caller behavior, such as trends in the numbers of incoming and abandoned calls. • Operations managers can analyze facilities usage to dramatically lower system maintenance and management costs and find more cost-effective ways of managing of branch/store operations. • Marketing managers can assess the effectiveness of promotional campaigns by tracking store traffic during critical periods.
  • 11. Responsibilities • Worked on the Project called Global Reporter for the client Vertical • Created T-SQL objects tables, stored procedure, triggers and functions for new plug-in functionality especially for Service Response, Fax Manager, Missing Reports and Administration reports etc., • Responsible for designing tables, writing stored procedures to load data from flat files generated from wave box to staging tables, main tables and render data to reports for the major modules Service Response, Fax Manager, Missing Reports and Administration reports etc., • Creating and scheduling jobs to load data from source which are in flat files generated by telephonic wave boxes to local database destination tables and developing T-SQL code as per the given functional specification. • Creating complex stored procedures which pull the data from flat files process them to store in staging and main tables and then render data that is necessary to generate reports. • Handled and played a significant role in performance tuning of the several T-SQL scripts by analyzing SQL Profiler trace, Index Tuning/Database Engine Tuning Advisor, Execution plans, performance counters and using various performance checks available in SQL Server. • Involved in designing various telephonic reports especially for Service Response, Fax Manager, Missing Reports and Administration reports etc., using SSRS • Installed new SQL server 2005/2008 servers and monitoring of three client production servers for tempdb growth, wait statistics, transaction log growth and performed all aspects of administration, including data modelling, backups, replication, partition of tables and recovery. • Involved in design and implementation of disaster recovery system which include logshipping. • Responsible for creating design and functional documents for the new plug in functionality and preparation of Unit test cases to test the new functionality. • Helped QA team to provide functional and technical details and assist them with Unit test plans and assisting across teams for all database related activities. • Active participation in sprint planning and SCRUM process to estimate time and task implementation. • Used MS Visual Source Safe and Star Team for version control management. Environment: MS SQL Server 2005, Windows 2K/NT/98 First Indian Corporation Private Limited, Bangalore Title Data Warehouse Feb ‘06- Mar ‘07 Designation: Software Engineer
  • 12. The objective of the validation process for the First American Data Warehouse is to verify in a timely manner that data in the Data Warehouse correctly reflect the corresponding data in the source tables. The standard used is the metadata information contained in the ModelMart for the data warehouse. Validations cover the Data Warehouse tables for both the Title and Centex servers. Although ideally the data being brought into the Data Warehouse ought to be validated during the Staging phase of the ETL process, the present processing window and amount of data being processed doesn't permit this. Instead, data are verified once in place in the Data Warehouse. The Validation process begins after the ETL process is complete, Count validations are run every night, and Detail validations are run once a week on a rotating schedule. Responsibilities: • Coding on SQL-Server using T-SQL for creating Database Objects • Performance Monitoring and Performance Tuning • Responsible for code review, stored procedures tuning and optimization • Responsible for writing scripts to validate data between dataware housing database and local databases. • Creation of DTS packages for ETL • Creating and Scheduling of SQL Server Jobs for data validations. • Creation of DQ Designs and Build Requests for implementing changes • Co-coordinating between the On-Site and Offshore team • Checking of Data Quality and initiate actions in case of any discrepancies • Rewriting of Stored Procedures to improve performance. • Used MS Visual Source Safe for version control management. • Used MS Visio for data modeling. Environment: MS SQL Server 2000/2005, Windows 2K/NT/98 Elite Technologies, Hyderabad Organization and Human Resource Planning (OHRP) Dec ‘04 – Jan’06 Designation: Developer OHRP is specially designed to give you control over how the HRIS system operates and it equally makes the managers work very simple to monitor his/her employes.This powerful application gives employees instantaneous accesses to choose data from their information set,while easing up valuable time to focus on more important tasks.the employee will have theability to make instant changes to some of their information. In all Industrial countries and increasingly in developing countries,computer systems are economically critical.OHRP is the perfect solution for today’s managers and human resource professionals,who need an easy way to store,retrieve and safeguard employee data and information.OHRP is a strategic HR solution for Organizations of all sizes.OHRP is a powerful and highly customizable HR solutions leverage the power of the web and the latest in an n-tier application framework to meet the changing and challenging needs of HR.
  • 13. Responsibilities: • Designed the SQL Server database. • Created and maintained very Complex Stored Procedures. • Develop standard procedures for SQL Server tuning and optimization • Establish QA environment to analyze existing unused indexes by running ITW • Write stored procedures for monitoring, management and maintenance • Wrote scripts to extract data from SQL Server databases • Performed fine-tuning of stored procedures to improve performance • Designed and implemented indexing strategy • Improved logical and physical database design • Wrote stored procedures for data capturing, manipulation and for use in the web search engine • Redesigned existing database schema to meet standard normalization requirements Environment: SQL Server 2000 (including Transact-SQL triggers and stored procedures), Windows 2K/NT/98 Elite Technologies, A.P. Smart Portal Apr ‘ 04 – Nov ‘ 04 Designation: Developer This revolutionary Web Portal is a great online solution for a Club, Association, Corporation, Nonprofit or any other company or organization. It combines several online applications into one easy to use package. Now you too can build an Online Community, Intranet or Extranet The portal comes with several e-content modules, such as: • Events - Event List provides the new events occurred in recent times. • Links - Provides links to different areas such as science based sites, New Discoveries etc. • Classifieds – Provides provision to view the news posted by the administrator • Discussions – Provides discussion environment on various issues with global community. • Administration – Restricted to administrator for administrative activities. Responsibilities: • Involved in analysis and design of the system. • Created data access component to retrieve data from the database, and created dynamic SQL query for the customer request. • Responsible for installation of the product and solving queries at the client location. • Written Stored Procedures, Functions, Database Triggers, Packages • Responsible for backup and recovery, performance tuning and maintenance of SQL Server 2000. • Responsible for importing mission critical data into the databases. • Designed and created tables, triggers, views, and stored procedures in SQL Server. • Database maintenance, writing queries and generating reports in the database. • Provided back-end support to developers for tasks like database design and modeling; stored procedures development and T-SQL scripts optimization
  • 14. • Gathering database requirements, functional specifications Environment: SQL Server 2000, Windows 2K/NT/98 Elite Technologies, A.P. NET SURVEY Aug ‘03– Mar‘ 04 Designation: Developer NetSurvey is a is a powerful Survey engine which can create different types of survey scripts which can be easily used to collect public opinion. It includes a Survey script builder where in different questions can be created with different types of output to collect opinion from the public. The generated survey script can span over a single page of multiple pages. Along with the above mentioned areas it also provides the following Survey Statistics: Monthly score of the survey which includes no of times it is being viewed and no of voted casted. Settings: Activating and deactivating the survey, resetting the score, Notification modes. Security: Password protection, IP Protection, Cookie Protecion etc. Form Builder: To generate customized templates for different types of surveys. Results: Reports results with respect to filter etc. (Graphical Report, Cross Tabulation, Data Export) Mailing: Mailing survey invitation to people. Code Creator: It creates the template based Asp.net Code Responsibilities: • Wrote stored procedures using Transact SQL (T-SQL) for data access from the SQL Server 2000 database. • Wrote triggers using Transact SQL (T-SQL) for updating multiple tables in SQL Server 2000 database. • Used MS DTS (Microsoft Data Transformation Services) packages to convert legacy data in dBase 3.x into a SQL Server 2000 database. • Interacted with the Accounting Staff and Business Managers for gathering requirements as well as understanding the business processes involved. • Analyzed and documented the limited scope of the existing application. • Extensively worked on database design issues on the SQL Server 2000 backend so that the data is most efficiently stored. • Evaluation and implementation of SQL tuning and performance modifications. • Creating and manage views • Designed tables, indexes and establishing relationships • Involved in writing Triggers and cursors. • Involved in Exception Handling. Environment: MS SQL Server 2000, Windows 2K/NT/98
  • 15. PERSONAL DETAILS Date of Birth: 04-04-1982 Age: 29 years Marital Status: Married Languages Known: Telugu, English Passport Details: Passport Number: G0556112 Type: P Surname: VUNDAVALLI Given Name: SIVA PRASAD Nationality: INDIAN Sex: M Date of Birth: 04-04-1982 Place of Birth: MANDAPETA Place of Issue: HYDERABAD Date of Issue: 03-11-2006 Date of Expiry: 02-11-2016 Issuing Authority: Ministry of External Affairs, Passport office, Hyderabad