Christine Misseri has over 25 years of experience as a software developer and database professional working with various applications and packages in industries such as insurance, finance, aerospace, healthcare, and utilities. She has expertise in technologies including C#, SQL Server, SSIS, Oracle, Microsoft Office, and Visual Studio. The document provides details on her qualifications, technical skills, accomplishments, and work experience developing and maintaining databases and applications.
The world of data architecture began with applications. Next came data warehouses. Then text was organized into a data warehouse.
Then one day the world discovered a whole new kind of data that was being generated by organizations. The world found that machines generated data that could be transformed into valuable insights. This was the origin of what is today called the data lakehouse. The evolution of data architecture continues today.
Come listen to industry experts describe this transformation of ordinary data into a data architecture that is invaluable to business. Simply put, organizations that take data architecture seriously are going to be at the forefront of business tomorrow.
This is an educational event.
Several of the authors of the book Building the Data Lakehouse will be presenting at this symposium.
Whitepaper: Volume Testing Thick Clients and DatabasesRTTS
Even in the current age of cloud computing there are still endless benefits of developing thick client software: non-dependency on browser version, offline support, low hosting fees, and utilizing existing end user hardware, to name a few.
It's more than likely that your organization is utilizing at least a few thick client applications. Now consider this: as your user base grows, does your think client's back-end server need to grow as well? How quickly? How do you ensure that you provide the correct amount of additional capacity without overstepping and unnecessarily eating into your profits? The answer is volume testing.
Read how RTTS does this with IBM Rational Performance Tester.
The world of data architecture began with applications. Next came data warehouses. Then text was organized into a data warehouse.
Then one day the world discovered a whole new kind of data that was being generated by organizations. The world found that machines generated data that could be transformed into valuable insights. This was the origin of what is today called the data lakehouse. The evolution of data architecture continues today.
Come listen to industry experts describe this transformation of ordinary data into a data architecture that is invaluable to business. Simply put, organizations that take data architecture seriously are going to be at the forefront of business tomorrow.
This is an educational event.
Several of the authors of the book Building the Data Lakehouse will be presenting at this symposium.
Whitepaper: Volume Testing Thick Clients and DatabasesRTTS
Even in the current age of cloud computing there are still endless benefits of developing thick client software: non-dependency on browser version, offline support, low hosting fees, and utilizing existing end user hardware, to name a few.
It's more than likely that your organization is utilizing at least a few thick client applications. Now consider this: as your user base grows, does your think client's back-end server need to grow as well? How quickly? How do you ensure that you provide the correct amount of additional capacity without overstepping and unnecessarily eating into your profits? The answer is volume testing.
Read how RTTS does this with IBM Rational Performance Tester.
Scaling Databricks to Run Data and ML Workloads on Millions of VMsMatei Zaharia
Keynote at Scale By The Bay 2020.
Cloud service developers need to handle massive scale workloads from thousands of customers with no downtime or regressions. In this talk, I’ll present our experience building a very large-scale cloud service at Databricks, which provides a data and ML platform service used by many of the largest enterprises in the world. Databricks manages millions of cloud VMs that process exabytes of data per day for interactive, streaming and batch production applications. This means that our control plane has to handle a wide range of workload patterns and cloud issues such as outages. We will describe how we built our control plane for Databricks using Scala services and open source infrastructure such as Kubernetes, Envoy, and Prometheus, and various design patterns and engineering processes that we learned along the way. In addition, I’ll describe how we have adapted data analytics systems themselves to improve reliability and manageability in the cloud, such as creating an ACID storage system that is as reliable as the underlying cloud object store (Delta Lake) and adding autoscaling and auto-shutdown features for Apache Spark.
Leveraging HPE ALM & QuerySurge to test HPE VerticaRTTS
Are you using HPE ALM or Quality Center (QC) for your requirements gathering and test management?
RTTS, an alliance partner of HPE and a member of HPE’s Big Data community, can show you how to use ALM/QC and RTTS’ QuerySurge to effectively manage your data validation & testing of Vertica (or any data warehouse).
In this webinar video you will see:
- a custom view of ALM to store source-to-target mappings
- data validation tests in QuerySurge
- the execution of QuerySurge tests from ALM
- the results of data validation tests stored in ALM
- custom ALM reports that show data validation coverage of Vertica
how we improve your data quality while reducing your costs & risks
Presented by:
Bill Hayduk, Founder & CEO of RTTS, the developers of QuerySurge
Chris Thompson, Senior Domain Expert, Big Data testing
To learn more about QuerySurge, visit www.QuerySurge.com
Implementing Azure DevOps with your Testing ProjectRTTS
Implementing Azure DevOps With Your Testing Project
Are you challenged with different teams working on different platforms making it difficult to get insight into another team’s work?
Is your team seeking ways to automate the code deployments so you can spend more time developing new features and writing more tests, and spend less time deploying and running manual tests?
RTTS, a Microsoft Gold DevOps Partner, will take you through solving these challenges with Azure DevOps.
Tuesday, June 16th 2020 @11am ET
Session Overview
------------------------------------
During the webinar, we will walk you through the following process of utilizing Azure DevOps:
- The challenges that inspired the Azure DevOps solution that you may experience as well
- The strategy for implementing Azure Devops
- Solutions in our every day processes to increase our times efficiency and save time
- A demo of an Azure DevOps environment for testing teams
The see a recording of the webinar, please visit:
https://www.youtube.com/watch?v=2vIic3wxaS4
To learn more about RTTS, please visit:
https://www.rttsweb.com
QuerySurge Slide Deck for Big Data Testing WebinarRTTS
This is a slide deck from QuerySurge's Big Data Testing webinar.
Learn why Testing is pivotal to the success of your Big Data Strategy .
Learn more at www.querysurge.com
The growing variety of new data sources is pushing organizations to look for streamlined ways to manage complexities and get the most out of their data-related investments. The companies that do this correctly are realizing the power of big data for business expansion and growth.
Learn why testing your enterprise's data is pivotal for success with big data, Hadoop and NoSQL. Learn how to increase your testing speed, boost your testing coverage (up to 100%), and improve the level of quality within your data warehouse - all with one ETL testing tool.
This information is geared towards:
- Big Data & Data Warehouse Architects,
- ETL Developers
- ETL Testers, Big Data Testers
- Data Analysts
- Operations teams
- Business Intelligence (BI) Architects
- Data Management Officers & Directors
You will learn how to:
- Improve your Data Quality
- Accelerate your data testing cycles
- Reduce your costs & risks
- Provide a huge ROI (as high as 1,300%)
Data Warehousing in Pharma: How to Find Bad Data while Meeting Regulatory Req...RTTS
In the U.S., pharmaceutical firms must meet electronic record-keeping regulations set by the Food and Drug Administration (FDA). The regulation is Title 21 CFR Part 11, commonly known as Part 11.
Part 11 requires regulated firms to implement controls for software and systems involved in processing many forms of data as part of business operations and product development.
Enterprise data warehouses are used by the pharmaceutical and medical device industries for storing data covered by Part 11. QuerySurge, the only test tool designed specifically for automating the testing of data warehouses and the ETL process, is the market leader in testing data warehouses used by Part 11-governed companies.
For more on QuerySurge and Pharma, please visit
http://www.querysurge.com/solutions/pharmaceutical-industry
Query Wizards - data testing made easy - no programmingRTTS
Fast and easy. No Programming needed. The latest QuerySurge release introduces the new Query Wizards. The Wizards allow both novice and experienced team members to validate their organization's data quickly with no SQL programming required.
The Wizards provide an immediate ROI through their ease-of-use and ensure that minimal time and effort are required for developing tests and obtaining results. Even novice testers are productive as soon as they start using the Wizards!
According to a recent survey of Data Architects and other data experts on LinkedIn, approximately 80% of columns in a data warehouse have no transformations, meaning the Wizards can test all of these columns quickly & easily, (The columns with transformations can be tested using the QuerySurge Design library using custom SQL coding.)
There are 3 Types of automated Data Comparisons:
- Column-Level Comparison
- Table-Level Comparison
- Row Count Comparison
There are also automated features for filtering (‘Where’ clause) and sorting (‘Order By’ clause).
The Wizards provide both novices and non-technical team members with a fast & easy way to be productive immediately and speed up testing for team members skilled in SQL.
Trial our software either as a download or in the cloud at www.QuerySurge.com. The trial comes with a built-in tutorial and sample data.
Best Laid Plans: Saving Time, Money and Trouble with Optimal ForecastingEric Kavanagh
Expectations have changed. That's true for users, executives and customers alike. There's no time for systems running slowly, or cost overruns. That's why fundamentals like capacity planning have become mission-critical. By paying attention to the details, and doing effective forecasts, companies can optimize their information architecture, keeping everyone happy. Register for this episode of Hot Technologies to learn from veteran Analysts Dr. Robin Bloor and Rick Sherman who will offer insights about how and why to do capacity planning. They'll be briefed by Bullett Manale of IDERA, who will explain how his company's SQL Diagnostic Manager can track a wide range of usages metrics which can be used for accurate forecasting.
Creating a Data validation and Testing StrategyRTTS
Creating A Data Validation & Testing Strategy
Are you struggling with formulating a strategy for how to validate the massive amount of data continuously entering your data warehouse or data lake?
We can help you!
Learn how RTTS’ Data Validation Assessment provides:
- an evaluation of your current data validation process
- recommendations on how to improve your process and
- a proposal for successful implementation
This slide deck addresses the following issues:
- How do I find out if I have bad data?
- How do I ensure I am testing the proper data permutations?
- How much of my data needs to be validated and automated?
- Which critical data endpoints need to be tested?
- How do I test data in my cloud environments?
And much more!
For more information, visit:
https://www.rttsweb.com/services/solutions/data-validation-assessment
Testing Big Data: Automated Testing of Hadoop with QuerySurgeRTTS
Are You Ready? Stepping Up To The Big Data Challenge In 2016 - Learn why Testing is pivotal to the success of your Big Data Strategy.
According to a new report by analyst firm IDG, 70% of enterprises have either deployed or are planning to deploy big data projects and programs this year due to the increase in the amount of data they need to manage.
The growing variety of new data sources is pushing organizations to look for streamlined ways to manage complexities and get the most out of their data-related investments. The companies that do this correctly are realizing the power of big data for business expansion and growth.
Learn why testing your enterprise's data is pivotal for success with big data and Hadoop. Learn how to increase your testing speed, boost your testing coverage (up to 100%), and improve the level of quality within your data - all with one data testing tool.
Scaling Databricks to Run Data and ML Workloads on Millions of VMsMatei Zaharia
Keynote at Scale By The Bay 2020.
Cloud service developers need to handle massive scale workloads from thousands of customers with no downtime or regressions. In this talk, I’ll present our experience building a very large-scale cloud service at Databricks, which provides a data and ML platform service used by many of the largest enterprises in the world. Databricks manages millions of cloud VMs that process exabytes of data per day for interactive, streaming and batch production applications. This means that our control plane has to handle a wide range of workload patterns and cloud issues such as outages. We will describe how we built our control plane for Databricks using Scala services and open source infrastructure such as Kubernetes, Envoy, and Prometheus, and various design patterns and engineering processes that we learned along the way. In addition, I’ll describe how we have adapted data analytics systems themselves to improve reliability and manageability in the cloud, such as creating an ACID storage system that is as reliable as the underlying cloud object store (Delta Lake) and adding autoscaling and auto-shutdown features for Apache Spark.
Leveraging HPE ALM & QuerySurge to test HPE VerticaRTTS
Are you using HPE ALM or Quality Center (QC) for your requirements gathering and test management?
RTTS, an alliance partner of HPE and a member of HPE’s Big Data community, can show you how to use ALM/QC and RTTS’ QuerySurge to effectively manage your data validation & testing of Vertica (or any data warehouse).
In this webinar video you will see:
- a custom view of ALM to store source-to-target mappings
- data validation tests in QuerySurge
- the execution of QuerySurge tests from ALM
- the results of data validation tests stored in ALM
- custom ALM reports that show data validation coverage of Vertica
how we improve your data quality while reducing your costs & risks
Presented by:
Bill Hayduk, Founder & CEO of RTTS, the developers of QuerySurge
Chris Thompson, Senior Domain Expert, Big Data testing
To learn more about QuerySurge, visit www.QuerySurge.com
Implementing Azure DevOps with your Testing ProjectRTTS
Implementing Azure DevOps With Your Testing Project
Are you challenged with different teams working on different platforms making it difficult to get insight into another team’s work?
Is your team seeking ways to automate the code deployments so you can spend more time developing new features and writing more tests, and spend less time deploying and running manual tests?
RTTS, a Microsoft Gold DevOps Partner, will take you through solving these challenges with Azure DevOps.
Tuesday, June 16th 2020 @11am ET
Session Overview
------------------------------------
During the webinar, we will walk you through the following process of utilizing Azure DevOps:
- The challenges that inspired the Azure DevOps solution that you may experience as well
- The strategy for implementing Azure Devops
- Solutions in our every day processes to increase our times efficiency and save time
- A demo of an Azure DevOps environment for testing teams
The see a recording of the webinar, please visit:
https://www.youtube.com/watch?v=2vIic3wxaS4
To learn more about RTTS, please visit:
https://www.rttsweb.com
QuerySurge Slide Deck for Big Data Testing WebinarRTTS
This is a slide deck from QuerySurge's Big Data Testing webinar.
Learn why Testing is pivotal to the success of your Big Data Strategy .
Learn more at www.querysurge.com
The growing variety of new data sources is pushing organizations to look for streamlined ways to manage complexities and get the most out of their data-related investments. The companies that do this correctly are realizing the power of big data for business expansion and growth.
Learn why testing your enterprise's data is pivotal for success with big data, Hadoop and NoSQL. Learn how to increase your testing speed, boost your testing coverage (up to 100%), and improve the level of quality within your data warehouse - all with one ETL testing tool.
This information is geared towards:
- Big Data & Data Warehouse Architects,
- ETL Developers
- ETL Testers, Big Data Testers
- Data Analysts
- Operations teams
- Business Intelligence (BI) Architects
- Data Management Officers & Directors
You will learn how to:
- Improve your Data Quality
- Accelerate your data testing cycles
- Reduce your costs & risks
- Provide a huge ROI (as high as 1,300%)
Data Warehousing in Pharma: How to Find Bad Data while Meeting Regulatory Req...RTTS
In the U.S., pharmaceutical firms must meet electronic record-keeping regulations set by the Food and Drug Administration (FDA). The regulation is Title 21 CFR Part 11, commonly known as Part 11.
Part 11 requires regulated firms to implement controls for software and systems involved in processing many forms of data as part of business operations and product development.
Enterprise data warehouses are used by the pharmaceutical and medical device industries for storing data covered by Part 11. QuerySurge, the only test tool designed specifically for automating the testing of data warehouses and the ETL process, is the market leader in testing data warehouses used by Part 11-governed companies.
For more on QuerySurge and Pharma, please visit
http://www.querysurge.com/solutions/pharmaceutical-industry
Query Wizards - data testing made easy - no programmingRTTS
Fast and easy. No Programming needed. The latest QuerySurge release introduces the new Query Wizards. The Wizards allow both novice and experienced team members to validate their organization's data quickly with no SQL programming required.
The Wizards provide an immediate ROI through their ease-of-use and ensure that minimal time and effort are required for developing tests and obtaining results. Even novice testers are productive as soon as they start using the Wizards!
According to a recent survey of Data Architects and other data experts on LinkedIn, approximately 80% of columns in a data warehouse have no transformations, meaning the Wizards can test all of these columns quickly & easily, (The columns with transformations can be tested using the QuerySurge Design library using custom SQL coding.)
There are 3 Types of automated Data Comparisons:
- Column-Level Comparison
- Table-Level Comparison
- Row Count Comparison
There are also automated features for filtering (‘Where’ clause) and sorting (‘Order By’ clause).
The Wizards provide both novices and non-technical team members with a fast & easy way to be productive immediately and speed up testing for team members skilled in SQL.
Trial our software either as a download or in the cloud at www.QuerySurge.com. The trial comes with a built-in tutorial and sample data.
Best Laid Plans: Saving Time, Money and Trouble with Optimal ForecastingEric Kavanagh
Expectations have changed. That's true for users, executives and customers alike. There's no time for systems running slowly, or cost overruns. That's why fundamentals like capacity planning have become mission-critical. By paying attention to the details, and doing effective forecasts, companies can optimize their information architecture, keeping everyone happy. Register for this episode of Hot Technologies to learn from veteran Analysts Dr. Robin Bloor and Rick Sherman who will offer insights about how and why to do capacity planning. They'll be briefed by Bullett Manale of IDERA, who will explain how his company's SQL Diagnostic Manager can track a wide range of usages metrics which can be used for accurate forecasting.
Creating a Data validation and Testing StrategyRTTS
Creating A Data Validation & Testing Strategy
Are you struggling with formulating a strategy for how to validate the massive amount of data continuously entering your data warehouse or data lake?
We can help you!
Learn how RTTS’ Data Validation Assessment provides:
- an evaluation of your current data validation process
- recommendations on how to improve your process and
- a proposal for successful implementation
This slide deck addresses the following issues:
- How do I find out if I have bad data?
- How do I ensure I am testing the proper data permutations?
- How much of my data needs to be validated and automated?
- Which critical data endpoints need to be tested?
- How do I test data in my cloud environments?
And much more!
For more information, visit:
https://www.rttsweb.com/services/solutions/data-validation-assessment
Testing Big Data: Automated Testing of Hadoop with QuerySurgeRTTS
Are You Ready? Stepping Up To The Big Data Challenge In 2016 - Learn why Testing is pivotal to the success of your Big Data Strategy.
According to a new report by analyst firm IDG, 70% of enterprises have either deployed or are planning to deploy big data projects and programs this year due to the increase in the amount of data they need to manage.
The growing variety of new data sources is pushing organizations to look for streamlined ways to manage complexities and get the most out of their data-related investments. The companies that do this correctly are realizing the power of big data for business expansion and growth.
Learn why testing your enterprise's data is pivotal for success with big data and Hadoop. Learn how to increase your testing speed, boost your testing coverage (up to 100%), and improve the level of quality within your data - all with one data testing tool.
1. Christine Misseri
28 EvergreenRd,AvonCT 06001
(T) 860-404-5171 (C) 860-305-2868 (E) cmisseri1818@att.net
Career Overview
I am a software developerandadatabase developerwithovertwentyfive yearsexperience asa
software /programminganddatabase professional,workingwithavarietyof applicationsandpackages
inthe Insurance,Financial,Aerospace,Hospital,HumanResource andElectrical GridIndustries. Having
the abilitytoprioritize andmeetoperational deadlinesinademandingenvironmentwithstrong
communication,organizational,technical andproblemsolvingskills.
Qualifications:
MS Visual Studio2015, Visual Source Safe,MS-DOS,Oracle 8i - Oracle 9i, Gupta,SQL Serverdeveloper,
MicrosoftFax Software,Toad,MicrosoftMail Service Pack6A,Oracle Enterprise Manager,Norton
Utilities,PLSQLDeveloper6.05, LotusNotes,PowerBuilder9.0 – 11, Focus,Office 97 - 2015 Products,
Visio5.0,Windows3.1 – Windows 8.1, Quest,Visual Basic3.0 - V.B 6.0, SQL Wintalk,WindowsNT4.0,
QuickBasic,VISIOProfessional 2000, COBOL, ALC,C,Fortran, MicrosoftC++, CenturaSQL, Smart Draw
version6.11, Data Junction, Micrografix Flow Charter7,SQL Talk,Erwin,MicrosoftProject97, SAS,PC-
DOS,OS/2, Perl, Sybase,Embarcadero,Novell Groupwise, SmartStream7.0 – 8.0, RapidSQL, One
Shielddragon,QuestSoftware –SQLNavigator,Oracle 11, Citrix,VPN Client –CiscoSystems,People
Soft,Jython,Java,Python SQLServer2012 Database Diagrams,DataEase 4.2.
Technical Skills:
Skills Experience Total Years Last Used
C# Intermediate 1 2016
SQL Expert 26 2016
SQL Server Expert 26 2016
SSIS Expert 5 2016
SSRS Intermediate 1 2016
Oracle 11 Expert 15 2016
MicrosoftExcel 2010 Expert 30 2016
MicrosoftWord 2010 Expert 30 2016
MicrosoftAccess2010 Expert 30 2016
MicrosoftVisual StudioExpert 5 2016
VBA Expert 30 2016
C Intermediate 5 1997
Visual Basic6.0 Intermediate 5 2002
.Net Intermediate 1 2005
Perl Intermediate 1 2013
Sybase Intermediate 2 2014
RapidSQL Expert 2 2014
Embarcadero Intermediate 2 2014
People Soft Intermediate 2 2014
Java Intermediate 1 2014
Jython Intermediate 1 2014
Accomplishments:
Database Design:
Created Masking on Social Security Numbers using SQL 2016.
2. Using C# and Visual Studio create a functioning application to turn Excel spreadsheets to
text format for the companies cash account information.
Using MicrosoftVisual Studio 2016, SSIS, created a functional back end database to
populate tables.
Populated front end application creating stored procedures to populate the frontend
application and reports.
Using SQL Server 2012,created a new database diagram thatis functional for
creating the companies new database and application.
Created stored procedures to export stored data from DataEase application to
populate into newly created SQL Server Tables.
Developing ETL Maintenance and development techniques for database loads.
Quality Assurance
Streamlined QA process to increase efficiencyand reduce new productroll out time by one week.
Created and streamlined stored procedures in DataEase to efficientlyextract data.
Streamlined SSIS to take extracted data from DataEase to push into SQL Server tables.
Streamlined the SSIS process to flow correctly inserting data into re-created tables each time the package
is run.
Network Security
Updated network systems to supportconfidential companyoperations and eliminate hacking.
Re-started the legacy system and replenishments jobs at night when the legacy system
stalled.
Programming
Increased revenues by 12% by developing codes for accounting tasks including cost estimation and
revenue generation.
Created queries in the Oracle and SQL application to check Electrical Grid information.
Completed 2 courses in PeopleSoft for the PeopleSoft tools and PeopleSoft query system.
Analysis and Development
• Resolved core issues through the redesign of the broker data model, data house -holding processes and daily
round-trip database synchronization.
• Set new standards for virtually 100% code coverage, while piloting SQL Designer.
• Resolved issues with the legacy system when the system and jobs stalled.
Product Development
• Worked on scalabilityand optimization ofproduction environment.
Work Experience:
10/10/2015 – Present Benistar Admin Services, Inc., Avon, CT
Create a functioning application to turn Excel spreadsheets to text format for the companies cash
account information using C# and Visual Studio.
Using SQL Server 2012, created functional tables to populate a front end web application.
Using Visual Studio 2015, created SSIS packages to populate the newly created tables.
Using SQL Server 2012, created stored procedures to populate the SSRS reports.
Using Visual Studio 2015, creating SSRS reports from created stored procedures.
Creating stored procedures using DataEase to export data from the old database.
Writing Queries, SSIS packaging to import and scrub data from the DataEase stored procedures
export into the new SQL Server backend tables.
Creating stored procedures to mimic old reports procedures in DataEase.
11/03/2014 – 05/29/2015 JVT Advisors, New England Business Center Andover, MA
Software Developer for ISO New England
• Creating functional Access 2007 databases for reporting ISO New England’s information to upper
management.
3. • Using Oracle 11g and SQL Server to create pass through queries to retrieve the data to populate
the reports that are being created and generated.
• Designing Access 2007 reports and applications for course registration and data mining
information comparing the use and functions of the electrical grid.
• Created 3 application databases with reports in the appropriate time allotted for the project.
• Used object-oriented design/programming to design new stand-alone application.
• Created a database for a data dump in Access 2007 using oracle and sql queries and
procedures
• Created a functional Access database and reports to report ISO New England’s megawatts
usage and outages for managers and upper managers.
10/10/2012 – 10/31/2014 PDS Tech, Roberts St East Hartford, CT
Database Developer for Hartford Hospital
• Converted a Human resources Access databases from 2000 – 2003 to 2010.
• Created a new Access database using the Sybase 8.0 system for human resources.
• Converted Purchasing’s Access 2000-2003 databases to 2010.
• Create stored procedures to run the Perl programs that capture data.
• Created Perl programs to capture data and send them to the clients through an FTP System.
• Used Smart Stream 7.0 - 8.0 to connect to the Oracle database to capture data and create
reports in Access 2010.
• Used Rapid SQL 7.6.4 to create other stored procedures to capture data and send to the Access
programs.
• Testing the new PeopleSoft Web application to make sure accruals and payroll data is calculating
the correct information. When doing this I found that there were problems in the calculations and
reported the problem.
• Creating queries using the new PeopleSoft query manager to test inputted data from the
developers. This is to make sure that the data for the payroll system is pulling information
correctly.
• Checking the legacy system to make sure the manufacture reports are running from the stored
procedures and smart stream correctly. When the reports do not run, the system needs to be
restarted; I restart the polls that the stored procedures run from.
• Helped with the new product help desk. Helping customers with their login problems and
navigation problems.
07/16/2012 – 10/09/2012 MODIS, Great Meadow Rd #601 Wethersfield, CT
Database Developer for Allied World
• Developed in One Shield Dragon a few applications for a new project. I finished the project before
I left with minimal help and used my experience in SQL and in application development to master
One Shield Dragon.
• Using SQL Navigator 5.5 created stored procedures to fill data into the applications as a back
end.
10/01/2011 – 6/1/2012 GeBBS Englewood Cliffs, NJ
Database Developer for NASCO
• Developing an Access 2007 database that houses Blue Cross Blue Shield of Massachusetts data
from their mainframe.
• This database creates Blids which is then sent up to Benefact, the data is placed in the proper
fields in the Benefact system.
• Created Modules and procedures to create the BLID output.
• Completed courses from NASCO on Testing and Certification NLC classes which are mainframe
classes using COBOL and ALC.
• Completed a course for benefact training.
• Working with ALC/COBOL to dump and check flags on the mainframe system.
02/2009 - 09/2011 Housing Authority Insurance Group Cheshire, CT
4. Database Developer
• Developing stored procedures, triggers, new tables and SSIS packages to automatically
store data for the reporting system.
• Maintaining and developing sql stored procedures for financial reports.
• Creating PowerBuilder interfaces for the customer to run and print out reports using Excel.
• Developing PowerBuilder interfaces for Claims, Marketing, Finance and risk, this was to help the
customer store their information in the cloud and reporting systems.
01/2007 - 12/2008 Insurity Hartford, CT
Database Developer
• Developing stored procedures, new tables, triggers Using SQL Server 2005 and 2008.
• Creating new tools and new technology to help the database process for achieving quality
and ease of use of the application.
11/2006 - 12/2006 Sapphire Technologies Consultant East Hartford, CT
Applications Developer for Aetna
• Developing Access applications using VBA code to automate the client applications.
• Creating database schema using Rational Rose software.
• Single-handedly developed process to improve work flow and communication between in -house
departments.
09/2006 - 10/2006 On Line Systems Consultant Farmington, CT
Application Developer for the Department of Health State of Connecticut
• Having only a month to improve their system, I created automated adhoc reports and Access
application database.
• Using Toad and Enterprise management, fixing Oracle views, triggers and stored procedures to
have the application run more efficiently
06/2006 - 08/2006 Web MD Avon, CT
Database Developer
• Developed and implemented in-house source organization system, increasing efficiency by 10%.
• Fixed applications for the efficiency
• Used a wide variety of techniques and tools in the development and unit testing of software
applications, including: object oriented programming languages and software tools, analysis
and design techniques, data base management systems, middleware and data access and
retrieval programs.
10/2002 - 12/2003 Sapphire Technologies Consultant East Hartford, CT
Application Developers for One Beacon Professional Partners
• Developed submission, and claims databases to store medical information.
• Developed VB and VBA bound and unbound controls in the application using grids, combo, drop
down and text boxes.
• Automated importing and exporting information between Access forms and Excel back to the
Access application after data was massaged to store new information for reporting.
• Created hyperlinks to open and display reports in a web like environment.
• Created security in the application using .Net
09/2005 - 06/2006 Universal Design Consultant Hartford, CT
Application Developer for Pratt and Whitney
• Used a wide variety of techniques and tools in the development and unit testing of software
5. applications, including: object oriented programming languages and software tools, analysis
and design techniques, data base management systems, middleware and data access and
retrieval programs.
• Developed Oracle stored procedures, triggers and views linking information from the Oracle
database to the Access application.
• Developed a vendor share Access application worked with clients to analyze computing and
network needs and installed appropriate solutions within each organization's budget.
• Provided daily support to ensure company staff had necessary tools to perform tasks efficiently.
06/2005 - 08/2005 RCG Glastonbury, CT
Applications Developer for Webster Bank
Repaired and maintained the financial database that was not calculating and reporting correct
monies.
Created new reports using new deigns and new technology using Access.
03/2004 - 05/2005 Universal Staffing Hartford, CT
Application Developer for Pratt and Whitney
• Implemented company policies, technical procedures and standards for preserving the
integrity and security of data, reports and access.
• Prepared technical architecture proposals for enhancements and integration of existing third party
software systems with SAP.
• Proposed technical feasibility solutions for new functional designs and suggested options
for performance improvement of technical objects.
• Led architectural design effort to migrate existing MS SQL OLTP dependent reporting
environment to enterprise-class BI platform (Business Objects XI 3.0).
• Streamlined and enhanced the corporate accounting and operations system.
06/2002 - 09/2002 Connecticut Staffing Consultant Hartford, CT
Applications Developer for DST Output
Developed a Time Tracking application to store and retrieve information from SQL Server.
Developing VB Bound and Unbound controls in the Time Tracking Program using grids, data
combo boxes, drop down boxes, text boxes and Masked Edit Boxes .
Using SQL 7.0 to connect to a Database to store and retrieve information to fill the Maintenance
and Time Tracking System Programs through VB 6.0.
Fixed and Enhanced the Access 97 STAT system
01/2001 - 12/2001 Microtech Consultant Simsbury, CT
Applications Developer for The Board Of Education Services for the Blind
Developed and designed a Maintenance application to store and retrieve
Information from SQL server. Developed VB Bound and Unbound controls in the Maintenance
Program using grids, data combo boxes, drop down boxes and text boxes.
Using SQL 7.0 to connect to a Database to store and retrieve information to fill the Maintenanc e
and Client Management System Programs.
Client Management System (CMS) Enhanced and fixed the CMS system.
Fixed Sheridan controls and problems within the Client Management System.
Developed and designed reports in VB 6.0 within the Client Management System. Developed
and designed reports using Seagate Crystal Reporting tool Version 7 for Client Management and
Requisitions Systems.
Wrote Views in SQL Server 7.0 to design reports for the Seagate Crystal reports using the Views
for stand-alone / detached reports.
Used VB 6.0 to automate the reports from the users' machine.
Professional Development:
1983 Asnuntuck Community College, Enfield,CT, United States
Computer Science
6. Associates
Learning Basic,COBOL and FORTRAN application.
2010 New Horizons, Waterbury, CT, United States
SQL Server 2005 Integration Services
• Professional training SQL Server 2005 Integration Services for SIS packaging.
Continuing Education:
Completed a course in PeopleSofttools 2014
Completed a course in PeopleSoftQuery 2014
Completed a course in MicrosoftOffice 2007 - 2010.
Completed courses for Testing and certification/NCL Classes COBOL and ACL Claims training.
Completed an in house training course in PowerBuilder - 2009.
Completed courses for Visual Basic 3.0 through 6.0 along with Active X - 1999.
Completed a C++ course at CCSU - 1996