Mark Finley has over 20 years of experience developing databases and software using technologies like SQL Server, Oracle, C#, and .NET. He has extensive experience architecting, analyzing, designing, developing, documenting, testing, deploying, and supporting complex systems. Some of his past roles include data architect at Quintiles, where he built a data warehouse and data integration systems, and senior developer/architect at MF Global, where he developed risk management applications. He is proficient in technologies like SQL Server, Oracle, SSIS, C#, ASP.NET, and Agile methodologies.
Oracle restful api & data live charting by Oracle Apex - داشبورد آنلاین (داده...mahdi ahmadi
Create customized dashboards for non-local or remote servers data using the ords service. The complete description on youtube : https://www.youtube.com/watch?v=ueioHTdMMDA
Politics Ain’t Beanbag: Using APEX, ML, and GeoCoding In a Modern Election Ca...Jim Czuprynski
Oracle announced in December 2019 its Spatial and Graph features are now included without additional licensing costs for Oracle databases. This means application developers now have low-cost access to powerful geolocation, routing, and mapping capabilities – a welcome addition for any Application Express (APEX) application that previously shied away from implementing those features. I'll demonstrate a real-life use case – handling the changing demands of a modern election campaign, including managing widely-dispersed volunteers and voters, using geolocation for merchandise distribution, and identifying “flippable” voters with ML and analytics – through a mobile-capable APEX application.
An Autonomous Singularity Approaches: Force Multipliers For Overwhelmed DBAsJim Czuprynski
Autonomous Database Services have expanded well beyond their original scope of heavy analytical workloads (ADW) and hybrid transaction processing / reporting workloads (ATP) to include dedicated Cloud-based instances to eliminate contention between “noisy neighbors” in the same region and domain.
I'll explain how Oracle DBAs at any skill level can immediately leverage Autonomous resources as force multipliers to free them from most mundane administration tasks so they can concentrate on mastering the new skills required to become an Enterprise Data Architect - the emerging post-DBA role – and shift their focus towards building better enterprise systems in concert with their organization’s application developers, business analysts, and business units.
Fast and Furious: Handling Edge Computing Data With Oracle 19c Fast Ingest an...Jim Czuprynski
The Internet of Things (IoT) has deep use cases - energy grids, communications, policing, security, and manufacturing. I’ll show how to use Oracle 19c’s Fast Ingest and Fast Lookup features to load IoT data from “edge” sources to take immediate advantage of that information in nearly real time.
JAM819 - Native API Deep Dive: Data Storage and RetrievalDr. Ranbijay Kumar
Nearly all apps need to store data on device. Join this session for an overview of the various APIs that can be used to store and retrieve data from device memory. Learn how to leverage different storage mechanisms available and what to consider. This session will cover, the file system, SQLite and persistent settings and how to implement this in your native C and Cascades application.
This session provides an introduction to using SSIS. This is an update to my older presentation on the topic: http://www.slideshare.net/rmaclean/sql-server-integration-services-2631027
Actively looking for an opportunity to work as a challenging Dot Net DeveloperKarthik Reddy
The .Net Developer designs, modifies, develops, writes and implements software programming applications and components. Supports and/or installs software applications and components. Works from written specifications and pre-established guidelines to perform the functions of the job and maintains documentation of process flow.
Oracle restful api & data live charting by Oracle Apex - داشبورد آنلاین (داده...mahdi ahmadi
Create customized dashboards for non-local or remote servers data using the ords service. The complete description on youtube : https://www.youtube.com/watch?v=ueioHTdMMDA
Politics Ain’t Beanbag: Using APEX, ML, and GeoCoding In a Modern Election Ca...Jim Czuprynski
Oracle announced in December 2019 its Spatial and Graph features are now included without additional licensing costs for Oracle databases. This means application developers now have low-cost access to powerful geolocation, routing, and mapping capabilities – a welcome addition for any Application Express (APEX) application that previously shied away from implementing those features. I'll demonstrate a real-life use case – handling the changing demands of a modern election campaign, including managing widely-dispersed volunteers and voters, using geolocation for merchandise distribution, and identifying “flippable” voters with ML and analytics – through a mobile-capable APEX application.
An Autonomous Singularity Approaches: Force Multipliers For Overwhelmed DBAsJim Czuprynski
Autonomous Database Services have expanded well beyond their original scope of heavy analytical workloads (ADW) and hybrid transaction processing / reporting workloads (ATP) to include dedicated Cloud-based instances to eliminate contention between “noisy neighbors” in the same region and domain.
I'll explain how Oracle DBAs at any skill level can immediately leverage Autonomous resources as force multipliers to free them from most mundane administration tasks so they can concentrate on mastering the new skills required to become an Enterprise Data Architect - the emerging post-DBA role – and shift their focus towards building better enterprise systems in concert with their organization’s application developers, business analysts, and business units.
Fast and Furious: Handling Edge Computing Data With Oracle 19c Fast Ingest an...Jim Czuprynski
The Internet of Things (IoT) has deep use cases - energy grids, communications, policing, security, and manufacturing. I’ll show how to use Oracle 19c’s Fast Ingest and Fast Lookup features to load IoT data from “edge” sources to take immediate advantage of that information in nearly real time.
JAM819 - Native API Deep Dive: Data Storage and RetrievalDr. Ranbijay Kumar
Nearly all apps need to store data on device. Join this session for an overview of the various APIs that can be used to store and retrieve data from device memory. Learn how to leverage different storage mechanisms available and what to consider. This session will cover, the file system, SQLite and persistent settings and how to implement this in your native C and Cascades application.
This session provides an introduction to using SSIS. This is an update to my older presentation on the topic: http://www.slideshare.net/rmaclean/sql-server-integration-services-2631027
Actively looking for an opportunity to work as a challenging Dot Net DeveloperKarthik Reddy
The .Net Developer designs, modifies, develops, writes and implements software programming applications and components. Supports and/or installs software applications and components. Works from written specifications and pre-established guidelines to perform the functions of the job and maintains documentation of process flow.
Actively looking for an opportunity to work as a challenging Dot Net DeveloperKarthik Reddy
The .Net Developer designs, modifies, develops, writes and implements software programming applications and components. Supports and/or installs software applications and components. Works from written specifications and pre-established guidelines to perform the functions of the job and maintains documentation of process flow.
1. Mark Finley CSM
329 South Cuyler Avenue
Oak Park IL 60302
312.709.9151
mfinley@finleyinfosys.com
RELEVANT EXPERIENCE:
Over 20 years’ experience developing databases and software. Extensive experience in
architecting, analysis, design, developing, documenting, testing, deploying and
supporting complex n-layer/tier systems and internet applications using Scrum and
Kanban Methodology.
Understand business requirements analysis and explaining in plain language how to
implement a solution.
Quickly grasp the gravity of the situation and respond with answers that can implemented
in days instead of months.
Built several complex databases utilizing XML payloads as outputs and inputs using SQL
Server and Oracle.
Built solutions using the Web API.
Proficient with Oracle PL/SQL including packages, stored procedures, DDL.
Proficient in DDL, TSQL stored procedures and triggers in SQL Server 2000, 2005, 2008
and 2012.
SSIS including scripting, XML, Visual Studio 2005 through 2015, C# 4.5, ADO.Net,
LINQ, ASP.Net, Ajax, Java script, HTML.
Have used TFS, JIRA, Subversion, SourceSafe, and Rational Rose software version
controls.
Current on Top Team and HP Quality Center for requirements and Road mapping.
Continuously learning new skills, every day is school day.
PRIOR EXPERIENCE:
Quintiles
o September 2009 to present:
o Data Architect 1. Sql Server, Oracle, SSIS, TSQL and C# developer,
o Net 4.5, WPF, WCF, Cloud. TFS, ETL, RESTful services, Web API.
Private clients:
o April 2009 to September 2009:
o ASP.net 3.5, Ajax and Flash
MF Global:
o March 2008 to March 2009:
o Sr. Developer and Architect
o C#, ASP.net 3.5, SQL Server 2005, AJAX, Java script, XML, DevExpress tools,
Agile.
ULTAInc.
September 2007 to February 2008:
Architect SQL Server 2000.
2. 2
Private clients
o August 2006 to August 2007:
o Architect, Sr. Developer C#, ASP.net 2.0, SQL Server 2005, DevExpress Tools,
Winforms Scrum projects
HoneyWell HomeMed Inc.
o June to August 2006:
o Architect
o C#, ASP.Net 2.0, SQL Server 2005 SQL Server assemblies.
o Scrum Project
Exelon Corp
o September 2004 to January 2006:
o Architect, Sr. Developer, team lead
o SQL Server, Access, Excel, C#, ASP.Net 1.1, Winforms
Other companies and history available upon request.
Quintiles
Data Architect
September 2009 to Present
Modeled and developed a data warehouse for OMOP (Observational Medical Outcomes
Partnership) from various sources. This was a data model from the ground up following
guidelines at http://omop.org/CDM. We architected the model in Oracle. We then
populated the model from a non OMOP source, converting the data into the new model.
The first pass took us about a month to get the data in. After applying indices, hints and
creating a new batching process, drove the insert to under one week.
We developed a request batch process, designed to start at the point of failure rather than
the beginning. We batch 200,000 persons with their related data at a time. An error
during that batch only affects that batch, the other batches would continue unless the
error is so severe as to stop the whole process. We fix the error and start at the failed
batch. We recently had a hardware failure causing a delay. The system is so robust that
we restarted at the point of the failure.
There is a set of queries we created to gather information over a period of time. This
windowing system aggregates the same condition recorded for the same person over a
period of time so that the condition would be treated as an era rather than several
occurrences in the data.
Another feature of this system is automated testing and validation. We developed with
the test team a system to ensure we had the records from the source inserted without loss
into the new OMOP warehouse. We automated this process so the tests run during the
batch process at the appropriate time.
The previous project was to create a data mart that polled data from multiple sources,
output to several different pipes. This was built in Sql Server 2012. The data mart
consists of several databases. Each database a specific function. The most interesting
took common functionality from the feeding sources, each from a separate entity, each
entity had no ownership ties with my company. I found three threads each entity’s data
feeds had in common. I created master threads for each area. I used my company’s
model for the naming convention for the columns to make internal feeds easier. I created
3. 3
mapping tables for the other entities, making the feed pipes easier to translate. The data
tracked clinical trial outcomes for the Drug industry.
I architected and developed a software tool called The Data Steward. This tool displays
the data from shared databases and publishes to the correct server using the Web API.
The Data Steward handles security via Active Directory. Various clients can only access
the data authorized on several different sources. An XML payload is developed on the
fly and sent to the controlling database (controller). The controller runs SSIS packages,
creates the dataset on the proper server the user has rights to. The Data Steward then
displays to the user a report of the data transferred.
I wrote an executable running daily that polls the Clinical Trials Government web site,
commonly called CT Gov, for changes in the last number of days as set by the
administrative interface and updates the controlling database with the detail. The
controller then publishes an XML file to subscribing databases. The subscribers compare
the XML file to their data and return an XML document to the controlling database with
a list of ids that have changed or are new. The Controller returns a third XML document
with just the changes the subscriber has requested. Thus, if a database is off line, it is
refreshed in total for what was missing whilst it was down. If the controller was down, it
will recover the latest files and changed files from the time it was down until it is
recovered. This is self-healing at its best. Currently there are 4 SQL server and 3 Oracle
subscribers to the controller.
Built CLRSQL median and Percentile functions for SQL Server 2008.
In addition to all my other duties, maintained the support ticket system and coordinated
efforts with our teams with DBAs, change management teams and other IT support
organziations.
MF Global,
Data Architect
March 2008 to March 2009:
Completed several projects all using n-layers design patterns and agile methodologies.
The projects mostly centered on handling risk for MF Global’s trading and options
business. The projects included web sites, dynamic link libraries (dll), and win forms.
Projects were built using 3.5 .Net Framework, C# 3.0, Ajax, and the DevExpress
Controls and components for .Net. Used factory pattern to create Entity classes, one per
table, and collection classes, one per entity class, for CRUD operations. All projects
were built using OOD methodologies, Visual Studio 2008, SQL Server 2005, C# 3.0, and
the .Net framework 3.0 or 3.5.
Built web sites using Microsoft .Net, Ajax, and SQL Server 2005. .These web sites
focused on mitigating risk to MF Global. The web sites displayed MF Global exposure
from Counterparties, both at the start of day and intraday trades. The intraday data
refreshed every 5 minutes. Warnings and caution flags display if a Counterparty exceeds
its notional limit.
Created a reusable dll with C# for any applications’ data layer communicating to SQL
Server. The helper classes wrapped the CreateParameter so that the signature chose the
correct data type. There is a class containing methods to take a value from SQL Server
and return a value type C# expects. This prevents the return of dbNull which can throw
an exception. This dll allowed the company to create a standard portal for Risk
Management and loosely couple the various sites onto one standard.
4. 4
Developed, designed and deployed an order tracking system for the IT department. The
UI is web based and the data stored on SQL Server 2005. Originally based design on
beta ASP.Net MVC provided by Microsoft, modified the design for extensive use of data
repeater controls for CRUD as required. MF Global has contract relationships with some
of its brokers. These relationships have business rules requiring the cost of IT concerning
these brokers to be prorated between brokers and cost centers. Therefore allocation
becomes very critical and the focal point of a successful deployment. We solved the
issue with a combination of database design of many to many relationships and use of the
repeater grid control.
Wrapped the functionality of the Jira web service into a standard reusable dll so that users
could create automated issues on the fly. Once the issue is created, it can be used many
times. Users can now insert, update and close one issue or multiple related issues using
one web form and in one instance, instead of opening and closing several related issues.
MF Global uses Jira to track ongoing IT issues such as database maintenance, ftp
maintenance etc.
Wrote several file to database conversion projects using C# and SSIS. The files load into
an ftp directory from various sources. When the files appear in the directory, an event is
launched in the service monitoring the directory and the appropriate exe is launched to
move the data in the file to the appropriate database. The information is used by the
Risk-Look portal.
Built several applications for various clients 1994 to 2008 using the latest SQL Server
and Visual Studio equivalent language including C# and Visual Basic 2.0 to 6.0.
Achievements will be provided upon request.
Other Employment History:
FNMC (Fannie Mae) February 1992 – February
1994
Lender Representative
Ensured lenders complied with Fannie Mae guidelines when selling mortgages to FNMC.
Boy Scouts of America April 1986 – November 1990
Senior District Executive
Led over 400 volunteers in achieving goals that Significantly Exceeded Expectations. Directed
summer camps successfully, always under budget and received highest camping rating from
every inspection team.
United States Army May 1978 – November 1984
Captain
Successfully commanded 123 member unit responsible for Basic Rifle Marksmanship
TRAINING:
Scrum Master with Ken Scwaber:
Two day course covering Scrum development as an agile methodology.
How successive sprints deliver software that works, is documented,
debugged, and teste; developed faster and better than the traditional
waterfall methodlogy.
Self taught everything SQL and C#.
5. 5
HIGHER EDUCATION:
Bachelor of Science, Missouri Western State College, St. Joseph Mo. 1978.
Accounting and Related Business courses, University of Louisville, Louisville Kentucky, 1984.
Work towards MS in Accounting, UMKC, Kansas City Mo, 1985.