● Who is CodeScience?
● Introductions
● Let’s Define “Test Data”
● Methods of Loading Test Data
● Recap
Agenda
This is an image placeholder for an image.
Please size accordingly.
● Founding partner of the Salesforce Product
Development Organization (PDO) Program since
2008 - named Master PDO in 2017
● PDO Program provides app development
services to ISVs for the Salesforce AppExchange
● Partner with clients in various industries to
assist in building over 220 apps on the
AppExchange
● From design to build to implementation, we
support through the full lifecycle
Who is CodeScience
Client success: 10% of the AppExchange
CodeScience client focus:
Today’s Speakers:
Bobby Tamburrino
Lead Salesforce Developer
CodeScience
Let’s Define “Test Data”
What Do We Mean When We Say “Test Data”?
● Records that reflect what production data could look like
● Each record vetted by either QA or an SME
● Small subset of records to allow for rapid loading into new
environments
● Records that have fleshed out and accurate relationships with other
records
What Do We NOT Mean When We Say “Test Data”?
● A full copy of a production instance with user-created data
● Data that has no relevance to the app’s functionality or intended use
● Data for Unit Tests
Those Who Benefit From Proper Test Data
● QA - Quicker startup time on end-to-end or regression testing with
vetted data
● DevOps - Easier spin-up of orgs with relevant data
● Developers - Able to develop against realistic data without having to
create it themselves
Dirty Little Secret About Developers
Dirty Little Secret About Developers
When To Define Your Test Data
● At the time of object creation
● As new features are developed
● As new test cases are defined
● As features or objects are deprecated
Methods of Loading Test Data
Data Loader
● Windows or MacOS Application
● Utilizes CSV Files
● Needs no custom development
● UI for mapping CSV fields to Salesforce fields
Downsides:
● Requires user input each time
● Not automated, so Continuous Integration can’t use it
○ There is a command line tool for Windows, but most CI is *nix based
Bulk API (2.0)
● Can be called with SalesforceDX, Workbench, or directly via the REST API
● Utilizes CSV files
● Acts asynchronously as a batch job
● Can handle large data sets (up to 150MB)
Downsides:
● Need custom development to create job, and poll API for success/failure
● CSV sometimes harder to read for developers used to JSON data structures
● Master-Detail relationships require editing CSV files with Id’s
Bulk API (2.0)
Processing data typically consists of the following steps:
1. Create a new job that specifies the object and action.
2. Send data to the server in a number of batches.
3. Once all data has been submitted, close the job. Once closed, no more batches can be sent
as part of the job.
4. Check status of all batches at a reasonable interval. Each status check returns the state of
each batch.
5. When all batches have either completed or failed, retrieve the result for each batch.
6. Match the result sets with the original data set to determine which records failed and
succeeded, and take appropriate action.
Bulk API (2.0)
Processing data typically consists of the following steps:
1. Create a new job that specifies the object and action.
2. Send data to the server in a number of batches.
3. Once all data has been submitted, close the job. Once closed, no more batches can be sent
as part of the job.
4. Check status of all batches at a reasonable interval. Each status check returns the state of
each batch.
5. When all batches have either completed or failed, retrieve the result for each batch.
6. Match the result sets with the original data set to determine which records failed and
succeeded, and take appropriate action.
SalesforceDX Data Tree Import/Export
● Called with SalesforceDX Commands
● Utilizes JSON files
● Can handle up to 200 records with Master-Detail and Lookup relationships at once
● Can insert multiple objects at once with a plan file
Downsides:
● Complex relationships between records might break record limit
● Must be able to query all data with one SOQL statement from top-down
● Does not handle records that have lookups to records in same object
● Does not handle lookups to User records
● Cannot import records with different Record Types
Custom Solution
● Apex Class that reads from a Static Resource of JSON files
○ Info file denotes what namespace an object lives in and what object a lookup has
● External ID field on all records to upsert against
● Namespace appending on fields at runtime, if needed
● Queries at runtime to match External ID’s to their fields:
○ External ID Field on target object a Lookup or Master/Detail relationship
○ Federation ID Field on User object for a Lookup to User
○ DeveloperName Field on RecordType to assign a Record Type
● Upsert, not insert, so rerunning doesn’t create duplicate data
In Summary...
● Create and Identify Test Data early
● Iterate on the Test Data as development continues
● If possible, add to Continuous Integration (or org creation scripts) to
give developers access to good data
● Complex data might require a custom data loading solution - budget
that in to your estimates!
Contact Us:
Thank You
info@codescience.com

WEBINAR: Proven Patterns for Loading Test Data for Managed Package Testing

  • 2.
    ● Who isCodeScience? ● Introductions ● Let’s Define “Test Data” ● Methods of Loading Test Data ● Recap Agenda
  • 3.
    This is animage placeholder for an image. Please size accordingly. ● Founding partner of the Salesforce Product Development Organization (PDO) Program since 2008 - named Master PDO in 2017 ● PDO Program provides app development services to ISVs for the Salesforce AppExchange ● Partner with clients in various industries to assist in building over 220 apps on the AppExchange ● From design to build to implementation, we support through the full lifecycle Who is CodeScience
  • 4.
    Client success: 10%of the AppExchange
  • 5.
  • 6.
    Today’s Speakers: Bobby Tamburrino LeadSalesforce Developer CodeScience
  • 7.
  • 8.
    What Do WeMean When We Say “Test Data”? ● Records that reflect what production data could look like ● Each record vetted by either QA or an SME ● Small subset of records to allow for rapid loading into new environments ● Records that have fleshed out and accurate relationships with other records
  • 9.
    What Do WeNOT Mean When We Say “Test Data”? ● A full copy of a production instance with user-created data ● Data that has no relevance to the app’s functionality or intended use ● Data for Unit Tests
  • 10.
    Those Who BenefitFrom Proper Test Data ● QA - Quicker startup time on end-to-end or regression testing with vetted data ● DevOps - Easier spin-up of orgs with relevant data ● Developers - Able to develop against realistic data without having to create it themselves
  • 11.
    Dirty Little SecretAbout Developers
  • 12.
    Dirty Little SecretAbout Developers
  • 13.
    When To DefineYour Test Data ● At the time of object creation ● As new features are developed ● As new test cases are defined ● As features or objects are deprecated
  • 14.
  • 15.
    Data Loader ● Windowsor MacOS Application ● Utilizes CSV Files ● Needs no custom development ● UI for mapping CSV fields to Salesforce fields Downsides: ● Requires user input each time ● Not automated, so Continuous Integration can’t use it ○ There is a command line tool for Windows, but most CI is *nix based
  • 16.
    Bulk API (2.0) ●Can be called with SalesforceDX, Workbench, or directly via the REST API ● Utilizes CSV files ● Acts asynchronously as a batch job ● Can handle large data sets (up to 150MB) Downsides: ● Need custom development to create job, and poll API for success/failure ● CSV sometimes harder to read for developers used to JSON data structures ● Master-Detail relationships require editing CSV files with Id’s
  • 17.
    Bulk API (2.0) Processingdata typically consists of the following steps: 1. Create a new job that specifies the object and action. 2. Send data to the server in a number of batches. 3. Once all data has been submitted, close the job. Once closed, no more batches can be sent as part of the job. 4. Check status of all batches at a reasonable interval. Each status check returns the state of each batch. 5. When all batches have either completed or failed, retrieve the result for each batch. 6. Match the result sets with the original data set to determine which records failed and succeeded, and take appropriate action.
  • 18.
    Bulk API (2.0) Processingdata typically consists of the following steps: 1. Create a new job that specifies the object and action. 2. Send data to the server in a number of batches. 3. Once all data has been submitted, close the job. Once closed, no more batches can be sent as part of the job. 4. Check status of all batches at a reasonable interval. Each status check returns the state of each batch. 5. When all batches have either completed or failed, retrieve the result for each batch. 6. Match the result sets with the original data set to determine which records failed and succeeded, and take appropriate action.
  • 19.
    SalesforceDX Data TreeImport/Export ● Called with SalesforceDX Commands ● Utilizes JSON files ● Can handle up to 200 records with Master-Detail and Lookup relationships at once ● Can insert multiple objects at once with a plan file Downsides: ● Complex relationships between records might break record limit ● Must be able to query all data with one SOQL statement from top-down ● Does not handle records that have lookups to records in same object ● Does not handle lookups to User records ● Cannot import records with different Record Types
  • 20.
    Custom Solution ● ApexClass that reads from a Static Resource of JSON files ○ Info file denotes what namespace an object lives in and what object a lookup has ● External ID field on all records to upsert against ● Namespace appending on fields at runtime, if needed ● Queries at runtime to match External ID’s to their fields: ○ External ID Field on target object a Lookup or Master/Detail relationship ○ Federation ID Field on User object for a Lookup to User ○ DeveloperName Field on RecordType to assign a Record Type ● Upsert, not insert, so rerunning doesn’t create duplicate data
  • 21.
    In Summary... ● Createand Identify Test Data early ● Iterate on the Test Data as development continues ● If possible, add to Continuous Integration (or org creation scripts) to give developers access to good data ● Complex data might require a custom data loading solution - budget that in to your estimates!
  • 22.