SlideShare a Scribd company logo
1 of 96
Download to read offline
Scribe Insight Tutorials
www.scribesoft.com
12/30/2015
Legal Information
© 1996-2015 Scribe Software Corporation. All rights reserved.
Complying with all applicable copyright laws is the responsibility of the user. No
part of this document may be reproduced or transmitted in any form or by any
means, electronic or mechanical, for any purpose, without the express written
permission of Scribe Software Corporation.
Trademarks
Scribe Adapter, Scribe Console, Scribe Insight, Scribe Integrate, Scribe Publisher,
and Scribe Workbench are all trademarks of Scribe Software Corporation. All other
names are recognized as trademarks, registered trademarks, service marks, or
registered service marks of their respective owners.
The names of companies, products, people, characters, or data mentioned herein
are fictitious and are in no way intended to represent any real individual, company,
product, or event, unless otherwise noted.
Scribe Software Corporation may have patents, patent applications, trademarks,
copyrights, or other intellectual property rights covering subject matter in this
document. Except as expressly provided in any written license agreement from
Scribe Software Corporation, the furnishing of this document does not give you any
license to these patents, trademarks, copyrights, or other intellectual property.
Licensing
The Unicode Text Adapter uses the LumenWorks.Framework.IO.Csv Fast CSV
Reader which is distributed under the MIT License.
Disclaimer
Scribe Software Corporation makes no representations or warranties with respect
to the adapter or the contents or use of this document, and disclaims any express or
implied merchantability or fitness for any specific purpose. Information in this
document is subject to change without notice. Scribe Software Corporation reserves
the right to revise this document or to change its contents at any time without
obligation to notify any individual or corporation about the changes.
Regardless of whether any remedy set forth herein fails of its essential purpose, in
no event will Scribe Software Corporation be liable to you for any special,
consequential, indirect or similar damages, including any lost profits or lost data
arising out of the use or inability to use the software or documentation, even if
Scribe has been advised of the possibility of such damages. Some states do not
allow the limitation or exclusion of liability for incidental or consequential damages
so the above limitation or exclusion may not apply to you.
Table Of Contents
Overview 1
Concepts 1
About The Tutorials 2
Requirements 3
Other Prerequisites 5
Tutorial 1: Migrating Account Information 11
Objectives 11
One: Create Connections 12
Two: Configure The Source 16
Three: Configure The Target Steps 19
Four: Create Data Links Between Source And Target Fields 23
Five: Add A Function 26
Six: Create A Lookup Link 29
Seven: Test The Data 31
Eight: Run The Job 34
Nine: Find Errors 37
Ten: Correct And Test The DTS 38
Eleven: Re-Run The Job 39
Tutorial 2: Creating A DTS File With Multiple Steps 40
Overview Of This Tutorial 40
One: Configure The Source 40
Two: Configure Target 41
Three: Add A Pre-Operation Step Flow Control Formula 42
Four: Create A Lookup Link 43
Five: Create Data Links And Formulas 44
Six: Check The Automatic Foreign Key Assignment 47
Seven: Test The Job 49
Eight: Run The Job 52
Nine: Review The Data 53
What's Next 54
Tutorial 3: Creating An Integration In Scribe Console 55
Objectives 55
One: Create A Collaboration 56
Two: Add An Integration Process 58
Three: Prepare The Scribe Sample Database 63
Four: Check The DTS File 64
Five: Run The Integration 70
Tutorial 4: More Console Techniques 72
Objectives 72
One: Introduce An Error In The DTS File 73
Two: Create The Rejected Rows Table 75
Three: Create A Data View 77
Four: Create A Monitor 81
Five: Run The Integration Process 86
Six: Check The RR_ACCOUNTS Table 89
Seven: Check The Monitor 90
What's Next? 92
Overview
Try the tutorials included in this guide to get started using Scribe Insight.
Concepts
Some concepts these tutorials introduce are:
l Data translation specification (DTS) — A file created in Scribe Workbench that
stores the information required to migrate or integrate data between source and
target data stores. This file consists of:
o Source and target data stores.
o Data processing logic to use when the DTS file is run.
o Formulas to link source fields to target fields, set constant values in target
fields, or define matching criteria.
o Formulas used to convert, parse, or import selected source fields.
l Integration Process (IP) — Detects an event and runs a DTS to modify and
integrate your data. Events that an IP detects include:
o A message being written into a queue.
o The results of a SQL query.
o A file being saved in a folder.
o A specific time.
l Collaboration — A set of Integration Processes, related files, and reports that
enables you to organize IPs into meaningful abstractions of business processes.
You use Scribe Workbench to create DTS files and use Scribe Console to include those DTS
files in an IP and Collaborations.
Page 1
Page 2 Overview
About The Tutorials
The tutorials involve migrating account, address, and contact data from a source text file to
a target SQL Server database, creating a collaboration that allows you to run the DTS file
automatically, and monitoring the run for errors.
The first two tutorials highlight the major features of the Scribe Workbench and show you
how to:
l Create database connections
l Configure a source and target
l Link source and target data fields
l Set up insert and update steps to control the data migration
l Save your work as a Data Translation Specification (DTS) file
l Test and run the DTS
l Identify and fix errors using the Transaction Error Report
The second two tutorials introduce you to some of the features of the Scribe Console and
show you how to:
l Create a collaboration
l Create and run an Integration Process within the collaboration
l Add a rejected rows table to your DTS file
l Add a data view for the rejected rows table
l Add a monitor and an alert
l Review alert details
Throughout this guide, there are references to advanced functionality that is not included in
the tutorials. For more information, see the Scribe Insight online help.
l The figures in this tutorial are based on Insight 7.9.1. If you are using an
earlier version of Insight, your Insight interface may differ from these
figures. However, most of the instructions are the same.
l You can download the latest version of the tutorial from
https://openmind.scribesoft.com/download/ScribeInsightTutorial. The
revision date is on the title page.
About The Tutorials
Requirements
Overview Page 3
Requirements
l Scribe Insight 7.9.1 or higher — The Scribe Insight Workbench must be installed
before you use these tutorials. For information about installing Scribe Insight, see
the Scribe Insight Installation Guide, which you can download from
https://openmind.scribesoft.com/html/insight_download.
l Scribe Sample Text — When you install Scribe Insight, the sample text files are
installed in C:UsersPublicDocumentsScribeSamplesTextdata and the Scribe
Sample Text ODBC Data Source Name is created.
l Scribe Sample SQL Server database — When you install Insight, you have the
option to install the SCRIBESAMPLE SQL Server database and to create the
corresponding ODBC DSN Scribe Sample.
If this sample database is not installed, the ODBC DSN is not displayed. Before you
start the tutorial, you must install the sample database.
Install The Scribe Sample SQL Server Database
1. Navigate to the Scribe program folder. By default, this is C:Program Files
(x86)Scribe.
2. Double-click the InternalDB application. The Scribe Internal Database
Maintenance Utility opens.
Figure 1: Scribe Internal Database Maintenance Utility
Page 4 Overview
3. Click the Sample Database tab.
Figure 2: Scribe Internal Database Maintenance Utility
4. Click Install Sample Database and follow the prompts to create the
database.
If you previously ran the tutorial and put data into the sample database, click
Refresh Sample Data to delete that data and restore the sample database
to its original, empty state.
Requirements
Other Prerequisites
Overview Page 5
Other Prerequisites
After you install Scribe Insight and make sure the sample database is available, there are
still a couple of tasks you need to do before beginning Tutorial 3:
l Verify your Console is configured correctly — Follow the directions on configuring
Scribe Insight in the Scribe Insight Installation Guide.
File security settings are located under Administration, on the File Management
tab of the Security pane.
When configuring the security settings, ensure at a minimum Scribe has access to
the following folders:
o C:Program Files (x86)ScribeUtilities
o C:UsersPublicDocumentsScribe
o C:UsersPublicDocumentsScribeCollaborations
o C:UsersPublicDocumentsScribeSamples
o C:UsersPublicDocumentsScribeTemplates
o C:UsersPublicDocumentsScribeTracing
Figure 3: Console Security Settings
Page 6 Overview
l Verify Accounts.dts is available — To use tutorials 3 and 4, the Accounts.dts file
created in Tutorial 1: Migrating Account Information, must be available.
o Scribe Insight 6.5.2 or earlier — Follow the instructions in Tutorial 1 to
create Accounts.dts.
o Scribe Insight 7.0 or later — Either follow the instructions in Tutorial 1 to
create Accounts.dts, or use the Accounts.dts file installed in
C:UsersPublicDocumentsScribeSamplesTutorials. If you use this file,
you must reconnect the Scribe Sample database.
Reconnect To The Scribe Sample Database
1. Open Scribe Workbench.
2. Click File > Open. The Select DTS dialog box appears.
3. Browse to the C:UsersPublicDocumentsScribeSamplesTutorials folder.
Figure 4: Select DTS Dialog Box
4. Double-click Accounts.dts. The Resolve Connection dialog box appears.
Connections are saved in the Internal Database. You are prompted to specify either
a name to use for the new connection in the database, or an existing connection with
similar properties you want the DTS to use.
Other Prerequisites
Other Prerequisites
Overview Page 7
5. Click OK when you are prompted to resolve the following connections:
l Scribe Sample
l Scribe Sample Text
Figure 5: Resolve Connection
The Connection Manager displays. Scribe Sample displays in red italics, which
indicates it is currently disconnected:
Figure 6: Connection Manager With Disconnected Datastore
6. In the Connection Manager, select Scribe Sample and click Edit. The Connection
Settings dialog appears.
Page 8 Overview
7. Click the Global Connection Settings tab.
Figure 7: Connection Tab
8. Click Connect. The Scribe Sample Connection Information dialog box appears.
Figure 8: Sample Connection Information
Other Prerequisites
Other Prerequisites
Overview Page 9
9. Expand ODBC Data Sources, select Scribe Sample, and click OK. The SQL Server
Login dialog box appears.
Figure 9: Sample Connection Information
10. In Password, enter the default Scribe password, integr8!, and click OK. You return
to the Connection Settings dialog box.
11. Click OK again. You return to the Connection Manager.
The Scribe Sample connection now displays in black, which indicates you have
successfully reconnected. The Accounts.dts file is ready to use.
Figure 10: Connection Manager With Reconnected Data Source
Page 10 Overview
12. Click Close.
If any other DTS files on your system use the Scribe Sample connection, you are
prompted to update those connections in other DTS files. Click No.
Figure 11: Globally Update The Connection
The Scribe Sample connection you defined is saved and only the Account DTS is
updated to use that connection.
13. Select File > Save.
14. If you are prompted to create a backup file, click No.
15. Select File > Exit.
Other Prerequisites
Objectives
Tutorial 1: Migrating Account Information Page 11
Tutorial 1: Migrating Account Information
Objectives
Use this tutorial to learn how to:
l Connect to a source text file and a target SQL Server database.
l Map fields from the source to the target.
l Save your setup as a DTS file.
l Run the DTS to verify the migration is working.
l Review and fix any errors.
l Rerun the DTS to verify your fix worked.
Start The Tutorial
To get started with this tutorial, on your desktop, double-click the Scribe
Workbench icon.
Figure 12: Scribe Workbench Desktop Icon
The Scribe Workbench main window displays.
Figure 13: Scribe Workbench Main Window
Page 12 Tutorial 1: Migrating Account Information
One: Create Connections
The first step in creating a DTS file is to specify your connection. For any data integration
or migration, you need a source that contains the data you want to move, and one or more
targets where you want to move that data. If you decide later you need more connections,
you can add them at any time.
For this tutorial, you create two connections: one for the source and one for the target.
Select Your Connections For This Tutorial
1. In the Scribe Workbench main window, select View > Connections. The
Connection Manager dialog box displays.
Figure 14: Connection Manager
2. Click the New button. The Add a Connection dialog box opens.
Figure 15: Add A Connection
One: Create Connections
One: Create Connections
Tutorial 1: Migrating Account Information Page 13
3. Click the plus sign ( + ) next to ODBC Data Sources to expand the tree, then select
Scribe Sample Text. This is the source connection in the DTS file.
Figure 16: Select Scribe Sample Text
The list of data sources included in this tree depends on your specific
environment. For example, in your working environment you may
have only a few ODBC data sources displayed.
4. Click OK. The Connection Settings dialog box displays with the Connection name
filled in on the Global Connection Settings tab.
Because you already have a Scribe Sample Text connection, this connection is given
a unique name, Scribe Sample Text (1).
Figure 17: Connection Settings Property Sheet
Page 14 Tutorial 1: Migrating Account Information
5. Click OK to save the connection and close the dialog box.
6. In the Connection Manager, click New again.
7. Expand ODBC Data Sources, then select Scribe Sample. This is the target
connection in your DTS file.
Figure 18: Select Scribe Sample
8. Click OK. The SQL Server login dialog box appears.
Figure 19: SQL Server Login
One: Create Connections
One: Create Connections
Tutorial 1: Migrating Account Information Page 15
9. In Login ID, enter SCRIBE.
10. In Password, enter integr8!.
11. Click OK. You return to the Connection Settings dialog box.
12. Click OK to close the Connection Settings dialog box. The Connection Manager
displays with the Scribe Sample (1) and Scribe Sample Text (1) connections.
Figure 20: Connection Manager
13. Click Close to close the Connection Manager.
Page 16 Tutorial 1: Migrating Account Information
Two: Configure The Source
The next step is to configure your source connection. In this tutorial, you select Scribe
Sample Text as the source and select a single table.
A source is a data set or a group of rows and columns. The data set can be any of the
following:
l A single table
l Multiple tables joined by a SQL query
l A single text file
l A result returned from a SQL query
l Results returned from a stored procedure
l An adapter object or related adapter objects
Configure The Source
1. From the Scribe Workbench main window, click Configure Source. The Configure
Source dialog box appears.
Figure 21: Configure Source Button
2. Click the Connections drop-down and select Scribe Sample Text.
Figure 22: Select Scribe Sample Text
Two: Configure The Source
Two: Configure The Source
Tutorial 1: Migrating Account Information Page 17
3. In the Configure Source Data Objects Explorer, expand Tables, then select Leads.
Figure 23: Select The Source — Leads Table
After you select the Scribe Sample Text connection, the Custom Query option
displays in this dialog box. While the tutorial uses only a single table as the source
data object, a powerful feature of Scribe Workbench is the variety of sources you
can configure through the Custom Query option.
Page 18 Tutorial 1: Migrating Account Information
4. Click OK. The list of source data fields displays in the source pane.
Figure 24: Leads Fields In Scribe Workbench Source Pane
Each field has an associated reference number shown in the Ref column. Use this
reference number when you work with functions and formulas.
Figure 25: Source Reference Values
Two: Configure The Source
Three: Configure The Target Steps
Tutorial 1: Migrating Account Information Page 19
Three: Configure The Target Steps
Next, you configure the target — the location where you integrate the data. Although you
can select multiple targets, you select only one target in this tutorial.
A step is an operation performed on a target data object. Each target can have multiple
steps. To define a step, in the Configure Steps dialog box, select a target data object and an
operation to perform on that target. Steps are performed in the in the order they are listed:
l Steps can be performed on tables, views, stored procedures, XML objects and
adapter objects.
l The default step operation is Insert, which inserts the source record into the target.
You can specify a different operation.
l Steps are performed once for each source row.
This tutorial integrates data from the Leads table in the source to the Accounts table in the
target SQL database.
Configure The Target Steps
1. From the Scribe Workbench main window, click Configure Steps. The Configure
Steps dialog box displays.
Figure 26: Configure Steps Button
2. Click the Data Objects tab.
Figure 27: Add Button On The Data Objects Tab
Page 20 Tutorial 1: Migrating Account Information
3. Click the Add button. The Add Target Connection dialog box appears.
Figure 28: Select Scribe Sample
4. Click the Connection drop-down and select Scribe Sample.
This is the target connection you will use for this tutorial.
5. Click OK. You return to the Configure Steps dialog box with the Scribe Sample target
data objects displayed.
6. On the Data Objects tab, expand Tables and select the ACCOUNT table.
Figure 29: Select A Target — Account Table
Three: Configure The Target Steps
Three: Configure The Target Steps
Tutorial 1: Migrating Account Information Page 21
7. At the bottom of the pane, select Update/Insert from the Operation drop-down list.
Figure 30: Update/Insert Operation
Make sure you select Update/Insert, not Insert/Update. The difference
is explained in the Insight Help.
8. Click Add Update/Insert Step.
Figure 31: Add Update/Insert Step
Page 22 Tutorial 1: Migrating Account Information
9. Click Close. You return to the Scribe Workbench main window, where the fields
from the ACCOUNT table of the Scribe Sample database appear in the target pane.
Figure 32: Field Information From Source And Target
Three: Configure The Target Steps
Four: Create Data Links Between Source And Target Fields
Tutorial 1: Migrating Account Information Page 23
Four: Create Data Links Between Source And Target Fields
At this point, you need to create data links to map source fields to target fields. Data links
enable you to set values on target fields. One or more source fields can be linked to one or
more fields in the target. For example, you may want to link a CONTACT_NAME field that
contains a full name in the source to both ContactFirstName and ContactLastName in the
target.
The Data Link button enables you to map source fields to target fields. The button is in the
center of the Scribe Workbench window, between the two panes.
Figure 33: Data Link Button
A check mark indicates the fields are linked.
Figure 34: Data Link
Page 24 Tutorial 1: Migrating Account Information
Create Data Links
1. In the Scribe Workbench, from the source field list, select the UNIQUE_ID field.
This field has the Source Reference number S1.
2. In the target field list, select the XREF field.
3. Click the Data Link button. A check mark indicates the fields are linked.
Figure 35: Data Link
4. Create the following additional Data Links:
l PHONE to PHONE
l BUSINESS_NAME to ACCOUNTNAME
Four: Create Data Links Between Source And Target Fields
Four: Create Data Links Between Source And Target Fields
Tutorial 1: Migrating Account Information Page 25
5. Click the Data Formulas tab at the bottom of the Scribe Workbench main window
to see the data formulas you have created.
Figure 36: Data Formulas Tab Showing Data Links
If you do not see the Data Formulas tab, click the Expand Links Pane button,
located below the Formula button.
Figure 37: Expand Links Pane Button
Page 26 Tutorial 1: Migrating Account Information
Five: Add A Function
Next, you need to ensure the required field, ACCOUNTID, is always unique.
Insight indicates a required field by bolding and underlining the field name in the target
pane. In this tutorial, ACCOUNTID is a required field in the target and you must make sure
it is set to a unique number whenever you insert a new record. To do this, you use the
Formula Browser to add the GUID function, which generates a unique number to the
ACCOUNTID field in the target.
When you connect with a Scribe adapter, unique IDs are usually generated by
the adapter or the applications API.
The Formula Button enables you to access the Function Browser.
Figure 38: Formula Button
The Function Browser contains over 180 functions you can use to create simple and
complex formulas, which can use multiple functions and logical IF statements. If you
create your own formula, you can save it and use it in future DTS files. Formulas you save
are included in the Function Browser under the category User Defined Formulas.
Five: Add A Function
Five: Add A Function
Tutorial 1: Migrating Account Information Page 27
Add A Function
1. In the Scribe Workbench, in the Target pane, select ACCOUNTID.
2. Click the Formula button in the center of the Scribe Workbench window.
The Edit Formula window displays.
Figure 39: Edit Formula
3. In the Function Browser, expand Functions by Category, expand System
Functions, then double-click GUID. The GUID function is added to the Formula
Editor.
Page 28 Tutorial 1: Migrating Account Information
4. Click OK to close the Edit Formula window and return to the Scribe Workbench main
window. The GUID() formula now displays in the Formula column of ACCOUNTID
row of the Data Formulas tab.
Figure 40: GUID Formula Added To Data Formulas Tab
By default, an Update overwrites data every time you run a job.
Since you are working with an Update/Insert step, you want to ensure the ACCOUNT
ID is assigned only when a new account is inserted, not when an existing account is
updated. To do this, you must change the overwrite status of the ACCOUNTID field.
The overwrite status is displayed in the Overwrite field of the Data Formulas tab.
Figure 41: Overwrite Column In Data Formulas Tab
5. On the Data Formulas tab, in the row where the Step field is set to ACCOUNTID,
double-click the asterisk (*). The asterisk disappears, indicating the overwrite
feature has been turned off. When overwrite is turned off:
l When a target record is inserted — The data link is used and a new GUID value
is generated for ACCOUNTID.
l When a target record is updated — The data link is not used and the existing
GUID value for ACCOUNTID is not changed.
l In general, you want to turn off Overwrite for any field you use as a
lookup link when you are performing an Update/Insert or an
Insert/Update step.
l In Insight 7.0.0 and later, the name of the target connection is included
in the step name, providing more information in multitarget DTS files.
Five: Add A Function
Six: Create A Lookup Link
Tutorial 1: Migrating Account Information Page 29
Six: Create A Lookup Link
A lookup link enables you to define the match criteria for seek, update, or delete steps. In
this tutorial, you create a lookup link between the UNIQUE_ID field in the source and the
XREF field in the target. The update/insert step you created uses this link to locate the
account to be updated.
Because there is already a data link between these fields, you must insert the UNIQUE_ID
from the source into the XREF field in the target. If you run the DTS more than once, the
lookup link between these fields uses the value inserted into XREF to enable you to identify
the target records to update.
You use the Lookup Link button in the middle of the window to create a lookup link.
Figure 42: Lookup Link Button
Symbols in the source and target panes indicate links and indexes:
l Checkmark (target and source panes) — The field is linked.
l Checkmark inside a square (target pane) — The field is part of a lookup link. A
graphical display of the link displays in the Links tab.
l Star (target pane) — The field is part of an index.
l Star inside a circle (target pane) — The field is part of a unique index.
Figure 43: Linked Source And Target Fields
Page 30 Tutorial 1: Migrating Account Information
Create A Lookup Link
1. In the source pane, select the UNIQUE_ID field.
2. In the target pane, select the XREF field.
3. Click the Lookup Link button.
4. Click the Links tab at the bottom of the Scribe Workbench main window.
5. Select Show links to Source field. A graphic representation of the lookup link
displays on the tab.
Figure 44: Link Representation
Six: Create A Lookup Link
Seven: Test The Data
Tutorial 1: Migrating Account Information Page 31
Seven: Test The Data
Testing data enables you to preview the results of your job without writing any rows to the
target connection. In the Test window, you can:
l Step through each row that will be written to the target connection
l Browse the source data
l Verify links and formula results
The Test job button is on the toolbar on the top left of the Scribe Workbench window.
Figure 45: Test Job Button
Test Your Data
1. On the Scribe Workbench toolbar, click the Test Job button.
The Test window appears, showing the source field names and values, as well as
data links, lookup links, and step results for each record.
Figure 46: Test Window
Page 32 Tutorial 1: Migrating Account Information
2. Click Next to scroll through each of the source rows.
Notice the business names on the Data Links tab are all uppercase. This would be
unattractive when you print addresses.
3. Click Close to close the Test window.
Before you actually run the job, let’s change the business names to mixed case.
To Change A Field Property
1. In the Scribe Workbench main window, click the Data Formulas tab.
2. Double-click the ACCOUNTNAME row. The Data Formulas window opens.
3. In the Formula Editor field, highlight S3.
4. In the Function Browser, expand Functions by Category, expand Text, and
double-click the PROPER function. In the Formula Editor field, PROPER(S3) appears.
5. Click OK to close the Edit Formulas window.
6. Verify PROPER(S3) appears in the Formula column next to ACCOUNTNAME.
Figure 47: Data Formulas Pane Showing Proper Function
Seven: Test The Data
Seven: Test The Data
Tutorial 1: Migrating Account Information Page 33
7. Save the DTS as Accounts.dts in the UsersPublicPublic
DocumentsScribeSamplesTutorials directory. If Accounts.dts already exists,
you can replace it.
8. Click the Test Job button. The Test window appears.
9. Click Next to scroll through each source row and verify the business names on the
Data Links tab are title case, for example, George Tel, not GEORGE TEL.
Figure 48: Test Window With Modified AccountName Values
10. Click Close.
Page 34 Tutorial 1: Migrating Account Information
Eight: Run The Job
Now, you try running the job. When your job runs, it updates your target data.
Run Your Job
1. In the Scribe Workbench main window, click the Run button.
Figure 49: Run Job Button
The Run Complete window appears, showing 15 successful and 1 one failed insert.
You need to determine why one row failed.
Figure 50: Run Complete Window
Eight: Run The Job
Eight: Run The Job
Tutorial 1: Migrating Account Information Page 35
2. Click the Transaction Errors button. The Execution Log Viewer appears,
summarizing the errors.
Figure 51: Execution Log Viewer
You need to view the row information in a more readable and printable format.
Page 36 Tutorial 1: Migrating Account Information
3. Click Transaction Errors. The Transaction Errors report appears.
Figure 52: Transaction Errors Report
You can see the failure occurred on row 14 and was caused by a blank account name.
Figure 53: Errors In Transaction Errors Report
4. Close the report.
5. Close the Execution Log Viewer.
6. Close the Run Complete window.
Eight: Run The Job
Nine: Find Errors
Tutorial 1: Migrating Account Information Page 37
Nine: Find Errors
You know from the Transaction Errors report the error is on row 14. Now you need to test
the job again to find the exact record where the error occurs.
See The Errors
1. In the Scribe Workbench, click the Test Job button. The Test window appears.
2. Click Next until source row 14 displays.
3. On the Data Links tab, locate the value of ACCOUNTNAME.
Figure 54: Test Window Showing ACCOUNTNAME Error
The value of #NULL! is causing the error.
4. Close the Test window.
Page 38 Tutorial 1: Migrating Account Information
Ten: Correct And Test The DTS
When you ran the job, you discovered the job cannot insert a null value into the account
name. One solution is to add error checking and error handling to the ACCOUNTNAME field.
This is easily done through the Edit Formula window.
Correct A Formula
1. In the Scribe Workbench, on the Target pane, select the ACCOUNTNAME row and
click the Formula button. The Formula Editor appears.
2. In the Edit Formula pane, PROPER(S3) displays in the formula editor. Enter the
following formula:
IF(ISERROR(S3),"Unknown"&S1,PROPER(S3))
This formula looks at the BUSINESS_NAME, Ref S3, field in each row:
l If the field is null, the value in the field is changed from null to “Unknown" and
the UNIQUE_ID from the source is included with it before inserting that value
into ACCOUNTNAME.
l If an account name is not provided in the source data, you use the UNIQUE_ID
to relate the missing data back to the source data.
3. Click OK. You return to the Scribe Workbench.
The updated formula appears on the Data Formulas tab.
Figure 55: Data Formulas Pane Showing The Corrected Formula
4. Save the Accounts DTS file.
5. Click the Test Job button.
Verify your mappings, formulas, and functions are returning the expected values.
For example, make sure the target ACCOUNTNAME is correct. For record 14, the
value in ACCOUNTAME field should be Unknown114.
6. Click the Close button.
Ten: Correct And Test The DTS
Eleven: Re-Run The Job
Tutorial 1: Migrating Account Information Page 39
Eleven: Re-Run The Job
Now that you have changed the DTS file to check for and fix null business names, run the
job again.
Re-Run The Job
1. In the Scribe Workbench main window, click the Run Job button.
2. Verify all rows succeeded and no rows failed. The Run Complete window shows:
l 1 insert was performed on the corrected row. The ACCOUNTNAME value for this
row reads Unknown114.
l 15 updates were performed. Because of the lookup link between UNIQUE_ID
and XREF, 15 rows were updated rather than inserted again.
Figure 56: Run Complete Showing Successful Run
3. Click Close.
This completes the first tutorial. You can proceed to the second tutorial in the next section
to learn more about the Scribe Workbench.
Tutorial 2 assumes Tutorial 1 has correctly populated the Scribe Sample
database. Without this data, Tutorial 2 does not run correctly.
Tutorial 2: Creating A DTS File With Multiple Steps
This tutorial requires you to have completed Tutorial 1, which introduced the Scribe
Workbench and some of the concepts discussed here, as well as integrated data into the
DTS file you use in Tutorial 2.
Overview Of This Tutorial
This tutorial introduces you to these concepts:
l Creating a multiple-step job
l Using the update operation and lookup links
l Using the SKIPSTEP function
If the Accounts DTS from the first tutorial is still open:
1. In the Scribe Workbench, save the DTS.
2. Click File > New.
To get started with this tutorial, you create a new DTS.
One: Configure The Source
Tutorial 2 assumes Tutorial 1 has correctly populated the Scribe Sample
database. Without this data, Tutorial 2 does not run correctly.
As in the first Tutorial, your first step is to add connections for the source.
Configure The Source
1. In the Scribe Workbench, click Configure Source.
2. Select Scribe Sample Text as the source connection.
3. In the database tree, expand All Data Objects, expand Tables, and select Leads.
4. Click OK to close the Configure Source dialog box.
Page 40
Page 41 Tutorial 2: Creating A DTS File with Multiple Steps
Two: Configure Target
Next, you configure steps for the target.
Configure The Target
1. In the Scribe Workbench, click Configure Steps.
2. Click the Data Objects tab.
3. Click Add.
4. In Connection, select Scribe Sample and click OK.
5. Add the following steps:
l Select the ACCOUNT table and add a Seek step.
A Seek step uses lookup links to find rows in the target connection.
l Select the ADDRESS table and add an Insert step.
l Select the CONTACT table and add an Insert step
l Select the CONTACT table again and add another Insert step.
The source data has a contact and alternate contact, so you need two CONTACT
insert steps to create a contact record in the target for the contact, and for the
alternate contact, when one exists.
Figure 57: Configuring Multiple Target Steps
6. Click Close to close the Configure Steps window.
Two: Configure Target
Three: Add A Pre-Operation Step Flow Control Formula
Tutorial 2: Creating A DTS File with Multiple Steps Page 42
Three: Add A Pre-Operation Step Flow Control Formula
If you look at the fields in the Source pane, notice there is both a CONTACT_NAME field (Ref
S2), and an ALT_CONTACT field (Ref S16). Sometimes the ALT_CONTACT field is null. If
there is a row for which the ALT_CONTACT field is null, you want to skip the CONTACT
Insert step associated with ALT_CONTACT. To do this, you must add a pre-operation step
flow control formula on the second CONTACT step.
Add A Step Flow Control Formula
1. In the Scribe Workbench, click Configure Steps.
2. Click the Flow Control tab.
3. In the Step Order pane, select Step 4 — Scribe Sample.CONTACT(2) Insert.
4. In the Pre-Operation Step Flow Control Formula text box, enter the following
formula:
IF(ISERROR(S16), SKIPSTEP(), TRUE())
This formula checks to see if there is a null value in field S16, ALT_CONTACT:
l If ALT_CONTACT is null, the SKIPSTEP function skips the step.
l If ALT_CONTACT is not null, the step is executed and the value in ALT_CONTACT
is inserted into the target.
Figure 58: Add A Pre-Operation Step Flow Control Formula
5. Click Close to close the Configure Steps window.
Page 43 Tutorial 2: Creating A DTS File with Multiple Steps
Four: Create A Lookup Link
Now you want to create a lookup link. The lookup link on the ACCOUNT Seek step allows
you to find a matching account in the target before inserting new contacts.
For DTS files with multiple steps, you can use the drop-down list above the target pane to
display only the fields from an individual step:
Figure 59: Step Selection Menu
For Insight 7.0.0 and later, the name of the target connection is included in the step name.
The first link you want to create is a lookup link on the ACCOUNT Seek step.
Create A Lookup Link
1. From the Configure Steps drop-down list, select Scribe Sample.ACCOUNT Seek.
2. Create a lookup link between UNIQUE_ID in the source and XREF in the target.
Four: Create A Lookup Link
Five: Create Data Links And Formulas
Tutorial 2: Creating A DTS File with Multiple Steps Page 44
Five: Create Data Links And Formulas
Now, you create data links and formulas to process the data before it is inserted.
Create The Needed Data Links And Formulas
1. From the Configure Steps drop-down list, select Scribe Sample.ADDRESS
Insert.
2. Create the following data links from the source to the target:
Source Field Target Field
ADDRESS ADDRESSLINE1
CITY CITY
STATE STATE
ZIP_CODE ZIP
COUNTRY COUNTRYCODE
3. From the Target pane, select each field listed below and add the specified formula:
l ADDRESSID — To ensure the address for this company is unique, click the
Formula button and add the GUID() function.
l ADDRESSLINE1, CITY — To format the address and city names in mixed case,
add the PROPER function to these fields.
l COUNTRYCODE — Ensure you have the correct country code by adding the
following formula to this field:
IF( ISERROR(S8), "US", DBLOOKUP(S8, "Scribe Internal Database",
"COUNTRY", "COUNTRYNAME", "COUNTRYCODE"))
If the country code is null, the country code is set to the default value of “US”.
Otherwise, this formula performs a database lookup to find the COUNTRYCODE
value.
In SQL , the formula would be similar to:
SELECT COUNTRYCODE FROM COUNTRY WHERE COUNTRYNAME = (value from S8)
Page 45 Tutorial 2: Creating A DTS File with Multiple Steps
4. Select CONTACT Insert from the Configure Steps drop-down list and create the
following data links:
Source Field Target Field
CONTACT_NAME CONTACTNAME
CONTACT_NAME FIRSTNAME
CONTACT_NAME LASTNAME
As with the Address ID, you want to ensure the CONTACTID is unique. In addition,
the CONTACT_NAME (Ref S2) field in the source needs to be split into two fields:
FIRSTNAME and LASTNAME. However, to make sure the name is right, you also want
to insert the whole contact name.
5. Create a formula to add the GUID() function to CONTACTID.
6. Add the PROPER() function to CONTACTNAME.
For the FIRSTNAME and LASTNAME fields, you want to parse contact name and
change the case to mixed case.
7. For FIRSTNAME, add the following formula:
PROPER(PARSENAME(S2, "F"))
8. For LASTNAME, add the following formula to specify an “L” rather than an “F”:
PROPER(PARSENAME(S2, "L"))
9. From the Configure Steps drop-down list, select CONTACT(2) Insert and
configure the data links for the CONTACT(2) step the same way you configured the
data links for the CONTACT step.
For CONTACT(2), be sure to use the source field ALT_CONTACT (S16) in your data
links and formulas.
Five: Create Data Links And Formulas
Five: Create Data Links And Formulas
Tutorial 2: Creating A DTS File with Multiple Steps Page 46
10. When you are done, set the Configure Steps drop-down list back to <View All
Steps>. Your Data Formula tab should look similar to the one below.
Figure 60: Completed Target Configuration
l In the Data Formulas tab, you can sort columns by double-clicking a
column header. In this example, the Formula Field has been sorted
alphabetically.
l To change the column that is sorted, click View > Sort > Data
Links, then select the column name you want to sort.
11. Save the DTS as ContactsandAddresses.dts. If this file already exists in the
..ScribeSamplesTutorials directory, you can overwrite it.
Page 47 Tutorial 2: Creating A DTS File with Multiple Steps
Six: Check The Automatic Foreign Key Assignment
When relationships are defined between tables or objects, Insight automatically fills in
values for the related fields. Because the ACCOUNT table and the CONTACT table are
related by the ACCOUNTID field, Insight fills in the ACCOUNTID in the CONTACT table when
a contact is inserted.
Autolinks are indicated by a diamond icon and italic text.
Figure 61: Automatic Link Indicators
Six: Check The Automatic Foreign Key Assignment
Six: Check The Automatic Foreign Key Assignment
Tutorial 2: Creating A DTS File with Multiple Steps Page 48
Use Automatic Foreign Key Assignment
1. From the Configure Steps drop-down list, select CONTACT Insert.
2. Right-click ACCOUNTID and select Field Properties to display the Properties
window.
Figure 62: ACCOUNTID Properties Window
Notice there is an Auto foreign key assignment on the ACCOUNTID field value in
the CONTACT step inherited from Step 1, which is a Seek on ACCOUNT for this
example. This is how Insight fills in the foreign key to a parent table.
3. Close the ACCOUNTID Properties window.
Page 49 Tutorial 2: Creating A DTS File with Multiple Steps
Seven: Test The Job
Now you can test your job.
Test The Job
1. Click Test Job.
2. Go to record 2 and click the Step Results tab.
Figure 63: Record 2 Step Results
Notice in the Source Value pane, the ALT_CONTACT field is null. Also, in the Step
Results tab, Steps 2 and 3 are inserted, but Step 4 is grayed out and arrows indicate
this Insert step is skipped.
Seven: Test The Job
Seven: Test The Job
Tutorial 2: Creating A DTS File with Multiple Steps Page 50
3. Go to record 3, which does have an alternate contact, to see how the step results are
different.
Figure 64: Record 3 Step Results
In this tutorial, the Jump To and Previous buttons are unavailable. While
these functions are supported by many Scribe adapters, they are not
supported by the ODBC text driver.
Page 51 Tutorial 2: Creating A DTS File with Multiple Steps
4. Click the Lookup Links tab. Here you can see this DTS uses a single lookup link
created during the ACCOUNT Seek step.
Figure 65: Record 3 Lookup Links
5. When you are done, close the Test window.
Seven: Test The Job
Eight: Run The Job
Tutorial 2: Creating A DTS File with Multiple Steps Page 52
Eight: Run The Job
After you are satisfied the data is correct, you can run the job.
Run The Job
1. Click Run > Run Job.
2. When the job finishes, the Run Complete window displays:
Figure 66: Run Complete Window
Notice the window shows 16 rows were processed, with 40 successful inserts and 8
skipped inserts.
If your results show 16 successful, 16 failed, and 16 skipped records,
you may not have run Tutorial 1 first or you may have refreshed the
database before starting this tutorial. Complete Tutorial 1 before you
run Tutorial 2 to get the correct results.
3. Close the Run Complete window.
4. In the Scribe Workbench, click Test Job, and view all the source rows.
Notice there are 8 records for which the ALT_CONTACT value is #NULL!. This
matches the number of skipped records.
Page 53 Tutorial 2: Creating A DTS File with Multiple Steps
Nine: Review The Data
One way to check your data is to use Microsoft SQL Server Management Studio™.
If you do not have Microsoft SQL Server Management Studio installed, you can
download it from the microsoft.com site.
Review The Data
1. Open Microsoft SQL Server Management Studio.
2. Open the SCRIBESAMPLE database, then select and open the CONTACT table.
Notice there are 24 rows in the CONTACT table. There are 16 primary contacts and 8
alternate contacts. The CONTACT table only has a CONTACTNAME field and does not
differentiate between the types of contacts. Therefore, your design requires you to
create a new record for each contact, whether it is primary or alternate.
In addition, as discussed earlier, the ACCOUNTID field has been inserted.
3. Look at the records for John Thibideau and Scott Berman. You’ll notice the
CONTACTIDs are different, but the ACCOUNTID is the same, because you assigned
the GUID function to the CONTACTID for both the primary (CONTACT) and alternate
(CONTACT 2) steps.
4. Going back to the Test window, you can see John Thibideau is the primary contact for
George Tel, Inc., and Scott Berman is the alternate. Therefore, the ACCOUNTID has
correctly been assigned to both contacts.
Nine: Review The Data
What's Next
Tutorial 2: Creating A DTS File with Multiple Steps Page 54
What's Next
Congratulations! You successfully completed the Scribe Workbench tutorials! At this point,
you should be ready to apply what you learned to your own data integration scenario. Use
the Scribe Insight Help for reference as you create your own DTS files.
The tutorials you just completed have provided an introduction to Scribe Workbench. If you
want to learn about using Scribe Console, continue on to the next tutorials.
At any point, you can re-run either tutorial. However, before you do, you need to refresh
the sample database to ensure you get the correct results.
Reset The Sample Database
1. Navigate to the Scribe program folder, C:Program Files (x86)Scribe.
2. Double-click the InternalDB.exe file to open the Scribe Internal Database
Maintenance Utility.
3. Click the Sample Database tab.
4. Click Refresh Sample Data. The sample database is reset for the tutorials.
5. Click OK on the Refresh Complete dialog box.
6. Click OK to close the utility window.
The sample database has been returned to its original state.
Tutorial 3: Creating An Integration In Scribe Console
This tutorial shows how to create a simple integration in the Scribe Console.
The scenario for this tutorial is that every evening at 10 pm, your Scribe site receives a
tab-delimited text file, which contains customer account data used to update the existing
account data in a SQL Server database.
Your job is to create an integration that runs every night at 10 pm. This integration must
add new customer data and update existing customer data, without changing any other
customer information.
For this tutorial, the Accounts.dts file created in Tutorial 1 must be available. DTS (data
translation specification) files are created in Scribe Workbench to store the translation
settings for migrating or integrating data between source and target datastores.
l Scribe Insight 6.5.2 or earlier — Follow the instructions in Tutorial 1:
Migrating Account Information to create the Accounts.dts.
l Scribe Insight 7.0 or later — Either follow the instructions in Tutorial 1:
Migrating Account Information to create the Accounts.dts, or use the
Accounts.dts file installed in
C:UsersPublicDocumentsScribeSamplesTutorials. If you use this file,
you must reconnect the Scribe Sample database.
Objectives
In this tutorial, you:
l Create a new collaboration
l Add a time-based integration process
l Check the DTS file from Scribe Console
l Run the integration
Page 55
Page 56 Tutorial 3: Creating An Integration In Scribe Console
One: Create A Collaboration
1. From the Start menu, open the Scribe Console.
2. In the Console tree, expand Scribe Console and expand the Local server:
Figure 67: Collaborations Folder In The Console Tree
3. Select the Collaborations folder, right-click the folder, then select New
Collaboration. The New Collaboration Wizard appears
4. Click Next. The Collaboration Settings screen appears.
5. In the Collaboration Name text box, enter Account Integration.
Right now, you only need to name the new collaboration before you create it.
6. Click Finish. The new Collaboration appears under the Collaborations root folder in
the Console.
Figure 68: Collaborations Folder Expanded
One: Create A Collaboration
One: Create A Collaboration
Tutorial 3: Creating An Integration In Scribe Console Page 57
7. Use Windows Explorer to move Accounts.dts from its original location to the
Collaborations folder:
l The default folder is
C:UsersPublicDocumentsScribeSamplesTutorials
l Move it to
C:UsersPublicDocumentsScribeCollaborationsAccount
Integration
Figure 69: Windows Explorer — Account Integration
Page 58 Tutorial 3: Creating An Integration In Scribe Console
Two: Add An Integration Process
Now that you have created your Collaboration, the next step is to add a time-based
Integration Process (IP).
Add An Integration Process
1. In the Collaborations folder, expand Account Integration, and select
Integration Process. The Integration Processes pane displays.
Figure 70: Integration Process Pane
Two: Add An Integration Process
Two: Add An Integration Process
Tutorial 3: Creating An Integration In Scribe Console Page 59
2. Click Add. The Add New Integration Process Wizard appears.
Figure 71: Integration Process Wizard — Step 1
You want to define a time-based event so the integration runs every evening at 10
pm. In Step 1, you begin creating the integration.
3. Under Process Event, click Time to specify this is a time-based event.
4. In Process name, enter Account Import.
5. Under Data Translation Specification, click Browse. The Select File dialog box
appears.
Page 60 Tutorial 3: Creating An Integration In Scribe Console
6. Select Accounts.dts and click OK.
Figure 72: Step 1 — Select File
You are not adding any DTS parameters. If you want to learn more about the steps in
this process, more information is available in the Scribe Insight Help.
7. Click Step 2 — Pre/Post Processing Commands.
You are not adding any pre- or post- processing commands.
8. Click Step 3 — Event Settings.
The settings available in this set depend on the Process Event type you select in Step
1. The settings that appear now apply to Time Integration Processes.
9. Under Time Event Settings, select Run DTS every day.
Two: Add An Integration Process
Two: Add An Integration Process
Tutorial 3: Creating An Integration In Scribe Console Page 61
10. Under Starting, change the time to 10:00:00 PM.
Figure 73: Step 3 — Event Settings
11. Click Step 4 — Activation.
You do not want to activate this collaboration quite yet.
12. Under Status, select Paused.
Figure 74: Step 4 — Activation
There are no changes required for the alerts, so you can skip Step 5.
Page 62 Tutorial 3: Creating An Integration In Scribe Console
13. Click Finish.
Figure 75: Time-based Integration Process
You have just created your first time-based IP!
Two: Add An Integration Process
Three: Prepare The Scribe Sample Database
Tutorial 3: Creating An Integration In Scribe Console Page 63
Three: Prepare The Scribe Sample Database
Before you start the first integration, you want to make sure the target, Scribe Sample
database, is empty so the integration can insert the records from the Scribe Sample Text
datastore when it runs for the first time. To clear the database, you use the InternalDB.exe
Scribe utility.
Prepare The Scribe Sample Database
1. In C:Program Files (x86)Scribe, double-click InternalDB.exe. The Scribe
Internal Database Maintenance Utility opens.
2. Click the Sample Database tab.
Figure 76: Scribe Internal Database Maintenance Utility
3. Click Refresh Sample Data.
Refreshing the sample data clears the contents of the Sample database.
4. When the Refresh complete! message appears, click OK, then click OK again to
close the utility.
Page 64 Tutorial 3: Creating An Integration In Scribe Console
Four: Check The DTS File
Before you run the integration, verify what the DTS file does. Remember a DTS file created
in Scribe Insight contains the instructions, including data transformation rules, to perform
an integration or migration.
You can open and examine the DTS file either directly from Scribe Workbench or from
within Scribe Console, which opens Workbench for you.
If you created this DTS file following the instructions in Tutorial 1: Migrating
Account Information, you already know what is in it, and you can skip this
step.
Examine The DTS File
1. From Scribe Console, expand Collaborations, expand Account Integration, and
select File Browser.
Figure 77: Collaborations File Browser
Four: Check The DTS File
Four: Check The DTS File
Tutorial 3: Creating An Integration In Scribe Console Page 65
2. Double-click Accounts.dts in the browser pane. Scribe Workbench opens.
Figure 78: Accounts.dts File
As mentioned in the Prerequisites section, this DTS file integrates data from the
Scribe Sample Text source to the Scribe Sample target. See Connection Manager
With Disconnected Datastore.
To see what this DTS file does, you want to examine the steps. Start by looking at
the target steps.
Page 66 Tutorial 3: Creating An Integration In Scribe Console
3. On the Insight main window, click Configure Steps.
Figure 79: Configure Steps Button
The Configure Steps dialog box appears.
Figure 80: Configure Steps
Four: Check The DTS File
Four: Check The DTS File
Tutorial 3: Creating An Integration In Scribe Console Page 67
4. In the Step Order pane, notice the DTS contains a single Update/Insert step on the
Account object.
Figure 81: Update/Insert Step In Configure Steps
Page 68 Tutorial 3: Creating An Integration In Scribe Console
5. Close the Configure Steps dialog box.
6. Click the Data Formulas tab in the main window:
Figure 82: Data Formulas Tab
Whenever you have an update step, a lookup link is required. This allows Insight to
look up the record it needs to update. For an Update/Insert operation, Insight looks
up the record, updates it if it exists, or inserts it if it is not already in the target.
On the Data Formulas tab, there is a link between source field 1 (S1),
called UNIQUE_ID, and the XREF field in the target.
Four: Check The DTS File
Four: Check The DTS File
Tutorial 3: Creating An Integration In Scribe Console Page 69
7. Scroll the target pane down to view XREF.
There are two links between UNIQUE_ID and XREF, a data link and a lookup link.
Figure 83: Lookup Criteria Tab
8. Close Scribe Workbench.
9. When you are prompted to save your changes to Account.dts, click Yes.
Page 70 Tutorial 3: Creating An Integration In Scribe Console
Five: Run The Integration
When you created the Integration Process, you added a time-based event that runs every
night. To test the integration, you don't have to wait until 10 tonight. This exercise shows
you how to run an integration before the event criteria are met.
Run Your New Integration
1. In Scribe Console, browse to Collaborations > Account Integration >
Integration Processes.
In the Integration Processes pane, you can see the status of Account Integration is
currently paused.
Figure 84: Paused Integration Process
2. Select the Account Import integration process and click Run Process.
This forces the integration process to run now.
You can also click Resume, which runs the paused integration, then queues the
integration to run on schedule at 10 pm tonight.
Figure 85: Queued Integration Process
Five: Run The Integration
Five: Run The Integration
Tutorial 3: Creating An Integration In Scribe Console Page 71
3. Expand Administration and select Execution Log. The Execution Log Viewer opens,
where you can verify your integration ran.
4. Click Refresh.
5. Verify the Result column shows Success, and the Source Rows column shows 16.
This tells you 16 source rows were successfully inserted into your target.
Figure 86: IP Result In Execution Log
At this point, your integration is done. You can stop here, or continue to the next
tutorial.
Tutorial 4: More Console Techniques
The previous tutorial provided an introduction to the Scribe Console and showed you how to
create and run a simple integration. In this tutorial, you learn some other tools and
techniques to help you use Scribe Console efficiently.
Before beginning this tutorial, you must have completed Tutorial 3: Creating
an Integration in Scribe Console.
In this tutorial, you are going to change the DTS file to introduce an error, then create a
method for handling errors in your real data. This tutorial walks you through the process of
creating a monitor to track errors during the DTS run, moving the failed data into a
rejected rows table, and creating a data view that allows you to see rejected rows.
Objectives
In this tutorial you:
l Create a Rejected Rows table in the Scribe Internal database
l Create and run a monitor
l Create a data view
l Check the data view, monitor, and alerts
Page 72
Page 73 Tutorial 4: More Console Techniques
One: Introduce An Error In The DTS File
Because the Integration Process you defined runs successfully, you cannot see how to
process errors using Insight. To do this, you must break the Accounts.dts file you used in
Tutorial 3.
Change The DTS File
1. If the Scribe Console is not already open, open it.
2. Expand Collaborations, expand Account Integration, and select File Browser.
3. Double-click Accounts.dts to open the DTS in Scribe Workbench.
4. From the Scribe Workbench main window, click the Data Formulas tab and select
the line that contains the formula:
IF(ISERROR(S3),"Unknown"&S1, PROPER( S3 ))
If you remember, this formula displays a message in the target when the
BUSINESS_NAME field in the source is empty and changes the name of the business
from all capital letters to initial caps.
Figure 87: Formula On Data Formula Tab
One: Introduce An Error In The DTS File
One: Introduce An Error In The DTS File
Tutorial 4: More Console Techniques Page 74
5. Double-click the formula to open it in the Formula Editor.
Figure 88: Formula Editor With Formula
6. Replace the entire formula with: S3
This deletes the formula but retains the data link between BUSINESS_NAME in the
source and ACCOUNTNAME in the target.
7. Click OK, then save this DTS file.
Page 75 Tutorial 4: More Console Techniques
Two: Create The Rejected Rows Table
Data rows flagged as having errors are called Rejected Rows, because the row contains
data that causes Insight to reject an insert or update. If you want to track these rows, you
need a place to record them.
To see which rows have been rejected, you create a Rejected Rows table for them. In this
tutorial, you create a table, RR_ACCOUNTS, in the Scribe Internal Database using the
DTS Settings dialog box. The DTS Setting button displays the DTS Settings dialog box.
Figure 89: DTS Settings Button
Create A Rejected Rows Table
1. In the Scribe Workbench, click the DTS Settings button. The DTS Settings dialog
box appears.
2. Click the Rejected Source Rows tab.
3. Select the Output Rejected Source Rows checkbox.
The Scribe Internal Database is the default connection. For most purposes, you can
use this table to store rejected rows records.
4. Select Always append rejected rows to the same table.
5. In the Table field, enter RR_ACCOUNTS for the table name, then click Create
Table Now.
Figure 90: Rejected Source Rows Tab
Two: Create The Rejected Rows Table
Two: Create The Rejected Rows Table
Tutorial 4: More Console Techniques Page 76
6. Click Yes to save your changes before creating the table. The Rejected row table
created message appears.
7. Click OK in the message.
8. Click OK again to close the DTS Settings dialog box.
To see the table you just created, you can use SQL Server Management Studio.
a. Open SQL Server Management Studio and browse to Databases >
SCRIBEINTERNAL > Tables > SCRIBE.RR_ACCOUNTS.
b. Exand Columns. Verify the RR_ACCOUNTS table contains the same fields as
your source data, as well as some extra fields for error handling.
Figure 91: Scribe Sample Text And RR_ACCOUNTS Tables
Before you continue, refresh the sample database so the next DTS run can insert
data.
9. Navigate to the Program Files (x86)Scribe folder and double-click
InternalDB.
10. Click the Sample Database tab, then click Refresh Sample Data.
Page 77 Tutorial 4: More Console Techniques
Three: Create A Data View
You have created a rejected rows table and have viewed it using SQL Server Management
Studio. However, it might be easier to view and repair the rejected rows within Scribe
Console. To do this, you need to create a data view.
Create A Data View
1. In the Scribe Workbench, save the DTS, then close the Workbench.
2. In the Scribe Console, expand Collaborations, expand Account Integration, and
select Data Views. The Data Views page appears.
You created the Account Integration Collaboration in the previous tutorial. Now, you
are going to create a Data View Rejected Accounts collaboration.
3. Click Add.
Figure 92: Add New Data View – Step 1
4. In the View Name and View Title fields, enter Rejected Accounts.
If you were creating multiple data views for this collaboration, you would enter the
name of a folder in the Folder field. Insight creates a folder and stores your data
view in the specified folder. Because you are only creating one data view, you can
move on to the next step.
Three: Create A Data View
Three: Create A Data View
Tutorial 4: More Console Techniques Page 78
5. Click Step 2 — Connect to Source to select a source connection.
As in the previous step, the rejected rows table is in the Scribe Internal database.
Scribe Internal needs to be the source.
6. Click Source Connect, expand ODBC Data Sources, then select ScribeInternal_
MS.
Figure 93: Add New Data View – Step 2
7. Click OK. The Connect to DSN dialog box appears.
8. Enter the connection information, then click OK.
9. Verify User ID is set to SCRIBE.
10. In Password, enter integr8!.
11. Click OK.
12. Click Step 3 — Configure Source to configure the source.
You want to use the RR_ACCOUNTS table you created earlier.
13. Click Source Configure. The Configure Source dialog box appears.
14. Expand All Data Objects (by Type), expand Tables, and select RR_ACCOUNTS.
15. Click OK.
Page 79 Tutorial 4: More Console Techniques
16. Verify the default View Presentation is set to Table.
Figure 94: Add New Data View — Step 3
17. Click Step 4 — Set Field Properties to configure the field properties.
The only change you need to make here is to allow updates and deletes.
18. In the Allowable Operations box, select Updates and Deletes:
Figure 95: Add New Data View – Step 4
Three: Create A Data View
Three: Create A Data View
Tutorial 4: More Console Techniques Page 80
19. Click Finish to save your new data view.
Now, run the view and see what happens.
20. In the Console tree, browse down to the Rejected Accounts data view and click the
data view name.
Figure 96: Running A Data View
Because you have not run the DTS file, the Rejected Accounts Data View is currently
empty. Notice the field names are the same as the fields you saw using SQL Server
Management Studio.
Figure 97: Rejected Accounts Data View
In the next step, you will create a monitor, then rerun the collaboration and check the data
view again.
Page 81 Tutorial 4: More Console Techniques
Four: Create A Monitor
A monitor enables you to oversee system issues, business activities, and Integration
Processes. You can set a monitor to raise an alert.
Create A Monitor
1. In Scribe Console, expand Collaborations, expand Account Integration, and
select Monitoring. The Monitoring pane appears.
2. Click Add to start the Add New Monitor Wizard.
Figure 98: Adding A Monitor – Step 1
3. Verify the Monitor type field is set to Query.
4. Name the monitor New Rejected Accounts and add a comment in the Comment
field.
Figure 99: Monitor Name And Comment
Four: Create A Monitor
Four: Create A Monitor
Tutorial 4: More Console Techniques Page 82
5. Click Step 2 — Source Connection to set the source.
6. Click Source Connect. Select the ScribeInternal_MS ODBC Data Source, click
OK, and enter integr8! if you are prompted to enter a password.
Figure 100: Monitor Connection Properties
For the data view, you included the entire RR_ACCOUNTS table. When you created
the RR_ACCOUNTS table, Scribe added a timestamp field called KSSTARTTIME.
For this monitor, you want to use specific data. You can create a custom query to
compare KSSTARTTIME to the values of the LastRunDateTime and ThisRunDateTime
system variables. If rows have been added to the RR_ACCOUNTS table either since
the last run or before the current run, this custom query causes the monitor to raise
an alert.
In Step 3, you are going to create the query this monitor uses.
7. Click Step 3 — Alert Criteria to define the alert criteria.
8. Click Source Configure. The Configure Source dialog box appears.
This is where you define a custom query.
9. Select Custom Query from the dialog box.
Page 83 Tutorial 4: More Console Techniques
10. In the SQL Query window, enter the following query:
SELECT * FROM SCRIBE.RR_ACCOUNTS
WHERE KSSTARTTIME >= :LastRunDateTime AND
KSSTARTTIME < :ThisRunDateTime
In this query, the ">=" raises an alert if the timestamp field, KSSTARTTIME, is
later than or greater than the last run date.
Figure 101: Monitor SQL Custom Query
11. Click Test/Requery to test your query.
When the query executes without any errors, an execution time appears next to the
Test/Requery button.
Figure 102: Monitor SQL Test Time
12. Click OK to close the Configure Source dialog box.
If there are errors, correct them, then close the dialog box.
As part of Step 3, you also need to raise an alert when one or more new rows are
found in the RR_ACCOUNTS table.
Four: Create A Monitor
Four: Create A Monitor
Tutorial 4: More Console Techniques Page 84
13. In the Alert Conditions box, select:
l Row count
l Operator = Greater Than or Equal
l Row(s) = 1
Leave the Create an Alert for each matching result row checkbox clear. You
do not need to create an alert for each matching row.
14. In the Alert Recipients box, select Fixed.
This option always sends alerts to the same Fixed list. At this point, you do not need
to worry about the recipients.
Figure 103: Monitor Alert Conditions
When you create a real collaboration, add recipients and recipient
groups under Alert Recipients in the Administration node before adding
monitors. If you add the recipients or groups in this pane, an email is
sent to the recipients whenever an alert is raised.
For information, see Managing Alert Recipients in the Scribe Insight
Help.
15. Click Step 4 — Monitor Interval.
You can use the default Monitoring Interval Settings, which monitor every 15
Minutes.
16. Click Step 5 — Activation. Verify the Status is set to Active.
You use the default values for other settings for this step.
17. Click Step 6 — Alerting.
On this step, you want to create a Warning alert.
18. In Alert Type, select Warning.
19. In Alert description, enter a meaningful description, such as New rejected
Accounts.
20. In Alert number, enter 3001.
The alert number is for your own purposes. Insight displays the Alert number but
does not use this number to perform any error handling.
Page 85 Tutorial 4: More Console Techniques
21. In Alert message, enter a meaningful message.
For example: There are new rejected accounts from the Account Import process.
22. In Alert Message Options, verify the following are selected:
l Include results
l Include row count/value
l Attach results as XML.
Having an XML file makes it easier to figure out what went wrong.
Figure 104: Monitor Activation
23. Click Finish to save and close your new monitor.
The new monitor appears in the Monitoring pane.
Figure 105: New Monitor On Monitoring Pane
Four: Create A Monitor
Five: Run The Integration Process
Tutorial 4: More Console Techniques Page 86
Five: Run The Integration Process
At this point, you have:
l Created an Integration Process (IP)
l Defined a DTS file that you know creates a rejected row
l Defined a data view for your Rejected Rows table, RR_ACCOUNTS
l Defined a monitor that monitors the IP and raises an alert if changes are made to
the RR_ACCOUNTS table
Now, you can run the IP and see what happens.
Run The Integration Process
1. In the Console tree, expand Collaborations, and expand Account Integration,
and select Integration Processes.
2. Select the Account Integration Collaboration.
Figure 106: Account Integration IP On Integration Processes Pane
3. Click Run Process. This forces the collaboration to run.
4. Expand Administration and select Execution Log.
Page 87 Tutorial 4: More Console Techniques
5. In the Execution Log Viewer, click Refresh to update the results.
Figure 107: Execution Log Viewer Showing Failed Row
The Execution Log Viewer includes two grids:
l The Execution grid, on the top, displays information about the execution.
l The Transaction grid, on the bottom, displays information about the transaction
selected in the Execution grid.
Five: Run The Integration Process
Five: Run The Integration Process
Tutorial 4: More Console Techniques Page 88
6. Select the top row in the Execution grid, which indicates a failed transaction. The
Transaction grid displays more information about this transaction.
For example, you can see source row #14 failed with Error Code 1005.
7. Double-click the message in the Transaction grid. The Transaction Detail report
displays, providing more information about the error.
Figure 108: Transaction Detail Report
According to the error message, the ACCOUNT column does not allow nulls.
Page 89 Tutorial 4: More Console Techniques
Six: Check The RR_ACCOUNTS Table
After checking the Execution Log, you know there is an error. The next step is to figure out
exactly which row you need to repair. There are a couple of ways to do this — you can open
the RR_ACCOUNTS table in SQL Server Management Studio, or you can use the data view
you created earlier in this tutorial. You are going to take a look at the table using the data
view.
Check The RR_ACCOUNTS Table
1. Expand Collaborations, expand Account Integration, expand Data Views, and
open the Rejected Accounts Data View.
This time, you see there is a row in this data view.
Figure 109: Rejected Accounts Data View
The row shows that for the company with the unique ID of 114, the Business Name is
blank.
Although doing so is beyond the scope of this tutorial, you have two options:
l Update the source to correct the error and rerun the collaboration — Since the
DTS file has lookup links and an Update/Insert step, when you rerun the
collaboration, the new data is inserted without creating duplicate records.
l Replace the error-processing formula you removed — Edit the DTS to use the
error processing formula that you removed at the beginning of this tutorial and
that inserts an error message into the target when an error is found:
[(IF(ISERROR(S3),"Unknown"&S1, PROPER(S3 )) ]
Six: Check The RR_ACCOUNTS Table
Seven: Check The Monitor
Tutorial 4: More Console Techniques Page 90
Seven: Check The Monitor
Finally, you want to check the monitor. As discussed in Step Four of this tutorial, you want
to add Alert recipients to your real monitors so an email is sent whenever an alert is raised.
For the purposes of the tutorial, however, you are only checking the monitor manually.
Check The Monitor
1. Expand Collaborations, expand Account Integration, and select Monitoring to
open this monitor.
You can also view all monitors on your system by selecting
Monitoring from the Integration Server node.
2. Click Resume, if needed, and then click Run Monitor.
3. Click Refresh.
You see an alert has been created.
Figure 110: Checking The Monitor
4. To view the alert, select Alert Log under Account Integration:
Figure 111: Viewing The Alert Log
Note there are two alerts: a system monitor raised one alert on the process ID, and
the monitor you created raised another alert.
5. Find the Monitor entry in the Category column and select the alert generated by the
monitor.
Page 91 Tutorial 4: More Console Techniques
6. Under Detail for Alert, click the Message tab for more information:
Figure 112: Alert Log Message Details — Top
Here, you see all of the information you need to correct the data and rerun the
collaboration. The data in this message is generated from the RR_ACCOUNTs table.
At the top, as shown above, you see the UNIQUE_ID field, which allows you to find
the record.
7. Scroll to the bottom of this information, where you can see the exact error message
along with the timestamp and other useful data.
Figure 113: Alert Log Message Details — Bottom
Congratulations! You have finished this tutorial.
Seven: Check The Monitor
What's Next? 
Tutorial 4: More Console Techniques Page 92
What's Next? 
With these two tutorials, you have seen some basic tasks you need to perform using Scribe
Console. There are other tools in the Console that you may need to use, but these tutorials
should help get you started.
For more information, there are many resources available, including the Scribe Insight
Help, Scribe Installation Guide, and Scribe's Forums and Knowledgebase at
https://openmind.scribesoft.com/forums.

More Related Content

What's hot

Power BI: Types of gateways in Power BI
Power BI: Types of gateways in Power BIPower BI: Types of gateways in Power BI
Power BI: Types of gateways in Power BIAmit Kumar ☁
 
Migrate a successful transactional database to azure
Migrate a successful transactional database to azureMigrate a successful transactional database to azure
Migrate a successful transactional database to azureIke Ellis
 
Azure data analytics platform - A reference architecture
Azure data analytics platform - A reference architecture Azure data analytics platform - A reference architecture
Azure data analytics platform - A reference architecture Rajesh Kumar
 
Develop scalable analytical solutions with Azure Data Factory & Azure SQL Dat...
Develop scalable analytical solutions with Azure Data Factory & Azure SQL Dat...Develop scalable analytical solutions with Azure Data Factory & Azure SQL Dat...
Develop scalable analytical solutions with Azure Data Factory & Azure SQL Dat...Microsoft Tech Community
 
Alteryx Architecture
Alteryx ArchitectureAlteryx Architecture
Alteryx ArchitectureVivek Mohan
 
J1 T1 4 - Azure Data Factory vs SSIS - Regis Baccaro
J1 T1 4 - Azure Data Factory vs SSIS - Regis BaccaroJ1 T1 4 - Azure Data Factory vs SSIS - Regis Baccaro
J1 T1 4 - Azure Data Factory vs SSIS - Regis BaccaroMS Cloud Summit
 
Azure Data Factory V2; The Data Flows
Azure Data Factory V2; The Data FlowsAzure Data Factory V2; The Data Flows
Azure Data Factory V2; The Data FlowsThomas Sykes
 
Power BI Report Server & Office Online Server
Power BI Report Server & Office Online ServerPower BI Report Server & Office Online Server
Power BI Report Server & Office Online ServerIsabelle Van Campenhoudt
 
Building a Turbo-fast Data Warehousing Platform with Databricks
Building a Turbo-fast Data Warehousing Platform with DatabricksBuilding a Turbo-fast Data Warehousing Platform with Databricks
Building a Turbo-fast Data Warehousing Platform with DatabricksDatabricks
 
Azure Data Factory presentation with links
Azure Data Factory presentation with linksAzure Data Factory presentation with links
Azure Data Factory presentation with linksChris Testa-O'Neill
 
Azure Cosmos DB + Gremlin API in Action
Azure Cosmos DB + Gremlin API in ActionAzure Cosmos DB + Gremlin API in Action
Azure Cosmos DB + Gremlin API in ActionDenys Chamberland
 
Introduction to Cortana Analytics
Introduction to Cortana AnalyticsIntroduction to Cortana Analytics
Introduction to Cortana AnalyticsChris Testa-O'Neill
 
Optimize SQL server performance for SharePoint
Optimize SQL server performance for SharePointOptimize SQL server performance for SharePoint
Optimize SQL server performance for SharePointserge luca
 
Relational data modeling trends for transactional applications
Relational data modeling trends for transactional applicationsRelational data modeling trends for transactional applications
Relational data modeling trends for transactional applicationsIke Ellis
 
Overview on Azure Machine Learning
Overview on Azure Machine LearningOverview on Azure Machine Learning
Overview on Azure Machine LearningJames Serra
 
Move a successful onpremise oltp application to the cloud
Move a successful onpremise oltp application to the cloudMove a successful onpremise oltp application to the cloud
Move a successful onpremise oltp application to the cloudIke Ellis
 
Intro to Azure Data Factory v1
Intro to Azure Data Factory v1Intro to Azure Data Factory v1
Intro to Azure Data Factory v1Eric Bragas
 
Microsoft Flow session : tips, pitfalls, warnings to be known before starting...
Microsoft Flow session : tips, pitfalls, warnings to be known before starting...Microsoft Flow session : tips, pitfalls, warnings to be known before starting...
Microsoft Flow session : tips, pitfalls, warnings to be known before starting...serge luca
 

What's hot (20)

Power BI: Types of gateways in Power BI
Power BI: Types of gateways in Power BIPower BI: Types of gateways in Power BI
Power BI: Types of gateways in Power BI
 
Migrate a successful transactional database to azure
Migrate a successful transactional database to azureMigrate a successful transactional database to azure
Migrate a successful transactional database to azure
 
Azure data analytics platform - A reference architecture
Azure data analytics platform - A reference architecture Azure data analytics platform - A reference architecture
Azure data analytics platform - A reference architecture
 
Develop scalable analytical solutions with Azure Data Factory & Azure SQL Dat...
Develop scalable analytical solutions with Azure Data Factory & Azure SQL Dat...Develop scalable analytical solutions with Azure Data Factory & Azure SQL Dat...
Develop scalable analytical solutions with Azure Data Factory & Azure SQL Dat...
 
Alteryx Architecture
Alteryx ArchitectureAlteryx Architecture
Alteryx Architecture
 
J1 T1 4 - Azure Data Factory vs SSIS - Regis Baccaro
J1 T1 4 - Azure Data Factory vs SSIS - Regis BaccaroJ1 T1 4 - Azure Data Factory vs SSIS - Regis Baccaro
J1 T1 4 - Azure Data Factory vs SSIS - Regis Baccaro
 
Azure Data Factory V2; The Data Flows
Azure Data Factory V2; The Data FlowsAzure Data Factory V2; The Data Flows
Azure Data Factory V2; The Data Flows
 
Tableau course in delhi
Tableau course in delhiTableau course in delhi
Tableau course in delhi
 
Hosting Tableau on AWS
Hosting Tableau on AWSHosting Tableau on AWS
Hosting Tableau on AWS
 
Power BI Report Server & Office Online Server
Power BI Report Server & Office Online ServerPower BI Report Server & Office Online Server
Power BI Report Server & Office Online Server
 
Building a Turbo-fast Data Warehousing Platform with Databricks
Building a Turbo-fast Data Warehousing Platform with DatabricksBuilding a Turbo-fast Data Warehousing Platform with Databricks
Building a Turbo-fast Data Warehousing Platform with Databricks
 
Azure Data Factory presentation with links
Azure Data Factory presentation with linksAzure Data Factory presentation with links
Azure Data Factory presentation with links
 
Azure Cosmos DB + Gremlin API in Action
Azure Cosmos DB + Gremlin API in ActionAzure Cosmos DB + Gremlin API in Action
Azure Cosmos DB + Gremlin API in Action
 
Introduction to Cortana Analytics
Introduction to Cortana AnalyticsIntroduction to Cortana Analytics
Introduction to Cortana Analytics
 
Optimize SQL server performance for SharePoint
Optimize SQL server performance for SharePointOptimize SQL server performance for SharePoint
Optimize SQL server performance for SharePoint
 
Relational data modeling trends for transactional applications
Relational data modeling trends for transactional applicationsRelational data modeling trends for transactional applications
Relational data modeling trends for transactional applications
 
Overview on Azure Machine Learning
Overview on Azure Machine LearningOverview on Azure Machine Learning
Overview on Azure Machine Learning
 
Move a successful onpremise oltp application to the cloud
Move a successful onpremise oltp application to the cloudMove a successful onpremise oltp application to the cloud
Move a successful onpremise oltp application to the cloud
 
Intro to Azure Data Factory v1
Intro to Azure Data Factory v1Intro to Azure Data Factory v1
Intro to Azure Data Factory v1
 
Microsoft Flow session : tips, pitfalls, warnings to be known before starting...
Microsoft Flow session : tips, pitfalls, warnings to be known before starting...Microsoft Flow session : tips, pitfalls, warnings to be known before starting...
Microsoft Flow session : tips, pitfalls, warnings to be known before starting...
 

Similar to Insight

Business Intelligence tools comparison
Business Intelligence tools comparisonBusiness Intelligence tools comparison
Business Intelligence tools comparisonStratebi
 
Sql server 2012 tutorials reporting services
Sql server 2012 tutorials   reporting servicesSql server 2012 tutorials   reporting services
Sql server 2012 tutorials reporting servicesSteve Xu
 
Learn to Effectively Script in ACL – The Keys To Getting Started and Fully Au...
Learn to Effectively Script in ACL – The Keys To Getting Started and Fully Au...Learn to Effectively Script in ACL – The Keys To Getting Started and Fully Au...
Learn to Effectively Script in ACL – The Keys To Getting Started and Fully Au...Jim Kaplan CIA CFE
 
Sql server 2012 tutorials writing transact-sql statements
Sql server 2012 tutorials   writing transact-sql statementsSql server 2012 tutorials   writing transact-sql statements
Sql server 2012 tutorials writing transact-sql statementsSteve Xu
 
November 2022 CIAOPS Need to Know Webinar
November 2022 CIAOPS Need to Know WebinarNovember 2022 CIAOPS Need to Know Webinar
November 2022 CIAOPS Need to Know WebinarRobert Crane
 
Cryptography is the application of algorithms to ensure the confiden.docx
Cryptography is the application of algorithms to ensure the confiden.docxCryptography is the application of algorithms to ensure the confiden.docx
Cryptography is the application of algorithms to ensure the confiden.docxmydrynan
 
Managing SQLserver for the reluctant DBA
Managing SQLserver for the reluctant DBAManaging SQLserver for the reluctant DBA
Managing SQLserver for the reluctant DBAConcentrated Technology
 
Great news! Executive leadership has reviewed and approved the C
Great news! Executive leadership has reviewed and approved the CGreat news! Executive leadership has reviewed and approved the C
Great news! Executive leadership has reviewed and approved the Cjesseniasaddler
 
MS SQL SERVER: Using the data mining tools
MS SQL SERVER: Using the data mining toolsMS SQL SERVER: Using the data mining tools
MS SQL SERVER: Using the data mining toolsDataminingTools Inc
 
MS SQL SERVER: Using the data mining tools
MS SQL SERVER: Using the data mining toolsMS SQL SERVER: Using the data mining tools
MS SQL SERVER: Using the data mining toolssqlserver content
 
Azure presentation nnug dec 2010
Azure presentation nnug  dec 2010Azure presentation nnug  dec 2010
Azure presentation nnug dec 2010Ethos Technologies
 
Sql interview question part 9
Sql interview question part 9Sql interview question part 9
Sql interview question part 9kaashiv1
 
Sql interview-question-part-9
Sql interview-question-part-9Sql interview-question-part-9
Sql interview-question-part-9kaashiv1
 
CPSC 50900 Database Systems ProjectAll your efforts this semeste
CPSC 50900 Database Systems ProjectAll your efforts this semesteCPSC 50900 Database Systems ProjectAll your efforts this semeste
CPSC 50900 Database Systems ProjectAll your efforts this semesteCruzIbarra161
 
Cis407 a ilab 6 web application development devry university
Cis407 a ilab 6 web application development devry universityCis407 a ilab 6 web application development devry university
Cis407 a ilab 6 web application development devry universitylhkslkdh89009
 

Similar to Insight (20)

Business Intelligence tools comparison
Business Intelligence tools comparisonBusiness Intelligence tools comparison
Business Intelligence tools comparison
 
Sql server 2012 tutorials reporting services
Sql server 2012 tutorials   reporting servicesSql server 2012 tutorials   reporting services
Sql server 2012 tutorials reporting services
 
Learn to Effectively Script in ACL – The Keys To Getting Started and Fully Au...
Learn to Effectively Script in ACL – The Keys To Getting Started and Fully Au...Learn to Effectively Script in ACL – The Keys To Getting Started and Fully Au...
Learn to Effectively Script in ACL – The Keys To Getting Started and Fully Au...
 
Sql server 2012 tutorials writing transact-sql statements
Sql server 2012 tutorials   writing transact-sql statementsSql server 2012 tutorials   writing transact-sql statements
Sql server 2012 tutorials writing transact-sql statements
 
November 2022 CIAOPS Need to Know Webinar
November 2022 CIAOPS Need to Know WebinarNovember 2022 CIAOPS Need to Know Webinar
November 2022 CIAOPS Need to Know Webinar
 
SB Support System
SB Support SystemSB Support System
SB Support System
 
Cryptography is the application of algorithms to ensure the confiden.docx
Cryptography is the application of algorithms to ensure the confiden.docxCryptography is the application of algorithms to ensure the confiden.docx
Cryptography is the application of algorithms to ensure the confiden.docx
 
Managing SQLserver for the reluctant DBA
Managing SQLserver for the reluctant DBAManaging SQLserver for the reluctant DBA
Managing SQLserver for the reluctant DBA
 
Dbi h315
Dbi h315Dbi h315
Dbi h315
 
Great news! Executive leadership has reviewed and approved the C
Great news! Executive leadership has reviewed and approved the CGreat news! Executive leadership has reviewed and approved the C
Great news! Executive leadership has reviewed and approved the C
 
MS SQL SERVER: Using the data mining tools
MS SQL SERVER: Using the data mining toolsMS SQL SERVER: Using the data mining tools
MS SQL SERVER: Using the data mining tools
 
MS SQL SERVER: Using the data mining tools
MS SQL SERVER: Using the data mining toolsMS SQL SERVER: Using the data mining tools
MS SQL SERVER: Using the data mining tools
 
Azure presentation nnug dec 2010
Azure presentation nnug  dec 2010Azure presentation nnug  dec 2010
Azure presentation nnug dec 2010
 
Ebook9
Ebook9Ebook9
Ebook9
 
Sql interview question part 9
Sql interview question part 9Sql interview question part 9
Sql interview question part 9
 
Sql interview-question-part-9
Sql interview-question-part-9Sql interview-question-part-9
Sql interview-question-part-9
 
Ebook9
Ebook9Ebook9
Ebook9
 
CPSC 50900 Database Systems ProjectAll your efforts this semeste
CPSC 50900 Database Systems ProjectAll your efforts this semesteCPSC 50900 Database Systems ProjectAll your efforts this semeste
CPSC 50900 Database Systems ProjectAll your efforts this semeste
 
Cis407 a ilab 6 web application development devry university
Cis407 a ilab 6 web application development devry universityCis407 a ilab 6 web application development devry university
Cis407 a ilab 6 web application development devry university
 
12363 database certification
12363 database certification12363 database certification
12363 database certification
 

Recently uploaded

Escort Service in Abu Dhabi +971509530047 UAE
Escort Service in Abu Dhabi +971509530047 UAEEscort Service in Abu Dhabi +971509530047 UAE
Escort Service in Abu Dhabi +971509530047 UAEvecevep119
 
Teepee Curios, Curio shop, Tucumcari, NM
Teepee Curios, Curio shop, Tucumcari, NMTeepee Curios, Curio shop, Tucumcari, NM
Teepee Curios, Curio shop, Tucumcari, NMroute66connected
 
My Morning Routine - Storyboard Sequence
My Morning Routine - Storyboard SequenceMy Morning Routine - Storyboard Sequence
My Morning Routine - Storyboard Sequenceartbysarahrodriguezg
 
Escort Service in Al Jaddaf +971509530047 UAE
Escort Service in Al Jaddaf +971509530047 UAEEscort Service in Al Jaddaf +971509530047 UAE
Escort Service in Al Jaddaf +971509530047 UAEvecevep119
 
Lindy's Coffee Shop, Restaurants-cafes, Albuquerque, NM
Lindy's Coffee Shop, Restaurants-cafes, Albuquerque, NMLindy's Coffee Shop, Restaurants-cafes, Albuquerque, NM
Lindy's Coffee Shop, Restaurants-cafes, Albuquerque, NMroute66connected
 
Vocal Music of the Romantic Period ~ MAPEH.pptx
Vocal Music of the Romantic Period ~ MAPEH.pptxVocal Music of the Romantic Period ~ MAPEH.pptx
Vocal Music of the Romantic Period ~ MAPEH.pptxMikaelaKaye
 
UNIT 5-6 anh văn chuyên nganhhhhhhh.docx
UNIT 5-6 anh văn chuyên nganhhhhhhh.docxUNIT 5-6 anh văn chuyên nganhhhhhhh.docx
UNIT 5-6 anh văn chuyên nganhhhhhhh.docxssuser519b4b
 
New_Cross_Over (Comedy storyboard sample)
New_Cross_Over (Comedy storyboard sample)New_Cross_Over (Comedy storyboard sample)
New_Cross_Over (Comedy storyboard sample)DavonBrooks
 
怎么办理美国UC Davis毕业证加州大学戴维斯分校学位证书一手渠道
怎么办理美国UC Davis毕业证加州大学戴维斯分校学位证书一手渠道怎么办理美国UC Davis毕业证加州大学戴维斯分校学位证书一手渠道
怎么办理美国UC Davis毕业证加州大学戴维斯分校学位证书一手渠道7283h7lh
 
Value Aspiration And Culture Theory of Architecture
Value Aspiration And Culture Theory of ArchitectureValue Aspiration And Culture Theory of Architecture
Value Aspiration And Culture Theory of ArchitectureDarrenMasbate
 
ReverseEngineerBoards_StarWarsEpisodeIII
ReverseEngineerBoards_StarWarsEpisodeIIIReverseEngineerBoards_StarWarsEpisodeIII
ReverseEngineerBoards_StarWarsEpisodeIIIartbysarahrodriguezg
 
Olivia Cox HITCS final lyric booklet.pdf
Olivia Cox HITCS final lyric booklet.pdfOlivia Cox HITCS final lyric booklet.pdf
Olivia Cox HITCS final lyric booklet.pdfLauraFagan6
 
Escort Service in Al Nahda +971509530047 UAE
Escort Service in Al Nahda +971509530047 UAEEscort Service in Al Nahda +971509530047 UAE
Escort Service in Al Nahda +971509530047 UAEvecevep119
 
Roadrunner Motel, Motel/Residence. Tucumcari, NM
Roadrunner Motel, Motel/Residence. Tucumcari, NMRoadrunner Motel, Motel/Residence. Tucumcari, NM
Roadrunner Motel, Motel/Residence. Tucumcari, NMroute66connected
 
layered-cardboard-sculptures-miika-nyyssonen.pdf
layered-cardboard-sculptures-miika-nyyssonen.pdflayered-cardboard-sculptures-miika-nyyssonen.pdf
layered-cardboard-sculptures-miika-nyyssonen.pdfbaroquemodernist
 
Escort Service in Al Barsha +971509530047 UAE
Escort Service in Al Barsha +971509530047 UAEEscort Service in Al Barsha +971509530047 UAE
Escort Service in Al Barsha +971509530047 UAEvecevep119
 
Escort Service in Al Rigga +971509530047 UAE
Escort Service in Al Rigga +971509530047 UAEEscort Service in Al Rigga +971509530047 UAE
Escort Service in Al Rigga +971509530047 UAEvecevep119
 
FORTH QUARTER MAPEH7-PHILIPPINE FESTIVALS.pptx
FORTH QUARTER MAPEH7-PHILIPPINE FESTIVALS.pptxFORTH QUARTER MAPEH7-PHILIPPINE FESTIVALS.pptx
FORTH QUARTER MAPEH7-PHILIPPINE FESTIVALS.pptxJadeTamme
 
STAR Scholars Program Brand Guide Presentation
STAR Scholars Program Brand Guide PresentationSTAR Scholars Program Brand Guide Presentation
STAR Scholars Program Brand Guide Presentationmakaiodm
 

Recently uploaded (20)

Escort Service in Abu Dhabi +971509530047 UAE
Escort Service in Abu Dhabi +971509530047 UAEEscort Service in Abu Dhabi +971509530047 UAE
Escort Service in Abu Dhabi +971509530047 UAE
 
Teepee Curios, Curio shop, Tucumcari, NM
Teepee Curios, Curio shop, Tucumcari, NMTeepee Curios, Curio shop, Tucumcari, NM
Teepee Curios, Curio shop, Tucumcari, NM
 
My Morning Routine - Storyboard Sequence
My Morning Routine - Storyboard SequenceMy Morning Routine - Storyboard Sequence
My Morning Routine - Storyboard Sequence
 
Escort Service in Al Jaddaf +971509530047 UAE
Escort Service in Al Jaddaf +971509530047 UAEEscort Service in Al Jaddaf +971509530047 UAE
Escort Service in Al Jaddaf +971509530047 UAE
 
Lindy's Coffee Shop, Restaurants-cafes, Albuquerque, NM
Lindy's Coffee Shop, Restaurants-cafes, Albuquerque, NMLindy's Coffee Shop, Restaurants-cafes, Albuquerque, NM
Lindy's Coffee Shop, Restaurants-cafes, Albuquerque, NM
 
Vocal Music of the Romantic Period ~ MAPEH.pptx
Vocal Music of the Romantic Period ~ MAPEH.pptxVocal Music of the Romantic Period ~ MAPEH.pptx
Vocal Music of the Romantic Period ~ MAPEH.pptx
 
UNIT 5-6 anh văn chuyên nganhhhhhhh.docx
UNIT 5-6 anh văn chuyên nganhhhhhhh.docxUNIT 5-6 anh văn chuyên nganhhhhhhh.docx
UNIT 5-6 anh văn chuyên nganhhhhhhh.docx
 
New_Cross_Over (Comedy storyboard sample)
New_Cross_Over (Comedy storyboard sample)New_Cross_Over (Comedy storyboard sample)
New_Cross_Over (Comedy storyboard sample)
 
怎么办理美国UC Davis毕业证加州大学戴维斯分校学位证书一手渠道
怎么办理美国UC Davis毕业证加州大学戴维斯分校学位证书一手渠道怎么办理美国UC Davis毕业证加州大学戴维斯分校学位证书一手渠道
怎么办理美国UC Davis毕业证加州大学戴维斯分校学位证书一手渠道
 
Value Aspiration And Culture Theory of Architecture
Value Aspiration And Culture Theory of ArchitectureValue Aspiration And Culture Theory of Architecture
Value Aspiration And Culture Theory of Architecture
 
School :)
School                                 :)School                                 :)
School :)
 
ReverseEngineerBoards_StarWarsEpisodeIII
ReverseEngineerBoards_StarWarsEpisodeIIIReverseEngineerBoards_StarWarsEpisodeIII
ReverseEngineerBoards_StarWarsEpisodeIII
 
Olivia Cox HITCS final lyric booklet.pdf
Olivia Cox HITCS final lyric booklet.pdfOlivia Cox HITCS final lyric booklet.pdf
Olivia Cox HITCS final lyric booklet.pdf
 
Escort Service in Al Nahda +971509530047 UAE
Escort Service in Al Nahda +971509530047 UAEEscort Service in Al Nahda +971509530047 UAE
Escort Service in Al Nahda +971509530047 UAE
 
Roadrunner Motel, Motel/Residence. Tucumcari, NM
Roadrunner Motel, Motel/Residence. Tucumcari, NMRoadrunner Motel, Motel/Residence. Tucumcari, NM
Roadrunner Motel, Motel/Residence. Tucumcari, NM
 
layered-cardboard-sculptures-miika-nyyssonen.pdf
layered-cardboard-sculptures-miika-nyyssonen.pdflayered-cardboard-sculptures-miika-nyyssonen.pdf
layered-cardboard-sculptures-miika-nyyssonen.pdf
 
Escort Service in Al Barsha +971509530047 UAE
Escort Service in Al Barsha +971509530047 UAEEscort Service in Al Barsha +971509530047 UAE
Escort Service in Al Barsha +971509530047 UAE
 
Escort Service in Al Rigga +971509530047 UAE
Escort Service in Al Rigga +971509530047 UAEEscort Service in Al Rigga +971509530047 UAE
Escort Service in Al Rigga +971509530047 UAE
 
FORTH QUARTER MAPEH7-PHILIPPINE FESTIVALS.pptx
FORTH QUARTER MAPEH7-PHILIPPINE FESTIVALS.pptxFORTH QUARTER MAPEH7-PHILIPPINE FESTIVALS.pptx
FORTH QUARTER MAPEH7-PHILIPPINE FESTIVALS.pptx
 
STAR Scholars Program Brand Guide Presentation
STAR Scholars Program Brand Guide PresentationSTAR Scholars Program Brand Guide Presentation
STAR Scholars Program Brand Guide Presentation
 

Insight

  • 2. Legal Information © 1996-2015 Scribe Software Corporation. All rights reserved. Complying with all applicable copyright laws is the responsibility of the user. No part of this document may be reproduced or transmitted in any form or by any means, electronic or mechanical, for any purpose, without the express written permission of Scribe Software Corporation. Trademarks Scribe Adapter, Scribe Console, Scribe Insight, Scribe Integrate, Scribe Publisher, and Scribe Workbench are all trademarks of Scribe Software Corporation. All other names are recognized as trademarks, registered trademarks, service marks, or registered service marks of their respective owners. The names of companies, products, people, characters, or data mentioned herein are fictitious and are in no way intended to represent any real individual, company, product, or event, unless otherwise noted. Scribe Software Corporation may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in any written license agreement from Scribe Software Corporation, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property. Licensing The Unicode Text Adapter uses the LumenWorks.Framework.IO.Csv Fast CSV Reader which is distributed under the MIT License. Disclaimer Scribe Software Corporation makes no representations or warranties with respect to the adapter or the contents or use of this document, and disclaims any express or implied merchantability or fitness for any specific purpose. Information in this document is subject to change without notice. Scribe Software Corporation reserves the right to revise this document or to change its contents at any time without obligation to notify any individual or corporation about the changes. Regardless of whether any remedy set forth herein fails of its essential purpose, in no event will Scribe Software Corporation be liable to you for any special, consequential, indirect or similar damages, including any lost profits or lost data arising out of the use or inability to use the software or documentation, even if Scribe has been advised of the possibility of such damages. Some states do not allow the limitation or exclusion of liability for incidental or consequential damages so the above limitation or exclusion may not apply to you.
  • 3. Table Of Contents Overview 1 Concepts 1 About The Tutorials 2 Requirements 3 Other Prerequisites 5 Tutorial 1: Migrating Account Information 11 Objectives 11 One: Create Connections 12 Two: Configure The Source 16 Three: Configure The Target Steps 19 Four: Create Data Links Between Source And Target Fields 23 Five: Add A Function 26 Six: Create A Lookup Link 29 Seven: Test The Data 31 Eight: Run The Job 34 Nine: Find Errors 37 Ten: Correct And Test The DTS 38 Eleven: Re-Run The Job 39 Tutorial 2: Creating A DTS File With Multiple Steps 40 Overview Of This Tutorial 40 One: Configure The Source 40 Two: Configure Target 41 Three: Add A Pre-Operation Step Flow Control Formula 42 Four: Create A Lookup Link 43 Five: Create Data Links And Formulas 44 Six: Check The Automatic Foreign Key Assignment 47
  • 4. Seven: Test The Job 49 Eight: Run The Job 52 Nine: Review The Data 53 What's Next 54 Tutorial 3: Creating An Integration In Scribe Console 55 Objectives 55 One: Create A Collaboration 56 Two: Add An Integration Process 58 Three: Prepare The Scribe Sample Database 63 Four: Check The DTS File 64 Five: Run The Integration 70 Tutorial 4: More Console Techniques 72 Objectives 72 One: Introduce An Error In The DTS File 73 Two: Create The Rejected Rows Table 75 Three: Create A Data View 77 Four: Create A Monitor 81 Five: Run The Integration Process 86 Six: Check The RR_ACCOUNTS Table 89 Seven: Check The Monitor 90 What's Next? 92
  • 5. Overview Try the tutorials included in this guide to get started using Scribe Insight. Concepts Some concepts these tutorials introduce are: l Data translation specification (DTS) — A file created in Scribe Workbench that stores the information required to migrate or integrate data between source and target data stores. This file consists of: o Source and target data stores. o Data processing logic to use when the DTS file is run. o Formulas to link source fields to target fields, set constant values in target fields, or define matching criteria. o Formulas used to convert, parse, or import selected source fields. l Integration Process (IP) — Detects an event and runs a DTS to modify and integrate your data. Events that an IP detects include: o A message being written into a queue. o The results of a SQL query. o A file being saved in a folder. o A specific time. l Collaboration — A set of Integration Processes, related files, and reports that enables you to organize IPs into meaningful abstractions of business processes. You use Scribe Workbench to create DTS files and use Scribe Console to include those DTS files in an IP and Collaborations. Page 1
  • 6. Page 2 Overview About The Tutorials The tutorials involve migrating account, address, and contact data from a source text file to a target SQL Server database, creating a collaboration that allows you to run the DTS file automatically, and monitoring the run for errors. The first two tutorials highlight the major features of the Scribe Workbench and show you how to: l Create database connections l Configure a source and target l Link source and target data fields l Set up insert and update steps to control the data migration l Save your work as a Data Translation Specification (DTS) file l Test and run the DTS l Identify and fix errors using the Transaction Error Report The second two tutorials introduce you to some of the features of the Scribe Console and show you how to: l Create a collaboration l Create and run an Integration Process within the collaboration l Add a rejected rows table to your DTS file l Add a data view for the rejected rows table l Add a monitor and an alert l Review alert details Throughout this guide, there are references to advanced functionality that is not included in the tutorials. For more information, see the Scribe Insight online help. l The figures in this tutorial are based on Insight 7.9.1. If you are using an earlier version of Insight, your Insight interface may differ from these figures. However, most of the instructions are the same. l You can download the latest version of the tutorial from https://openmind.scribesoft.com/download/ScribeInsightTutorial. The revision date is on the title page. About The Tutorials
  • 7. Requirements Overview Page 3 Requirements l Scribe Insight 7.9.1 or higher — The Scribe Insight Workbench must be installed before you use these tutorials. For information about installing Scribe Insight, see the Scribe Insight Installation Guide, which you can download from https://openmind.scribesoft.com/html/insight_download. l Scribe Sample Text — When you install Scribe Insight, the sample text files are installed in C:UsersPublicDocumentsScribeSamplesTextdata and the Scribe Sample Text ODBC Data Source Name is created. l Scribe Sample SQL Server database — When you install Insight, you have the option to install the SCRIBESAMPLE SQL Server database and to create the corresponding ODBC DSN Scribe Sample. If this sample database is not installed, the ODBC DSN is not displayed. Before you start the tutorial, you must install the sample database. Install The Scribe Sample SQL Server Database 1. Navigate to the Scribe program folder. By default, this is C:Program Files (x86)Scribe. 2. Double-click the InternalDB application. The Scribe Internal Database Maintenance Utility opens. Figure 1: Scribe Internal Database Maintenance Utility
  • 8. Page 4 Overview 3. Click the Sample Database tab. Figure 2: Scribe Internal Database Maintenance Utility 4. Click Install Sample Database and follow the prompts to create the database. If you previously ran the tutorial and put data into the sample database, click Refresh Sample Data to delete that data and restore the sample database to its original, empty state. Requirements
  • 9. Other Prerequisites Overview Page 5 Other Prerequisites After you install Scribe Insight and make sure the sample database is available, there are still a couple of tasks you need to do before beginning Tutorial 3: l Verify your Console is configured correctly — Follow the directions on configuring Scribe Insight in the Scribe Insight Installation Guide. File security settings are located under Administration, on the File Management tab of the Security pane. When configuring the security settings, ensure at a minimum Scribe has access to the following folders: o C:Program Files (x86)ScribeUtilities o C:UsersPublicDocumentsScribe o C:UsersPublicDocumentsScribeCollaborations o C:UsersPublicDocumentsScribeSamples o C:UsersPublicDocumentsScribeTemplates o C:UsersPublicDocumentsScribeTracing Figure 3: Console Security Settings
  • 10. Page 6 Overview l Verify Accounts.dts is available — To use tutorials 3 and 4, the Accounts.dts file created in Tutorial 1: Migrating Account Information, must be available. o Scribe Insight 6.5.2 or earlier — Follow the instructions in Tutorial 1 to create Accounts.dts. o Scribe Insight 7.0 or later — Either follow the instructions in Tutorial 1 to create Accounts.dts, or use the Accounts.dts file installed in C:UsersPublicDocumentsScribeSamplesTutorials. If you use this file, you must reconnect the Scribe Sample database. Reconnect To The Scribe Sample Database 1. Open Scribe Workbench. 2. Click File > Open. The Select DTS dialog box appears. 3. Browse to the C:UsersPublicDocumentsScribeSamplesTutorials folder. Figure 4: Select DTS Dialog Box 4. Double-click Accounts.dts. The Resolve Connection dialog box appears. Connections are saved in the Internal Database. You are prompted to specify either a name to use for the new connection in the database, or an existing connection with similar properties you want the DTS to use. Other Prerequisites
  • 11. Other Prerequisites Overview Page 7 5. Click OK when you are prompted to resolve the following connections: l Scribe Sample l Scribe Sample Text Figure 5: Resolve Connection The Connection Manager displays. Scribe Sample displays in red italics, which indicates it is currently disconnected: Figure 6: Connection Manager With Disconnected Datastore 6. In the Connection Manager, select Scribe Sample and click Edit. The Connection Settings dialog appears.
  • 12. Page 8 Overview 7. Click the Global Connection Settings tab. Figure 7: Connection Tab 8. Click Connect. The Scribe Sample Connection Information dialog box appears. Figure 8: Sample Connection Information Other Prerequisites
  • 13. Other Prerequisites Overview Page 9 9. Expand ODBC Data Sources, select Scribe Sample, and click OK. The SQL Server Login dialog box appears. Figure 9: Sample Connection Information 10. In Password, enter the default Scribe password, integr8!, and click OK. You return to the Connection Settings dialog box. 11. Click OK again. You return to the Connection Manager. The Scribe Sample connection now displays in black, which indicates you have successfully reconnected. The Accounts.dts file is ready to use. Figure 10: Connection Manager With Reconnected Data Source
  • 14. Page 10 Overview 12. Click Close. If any other DTS files on your system use the Scribe Sample connection, you are prompted to update those connections in other DTS files. Click No. Figure 11: Globally Update The Connection The Scribe Sample connection you defined is saved and only the Account DTS is updated to use that connection. 13. Select File > Save. 14. If you are prompted to create a backup file, click No. 15. Select File > Exit. Other Prerequisites
  • 15. Objectives Tutorial 1: Migrating Account Information Page 11 Tutorial 1: Migrating Account Information Objectives Use this tutorial to learn how to: l Connect to a source text file and a target SQL Server database. l Map fields from the source to the target. l Save your setup as a DTS file. l Run the DTS to verify the migration is working. l Review and fix any errors. l Rerun the DTS to verify your fix worked. Start The Tutorial To get started with this tutorial, on your desktop, double-click the Scribe Workbench icon. Figure 12: Scribe Workbench Desktop Icon The Scribe Workbench main window displays. Figure 13: Scribe Workbench Main Window
  • 16. Page 12 Tutorial 1: Migrating Account Information One: Create Connections The first step in creating a DTS file is to specify your connection. For any data integration or migration, you need a source that contains the data you want to move, and one or more targets where you want to move that data. If you decide later you need more connections, you can add them at any time. For this tutorial, you create two connections: one for the source and one for the target. Select Your Connections For This Tutorial 1. In the Scribe Workbench main window, select View > Connections. The Connection Manager dialog box displays. Figure 14: Connection Manager 2. Click the New button. The Add a Connection dialog box opens. Figure 15: Add A Connection One: Create Connections
  • 17. One: Create Connections Tutorial 1: Migrating Account Information Page 13 3. Click the plus sign ( + ) next to ODBC Data Sources to expand the tree, then select Scribe Sample Text. This is the source connection in the DTS file. Figure 16: Select Scribe Sample Text The list of data sources included in this tree depends on your specific environment. For example, in your working environment you may have only a few ODBC data sources displayed. 4. Click OK. The Connection Settings dialog box displays with the Connection name filled in on the Global Connection Settings tab. Because you already have a Scribe Sample Text connection, this connection is given a unique name, Scribe Sample Text (1). Figure 17: Connection Settings Property Sheet
  • 18. Page 14 Tutorial 1: Migrating Account Information 5. Click OK to save the connection and close the dialog box. 6. In the Connection Manager, click New again. 7. Expand ODBC Data Sources, then select Scribe Sample. This is the target connection in your DTS file. Figure 18: Select Scribe Sample 8. Click OK. The SQL Server login dialog box appears. Figure 19: SQL Server Login One: Create Connections
  • 19. One: Create Connections Tutorial 1: Migrating Account Information Page 15 9. In Login ID, enter SCRIBE. 10. In Password, enter integr8!. 11. Click OK. You return to the Connection Settings dialog box. 12. Click OK to close the Connection Settings dialog box. The Connection Manager displays with the Scribe Sample (1) and Scribe Sample Text (1) connections. Figure 20: Connection Manager 13. Click Close to close the Connection Manager.
  • 20. Page 16 Tutorial 1: Migrating Account Information Two: Configure The Source The next step is to configure your source connection. In this tutorial, you select Scribe Sample Text as the source and select a single table. A source is a data set or a group of rows and columns. The data set can be any of the following: l A single table l Multiple tables joined by a SQL query l A single text file l A result returned from a SQL query l Results returned from a stored procedure l An adapter object or related adapter objects Configure The Source 1. From the Scribe Workbench main window, click Configure Source. The Configure Source dialog box appears. Figure 21: Configure Source Button 2. Click the Connections drop-down and select Scribe Sample Text. Figure 22: Select Scribe Sample Text Two: Configure The Source
  • 21. Two: Configure The Source Tutorial 1: Migrating Account Information Page 17 3. In the Configure Source Data Objects Explorer, expand Tables, then select Leads. Figure 23: Select The Source — Leads Table After you select the Scribe Sample Text connection, the Custom Query option displays in this dialog box. While the tutorial uses only a single table as the source data object, a powerful feature of Scribe Workbench is the variety of sources you can configure through the Custom Query option.
  • 22. Page 18 Tutorial 1: Migrating Account Information 4. Click OK. The list of source data fields displays in the source pane. Figure 24: Leads Fields In Scribe Workbench Source Pane Each field has an associated reference number shown in the Ref column. Use this reference number when you work with functions and formulas. Figure 25: Source Reference Values Two: Configure The Source
  • 23. Three: Configure The Target Steps Tutorial 1: Migrating Account Information Page 19 Three: Configure The Target Steps Next, you configure the target — the location where you integrate the data. Although you can select multiple targets, you select only one target in this tutorial. A step is an operation performed on a target data object. Each target can have multiple steps. To define a step, in the Configure Steps dialog box, select a target data object and an operation to perform on that target. Steps are performed in the in the order they are listed: l Steps can be performed on tables, views, stored procedures, XML objects and adapter objects. l The default step operation is Insert, which inserts the source record into the target. You can specify a different operation. l Steps are performed once for each source row. This tutorial integrates data from the Leads table in the source to the Accounts table in the target SQL database. Configure The Target Steps 1. From the Scribe Workbench main window, click Configure Steps. The Configure Steps dialog box displays. Figure 26: Configure Steps Button 2. Click the Data Objects tab. Figure 27: Add Button On The Data Objects Tab
  • 24. Page 20 Tutorial 1: Migrating Account Information 3. Click the Add button. The Add Target Connection dialog box appears. Figure 28: Select Scribe Sample 4. Click the Connection drop-down and select Scribe Sample. This is the target connection you will use for this tutorial. 5. Click OK. You return to the Configure Steps dialog box with the Scribe Sample target data objects displayed. 6. On the Data Objects tab, expand Tables and select the ACCOUNT table. Figure 29: Select A Target — Account Table Three: Configure The Target Steps
  • 25. Three: Configure The Target Steps Tutorial 1: Migrating Account Information Page 21 7. At the bottom of the pane, select Update/Insert from the Operation drop-down list. Figure 30: Update/Insert Operation Make sure you select Update/Insert, not Insert/Update. The difference is explained in the Insight Help. 8. Click Add Update/Insert Step. Figure 31: Add Update/Insert Step
  • 26. Page 22 Tutorial 1: Migrating Account Information 9. Click Close. You return to the Scribe Workbench main window, where the fields from the ACCOUNT table of the Scribe Sample database appear in the target pane. Figure 32: Field Information From Source And Target Three: Configure The Target Steps
  • 27. Four: Create Data Links Between Source And Target Fields Tutorial 1: Migrating Account Information Page 23 Four: Create Data Links Between Source And Target Fields At this point, you need to create data links to map source fields to target fields. Data links enable you to set values on target fields. One or more source fields can be linked to one or more fields in the target. For example, you may want to link a CONTACT_NAME field that contains a full name in the source to both ContactFirstName and ContactLastName in the target. The Data Link button enables you to map source fields to target fields. The button is in the center of the Scribe Workbench window, between the two panes. Figure 33: Data Link Button A check mark indicates the fields are linked. Figure 34: Data Link
  • 28. Page 24 Tutorial 1: Migrating Account Information Create Data Links 1. In the Scribe Workbench, from the source field list, select the UNIQUE_ID field. This field has the Source Reference number S1. 2. In the target field list, select the XREF field. 3. Click the Data Link button. A check mark indicates the fields are linked. Figure 35: Data Link 4. Create the following additional Data Links: l PHONE to PHONE l BUSINESS_NAME to ACCOUNTNAME Four: Create Data Links Between Source And Target Fields
  • 29. Four: Create Data Links Between Source And Target Fields Tutorial 1: Migrating Account Information Page 25 5. Click the Data Formulas tab at the bottom of the Scribe Workbench main window to see the data formulas you have created. Figure 36: Data Formulas Tab Showing Data Links If you do not see the Data Formulas tab, click the Expand Links Pane button, located below the Formula button. Figure 37: Expand Links Pane Button
  • 30. Page 26 Tutorial 1: Migrating Account Information Five: Add A Function Next, you need to ensure the required field, ACCOUNTID, is always unique. Insight indicates a required field by bolding and underlining the field name in the target pane. In this tutorial, ACCOUNTID is a required field in the target and you must make sure it is set to a unique number whenever you insert a new record. To do this, you use the Formula Browser to add the GUID function, which generates a unique number to the ACCOUNTID field in the target. When you connect with a Scribe adapter, unique IDs are usually generated by the adapter or the applications API. The Formula Button enables you to access the Function Browser. Figure 38: Formula Button The Function Browser contains over 180 functions you can use to create simple and complex formulas, which can use multiple functions and logical IF statements. If you create your own formula, you can save it and use it in future DTS files. Formulas you save are included in the Function Browser under the category User Defined Formulas. Five: Add A Function
  • 31. Five: Add A Function Tutorial 1: Migrating Account Information Page 27 Add A Function 1. In the Scribe Workbench, in the Target pane, select ACCOUNTID. 2. Click the Formula button in the center of the Scribe Workbench window. The Edit Formula window displays. Figure 39: Edit Formula 3. In the Function Browser, expand Functions by Category, expand System Functions, then double-click GUID. The GUID function is added to the Formula Editor.
  • 32. Page 28 Tutorial 1: Migrating Account Information 4. Click OK to close the Edit Formula window and return to the Scribe Workbench main window. The GUID() formula now displays in the Formula column of ACCOUNTID row of the Data Formulas tab. Figure 40: GUID Formula Added To Data Formulas Tab By default, an Update overwrites data every time you run a job. Since you are working with an Update/Insert step, you want to ensure the ACCOUNT ID is assigned only when a new account is inserted, not when an existing account is updated. To do this, you must change the overwrite status of the ACCOUNTID field. The overwrite status is displayed in the Overwrite field of the Data Formulas tab. Figure 41: Overwrite Column In Data Formulas Tab 5. On the Data Formulas tab, in the row where the Step field is set to ACCOUNTID, double-click the asterisk (*). The asterisk disappears, indicating the overwrite feature has been turned off. When overwrite is turned off: l When a target record is inserted — The data link is used and a new GUID value is generated for ACCOUNTID. l When a target record is updated — The data link is not used and the existing GUID value for ACCOUNTID is not changed. l In general, you want to turn off Overwrite for any field you use as a lookup link when you are performing an Update/Insert or an Insert/Update step. l In Insight 7.0.0 and later, the name of the target connection is included in the step name, providing more information in multitarget DTS files. Five: Add A Function
  • 33. Six: Create A Lookup Link Tutorial 1: Migrating Account Information Page 29 Six: Create A Lookup Link A lookup link enables you to define the match criteria for seek, update, or delete steps. In this tutorial, you create a lookup link between the UNIQUE_ID field in the source and the XREF field in the target. The update/insert step you created uses this link to locate the account to be updated. Because there is already a data link between these fields, you must insert the UNIQUE_ID from the source into the XREF field in the target. If you run the DTS more than once, the lookup link between these fields uses the value inserted into XREF to enable you to identify the target records to update. You use the Lookup Link button in the middle of the window to create a lookup link. Figure 42: Lookup Link Button Symbols in the source and target panes indicate links and indexes: l Checkmark (target and source panes) — The field is linked. l Checkmark inside a square (target pane) — The field is part of a lookup link. A graphical display of the link displays in the Links tab. l Star (target pane) — The field is part of an index. l Star inside a circle (target pane) — The field is part of a unique index. Figure 43: Linked Source And Target Fields
  • 34. Page 30 Tutorial 1: Migrating Account Information Create A Lookup Link 1. In the source pane, select the UNIQUE_ID field. 2. In the target pane, select the XREF field. 3. Click the Lookup Link button. 4. Click the Links tab at the bottom of the Scribe Workbench main window. 5. Select Show links to Source field. A graphic representation of the lookup link displays on the tab. Figure 44: Link Representation Six: Create A Lookup Link
  • 35. Seven: Test The Data Tutorial 1: Migrating Account Information Page 31 Seven: Test The Data Testing data enables you to preview the results of your job without writing any rows to the target connection. In the Test window, you can: l Step through each row that will be written to the target connection l Browse the source data l Verify links and formula results The Test job button is on the toolbar on the top left of the Scribe Workbench window. Figure 45: Test Job Button Test Your Data 1. On the Scribe Workbench toolbar, click the Test Job button. The Test window appears, showing the source field names and values, as well as data links, lookup links, and step results for each record. Figure 46: Test Window
  • 36. Page 32 Tutorial 1: Migrating Account Information 2. Click Next to scroll through each of the source rows. Notice the business names on the Data Links tab are all uppercase. This would be unattractive when you print addresses. 3. Click Close to close the Test window. Before you actually run the job, let’s change the business names to mixed case. To Change A Field Property 1. In the Scribe Workbench main window, click the Data Formulas tab. 2. Double-click the ACCOUNTNAME row. The Data Formulas window opens. 3. In the Formula Editor field, highlight S3. 4. In the Function Browser, expand Functions by Category, expand Text, and double-click the PROPER function. In the Formula Editor field, PROPER(S3) appears. 5. Click OK to close the Edit Formulas window. 6. Verify PROPER(S3) appears in the Formula column next to ACCOUNTNAME. Figure 47: Data Formulas Pane Showing Proper Function Seven: Test The Data
  • 37. Seven: Test The Data Tutorial 1: Migrating Account Information Page 33 7. Save the DTS as Accounts.dts in the UsersPublicPublic DocumentsScribeSamplesTutorials directory. If Accounts.dts already exists, you can replace it. 8. Click the Test Job button. The Test window appears. 9. Click Next to scroll through each source row and verify the business names on the Data Links tab are title case, for example, George Tel, not GEORGE TEL. Figure 48: Test Window With Modified AccountName Values 10. Click Close.
  • 38. Page 34 Tutorial 1: Migrating Account Information Eight: Run The Job Now, you try running the job. When your job runs, it updates your target data. Run Your Job 1. In the Scribe Workbench main window, click the Run button. Figure 49: Run Job Button The Run Complete window appears, showing 15 successful and 1 one failed insert. You need to determine why one row failed. Figure 50: Run Complete Window Eight: Run The Job
  • 39. Eight: Run The Job Tutorial 1: Migrating Account Information Page 35 2. Click the Transaction Errors button. The Execution Log Viewer appears, summarizing the errors. Figure 51: Execution Log Viewer You need to view the row information in a more readable and printable format.
  • 40. Page 36 Tutorial 1: Migrating Account Information 3. Click Transaction Errors. The Transaction Errors report appears. Figure 52: Transaction Errors Report You can see the failure occurred on row 14 and was caused by a blank account name. Figure 53: Errors In Transaction Errors Report 4. Close the report. 5. Close the Execution Log Viewer. 6. Close the Run Complete window. Eight: Run The Job
  • 41. Nine: Find Errors Tutorial 1: Migrating Account Information Page 37 Nine: Find Errors You know from the Transaction Errors report the error is on row 14. Now you need to test the job again to find the exact record where the error occurs. See The Errors 1. In the Scribe Workbench, click the Test Job button. The Test window appears. 2. Click Next until source row 14 displays. 3. On the Data Links tab, locate the value of ACCOUNTNAME. Figure 54: Test Window Showing ACCOUNTNAME Error The value of #NULL! is causing the error. 4. Close the Test window.
  • 42. Page 38 Tutorial 1: Migrating Account Information Ten: Correct And Test The DTS When you ran the job, you discovered the job cannot insert a null value into the account name. One solution is to add error checking and error handling to the ACCOUNTNAME field. This is easily done through the Edit Formula window. Correct A Formula 1. In the Scribe Workbench, on the Target pane, select the ACCOUNTNAME row and click the Formula button. The Formula Editor appears. 2. In the Edit Formula pane, PROPER(S3) displays in the formula editor. Enter the following formula: IF(ISERROR(S3),"Unknown"&S1,PROPER(S3)) This formula looks at the BUSINESS_NAME, Ref S3, field in each row: l If the field is null, the value in the field is changed from null to “Unknown" and the UNIQUE_ID from the source is included with it before inserting that value into ACCOUNTNAME. l If an account name is not provided in the source data, you use the UNIQUE_ID to relate the missing data back to the source data. 3. Click OK. You return to the Scribe Workbench. The updated formula appears on the Data Formulas tab. Figure 55: Data Formulas Pane Showing The Corrected Formula 4. Save the Accounts DTS file. 5. Click the Test Job button. Verify your mappings, formulas, and functions are returning the expected values. For example, make sure the target ACCOUNTNAME is correct. For record 14, the value in ACCOUNTAME field should be Unknown114. 6. Click the Close button. Ten: Correct And Test The DTS
  • 43. Eleven: Re-Run The Job Tutorial 1: Migrating Account Information Page 39 Eleven: Re-Run The Job Now that you have changed the DTS file to check for and fix null business names, run the job again. Re-Run The Job 1. In the Scribe Workbench main window, click the Run Job button. 2. Verify all rows succeeded and no rows failed. The Run Complete window shows: l 1 insert was performed on the corrected row. The ACCOUNTNAME value for this row reads Unknown114. l 15 updates were performed. Because of the lookup link between UNIQUE_ID and XREF, 15 rows were updated rather than inserted again. Figure 56: Run Complete Showing Successful Run 3. Click Close. This completes the first tutorial. You can proceed to the second tutorial in the next section to learn more about the Scribe Workbench. Tutorial 2 assumes Tutorial 1 has correctly populated the Scribe Sample database. Without this data, Tutorial 2 does not run correctly.
  • 44. Tutorial 2: Creating A DTS File With Multiple Steps This tutorial requires you to have completed Tutorial 1, which introduced the Scribe Workbench and some of the concepts discussed here, as well as integrated data into the DTS file you use in Tutorial 2. Overview Of This Tutorial This tutorial introduces you to these concepts: l Creating a multiple-step job l Using the update operation and lookup links l Using the SKIPSTEP function If the Accounts DTS from the first tutorial is still open: 1. In the Scribe Workbench, save the DTS. 2. Click File > New. To get started with this tutorial, you create a new DTS. One: Configure The Source Tutorial 2 assumes Tutorial 1 has correctly populated the Scribe Sample database. Without this data, Tutorial 2 does not run correctly. As in the first Tutorial, your first step is to add connections for the source. Configure The Source 1. In the Scribe Workbench, click Configure Source. 2. Select Scribe Sample Text as the source connection. 3. In the database tree, expand All Data Objects, expand Tables, and select Leads. 4. Click OK to close the Configure Source dialog box. Page 40
  • 45. Page 41 Tutorial 2: Creating A DTS File with Multiple Steps Two: Configure Target Next, you configure steps for the target. Configure The Target 1. In the Scribe Workbench, click Configure Steps. 2. Click the Data Objects tab. 3. Click Add. 4. In Connection, select Scribe Sample and click OK. 5. Add the following steps: l Select the ACCOUNT table and add a Seek step. A Seek step uses lookup links to find rows in the target connection. l Select the ADDRESS table and add an Insert step. l Select the CONTACT table and add an Insert step l Select the CONTACT table again and add another Insert step. The source data has a contact and alternate contact, so you need two CONTACT insert steps to create a contact record in the target for the contact, and for the alternate contact, when one exists. Figure 57: Configuring Multiple Target Steps 6. Click Close to close the Configure Steps window. Two: Configure Target
  • 46. Three: Add A Pre-Operation Step Flow Control Formula Tutorial 2: Creating A DTS File with Multiple Steps Page 42 Three: Add A Pre-Operation Step Flow Control Formula If you look at the fields in the Source pane, notice there is both a CONTACT_NAME field (Ref S2), and an ALT_CONTACT field (Ref S16). Sometimes the ALT_CONTACT field is null. If there is a row for which the ALT_CONTACT field is null, you want to skip the CONTACT Insert step associated with ALT_CONTACT. To do this, you must add a pre-operation step flow control formula on the second CONTACT step. Add A Step Flow Control Formula 1. In the Scribe Workbench, click Configure Steps. 2. Click the Flow Control tab. 3. In the Step Order pane, select Step 4 — Scribe Sample.CONTACT(2) Insert. 4. In the Pre-Operation Step Flow Control Formula text box, enter the following formula: IF(ISERROR(S16), SKIPSTEP(), TRUE()) This formula checks to see if there is a null value in field S16, ALT_CONTACT: l If ALT_CONTACT is null, the SKIPSTEP function skips the step. l If ALT_CONTACT is not null, the step is executed and the value in ALT_CONTACT is inserted into the target. Figure 58: Add A Pre-Operation Step Flow Control Formula 5. Click Close to close the Configure Steps window.
  • 47. Page 43 Tutorial 2: Creating A DTS File with Multiple Steps Four: Create A Lookup Link Now you want to create a lookup link. The lookup link on the ACCOUNT Seek step allows you to find a matching account in the target before inserting new contacts. For DTS files with multiple steps, you can use the drop-down list above the target pane to display only the fields from an individual step: Figure 59: Step Selection Menu For Insight 7.0.0 and later, the name of the target connection is included in the step name. The first link you want to create is a lookup link on the ACCOUNT Seek step. Create A Lookup Link 1. From the Configure Steps drop-down list, select Scribe Sample.ACCOUNT Seek. 2. Create a lookup link between UNIQUE_ID in the source and XREF in the target. Four: Create A Lookup Link
  • 48. Five: Create Data Links And Formulas Tutorial 2: Creating A DTS File with Multiple Steps Page 44 Five: Create Data Links And Formulas Now, you create data links and formulas to process the data before it is inserted. Create The Needed Data Links And Formulas 1. From the Configure Steps drop-down list, select Scribe Sample.ADDRESS Insert. 2. Create the following data links from the source to the target: Source Field Target Field ADDRESS ADDRESSLINE1 CITY CITY STATE STATE ZIP_CODE ZIP COUNTRY COUNTRYCODE 3. From the Target pane, select each field listed below and add the specified formula: l ADDRESSID — To ensure the address for this company is unique, click the Formula button and add the GUID() function. l ADDRESSLINE1, CITY — To format the address and city names in mixed case, add the PROPER function to these fields. l COUNTRYCODE — Ensure you have the correct country code by adding the following formula to this field: IF( ISERROR(S8), "US", DBLOOKUP(S8, "Scribe Internal Database", "COUNTRY", "COUNTRYNAME", "COUNTRYCODE")) If the country code is null, the country code is set to the default value of “US”. Otherwise, this formula performs a database lookup to find the COUNTRYCODE value. In SQL , the formula would be similar to: SELECT COUNTRYCODE FROM COUNTRY WHERE COUNTRYNAME = (value from S8)
  • 49. Page 45 Tutorial 2: Creating A DTS File with Multiple Steps 4. Select CONTACT Insert from the Configure Steps drop-down list and create the following data links: Source Field Target Field CONTACT_NAME CONTACTNAME CONTACT_NAME FIRSTNAME CONTACT_NAME LASTNAME As with the Address ID, you want to ensure the CONTACTID is unique. In addition, the CONTACT_NAME (Ref S2) field in the source needs to be split into two fields: FIRSTNAME and LASTNAME. However, to make sure the name is right, you also want to insert the whole contact name. 5. Create a formula to add the GUID() function to CONTACTID. 6. Add the PROPER() function to CONTACTNAME. For the FIRSTNAME and LASTNAME fields, you want to parse contact name and change the case to mixed case. 7. For FIRSTNAME, add the following formula: PROPER(PARSENAME(S2, "F")) 8. For LASTNAME, add the following formula to specify an “L” rather than an “F”: PROPER(PARSENAME(S2, "L")) 9. From the Configure Steps drop-down list, select CONTACT(2) Insert and configure the data links for the CONTACT(2) step the same way you configured the data links for the CONTACT step. For CONTACT(2), be sure to use the source field ALT_CONTACT (S16) in your data links and formulas. Five: Create Data Links And Formulas
  • 50. Five: Create Data Links And Formulas Tutorial 2: Creating A DTS File with Multiple Steps Page 46 10. When you are done, set the Configure Steps drop-down list back to <View All Steps>. Your Data Formula tab should look similar to the one below. Figure 60: Completed Target Configuration l In the Data Formulas tab, you can sort columns by double-clicking a column header. In this example, the Formula Field has been sorted alphabetically. l To change the column that is sorted, click View > Sort > Data Links, then select the column name you want to sort. 11. Save the DTS as ContactsandAddresses.dts. If this file already exists in the ..ScribeSamplesTutorials directory, you can overwrite it.
  • 51. Page 47 Tutorial 2: Creating A DTS File with Multiple Steps Six: Check The Automatic Foreign Key Assignment When relationships are defined between tables or objects, Insight automatically fills in values for the related fields. Because the ACCOUNT table and the CONTACT table are related by the ACCOUNTID field, Insight fills in the ACCOUNTID in the CONTACT table when a contact is inserted. Autolinks are indicated by a diamond icon and italic text. Figure 61: Automatic Link Indicators Six: Check The Automatic Foreign Key Assignment
  • 52. Six: Check The Automatic Foreign Key Assignment Tutorial 2: Creating A DTS File with Multiple Steps Page 48 Use Automatic Foreign Key Assignment 1. From the Configure Steps drop-down list, select CONTACT Insert. 2. Right-click ACCOUNTID and select Field Properties to display the Properties window. Figure 62: ACCOUNTID Properties Window Notice there is an Auto foreign key assignment on the ACCOUNTID field value in the CONTACT step inherited from Step 1, which is a Seek on ACCOUNT for this example. This is how Insight fills in the foreign key to a parent table. 3. Close the ACCOUNTID Properties window.
  • 53. Page 49 Tutorial 2: Creating A DTS File with Multiple Steps Seven: Test The Job Now you can test your job. Test The Job 1. Click Test Job. 2. Go to record 2 and click the Step Results tab. Figure 63: Record 2 Step Results Notice in the Source Value pane, the ALT_CONTACT field is null. Also, in the Step Results tab, Steps 2 and 3 are inserted, but Step 4 is grayed out and arrows indicate this Insert step is skipped. Seven: Test The Job
  • 54. Seven: Test The Job Tutorial 2: Creating A DTS File with Multiple Steps Page 50 3. Go to record 3, which does have an alternate contact, to see how the step results are different. Figure 64: Record 3 Step Results In this tutorial, the Jump To and Previous buttons are unavailable. While these functions are supported by many Scribe adapters, they are not supported by the ODBC text driver.
  • 55. Page 51 Tutorial 2: Creating A DTS File with Multiple Steps 4. Click the Lookup Links tab. Here you can see this DTS uses a single lookup link created during the ACCOUNT Seek step. Figure 65: Record 3 Lookup Links 5. When you are done, close the Test window. Seven: Test The Job
  • 56. Eight: Run The Job Tutorial 2: Creating A DTS File with Multiple Steps Page 52 Eight: Run The Job After you are satisfied the data is correct, you can run the job. Run The Job 1. Click Run > Run Job. 2. When the job finishes, the Run Complete window displays: Figure 66: Run Complete Window Notice the window shows 16 rows were processed, with 40 successful inserts and 8 skipped inserts. If your results show 16 successful, 16 failed, and 16 skipped records, you may not have run Tutorial 1 first or you may have refreshed the database before starting this tutorial. Complete Tutorial 1 before you run Tutorial 2 to get the correct results. 3. Close the Run Complete window. 4. In the Scribe Workbench, click Test Job, and view all the source rows. Notice there are 8 records for which the ALT_CONTACT value is #NULL!. This matches the number of skipped records.
  • 57. Page 53 Tutorial 2: Creating A DTS File with Multiple Steps Nine: Review The Data One way to check your data is to use Microsoft SQL Server Management Studio™. If you do not have Microsoft SQL Server Management Studio installed, you can download it from the microsoft.com site. Review The Data 1. Open Microsoft SQL Server Management Studio. 2. Open the SCRIBESAMPLE database, then select and open the CONTACT table. Notice there are 24 rows in the CONTACT table. There are 16 primary contacts and 8 alternate contacts. The CONTACT table only has a CONTACTNAME field and does not differentiate between the types of contacts. Therefore, your design requires you to create a new record for each contact, whether it is primary or alternate. In addition, as discussed earlier, the ACCOUNTID field has been inserted. 3. Look at the records for John Thibideau and Scott Berman. You’ll notice the CONTACTIDs are different, but the ACCOUNTID is the same, because you assigned the GUID function to the CONTACTID for both the primary (CONTACT) and alternate (CONTACT 2) steps. 4. Going back to the Test window, you can see John Thibideau is the primary contact for George Tel, Inc., and Scott Berman is the alternate. Therefore, the ACCOUNTID has correctly been assigned to both contacts. Nine: Review The Data
  • 58. What's Next Tutorial 2: Creating A DTS File with Multiple Steps Page 54 What's Next Congratulations! You successfully completed the Scribe Workbench tutorials! At this point, you should be ready to apply what you learned to your own data integration scenario. Use the Scribe Insight Help for reference as you create your own DTS files. The tutorials you just completed have provided an introduction to Scribe Workbench. If you want to learn about using Scribe Console, continue on to the next tutorials. At any point, you can re-run either tutorial. However, before you do, you need to refresh the sample database to ensure you get the correct results. Reset The Sample Database 1. Navigate to the Scribe program folder, C:Program Files (x86)Scribe. 2. Double-click the InternalDB.exe file to open the Scribe Internal Database Maintenance Utility. 3. Click the Sample Database tab. 4. Click Refresh Sample Data. The sample database is reset for the tutorials. 5. Click OK on the Refresh Complete dialog box. 6. Click OK to close the utility window. The sample database has been returned to its original state.
  • 59. Tutorial 3: Creating An Integration In Scribe Console This tutorial shows how to create a simple integration in the Scribe Console. The scenario for this tutorial is that every evening at 10 pm, your Scribe site receives a tab-delimited text file, which contains customer account data used to update the existing account data in a SQL Server database. Your job is to create an integration that runs every night at 10 pm. This integration must add new customer data and update existing customer data, without changing any other customer information. For this tutorial, the Accounts.dts file created in Tutorial 1 must be available. DTS (data translation specification) files are created in Scribe Workbench to store the translation settings for migrating or integrating data between source and target datastores. l Scribe Insight 6.5.2 or earlier — Follow the instructions in Tutorial 1: Migrating Account Information to create the Accounts.dts. l Scribe Insight 7.0 or later — Either follow the instructions in Tutorial 1: Migrating Account Information to create the Accounts.dts, or use the Accounts.dts file installed in C:UsersPublicDocumentsScribeSamplesTutorials. If you use this file, you must reconnect the Scribe Sample database. Objectives In this tutorial, you: l Create a new collaboration l Add a time-based integration process l Check the DTS file from Scribe Console l Run the integration Page 55
  • 60. Page 56 Tutorial 3: Creating An Integration In Scribe Console One: Create A Collaboration 1. From the Start menu, open the Scribe Console. 2. In the Console tree, expand Scribe Console and expand the Local server: Figure 67: Collaborations Folder In The Console Tree 3. Select the Collaborations folder, right-click the folder, then select New Collaboration. The New Collaboration Wizard appears 4. Click Next. The Collaboration Settings screen appears. 5. In the Collaboration Name text box, enter Account Integration. Right now, you only need to name the new collaboration before you create it. 6. Click Finish. The new Collaboration appears under the Collaborations root folder in the Console. Figure 68: Collaborations Folder Expanded One: Create A Collaboration
  • 61. One: Create A Collaboration Tutorial 3: Creating An Integration In Scribe Console Page 57 7. Use Windows Explorer to move Accounts.dts from its original location to the Collaborations folder: l The default folder is C:UsersPublicDocumentsScribeSamplesTutorials l Move it to C:UsersPublicDocumentsScribeCollaborationsAccount Integration Figure 69: Windows Explorer — Account Integration
  • 62. Page 58 Tutorial 3: Creating An Integration In Scribe Console Two: Add An Integration Process Now that you have created your Collaboration, the next step is to add a time-based Integration Process (IP). Add An Integration Process 1. In the Collaborations folder, expand Account Integration, and select Integration Process. The Integration Processes pane displays. Figure 70: Integration Process Pane Two: Add An Integration Process
  • 63. Two: Add An Integration Process Tutorial 3: Creating An Integration In Scribe Console Page 59 2. Click Add. The Add New Integration Process Wizard appears. Figure 71: Integration Process Wizard — Step 1 You want to define a time-based event so the integration runs every evening at 10 pm. In Step 1, you begin creating the integration. 3. Under Process Event, click Time to specify this is a time-based event. 4. In Process name, enter Account Import. 5. Under Data Translation Specification, click Browse. The Select File dialog box appears.
  • 64. Page 60 Tutorial 3: Creating An Integration In Scribe Console 6. Select Accounts.dts and click OK. Figure 72: Step 1 — Select File You are not adding any DTS parameters. If you want to learn more about the steps in this process, more information is available in the Scribe Insight Help. 7. Click Step 2 — Pre/Post Processing Commands. You are not adding any pre- or post- processing commands. 8. Click Step 3 — Event Settings. The settings available in this set depend on the Process Event type you select in Step 1. The settings that appear now apply to Time Integration Processes. 9. Under Time Event Settings, select Run DTS every day. Two: Add An Integration Process
  • 65. Two: Add An Integration Process Tutorial 3: Creating An Integration In Scribe Console Page 61 10. Under Starting, change the time to 10:00:00 PM. Figure 73: Step 3 — Event Settings 11. Click Step 4 — Activation. You do not want to activate this collaboration quite yet. 12. Under Status, select Paused. Figure 74: Step 4 — Activation There are no changes required for the alerts, so you can skip Step 5.
  • 66. Page 62 Tutorial 3: Creating An Integration In Scribe Console 13. Click Finish. Figure 75: Time-based Integration Process You have just created your first time-based IP! Two: Add An Integration Process
  • 67. Three: Prepare The Scribe Sample Database Tutorial 3: Creating An Integration In Scribe Console Page 63 Three: Prepare The Scribe Sample Database Before you start the first integration, you want to make sure the target, Scribe Sample database, is empty so the integration can insert the records from the Scribe Sample Text datastore when it runs for the first time. To clear the database, you use the InternalDB.exe Scribe utility. Prepare The Scribe Sample Database 1. In C:Program Files (x86)Scribe, double-click InternalDB.exe. The Scribe Internal Database Maintenance Utility opens. 2. Click the Sample Database tab. Figure 76: Scribe Internal Database Maintenance Utility 3. Click Refresh Sample Data. Refreshing the sample data clears the contents of the Sample database. 4. When the Refresh complete! message appears, click OK, then click OK again to close the utility.
  • 68. Page 64 Tutorial 3: Creating An Integration In Scribe Console Four: Check The DTS File Before you run the integration, verify what the DTS file does. Remember a DTS file created in Scribe Insight contains the instructions, including data transformation rules, to perform an integration or migration. You can open and examine the DTS file either directly from Scribe Workbench or from within Scribe Console, which opens Workbench for you. If you created this DTS file following the instructions in Tutorial 1: Migrating Account Information, you already know what is in it, and you can skip this step. Examine The DTS File 1. From Scribe Console, expand Collaborations, expand Account Integration, and select File Browser. Figure 77: Collaborations File Browser Four: Check The DTS File
  • 69. Four: Check The DTS File Tutorial 3: Creating An Integration In Scribe Console Page 65 2. Double-click Accounts.dts in the browser pane. Scribe Workbench opens. Figure 78: Accounts.dts File As mentioned in the Prerequisites section, this DTS file integrates data from the Scribe Sample Text source to the Scribe Sample target. See Connection Manager With Disconnected Datastore. To see what this DTS file does, you want to examine the steps. Start by looking at the target steps.
  • 70. Page 66 Tutorial 3: Creating An Integration In Scribe Console 3. On the Insight main window, click Configure Steps. Figure 79: Configure Steps Button The Configure Steps dialog box appears. Figure 80: Configure Steps Four: Check The DTS File
  • 71. Four: Check The DTS File Tutorial 3: Creating An Integration In Scribe Console Page 67 4. In the Step Order pane, notice the DTS contains a single Update/Insert step on the Account object. Figure 81: Update/Insert Step In Configure Steps
  • 72. Page 68 Tutorial 3: Creating An Integration In Scribe Console 5. Close the Configure Steps dialog box. 6. Click the Data Formulas tab in the main window: Figure 82: Data Formulas Tab Whenever you have an update step, a lookup link is required. This allows Insight to look up the record it needs to update. For an Update/Insert operation, Insight looks up the record, updates it if it exists, or inserts it if it is not already in the target. On the Data Formulas tab, there is a link between source field 1 (S1), called UNIQUE_ID, and the XREF field in the target. Four: Check The DTS File
  • 73. Four: Check The DTS File Tutorial 3: Creating An Integration In Scribe Console Page 69 7. Scroll the target pane down to view XREF. There are two links between UNIQUE_ID and XREF, a data link and a lookup link. Figure 83: Lookup Criteria Tab 8. Close Scribe Workbench. 9. When you are prompted to save your changes to Account.dts, click Yes.
  • 74. Page 70 Tutorial 3: Creating An Integration In Scribe Console Five: Run The Integration When you created the Integration Process, you added a time-based event that runs every night. To test the integration, you don't have to wait until 10 tonight. This exercise shows you how to run an integration before the event criteria are met. Run Your New Integration 1. In Scribe Console, browse to Collaborations > Account Integration > Integration Processes. In the Integration Processes pane, you can see the status of Account Integration is currently paused. Figure 84: Paused Integration Process 2. Select the Account Import integration process and click Run Process. This forces the integration process to run now. You can also click Resume, which runs the paused integration, then queues the integration to run on schedule at 10 pm tonight. Figure 85: Queued Integration Process Five: Run The Integration
  • 75. Five: Run The Integration Tutorial 3: Creating An Integration In Scribe Console Page 71 3. Expand Administration and select Execution Log. The Execution Log Viewer opens, where you can verify your integration ran. 4. Click Refresh. 5. Verify the Result column shows Success, and the Source Rows column shows 16. This tells you 16 source rows were successfully inserted into your target. Figure 86: IP Result In Execution Log At this point, your integration is done. You can stop here, or continue to the next tutorial.
  • 76. Tutorial 4: More Console Techniques The previous tutorial provided an introduction to the Scribe Console and showed you how to create and run a simple integration. In this tutorial, you learn some other tools and techniques to help you use Scribe Console efficiently. Before beginning this tutorial, you must have completed Tutorial 3: Creating an Integration in Scribe Console. In this tutorial, you are going to change the DTS file to introduce an error, then create a method for handling errors in your real data. This tutorial walks you through the process of creating a monitor to track errors during the DTS run, moving the failed data into a rejected rows table, and creating a data view that allows you to see rejected rows. Objectives In this tutorial you: l Create a Rejected Rows table in the Scribe Internal database l Create and run a monitor l Create a data view l Check the data view, monitor, and alerts Page 72
  • 77. Page 73 Tutorial 4: More Console Techniques One: Introduce An Error In The DTS File Because the Integration Process you defined runs successfully, you cannot see how to process errors using Insight. To do this, you must break the Accounts.dts file you used in Tutorial 3. Change The DTS File 1. If the Scribe Console is not already open, open it. 2. Expand Collaborations, expand Account Integration, and select File Browser. 3. Double-click Accounts.dts to open the DTS in Scribe Workbench. 4. From the Scribe Workbench main window, click the Data Formulas tab and select the line that contains the formula: IF(ISERROR(S3),"Unknown"&S1, PROPER( S3 )) If you remember, this formula displays a message in the target when the BUSINESS_NAME field in the source is empty and changes the name of the business from all capital letters to initial caps. Figure 87: Formula On Data Formula Tab One: Introduce An Error In The DTS File
  • 78. One: Introduce An Error In The DTS File Tutorial 4: More Console Techniques Page 74 5. Double-click the formula to open it in the Formula Editor. Figure 88: Formula Editor With Formula 6. Replace the entire formula with: S3 This deletes the formula but retains the data link between BUSINESS_NAME in the source and ACCOUNTNAME in the target. 7. Click OK, then save this DTS file.
  • 79. Page 75 Tutorial 4: More Console Techniques Two: Create The Rejected Rows Table Data rows flagged as having errors are called Rejected Rows, because the row contains data that causes Insight to reject an insert or update. If you want to track these rows, you need a place to record them. To see which rows have been rejected, you create a Rejected Rows table for them. In this tutorial, you create a table, RR_ACCOUNTS, in the Scribe Internal Database using the DTS Settings dialog box. The DTS Setting button displays the DTS Settings dialog box. Figure 89: DTS Settings Button Create A Rejected Rows Table 1. In the Scribe Workbench, click the DTS Settings button. The DTS Settings dialog box appears. 2. Click the Rejected Source Rows tab. 3. Select the Output Rejected Source Rows checkbox. The Scribe Internal Database is the default connection. For most purposes, you can use this table to store rejected rows records. 4. Select Always append rejected rows to the same table. 5. In the Table field, enter RR_ACCOUNTS for the table name, then click Create Table Now. Figure 90: Rejected Source Rows Tab Two: Create The Rejected Rows Table
  • 80. Two: Create The Rejected Rows Table Tutorial 4: More Console Techniques Page 76 6. Click Yes to save your changes before creating the table. The Rejected row table created message appears. 7. Click OK in the message. 8. Click OK again to close the DTS Settings dialog box. To see the table you just created, you can use SQL Server Management Studio. a. Open SQL Server Management Studio and browse to Databases > SCRIBEINTERNAL > Tables > SCRIBE.RR_ACCOUNTS. b. Exand Columns. Verify the RR_ACCOUNTS table contains the same fields as your source data, as well as some extra fields for error handling. Figure 91: Scribe Sample Text And RR_ACCOUNTS Tables Before you continue, refresh the sample database so the next DTS run can insert data. 9. Navigate to the Program Files (x86)Scribe folder and double-click InternalDB. 10. Click the Sample Database tab, then click Refresh Sample Data.
  • 81. Page 77 Tutorial 4: More Console Techniques Three: Create A Data View You have created a rejected rows table and have viewed it using SQL Server Management Studio. However, it might be easier to view and repair the rejected rows within Scribe Console. To do this, you need to create a data view. Create A Data View 1. In the Scribe Workbench, save the DTS, then close the Workbench. 2. In the Scribe Console, expand Collaborations, expand Account Integration, and select Data Views. The Data Views page appears. You created the Account Integration Collaboration in the previous tutorial. Now, you are going to create a Data View Rejected Accounts collaboration. 3. Click Add. Figure 92: Add New Data View – Step 1 4. In the View Name and View Title fields, enter Rejected Accounts. If you were creating multiple data views for this collaboration, you would enter the name of a folder in the Folder field. Insight creates a folder and stores your data view in the specified folder. Because you are only creating one data view, you can move on to the next step. Three: Create A Data View
  • 82. Three: Create A Data View Tutorial 4: More Console Techniques Page 78 5. Click Step 2 — Connect to Source to select a source connection. As in the previous step, the rejected rows table is in the Scribe Internal database. Scribe Internal needs to be the source. 6. Click Source Connect, expand ODBC Data Sources, then select ScribeInternal_ MS. Figure 93: Add New Data View – Step 2 7. Click OK. The Connect to DSN dialog box appears. 8. Enter the connection information, then click OK. 9. Verify User ID is set to SCRIBE. 10. In Password, enter integr8!. 11. Click OK. 12. Click Step 3 — Configure Source to configure the source. You want to use the RR_ACCOUNTS table you created earlier. 13. Click Source Configure. The Configure Source dialog box appears. 14. Expand All Data Objects (by Type), expand Tables, and select RR_ACCOUNTS. 15. Click OK.
  • 83. Page 79 Tutorial 4: More Console Techniques 16. Verify the default View Presentation is set to Table. Figure 94: Add New Data View — Step 3 17. Click Step 4 — Set Field Properties to configure the field properties. The only change you need to make here is to allow updates and deletes. 18. In the Allowable Operations box, select Updates and Deletes: Figure 95: Add New Data View – Step 4 Three: Create A Data View
  • 84. Three: Create A Data View Tutorial 4: More Console Techniques Page 80 19. Click Finish to save your new data view. Now, run the view and see what happens. 20. In the Console tree, browse down to the Rejected Accounts data view and click the data view name. Figure 96: Running A Data View Because you have not run the DTS file, the Rejected Accounts Data View is currently empty. Notice the field names are the same as the fields you saw using SQL Server Management Studio. Figure 97: Rejected Accounts Data View In the next step, you will create a monitor, then rerun the collaboration and check the data view again.
  • 85. Page 81 Tutorial 4: More Console Techniques Four: Create A Monitor A monitor enables you to oversee system issues, business activities, and Integration Processes. You can set a monitor to raise an alert. Create A Monitor 1. In Scribe Console, expand Collaborations, expand Account Integration, and select Monitoring. The Monitoring pane appears. 2. Click Add to start the Add New Monitor Wizard. Figure 98: Adding A Monitor – Step 1 3. Verify the Monitor type field is set to Query. 4. Name the monitor New Rejected Accounts and add a comment in the Comment field. Figure 99: Monitor Name And Comment Four: Create A Monitor
  • 86. Four: Create A Monitor Tutorial 4: More Console Techniques Page 82 5. Click Step 2 — Source Connection to set the source. 6. Click Source Connect. Select the ScribeInternal_MS ODBC Data Source, click OK, and enter integr8! if you are prompted to enter a password. Figure 100: Monitor Connection Properties For the data view, you included the entire RR_ACCOUNTS table. When you created the RR_ACCOUNTS table, Scribe added a timestamp field called KSSTARTTIME. For this monitor, you want to use specific data. You can create a custom query to compare KSSTARTTIME to the values of the LastRunDateTime and ThisRunDateTime system variables. If rows have been added to the RR_ACCOUNTS table either since the last run or before the current run, this custom query causes the monitor to raise an alert. In Step 3, you are going to create the query this monitor uses. 7. Click Step 3 — Alert Criteria to define the alert criteria. 8. Click Source Configure. The Configure Source dialog box appears. This is where you define a custom query. 9. Select Custom Query from the dialog box.
  • 87. Page 83 Tutorial 4: More Console Techniques 10. In the SQL Query window, enter the following query: SELECT * FROM SCRIBE.RR_ACCOUNTS WHERE KSSTARTTIME >= :LastRunDateTime AND KSSTARTTIME < :ThisRunDateTime In this query, the ">=" raises an alert if the timestamp field, KSSTARTTIME, is later than or greater than the last run date. Figure 101: Monitor SQL Custom Query 11. Click Test/Requery to test your query. When the query executes without any errors, an execution time appears next to the Test/Requery button. Figure 102: Monitor SQL Test Time 12. Click OK to close the Configure Source dialog box. If there are errors, correct them, then close the dialog box. As part of Step 3, you also need to raise an alert when one or more new rows are found in the RR_ACCOUNTS table. Four: Create A Monitor
  • 88. Four: Create A Monitor Tutorial 4: More Console Techniques Page 84 13. In the Alert Conditions box, select: l Row count l Operator = Greater Than or Equal l Row(s) = 1 Leave the Create an Alert for each matching result row checkbox clear. You do not need to create an alert for each matching row. 14. In the Alert Recipients box, select Fixed. This option always sends alerts to the same Fixed list. At this point, you do not need to worry about the recipients. Figure 103: Monitor Alert Conditions When you create a real collaboration, add recipients and recipient groups under Alert Recipients in the Administration node before adding monitors. If you add the recipients or groups in this pane, an email is sent to the recipients whenever an alert is raised. For information, see Managing Alert Recipients in the Scribe Insight Help. 15. Click Step 4 — Monitor Interval. You can use the default Monitoring Interval Settings, which monitor every 15 Minutes. 16. Click Step 5 — Activation. Verify the Status is set to Active. You use the default values for other settings for this step. 17. Click Step 6 — Alerting. On this step, you want to create a Warning alert. 18. In Alert Type, select Warning. 19. In Alert description, enter a meaningful description, such as New rejected Accounts. 20. In Alert number, enter 3001. The alert number is for your own purposes. Insight displays the Alert number but does not use this number to perform any error handling.
  • 89. Page 85 Tutorial 4: More Console Techniques 21. In Alert message, enter a meaningful message. For example: There are new rejected accounts from the Account Import process. 22. In Alert Message Options, verify the following are selected: l Include results l Include row count/value l Attach results as XML. Having an XML file makes it easier to figure out what went wrong. Figure 104: Monitor Activation 23. Click Finish to save and close your new monitor. The new monitor appears in the Monitoring pane. Figure 105: New Monitor On Monitoring Pane Four: Create A Monitor
  • 90. Five: Run The Integration Process Tutorial 4: More Console Techniques Page 86 Five: Run The Integration Process At this point, you have: l Created an Integration Process (IP) l Defined a DTS file that you know creates a rejected row l Defined a data view for your Rejected Rows table, RR_ACCOUNTS l Defined a monitor that monitors the IP and raises an alert if changes are made to the RR_ACCOUNTS table Now, you can run the IP and see what happens. Run The Integration Process 1. In the Console tree, expand Collaborations, and expand Account Integration, and select Integration Processes. 2. Select the Account Integration Collaboration. Figure 106: Account Integration IP On Integration Processes Pane 3. Click Run Process. This forces the collaboration to run. 4. Expand Administration and select Execution Log.
  • 91. Page 87 Tutorial 4: More Console Techniques 5. In the Execution Log Viewer, click Refresh to update the results. Figure 107: Execution Log Viewer Showing Failed Row The Execution Log Viewer includes two grids: l The Execution grid, on the top, displays information about the execution. l The Transaction grid, on the bottom, displays information about the transaction selected in the Execution grid. Five: Run The Integration Process
  • 92. Five: Run The Integration Process Tutorial 4: More Console Techniques Page 88 6. Select the top row in the Execution grid, which indicates a failed transaction. The Transaction grid displays more information about this transaction. For example, you can see source row #14 failed with Error Code 1005. 7. Double-click the message in the Transaction grid. The Transaction Detail report displays, providing more information about the error. Figure 108: Transaction Detail Report According to the error message, the ACCOUNT column does not allow nulls.
  • 93. Page 89 Tutorial 4: More Console Techniques Six: Check The RR_ACCOUNTS Table After checking the Execution Log, you know there is an error. The next step is to figure out exactly which row you need to repair. There are a couple of ways to do this — you can open the RR_ACCOUNTS table in SQL Server Management Studio, or you can use the data view you created earlier in this tutorial. You are going to take a look at the table using the data view. Check The RR_ACCOUNTS Table 1. Expand Collaborations, expand Account Integration, expand Data Views, and open the Rejected Accounts Data View. This time, you see there is a row in this data view. Figure 109: Rejected Accounts Data View The row shows that for the company with the unique ID of 114, the Business Name is blank. Although doing so is beyond the scope of this tutorial, you have two options: l Update the source to correct the error and rerun the collaboration — Since the DTS file has lookup links and an Update/Insert step, when you rerun the collaboration, the new data is inserted without creating duplicate records. l Replace the error-processing formula you removed — Edit the DTS to use the error processing formula that you removed at the beginning of this tutorial and that inserts an error message into the target when an error is found: [(IF(ISERROR(S3),"Unknown"&S1, PROPER(S3 )) ] Six: Check The RR_ACCOUNTS Table
  • 94. Seven: Check The Monitor Tutorial 4: More Console Techniques Page 90 Seven: Check The Monitor Finally, you want to check the monitor. As discussed in Step Four of this tutorial, you want to add Alert recipients to your real monitors so an email is sent whenever an alert is raised. For the purposes of the tutorial, however, you are only checking the monitor manually. Check The Monitor 1. Expand Collaborations, expand Account Integration, and select Monitoring to open this monitor. You can also view all monitors on your system by selecting Monitoring from the Integration Server node. 2. Click Resume, if needed, and then click Run Monitor. 3. Click Refresh. You see an alert has been created. Figure 110: Checking The Monitor 4. To view the alert, select Alert Log under Account Integration: Figure 111: Viewing The Alert Log Note there are two alerts: a system monitor raised one alert on the process ID, and the monitor you created raised another alert. 5. Find the Monitor entry in the Category column and select the alert generated by the monitor.
  • 95. Page 91 Tutorial 4: More Console Techniques 6. Under Detail for Alert, click the Message tab for more information: Figure 112: Alert Log Message Details — Top Here, you see all of the information you need to correct the data and rerun the collaboration. The data in this message is generated from the RR_ACCOUNTs table. At the top, as shown above, you see the UNIQUE_ID field, which allows you to find the record. 7. Scroll to the bottom of this information, where you can see the exact error message along with the timestamp and other useful data. Figure 113: Alert Log Message Details — Bottom Congratulations! You have finished this tutorial. Seven: Check The Monitor
  • 96. What's Next?  Tutorial 4: More Console Techniques Page 92 What's Next?  With these two tutorials, you have seen some basic tasks you need to perform using Scribe Console. There are other tools in the Console that you may need to use, but these tutorials should help get you started. For more information, there are many resources available, including the Scribe Insight Help, Scribe Installation Guide, and Scribe's Forums and Knowledgebase at https://openmind.scribesoft.com/forums.