Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

William Berth Bi Portfolio

416 views

Published on

  • Be the first to comment

  • Be the first to like this

William Berth Bi Portfolio

  1. 1. Business Intelligence Portfolio SetFocus BI Masters Program William Berth wberth@gmail.comDesign and Development of a Business Solution using 802-375-9490 Microsoft Business Intelligence tools
  2. 2. This portfolio describes the training & contains examples of my project development skillscompleted in the Business Intelligence arena using Microsoft SQL Server 2008 & Microsoft VisualStudio 2008 while participating in the SetFocus Business Intelligence Master’s Program . • About The Masters Program • Advanced Transact SQL Project • SQL Server Integration Services Project • SQL Server Analysis Services Project • SQL Server Reporting Services Project (including PerformancePoint Server (PPS), Excel, and SharePoint (SP)) • Full Business Intelligence Implementation - Final Team Project
  3. 3. • The SetFocus BI Master’s Program is an intensive, hands–on, project oriented program allowing knowledge and valuable experience to design and develop business intelligence systems using the Microsoft Business Intelligence tools, part of Microsofts SQL Server Applications, putting the BI skill set to use in a simulated work environment with an emphasis on business application context.• Students receive 400 hours of in-depth hands on experience including 7 tests, 5 projects, exercises, workshops, & labs focused on the Microsoft Business Intelligence stack.• Course topics include (but not limited to): – Creating queries using advanced Transact SQL functions in SQL Server Management Studio. – Designing, Implementing and Maintaining an ETL Solution Architecture using SQL Server Integration Services. – Data warehousing (relational & multidimensional) and dimensional modeling concepts. – Designing, Implementing and Maintaining an Analysis Solution Architecture Using SQL Server Analysis Services, expanding your knowledge on KPI s, data mining, & MDX Queries . – Creating complex reports employing expressions, global collections, and conditional formatting using SQL Server Reporting Services. – Plan, build, and roll out an Enterprise Reporting Dashboard utilizing SharePoint 2007 and leverage Performance Point Server 2007 for monitoring and analysis purposes.• Projects are completed individually. A full Business Intelligence project is completed as part of a team. Project specs include deliverables, documentation, and a due date deadline. 1/3/2013 SetFocus: Business Intelligence Masters Program: 2010I3 Back to TOC
  4. 4. • Introduction: Having a thorough understanding of T-SQL programming skills is essential for BI applications and creating a data warehouse. The T-SQL session of the course focused on intermediate and advanced T-SQL 2008 database programming concepts, database patterns, and reusable approaches. Some of the topics covered included: SQL Server 2008 data types; partial text searches with LIKE and %; using the GROUP BY to aggregate data; creating stored procedures, database views and database triggers; implementing database transactions; implementing error handling with TRY/CATCH; advanced uses of PIVOT statement, APPLY operator, GROUPING SETS, and RANK functions for reporting; change data capture and database update triggers for audit trail logging; querying hierarchical data with recursive queries and the Hierarchy ID data type; using subqueries and common table expressions; date handling and temporal-based queries; error handling, transactions, and transaction isolation levels; Dynamic SQL and Dynamic SQL alternatives; interpreting and analyzing execution plans; geo-spatial & recursive queries, and XML handling as parameters.• Overview: The T-SQL project put to use many of the concepts presented above. This project was completed by each student individually. The project consisted of writing 10 queries in SSMS, each query having its own specification including filters. Solutions for the project are contained on the following slides. Additional SQL examples are provided in the SSIS and SSAS projects for building the data warehouse. Back to TOC
  5. 5. • Query 1 Specification: Summarize LineTotal in Purchase Order Details as TotDollars by Product. Show only those products with the word ‘Washer’ anywhere in the name. Retrieve based on the Order Date Year of 2003. Back to TOC
  6. 6. • Query 2 Specification: Count orders by Product Subcategory. Retrieve based on the Order Date Year of 2004. Only show those product subcategories with at least 10 orders. Back to TOC
  7. 7. • Query 3 Specification: Show a list of vendors who did NOT have an order in 2003 (based on the year of orderdate). Sort order doesn‘t matter. Back to TOC
  8. 8. • Query 4 Specification: Summarize Freight by Ship Method Name. Retrieve based on first quarter of 2003. Show all Ship Methods, even if no data exists. Back to TOC
  9. 9. • Query 5 Specification: Summarize TotalDue from Purchase Orders for the months in 2003. Retrieve for the following National ID Numbers: (792847334, 407505660, 482810518, 466142721, 367453993). Use DateName for month, but make sure to sort chronologically (make sure to match the sort in the output). Back to TOC
  10. 10. • Query 6 Specification: Retrieve Names and Addresses from the vVendorWithAddresses view (and defines a record type of ‘Vendor’). Retrieve Name and Addresses from the vEmployee View (concatenate LastName, FirstName, Middle Initial). Convert any null values (Middle Initial, AddressLine2) to an empty string. Sort the final output on PostalCode, and Name within PostalCode. Back to TOC
  11. 11. • Query 7 Specification: Write a STORED PROCEDURE called GetVendorOrders that does the following: Retrieves a Vendor Account Number and StartDate/EndDate parameter date range. Summarizes freight, tax amount, and SubTotal by Ship Method, for the vendor and date range. Run the stored procedure for vendor ADVANCED0001 & date range 1-1-2003 to 12-31-2003. Back to TOC
  12. 12. • Query 8 Specification: Write a query that retrieves the top 5 week ending dates for 2003, based on overall TotalDue in PurchaseOrderHeader (and rank them; the Grand total is the sum of all the Ship Methods). Summarize all dates for the week ending (Saturday) date. Break apart sales by the Ship Methods in the ShipMethod table.-- Check if Saturday date function exists. If so, drop it.IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N[dbo].[SaturdayDate]) AND type in (NFN, NIF, NTF, NFS, NFT))DROP FUNCTION [dbo].[SaturdayDate]GO--Create Saturday date functioncreate function [dbo].[SaturdayDate] (@CurrentDate datetime)returns datetimeas begindeclare @SaturdayDate datetimeSET @SaturdayDate = DATEADD(d, (7 - DATEPART(dw,@CurrentDate)), @CurrentDate)return @SaturdayDateendgo-- This CTE selects all Saturday dates for order dates, amounts, & ship methods in 2003.;with ShipMethodCTE as (select dbo.SaturdayDate (OrderDate) as WeekEnding, poh.ShipMethodID, poh.TotalDuefrom Purchasing.PurchaseOrderHeader pohwhere year (poh.OrderDate) = 2003 ),-- This CTE pivots (transposes) the ship methods from the 1st CTE.PivotCTE as (select WeekEnding, [1] as XRQ, [2] as ZY, [3] as OVERSEAS, [4] AS OVERNIGHT, [5] AS CARGOfrom ShipMethodCTEPIVOT (sum(TotalDue) for ShipMethodID in ([1], [2], [3], [4], [5] )) as temp )-- This query selects, sums, & ranks the CTE.select top 5 WeekEnding, sum(XRQ + ZY + OVERSEAS + OVERNIGHT + CARGO) as GrandTot,rank () over (order by (sum(XRQ + ZY + OVERSEAS + OVERNIGHT + CARGO)) desc ) as WeekRank,XRQ, ZY, OVERSEAS, OVERNIGHT, CARGOfrom PivotCTE Back to TOCgroup by WeekEnding, XRQ, ZY, OVERSEAS, OVERNIGHT, CARGO
  13. 13. • Query 9 Specification: Write a stored procedure that will receive four values: Top N Vendors, Top Y Products within each Vendor, a Start Date, and End Date range. Get the top N Vendors and rank them by POHeader.TotalDue descending so that no gaps occur. For each of the vendors, run a subsequent TOP query to get the top Y products FOR THAT vendor. Note: some vendors only deal with one product. Run the procedure with the parameters of 5, 5, and for the date range of Jan 1 2003 to June 30, 2004. Click here to see query code -- Execute procedure exec TopVendorProducts 5, 5, 1-1-2003, 6-30-2004 Back to TOC
  14. 14. • Query 10 Specification: Retrieve Products and Standard Cost, for those products with a standard cost as of 1/1/2002 of $700 or more.IF EXISTS (SELECT * FROM sys.objectsWHERE object_id = OBJECT_ID(N[dbo].[StdCostDate])AND type in (NFN, NIF, NTF, NFS, NFT)) DROP FUNCTION [dbo].[StdCostDate]go-- This function determines the Standard Cost for a-- product as of a specific point in time.CREATE FUNCTION [dbo].[StdCostDate] (@AsOfDate AS datetime, @CostAmt AS INT)RETURNS datetimeAS BEGINdeclare @StdCostDate datetimeset @StdCostDate = (select MAX(StartDate) from Production.ProductCostHistory where StartDate <= @AsOfDate and StandardCost > @CostAmt)return @StdCostDateENDgoselect pp.ProductNumber, pp.Name, round (pc.StandardCost,2) as StandardCostfrom Production.Product ppjoin Production.ProductCostHistory pc on pp.ProductID = pc.ProductIDwhere StartDate = dbo.StdCostDate (1-1-2002, 700) Back to TOCand pc.StandardCost > 700order by pc.StandardCost desc
  15. 15. • Introduction: A key component to building a robust data warehouse is the ability to extract, transform, and load (ETL) operational source system data (often disparate) and consolidate that data into a data staging area. The SQL Server Integration Services (SSIS) BI tool provides the capability to read data from/write data to different output formats (thus the name “integration services”, because of the integration with different file systems). The SSIS session of the course focused on the fundamentals of SSIS, some best/recommended practices in SSIS, and common design patterns in SSIS. Some of the topics covered included: data and control flow task operations, variables, configurations, event handling and logging, transformations, package management and deployment, SSIS scripts, running SSIS scheduled jobs using SQL Server Agent, and integration of T-SQL Stored procedures with SSIS. Additionally, the course covered the new features in SSIS 2008 and all new MS BI enhancements in SQL Server 2008.• Overview: The SSIS project is for a fictitious construction company called AllWorks. Students design and build a SQL Server 2008 database to track employee and customer information, timesheet and labor rates data, as well as job order information, job materials, and customer invoices. In the client project scenario, AllWorks currently stores this information in Excel Spreadsheets, XML files, and CSV files. The first task is to review the source data from the three different sources. Students are asked to make some enhancements to the existing data sources, to support more flexible business practices for customer invoicing. Next, students use SQL Server 2008 Integration Services to integrate these external data sources into the SQL Server database. The goal is to create a 3NF database to hold all the data in the source files. A T-SQL script that creates the AllWorks database, tables, database diagram and PK/FK constraints is included. This project was completed by each student individually. Specifications and solutions are provided on the following slides. Back to TOC
  16. 16. • Specifications: AllWorks currently uses spreadsheets and Oracle data (exported as XML) as part of their systems. They store Employee and Client geography data, along with overhead and job order master data, and invoices in spreadsheets. The feed of material purchases comes from an XML file. And finally, the timesheet data comes from CSV files. Each individual file specification will be addressed in the following slides. Some solution examples are also provided on the following slides. Generally, each package will have similar processing except for the timesheet file (see this document for technical project specs): The next step is to create a SQL Server OLTP database and transfer the data from the different raw data sources (XLS, CSV, and XML). A data model was produced and a SQL script was built to create the database and tables as shown in this document: Initially a full load of the data into the SQL database will be executed, and then scheduled packages will be run nightly to import/update any additional information. Back to TOC
  17. 17. This is the relational database builtfor AllWorks from the DDL scripts. Back to TOC
  18. 18. • Employees Input File: This employees spreadsheet contains 2 worksheets. First sheet (employees) contains roster of Employees, and a flag for whether the employee is a contractor or a regular employee (some overhead rates only apply to employees). Second sheet (employee rates) contains Employee Hourly Rates for each employee, along with an effective date.• Client Geographies Input File: This client /customer spreadsheet contains 2 worksheets. First sheet (Client Listing) contains each client for AllWorks, along with a CountyKey to associate the client with a county. Second sheet (County Definitions) contains the list of counties.• Overhead Input File: This overhead spreadsheet contains 2 worksheets. First sheet (Overhead master) lists each labor overhead category. Second sheet (Overhead rates) lists the hourly rates for each overhead category, the rate effective date, and whether the rate applies to employees and/or contractors. This relates to the employee flag in the employee table. Remember that a rate may change from applying to only employees, to applying to both employees and contractors (or vice-versa).• Project Master Input File: The job spreadsheet contains 1 worksheet which contains one row for each Job work order. It contains a reference to the client, material markup %, labor overhead markup %, whether the job is closed, and the creation date. The markup % is optional: the way they work is as follows: if $1,000 of material is purchased, and the Job markup % is 10 (10%...they are stored as whole numbers), then the total material cost is $1,100. The same concept applies for labor markup, based on the # of hours of labor for the job.• Invoices Input File: An Excel spreadsheet that contains client invoices, a link to the job number, the invoice amount and invoice number, the paid amount, and any additional labor. (Sometimes AllWorks will append a lump sum amount). IMPORTANT NOTE! AllWorks invoicing spreadsheet only allowed for one work order job per invoice (a 1-1 relationship between invoice and job ). AllWorks wants to expand that, so that a single invoice can cover multiple jobs. Additionally, when AllWorks receives payment, they want to track how much was received for each job on each invoice. So a 1-many link/cross-reference table is created, both for the invoice and the jobs associated with it and then another 1- many table for each invoice receipt and the amount paid for each job on the invoice.• Material Purchases Input File: This XML file contains the material purchase transactions for each job. It contains a link to the Job number, the purchase amount, and the purchase date. AllWorks does not have a separate table for the TypeCode, they have been hardwiring it. 1 stands for Material, 2 stands for Petty Cash, and 3 stands for Fuel.• Employee Time Input Files: There are multiple CSV files to be processed that contains all the labor data transactions – the employee id, the work date, number of work hours, and the job number. Back to TOC
  19. 19. • AllWorksOLTP Project Solution: The project solution contains 13 packages; 7 package details are described on subsequent slides. The OverheadMaster, OverheadRates, Client, Project, InvoiceJob, & InvoiceReceipt packages are simple in design and not described in this portfolio. They simply read different worksheets; validate ID fields; insert, update, or write out an flat error file based on the validation of the ID fields. Back to TOC
  20. 20. • Employee Table Spec: Take the spreadsheet, Employees.XLSX, from the C: folder, and create a package that will read the contents of the spreadsheet and add them to the SQL Server AllWorks OLTP database. The package must be able to insert missing rows based on the employee ID. The package must be able to update existing rows. A Fullname column (FirstName then Lastname) must be created in the package. Back to TOC
  21. 21. The employeefirst name &last name areconcatenatedtogether tocreate a newfield called FullName.Back to TOC
  22. 22. The input row ischecked againstthe lookup alias.If the inputemployee ID doesnot exist in thelookup alias, therow is insertedinto the database.If the employee IDlookup alias doesexist, input fieldsare checkedagainst the lookupalias fields. If datahas changedthe row is updatedin the database. Ifnot, the row isinvalid & is writtento a file.Back to TOC
  23. 23. • Employee Rate Table Spec: Take the spreadsheet ,Employees.XLSX, from the folder C:, and create a package that will read the contents of the spreadsheet (the 2nd sheet, for Employee Rates) and add them to the SQL Server AllWorks OLTP database. The package must be able to insert missing rows based on the employee ID and Rate Effective Date. The package must be able to update existing rows. The student must validate in the package that each employee ID in the employee rates spreadsheet is a valid employee ID. The package should write out a log of any invalid employees (an employee ID that does not exist) to a flat file. NOTE: All packages move processed files to an output area when the input file has completed all processing and emails are sent out based on a package‟s success or failure. Back to TOC
  24. 24. Back to TOC
  25. 25. Validate the employee id in the rate worksheet by matching itagainst the employee table. Back to TOC
  26. 26. The input row ischecked against thelookup alias. If theinput employee IDand effective datedoes not exist in thelookup alias, the rowis inserted into thedatabase. If theemployee ID &effective date lookupaliases do exist, inputfields are checkedagainst the lookupalias fields. If datahas changed the rowis updated in thedatabase. If not, therow is invalid & iswritten to a file. Back to TOC
  27. 27. The OLE DB Command taskto update the Employee Ratetable. Back to TOC
  28. 28. • County Table Spec: Take the spreadsheet, Clientgeographies.XLS, from the folder C:, and create a package that will read the contents of the spreadsheet (the 2nd sheet, for County Definitions) and add them to the SQL Server AllWorks OLTP database. The package must be able to insert nonexistent rows based on the County ID. If the Description column contains the word “County”, remove it and add “, PA”. The package must be able to update existing rows (just the description, as the table only contains two columns, the county ID and description). Back to TOC
  29. 29. Look for “County” in description & splitdescriptions having “County”. Back to TOC
  30. 30. Parse description field & build new description. Back to TOC
  31. 31. Union all transformed & existing description rows. Back to TOC
  32. 32. • Timesheet Table Spec: Take the several CSV files, EmpTime####.CSV, from the folder C:TIME, and create a package that will read the contents of the CSV files and add them to the SQL Server AllWorks OLTP database. The package must be able to insert rows that don’t exist in the table, based on the combination of the Job Number ID, Employee ID, and Date. An employee can work on multiple Jobs in a single day, but there will never be more than one row per Job/employee/date. The package must validate that each Job Number ID exists in the project table and that each Employee ID exists in the employee table. If not, the incoming record should be written to a flat file. The package should update existing timesheet record (for changes to hours worked), based on the Job Number ID/Employee ID/Work Date. The package must validate that a timesheet record cannot be posted for a Job that has been closed, if the work date exceeds the Job close date. If this occurs, the incoming timesheet record should be written to a flat file. No input files precedence constraint expression. Back to TOC
  33. 33. There are multiple timesheet input files each having a different name so theForEach Loop container must capture the input file name dynamically using theConnection Manager named „Dynamic Input Time Sheet File‟ . Back to TOC
  34. 34. The package is unlike the other packages, because a variable # of CSV files have to be read through. Therefore, two sets of variables need to be maintained – a count of rows added/inserted/errors for the file, and then a count of rows added/inserted/errors for the folder – using a Script Task.public void Main() { // TODO: Add your code here Dts.Variables["FilesProcessedCount"].Value = (int)Dts.Variables["FilesProcessedCount"].Value + (int)1; Dts.Variables["FolderAddedCount"].Value = (int)Dts.Variables["FolderAddedCount"].Value +(int)Dts.Variables["NewCount"].Value; Dts.Variables["FolderChangedCount"].Value = (int)Dts.Variables["FolderChangedCount"].Value +(int)Dts.Variables["ChangedCount"].Value; Dts.Variables["FolderInvalidCount"].Value = (int)Dts.Variables["FolderInvalidCount"].Value +(int)Dts.Variables["InvalidCount"].Value; Dts.TaskResult = (int)ScriptResults.Success; Back to TOC }
  35. 35. The ProcessTimesheetData Flowtask.Back to TOC
  36. 36. • Material Purchase Table Spec: Take the XML file, MaterialPurchases.XML, from the folder C:, and create a package that will read the contents of the XML file and add them to the SQL Server AllWorks OLTP database. The package must be able to insert missing rows based on the ID. The package should update existing material purchase records, based on the ID. The package must validate that each Project ID and Material Type exists in the Project and Material Type tables. If not, the incoming record should be written to a text file. The package must validate that a material purchase record cannot be posted for a project that has been closed, if the purchase date exceeds the project close date. If this occurs, the incoming material purchase record should be written to an error file. Back to TOC
  37. 37. The SalePoint inputfield is checkedfor null values &converted to an emptystring if nulls are found. Back to TOC
  38. 38. • Invoice Table Spec: Take the Excel spreadsheet. Invoices.XLSX, from the folder C:, and create a package that will read the contents of the file and add them to the SQL Server AllWorks OLTP database. The package must be able to insert missing invoice based on the Invoice ID, which uniquely identify a row. The package should update existing invoice records, based on the Invoice ID. The package must validate that the Project ID exists. If the Project ID does not exist, the package should write the incoming row to a flat file. Back to TOC
  39. 39. The Terms & Work Dateinput fields arechecked for null values& converted to an emptystring if nulls are found. Back to TOC
  40. 40. • Master Package: Create a master package that will include all the ETL packages using execute package tasks to execute ETL packages from the SQL Server. Set up package variables and set ETL package configurations to parent/child. Deploy all ETL packages to SQL Server and keep these packages grouped by using a sequence container and set precedence constraints to mirror PK/FK database constraints & join the packages depending on their dependencies within a sequence container. Use a SQL Agent Job in SSMS to schedule the Master package only. Outside of the sequence container use maintenance tasks to do the following: copy data files from the server to C: folder; backup database; shrink database; rebuild index; update database statistics; create one send email failure connected to each of the database tasks and one successful after last database task. Run the Master package on the SQL Server & deploy to SQL Server. Create a job in SQL Server using SQL Agent. Back to TOC
  41. 41. The SequenceContainer flowcontinued.Back to TOC
  42. 42. The SQLServer Agentjob properties.Back to TOC
  43. 43. The SQL Server Agentjob properties toschedule & execute themaster package. Back to TOC
  44. 44. • Introduction: An effective data warehouse is designed to facilitate the analysis and reporting of this stored data repository. With this, it is imperative to provide a fundamental knowledge of data warehousing and database programming concepts as well as Kimball data modeling concepts in building a multidimensional database design. Some of the dimensional modeling topics covered included: study of the Ralph Kimball methodology (and compare to the Bill Inmon methodology for data warehousing); study differences between OLTP, Data Warehouse, and OLAP databases; study OLAP storage methodologies/ persistence models (MOLAP, HOLAP, ROLAP); and discussed uses of surrogate keys. The SSAS session of the course included: analysis of many different dimension and fact table patterns, across different industries included: Star and Snowflake schemas; Slowly Changing Dimensions; Role Playing dimensions; Many to many relationships; Bridge tables; Dimension hierarchies, multiple hierarchies and Dimension attributes; Common business dimensions (product, date, account/customer); Dimension outriggers; Multiple fact tables at different levels of dimension granularity; General patterns and practices for loading dimension tables and fact tables; Degenerate Dimensions; Factless Fact Tables; Periodic Snapshot fact Tables; Junk Dimensions; and Fully-additive and semi-additive facts.• The SQL Server Analysis Services (SSAS) BI tool provides the ability to create online analytical processing (OLAP) data structures (cubes) that allow fast analysis of data. Some of the SSAS topics covered in the course included: the basics of using SSAS to build OLAP cubes; applying the different dimension models from the Data Warehousing/Dimension Modeling exercises to create actual OLAP databases; the SSAS Cube Editor; the SSAS Dimension Editor; MDX Calculations and Analysis Services KPIs; drillthrough report actions; creating OLAP perspectives; OLAP storage methodologies (MOLAP, HOLAP, ROLAP); creating OLAP partitions and OLAP aggregations ; integrating SSAS with SSIS; XMLA to manage an OLAP environment OLAP Security Roles; SSAS Data Mining; and backing up and restoring OLAP databases. Back to TOC
  45. 45. • Overview/Specifications: Building off of the SSIS project, students create a data warehouse and build some analysis package solutions using Analysis Services, setting the data warehouse as the data source. Students denormalize the OLTP database, create fact and dimensions tables (including Drop and Create Stored Procedure scripts for the tables and PK/FK constraints), and set up an OLAP database containing one cube with multiple dimensions (one of the dimensions will be used in the cube twice as a role-playing dimension). Students also write MDX queries (10 calculations and 10 named sets for key performance indicators (KPIs)), create 4 KPIs and display the results in Excel to analyze and measure AllWorks profitability and costs. All Cost and Profit measures as Currency and Percent measures will be formatted as ‘Percent’. Hierarchies are created in Date, Project, and Employee dimensions based on the key columns. All dimension attributes and hierarchies will be built so that anyone retrieving data through a cube browser interface will see descriptions, not keys, except for the EmployeeKey (which is renamed to ID). Partitions are created for each of the fact tables with an aggregation design for a 50% performance increase. Four perspectives will be created. A role will be created granting Authenticated Users Read Definition access to the OLAP database and Read access to the cube. A URL action will be created that pulls up a GoogleMap for the Counties.• A separate MDX workshop solution is provided along with the SSAS project. MDX is the programming/ querying language for OLAP databases. Topics for the 19 MDX queries include: general MDX query syntax; MDX context; hierarchical functions; date-based functions (ParallelPeriod, PeriodsToDate, LastPeriods, etc.); Permanent Calculation Members and Named Sets; Ranking and TopN retrieval; aggregation functions; and KPI Programming. Calculations and sets are created at the query scope, not the session or cube scope.• This project was completed by each student individually. Specifications and solutions are provided on the following slides. Note that the dimensional model was provided for the students. Back to TOC
  46. 46. These are the table specifications provided from the physical data model:Below are the SQL DDL Scripts that contain code to drop, create, & load the fact & dimensiontables. The scripts to build & load the dimension tables will be run first. The scripts to create all thefact tables will be run second.This script to create & load the DimDate dimension table must be run first, prior to anyother script execution since some of the other scripts join to the DimDate table:The next step is to create the remaining dimension tables. These scripts to create all the otherdimension tables can be run in any order:The script to create & load the FactLabor fact table must be run first before any otherfact table script execution since some of the other scripts join to the FactLabor table:The final step is to create & load the remaining fact tables which can be run in any order: Back to TOC
  47. 47. This is the AllWorks data warehouse. Itcontains the data staging area relationaldatabase built by the SSIS project ETLprocess from the flat file operational sourcesystem. It also contains the data presentationarea built from the scripts in the previous slide. Back to TOC
  48. 48. This is the AllWorks OLAPsolution. Also shown hereis the data source view forthe star schema model. Back to TOC
  49. 49. The AllWorksOLAP cube isbuilt from thedata source viewusing the cubewizard. TheCube Designershows the fact &dimension tablesand provides theability to refinethe cube.Back to TOC
  50. 50. Dimension Designer is used tocreate a hierarchy for the DimDate dimension. The AttributeRelationships tab is used todefine and refine a naturalhierarchy to optimize cubeperformance. The NameColumn property is changed todisplay the name column. Back to TOC
  51. 51. Dimension Designer is usedto create a hierarchy for theDim Project dimension. TheAttribute Relationships tab isused to define and refine areporting hierarchy tooptimize cube performance. Back to TOC
  52. 52. Dimension Designer is usedto create a hierarchy for theDim Employee dimension.The Attribute Relationshipstab is used to define andrefine a reporting hierarchy tooptimize cube performance. Back to TOC
  53. 53. Dimension Usageshows therelationships (typeand granularity)between the factand dimensiontables. As per thespecs a JobClosed Date role-playing dimensionis added as areferenced(snowflake)dimension throughthe Dim Projectdimension and theDim Datedimension.Back to TOC
  54. 54. Several CalculatedMembers arecreated to be usedin KPIs and MDXqueries. Thismember divides theinvoice amountusing anothercalculated memberfor a percentagewhen the invoiceamount is > 0. Theformat is displayedas „Percent‟. Back to TOC
  55. 55. This memberuses the ParallelPeriod MDXfunction toretrieve the countfor the previousquarter.Back to TOC
  56. 56. SeveralNamed Setsare created tobe used inKPIs andMDX queries.This NamedSet is used toretrieve theclient childrenonly.Back to TOC
  57. 57. This NamedSet is used tofilter theoverhead costthat are > 0 forthe priorquarterchildren onlyusing theParallelPeriodMDX function.Back to TOC
  58. 58. Four KPIs arecreated to be usedin Excel & showthe status ofimportant businessmetrics. Here arethe specs:Shown here is thesolution for KPI2.The ValueExpression is aCalculatedMember. Here arethe Excelsolutions:Back to TOC
  59. 59. This Standard URLAction allows theuser to right-click ona measure value inan Excel cell anddisplay this „CountyGoogleMap‟ in theAdditional Actionssubmenu. Clickingon this action inExcel will launch aweb Google Map forthe county valueassociated with themeasure.Back to TOC
  60. 60. Physical storagepartitions arecreated for each ofthe four fact tables– one for data on orafter June 16, 2005and one for databefore that date.This will provide anactive partition forthe most recent fullyears of data, andthen store all olderdata in an archivedpartition.Performance willincrease for thisdefault MOLAPstorage on queriesthat use “recent”data. Partitions willbe optimized usingAggregations.Back to TOC
  61. 61. The AggregationDesign Wizard isused to optimizethe partition‟saggregations fora 50%performanceincrease.Back to TOC
  62. 62. Perspectives arecreated toprovide subsetsof the cube whichsimplifies theview of the cubefor improvedusability to endusers. Particularmeasures areselected for eachperspective.Back to TOC
  63. 63. Besides measures,these perspectivescontain otherobjects like KPIs,measure groups,dimensions,hierarchies, andattributes. Back to TOC
  64. 64. Perspectives alsocontain actions,named sets, andcalculatedmembers.Back to TOC
  65. 65. A security role iscreated to assign„read‟ permission(cannot modify)to databaseobjects andproperties foruser access.Back to TOC
  66. 66. „Read‟ securityaccess is grantedto the„AuthenticatedUsers‟ Windowsgroup for the„AllWorks‟ cube inthe „AllWorksDW„OLAP database.Back to TOC
  67. 67. „Read‟ securityaccess is grantedto all dimensionson this subtabwhich controlsaccess directly toa dimension froma front-endapplication.Back to TOC
  68. 68. This is the start ofthe MDX QueryWorkshop. Hereare the specs forall queries:Each querysolution and it‟sresult set areshown on this andthe followingslides. They willuse calculatedmembers andnames setscreated in thecube and thecorrespondingcube perspective.Back to TOC
  69. 69. The queryWHERE clauseis only fordimension slicing& defining. Itgets evaluatedfirst. The queryis restricted tousing onemeasure in theWHERE clause.Back to TOC
  70. 70. The last 52periods from thecurrent memberare used tocalculate amoving average.Lastchild is usedto display onlythe last 52periods.Back to TOC
  71. 71. The use of anasterisk signifiesa cross join. Useof commas is animplied cross joinwhich is thesame thing as anasterisk.Back to TOC
  72. 72. The Orderfunction arrangesmembers of thespecified setaccording to theirposition in thehierarchy, andthen descendingpurchase amountorder.Back to TOC
  73. 73. Using Memberreturns the set ofmembers in adimension andreturns a total forall members in aset. Using Ordersorts jobs inorder of purchaseamount and anasterisk showsthe breakdown ineach job bymaterial type.Back to TOC
  74. 74. Break hierarchydescending isused to break theouter left parentin the join toorder labor rateshighest to lowest.Back to TOC
  75. 75. „With Member‟defines acalculatedmember which ishelpful when themember is usedmore than oncein a query asshown here toget the labor rate.Back to TOC
  76. 76. Generate is usedto apply each jobset member toeach member ofthe employee set,then joins theresulting sets byunion. Theasterisk is usedto display the joband employee.Back to TOC
  77. 77. Parallelperiod isused to return thequarter memberfrom 4 priorquarter periodsago. The positivenumber „4‟assumes goingback 4 quarters.Back to TOC
  78. 78. Topcount sortsthe jobs laborcost set indescending orderand returns 10elements with thehighest valuesto get the top 10projects.Back to TOC
  79. 79. The Childrenfunction returns anaturally orderedset that containsthe children ofthe specified„EmployeeName‟ member.Using „Children‟with a hierarchyrequires a basepoint.Back to TOC
  80. 80. The FormatString cellproperty appliesa formatexpressionagainst the valueto return aformatted value.The „0‟, „.‟, & „%‟symbols areplaceholders.The „#‟ symbol isa placeholderonly if a valueexists.Back to TOC
  81. 81. Filter is used torestrict output byevaluating thespecified logicalexpressionagainst theEmployee Nameset. It requires 2parameters: theset you‟re filteringon & the and thefilter conditionexpression(where CarlImhoff incurredlabor costs).Back to TOC
  82. 82. Non empty isused to filter outnull rows from aspecified set. Allcolumns in therow-set must beempty for this towork.Back to TOC
  83. 83. This searchedCase statementevaluates a set ofBooleanexpressions (totalprofits = 0) toreturn specificvalues (null) iftrue. It is usedhere to avoiddivision by 0 inthe cubecalculatedmember.Back to TOC
  84. 84. MDX requiresenclosing sets incurly braces { }.When usingmultiplemeasures, theymust go in curlybraces { } as aset. Since aNamed Set isbeing used here,it is not requiredto enclose theoutput of thefunction in curlybraces.Back to TOC
  85. 85. The „Having‟clause is a way offiltering the finalresult set. Herewe are retrievingonly clients withthe word „Inc‟ inthe client name.Back to TOC
  86. 86. The TopPercentfunctioncalculates thesum of thespecified numericexpression (30)evaluated overthe invoiceamount for alljobs set, sortingthe set indescending orderand returns theelements with thehighest valueswhose summedvalue is ≥ 30.Back to TOC
  87. 87. • Introduction: Many high performing companies are becoming more analytical by turning to Business Intelligence (BI) capabilities, BI reporting, and data that is integrated and analyzed across the business for examination to help them get a better understanding of the key performance indicators that drive their business. BI reporting enables users to make decisions that affect how an organization is run. To be useful, BI reporting must turn dynamic, detailed data into information and make it available in real-time, at the point of a decision. It must be accessible on-demand when and where it’s needed and deliver optimal performance for data loading, ad hoc queries, and complex analytics. The Microsoft Business Intelligence stack provides the tools that allows for this analysis while incorporating OLAP capabilities for quick reporting performance. The SQL Server Reporting Services (SSRS) BI tool is a full-featured report writer and rendering engine (for multiple formats) used to create, execute, and deliver reports, tables, and charts that summarize and illustrate the data stored in a variety of data sources. SSRS is web enabled. The Performance Point Server 2007 (PPS) BI tool is for performance management using dashboards, scorecards, reports, charts, and KPIs. The venerable Excel 2007 product has been enhanced with cube/analysis capabilities and delivers to an end-user’s Excel spreadsheet data that can be used for planning, analytical, and reporting purposes, from an SSAS OLAP cube, and adds all-important spreadsheet computational, formatting, and graphical features. Most importantly, since the interface is Excel, users enjoy the full familiarity and write-back capabilities of their spreadsheet for planning, budgeting, and forecasting tasks. SharePoint Server 2007 enterprise information portal can be configured as a web-based intranet that helps simplify BI and can help improve an organizations effectiveness by streamlining the management of and access to data by providing data integration, data crawling, and report design to enable business decision making. SharePoints multi-purpose platform allows for BI tooling through using the collaboration features of SharePoint with reports and integrating with SSRS to provide a complete business intelligence platform. Back to TOC
  88. 88. • Overview: The SSRS session of the course focused on SSRS, PPS, Excel, and SharePoint (concentrating on BI with SharePoint and Office Excel Services). There was some coverage of SharePoint administration but not as heavy as a SharePoint for Developers class. The course topics for each subject are as follows: SSRS: Learn the basics of the SSRS report writer; Learn how to create reports with groups, parameters, subtotals, running totals, calculations, etc.; Learn how to create matrix (pivot table) reports; Learn how to leverage the latest charting enhancements in SSRS 2008; Learn SSRS Expressions; Learn how to incorporate SQL queries and stored procedures into reports; Learn how to schedule reports for automatic execution and deliver through report subscriptions; Learn the basics of integrating SSRS with .NET; Use subreports for more advanced output formats; Reporting against either relational or OLAP databases. PPS: Creating KPI scorecards and analytic charts/grids; Creating Dashboard pages and dashboard filters; Create KPI scorecard hotlinks to related charts/grids; Incorporating content from Excel and SSRS; Publishing dashboards to SharePoint. SharePoint: Creating Business Intelligence Site Collections; Configuring SharePoint to receive Excel content (Trusted site locations); Configuring a site collection to store document libraries for PPS, SSRS, and Excel Content; Scheduling SSRS reports for automatic deliver to SharePoint SSRS document libraries. Excel Services: Creating Pivot Tables and Pivot Charts against OLAP data; Publishing Excel content to a SharePoint document library.• The SSRS project is combined with an Excel/PPS/SharePoint Project using the concepts described above. Students design and build each reporting component using the SSAS project OLAP database solution, then deploy to SharePoint. Here are the general specs: This project was completed by each student individually. Each report specification and solution are provided on the following slides. Back to TOC
  89. 89. This is the entireSharePoint Sitecollection solution.Note the DocumentLibraries created asper #1 in the specs. Back to TOC
  90. 90. The data sourcesare uploaded for theentire SharePointSite collectionsolution as per PartOne of the specs. Back to TOC
  91. 91. This is the entirePerformancePointServer solution.Details of eachcomponent are onthe following slides.Back to TOC
  92. 92. These are the KPIsused for the entirePPS solution (seeindicators used onthe next slide). Back to TOC
  93. 93. These are the KPIindicators used forthe entire PPSsolution.Back to TOC
  94. 94. These are the 2scorecards for page1 of the dashboard.Back to TOC
  95. 95. These are thereports used for theentire PPS solution.Back to TOC
  96. 96. First, the DataSources are set upas per the specs.Back to TOC
  97. 97. The 1st page in theAllWorksDashboardconsists of twoscorecards and thename of the page issimply called„Scorecards‟. Hereare the specs: Back to TOC
  98. 98. The OverheadScorecard iscreated for the leftside of thedashboard window.As per the specs,the scorecard usesthe Overheadperspective and theOverhead TrendKPI. Also per thespecs in the Editortab, theOverheadTotal andAllOverheadsnamed sets fromthe cube are placedbelow the KPI toshow totals.Back to TOC
  99. 99. As per the specs,only the OverheadScorecard uses theOverhead Quartersfilter from the cube.The filter is linked tothe OverheadScorecard.Back to TOC
  100. 100. The KPIs areimported from thecube. The value isreformatted to showas a percentage.Back to TOC
  101. 101. The Client JobFinancialsscorecard is createdfor the right side ofthe dashboardwindow. As per thespecs, thescorecard uses theJob Summaryperspective, twoobjective KPIs, andmultiple cube KPIs.Back to TOC
  102. 102. The 1st objectiveKPI is „ClientFinancials‟. ThisKPI is dropped tothe left of the otherKPIs in the Editortab of thescorecard.Back to TOC
  103. 103. Included below theClient FinancialsKPI is the „OpenReceivables as a %of Invoiced‟ KPI.This KPI is importedfrom the cube andhas its valuereformatted to showas a percentage.As per the specs inthe scorecard Editortab, the ClientTotalandClientsOpenReceivables named setsfrom the cube areplaced below theKPI to show totals.Back to TOC
  104. 104. Also included belowthe Client FinancialsKPI is the „Profit %‟KPI. This KPI isimported from thecube and has itsvalue reformattedto show as apercentage. As perthe specs in thescorecard Editortab, the ClientTotaland AllClientsnamed sets fromthe cube are placedbelow the KPI toshow totals.Back to TOC
  105. 105. The 2nd objectiveKPI is „ConstructionJob Financials‟.This KPI is draggedunderneath the„Client Financials‟KPI in the Editor tabof the scorecard.Back to TOC
  106. 106. Also included belowthe „ConstructionJob Financials‟ KPIis the „Overhead as% of Total Cost‟ KPIimported from thecube. As per thespecs in thescorecard Editortab, the JobTotaland AllJobs namedsets from the cubeare placed belowthe KPI to showtotals.Back to TOC
  107. 107. The 2nd page in theAllWorksDashboardis a bar chart reportand the name ofthe page is simplycalled „Materials‟. Ituses theClientsPlusAll namedset as the filter. Hereare the specs: Back to TOC
  108. 108. This analytical chart report for page 2 uses the material perspective data source with a single client.Back to TOC
  109. 109. The 3rd page in theAllWorksDashboardis a line chart reportand the name ofthe page is simplycalled „Overhead‟.It uses theAllOverheads namedset as the filter. Hereare the specs: Back to TOC
  110. 110. This analytical chart report for page 3 uses the overhead perspective data source for multiple overhead categories.Back to TOC
  111. 111. The PPS reportsare deployed totheAllWorksDashboard of thePPSDashboardsdocument libraryin SharePoint.Back to TOC
  112. 112. These are all thePPS reports inthe SharePointPPSDashboardsdocument library.Back to TOC
  113. 113. Page 1 report(Scorecards) ofPPSDashboardsin SharePoint.Back to TOC
  114. 114. Page 2 report(Materials) ofPPSDashboardsin SharePoint.Back to TOC
  115. 115. Page 3 report(Overhead) ofPPSDashboardsin SharePoint.Back to TOC
  116. 116. Back to TOC
  117. 117. Back to TOC
  118. 118. Back to TOC
  119. 119. Back to TOC
  120. 120. Back to TOC
  121. 121. Back to TOC
  122. 122. Created a teamSharePoint sitefor collaboration.Back to TOC

×