This document discusses event handling, logging, and configuration files in SQL Server Integration Services (SSIS). It provides an overview of SSIS and describes how to handle errors in the control flow and data flow. It also discusses different logging options in SSIS and the various event handlers that can be used. The document demonstrates how to set up auditing in an SSIS package by adding tasks to event handlers, capturing row counts, and storing metadata in variables. It notes some benefits of custom auditing over standard logging. Finally, it provides recommendations for optimizing long-running packages and key components to include in a custom auditing package.
LDM Slides: Data Modeling for XML and JSONDATAVERSITY
Data modeling has traditionally focused on relational database systems. But in the age of the internet, technologies such as XML and JSON have evolved to provide structure and definition to “data in motion”. Have data modeling technologies evolved to support these technologies? Can we use traditional approaches to model data in XML and JSON? Or are new tools and methodologies required? Join this webinar to discuss:
- XML & JSON vs. Relational Database Modeling
- Techniques & Tools for Data Modeling for XML
- Techniques & Tools for Data Modeling for JSON
- Use Cases & Opportunities for XML and JSON Data Modeling
The document provides an overview of the SAS Data Governance Framework, which is designed to provide the depth, breadth and flexibility necessary to overcome common data governance failure points. It describes the key components of the framework, including corporate drivers, data governance objectives and principles, data management roles and processes, and technical solutions. The framework is presented as a comprehensive approach for establishing an effective and sustainable enterprise data governance program.
An introduction to the FAIR principles and a discussion of key issues that must be addressed to ensure data is findable, accessible, interoperable and re-usable. The session explored the role of the CDISC and DDI standards for addressing these issues.
Presented by Gareth Knight at the ADMIT Network conference, organised by the Association for Data Management in the Tropics, in Antwerp, Belgium on December 1st 2015.
Este documento trata sobre la protección de datos personales en centros educativos. Explica conceptos clave como datos personales, ficheros, tratamiento de datos, responsables y encargados del tratamiento. También describe los principios de protección de datos como la legitimación, calidad de datos, transparencia, seguridad, secreto y cancelación. Finalmente, detalla el tratamiento de datos por los centros educativos, incluyendo qué datos pueden recabar, cómo se recaban, a quién se pueden comunicar y publicar.
El documento introduce los conceptos básicos del álgebra relacional, incluyendo las operaciones primitivas como unión, intersección, diferencia y producto cartesiano. Explica que las operaciones se clasifican según sean unarias o binarias, y según se parezcan a la teoría de conjuntos. Finalmente, describe las operaciones de selección, proyección y combinación, indicando cómo afectan al esquema y extensión de las relaciones.
Este documento describe los conceptos de integridad de datos en una base de datos relacional. Explica que la integridad de datos garantiza la calidad de los datos almacenados mediante reglas como la integridad de dominio, unicidad, entidad y referencial. También describe las vistas como relaciones virtuales definidas por consultas a las tablas de la base de datos.
LDM Slides: Data Modeling for XML and JSONDATAVERSITY
Data modeling has traditionally focused on relational database systems. But in the age of the internet, technologies such as XML and JSON have evolved to provide structure and definition to “data in motion”. Have data modeling technologies evolved to support these technologies? Can we use traditional approaches to model data in XML and JSON? Or are new tools and methodologies required? Join this webinar to discuss:
- XML & JSON vs. Relational Database Modeling
- Techniques & Tools for Data Modeling for XML
- Techniques & Tools for Data Modeling for JSON
- Use Cases & Opportunities for XML and JSON Data Modeling
The document provides an overview of the SAS Data Governance Framework, which is designed to provide the depth, breadth and flexibility necessary to overcome common data governance failure points. It describes the key components of the framework, including corporate drivers, data governance objectives and principles, data management roles and processes, and technical solutions. The framework is presented as a comprehensive approach for establishing an effective and sustainable enterprise data governance program.
An introduction to the FAIR principles and a discussion of key issues that must be addressed to ensure data is findable, accessible, interoperable and re-usable. The session explored the role of the CDISC and DDI standards for addressing these issues.
Presented by Gareth Knight at the ADMIT Network conference, organised by the Association for Data Management in the Tropics, in Antwerp, Belgium on December 1st 2015.
Este documento trata sobre la protección de datos personales en centros educativos. Explica conceptos clave como datos personales, ficheros, tratamiento de datos, responsables y encargados del tratamiento. También describe los principios de protección de datos como la legitimación, calidad de datos, transparencia, seguridad, secreto y cancelación. Finalmente, detalla el tratamiento de datos por los centros educativos, incluyendo qué datos pueden recabar, cómo se recaban, a quién se pueden comunicar y publicar.
El documento introduce los conceptos básicos del álgebra relacional, incluyendo las operaciones primitivas como unión, intersección, diferencia y producto cartesiano. Explica que las operaciones se clasifican según sean unarias o binarias, y según se parezcan a la teoría de conjuntos. Finalmente, describe las operaciones de selección, proyección y combinación, indicando cómo afectan al esquema y extensión de las relaciones.
Este documento describe los conceptos de integridad de datos en una base de datos relacional. Explica que la integridad de datos garantiza la calidad de los datos almacenados mediante reglas como la integridad de dominio, unicidad, entidad y referencial. También describe las vistas como relaciones virtuales definidas por consultas a las tablas de la base de datos.
SQL Server Integration Services (SSIS) is a platform for building extraction, transformation, and loading (ETL) packages and other data integration tasks. SSIS packages contain graphical tasks and workflows that can be developed using Visual Studio tools and debugged. Packages integrate various data sources, handle data flows between sources and destinations with transformations, and include features for logging, error handling, restarting failed executions, and configuring variables and parameters.
SQL Server Integration Services (SSIS) is a platform for building extract, transform, and load (ETL) packages and other data integration and workflow tasks. It includes graphical tools and wizards to design packages, as well as utilities to run, debug, and deploy packages. Key components of SSIS include control flow tasks, data flows, variables, logging, and support for transactions and restarting failed packages.
Retail Analytics, with Oracle Data Integrator 11G.
Points about ODI Objects, Interfaces, Variables, Packages, Scenarios, Load Plans, Scheduling.
Batch Scheduling with RA 14.2, UAF in 14.2, Error Managment in RA 14.2
This document summarizes a business intelligence portfolio project for a simulated construction company. It includes details on an ETL solution built in SQL Server Integration Services to load data nightly from various sources into a SQL database. It also covers an OLAP cube with a partial snowflake structure created in SQL Server Analysis Services, including sample MDX queries and KPIs. Finally, it discusses reports deployed to SharePoint using SQL Server Reporting Services and PerformancePoint Services, including gauges, charts and dashboards. The overall goal was to build a BI solution to track, analyze and report on all aspects of the company's business using Microsoft SQL Server and SharePoint technologies.
Microsoft-business-intelligence-training-in-mumbaiUnmesh Baile
Vibrant Technologies is headquarted in Mumbai,India.We are the best MSBI training provider in Navi Mumbai who provides Live Projects to students.We provide Corporate Training also.We are Best Microsoft Business Intelligence classes in Mumbai according to our students and corporators
This document provides an overview and samples of a business intelligence project using SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), and SQL Server Reporting Services (SSRS). It includes descriptions of ETL packages in SSIS to load and transform data, a cube with dimensions and calculations in SSAS, and sample MDX queries and reports. The goals are to track, analyze, and report on facets of a simulated construction company.
Ssis Best Practices Israel Bi U Ser Group Itay Braunsqlserver.co.il
This document provides best practices and recommendations for SQL Server Integration Services (SSIS). It discusses topics such as logging package runtime information, establishing performance baselines, package configuration, lookup optimization, data profiling, resource utilization, and network optimization. The document also provides tips on narrowing data types, sorting data, using SQL for set operations, and change data capture functionality.
To Study E T L ( Extract, Transform, Load) Tools Specially S Q L Server I...Shahzad
This document discusses SQL Server Integration Services (SSIS), an extract, transform, load (ETL) tool from Microsoft. It provides an overview of what ETL is and the typical steps involved, including extracting data from sources, transforming the data, and loading it into a destination. The document then describes the key components of SSIS, such as the SSIS designer, runtime engine, tasks, data flow engine, API/object model, and packages. It also lists some common uses of SSIS, such as merging data, populating data warehouses, cleaning data, and automating data loads.
The document discusses the use of scripting in SQL Server Integration Services (SSIS). It covers how scripts allow extending package functionality beyond default tasks. Specific topics covered include using scripts to manage packages programmatically via the object model or command line utilities, configuring script tasks, using the log method and handling events from within a script task, and creating a data transformation script component. Demos are provided for various scripting features.
The document provides an overview of SQL Server tracing and profiling tools. It discusses SQL Trace architecture, security and permissions for tracing, using SQL Server Profiler, saving and replaying traces, server-side tracing, and querying trace metadata. Key points covered include the SQL Trace architecture components, available tracing events and columns, permissions required for tracing, options for saving trace data, and system stored procedures for managing server-side traces.
The document discusses the extraction, transformation, and loading (ETL) process used in data warehousing. It describes how ETL tools extract data from operational systems, transform the data through cleansing and formatting, and load it into the data warehouse. Metadata is generated during the ETL process to document the data flow and mappings. The roles of different types of metadata are also outlined. Common ETL tools and their strengths and limitations are reviewed.
The document provides an introduction to stored procedures in SQL. Key points include:
- Stored procedures allow code to be executed faster than batches by pre-compiling the code.
- They centralize business logic and error handling routines for consistent implementation across users.
- Parameters can be passed into stored procedures to make them more flexible. Output parameters allow returning values.
- Best practices include adding comments, error handling, and using transactions for consistency across nested stored procedures.
The document provides an introduction to stored procedures in SQL. Key points include:
- Stored procedures allow code to be executed as a batch after being compiled once, improving performance over executing individual SQL statements.
- Stored procedures can accept input parameters, return output parameters, and be used to enforce consistent implementation of business logic and error handling.
- Best practices for stored procedures include adding documentation, error handling, and using input/output parameters to make procedures more flexible and reusable.
This document summarizes MySQL's monitoring mechanisms and how they have evolved over time. It discusses tools like SHOW statements, INFORMATION_SCHEMA, slow/general query logs, and EXPLAIN that provided limited visibility in past versions. MySQL 5.5 introduced the Performance Schema framework for detailed instrumentation. Subsequent versions have expanded instrumentation to provide more developer-focused statistics on statements, stages, I/O, locks and more. New INFORMATION_SCHEMA tables in 5.6 provide additional InnoDB statistics on data dictionary, buffer pool, transactions and compression. The optimizer trace exposes query transformations. Enhanced EXPLAIN now supports more statement types and future improvements will provide a structured EXPLAIN output.
The 5.5 and 5.6 releases of MySQL introduce several new mechanisms that provide improved monitoring and performance tuning functionality. Examples are performance schemas, InnoDB metrics tables, optimizer trace, and extended explain functionality. This session outlines the vision for monitoring-related functionality in MySQL and presents an overview of the new mechanisms. It shows how these are integrated with MySQL management tools. Furthermore, it discusses how these mechanisms can be utilized by application developers, DBAs, and production engineers for tracking down performance issues and monitoring production systems.
The document describes a Business Intelligence project for AllWorks Inc. to load data from various sources into a SQL Server database using SSIS packages. It involves:
1) Creating a normalized data model to hold data from spreadsheets, XML files, and CSV files.
2) Developing SSIS packages to extract, transform, validate and load the data into tables.
3) Creating a master package to run the individual packages in order and ensure data dependencies are met.
4) Developing a database maintenance package to backup, shrink, and re-index the database after each load.
5) Scheduling the packages to run daily via a SQL Server Agent job.
The document discusses various techniques for optimizing SQL Server performance, including handling index fragmentation, optimizing files and partitioning tables, effective use of SQL Profiler and Performance Monitor, a methodology for performance troubleshooting, and a 10 step process for performance optimization. Some key points covered are determining and resolving index fragmentation, partitioning tables across multiple file groups, capturing traces with SQL Profiler and Performance Monitor counters to diagnose issues, and ensuring proper indexing through query execution plans and the SQL Server tuning advisor.
The document provides 5 tips for successfully upgrading SQL Server Integration Services (SSIS) packages to SQL Server 2012:
1. Manually edit package configurations, especially connection strings, after upgrading with the upgrade wizard since configurations are not automatically updated.
2. Use the Project Conversion Wizard to convert packages to the new project deployment model in SQL Server 2012 for improved deployment and management.
3. Update Execute Package tasks to use project references rather than file references for calling child packages within the same project.
4. Parameterize the PackageName property of Execute Package tasks to dynamically configure which child package runs at runtime.
5. Convert package configurations to parameters when possible to take advantage of improved configuration handling in the
This presentation illustrates DocIndex, InternetMiner and VisioDecompositer - my 3 proprietary test tools - and walks the user through how they are used effectively.
The tools are presented in the context of a Test Strategy and the emphasis is on HOW the tools are used and the rationale behind the esign of the tools.
View this presentation with SPEAKERS NOTES ON.
Explain about power BI Overview from Power BI Desktop, Power BI Service, Power BI Report Server and Power BI Mobile that consume all BI Data from Dataset and datamodel
SSIS: Flow tasks, containers and precedence constraintsKiki Noviandi
SSIS components such as Flow Task, Container and priority constraints become important parts in conducting ETL Processes, Explanation of SSIS architecture, Control Flow and Data Flow are the main topics in this presentation
SQL Server Integration Services (SSIS) is a platform for building extraction, transformation, and loading (ETL) packages and other data integration tasks. SSIS packages contain graphical tasks and workflows that can be developed using Visual Studio tools and debugged. Packages integrate various data sources, handle data flows between sources and destinations with transformations, and include features for logging, error handling, restarting failed executions, and configuring variables and parameters.
SQL Server Integration Services (SSIS) is a platform for building extract, transform, and load (ETL) packages and other data integration and workflow tasks. It includes graphical tools and wizards to design packages, as well as utilities to run, debug, and deploy packages. Key components of SSIS include control flow tasks, data flows, variables, logging, and support for transactions and restarting failed packages.
Retail Analytics, with Oracle Data Integrator 11G.
Points about ODI Objects, Interfaces, Variables, Packages, Scenarios, Load Plans, Scheduling.
Batch Scheduling with RA 14.2, UAF in 14.2, Error Managment in RA 14.2
This document summarizes a business intelligence portfolio project for a simulated construction company. It includes details on an ETL solution built in SQL Server Integration Services to load data nightly from various sources into a SQL database. It also covers an OLAP cube with a partial snowflake structure created in SQL Server Analysis Services, including sample MDX queries and KPIs. Finally, it discusses reports deployed to SharePoint using SQL Server Reporting Services and PerformancePoint Services, including gauges, charts and dashboards. The overall goal was to build a BI solution to track, analyze and report on all aspects of the company's business using Microsoft SQL Server and SharePoint technologies.
Microsoft-business-intelligence-training-in-mumbaiUnmesh Baile
Vibrant Technologies is headquarted in Mumbai,India.We are the best MSBI training provider in Navi Mumbai who provides Live Projects to students.We provide Corporate Training also.We are Best Microsoft Business Intelligence classes in Mumbai according to our students and corporators
This document provides an overview and samples of a business intelligence project using SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), and SQL Server Reporting Services (SSRS). It includes descriptions of ETL packages in SSIS to load and transform data, a cube with dimensions and calculations in SSAS, and sample MDX queries and reports. The goals are to track, analyze, and report on facets of a simulated construction company.
Ssis Best Practices Israel Bi U Ser Group Itay Braunsqlserver.co.il
This document provides best practices and recommendations for SQL Server Integration Services (SSIS). It discusses topics such as logging package runtime information, establishing performance baselines, package configuration, lookup optimization, data profiling, resource utilization, and network optimization. The document also provides tips on narrowing data types, sorting data, using SQL for set operations, and change data capture functionality.
To Study E T L ( Extract, Transform, Load) Tools Specially S Q L Server I...Shahzad
This document discusses SQL Server Integration Services (SSIS), an extract, transform, load (ETL) tool from Microsoft. It provides an overview of what ETL is and the typical steps involved, including extracting data from sources, transforming the data, and loading it into a destination. The document then describes the key components of SSIS, such as the SSIS designer, runtime engine, tasks, data flow engine, API/object model, and packages. It also lists some common uses of SSIS, such as merging data, populating data warehouses, cleaning data, and automating data loads.
The document discusses the use of scripting in SQL Server Integration Services (SSIS). It covers how scripts allow extending package functionality beyond default tasks. Specific topics covered include using scripts to manage packages programmatically via the object model or command line utilities, configuring script tasks, using the log method and handling events from within a script task, and creating a data transformation script component. Demos are provided for various scripting features.
The document provides an overview of SQL Server tracing and profiling tools. It discusses SQL Trace architecture, security and permissions for tracing, using SQL Server Profiler, saving and replaying traces, server-side tracing, and querying trace metadata. Key points covered include the SQL Trace architecture components, available tracing events and columns, permissions required for tracing, options for saving trace data, and system stored procedures for managing server-side traces.
The document discusses the extraction, transformation, and loading (ETL) process used in data warehousing. It describes how ETL tools extract data from operational systems, transform the data through cleansing and formatting, and load it into the data warehouse. Metadata is generated during the ETL process to document the data flow and mappings. The roles of different types of metadata are also outlined. Common ETL tools and their strengths and limitations are reviewed.
The document provides an introduction to stored procedures in SQL. Key points include:
- Stored procedures allow code to be executed faster than batches by pre-compiling the code.
- They centralize business logic and error handling routines for consistent implementation across users.
- Parameters can be passed into stored procedures to make them more flexible. Output parameters allow returning values.
- Best practices include adding comments, error handling, and using transactions for consistency across nested stored procedures.
The document provides an introduction to stored procedures in SQL. Key points include:
- Stored procedures allow code to be executed as a batch after being compiled once, improving performance over executing individual SQL statements.
- Stored procedures can accept input parameters, return output parameters, and be used to enforce consistent implementation of business logic and error handling.
- Best practices for stored procedures include adding documentation, error handling, and using input/output parameters to make procedures more flexible and reusable.
This document summarizes MySQL's monitoring mechanisms and how they have evolved over time. It discusses tools like SHOW statements, INFORMATION_SCHEMA, slow/general query logs, and EXPLAIN that provided limited visibility in past versions. MySQL 5.5 introduced the Performance Schema framework for detailed instrumentation. Subsequent versions have expanded instrumentation to provide more developer-focused statistics on statements, stages, I/O, locks and more. New INFORMATION_SCHEMA tables in 5.6 provide additional InnoDB statistics on data dictionary, buffer pool, transactions and compression. The optimizer trace exposes query transformations. Enhanced EXPLAIN now supports more statement types and future improvements will provide a structured EXPLAIN output.
The 5.5 and 5.6 releases of MySQL introduce several new mechanisms that provide improved monitoring and performance tuning functionality. Examples are performance schemas, InnoDB metrics tables, optimizer trace, and extended explain functionality. This session outlines the vision for monitoring-related functionality in MySQL and presents an overview of the new mechanisms. It shows how these are integrated with MySQL management tools. Furthermore, it discusses how these mechanisms can be utilized by application developers, DBAs, and production engineers for tracking down performance issues and monitoring production systems.
The document describes a Business Intelligence project for AllWorks Inc. to load data from various sources into a SQL Server database using SSIS packages. It involves:
1) Creating a normalized data model to hold data from spreadsheets, XML files, and CSV files.
2) Developing SSIS packages to extract, transform, validate and load the data into tables.
3) Creating a master package to run the individual packages in order and ensure data dependencies are met.
4) Developing a database maintenance package to backup, shrink, and re-index the database after each load.
5) Scheduling the packages to run daily via a SQL Server Agent job.
The document discusses various techniques for optimizing SQL Server performance, including handling index fragmentation, optimizing files and partitioning tables, effective use of SQL Profiler and Performance Monitor, a methodology for performance troubleshooting, and a 10 step process for performance optimization. Some key points covered are determining and resolving index fragmentation, partitioning tables across multiple file groups, capturing traces with SQL Profiler and Performance Monitor counters to diagnose issues, and ensuring proper indexing through query execution plans and the SQL Server tuning advisor.
The document provides 5 tips for successfully upgrading SQL Server Integration Services (SSIS) packages to SQL Server 2012:
1. Manually edit package configurations, especially connection strings, after upgrading with the upgrade wizard since configurations are not automatically updated.
2. Use the Project Conversion Wizard to convert packages to the new project deployment model in SQL Server 2012 for improved deployment and management.
3. Update Execute Package tasks to use project references rather than file references for calling child packages within the same project.
4. Parameterize the PackageName property of Execute Package tasks to dynamically configure which child package runs at runtime.
5. Convert package configurations to parameters when possible to take advantage of improved configuration handling in the
This presentation illustrates DocIndex, InternetMiner and VisioDecompositer - my 3 proprietary test tools - and walks the user through how they are used effectively.
The tools are presented in the context of a Test Strategy and the emphasis is on HOW the tools are used and the rationale behind the esign of the tools.
View this presentation with SPEAKERS NOTES ON.
Explain about power BI Overview from Power BI Desktop, Power BI Service, Power BI Report Server and Power BI Mobile that consume all BI Data from Dataset and datamodel
SSIS: Flow tasks, containers and precedence constraintsKiki Noviandi
SSIS components such as Flow Task, Container and priority constraints become important parts in conducting ETL Processes, Explanation of SSIS architecture, Control Flow and Data Flow are the main topics in this presentation
KPI (Key performance indicator) is part of data processing in Design Analysis Services, this slide explains what KPI is, how to make KPI using SSAS and Displaying reports on SSRS
Master Data Services (MDS) is a Microsoft platform to support Master Data Management (MDM). In this presentation, will be explained about the Data Master service, the deployment and installation of the master data service, and the basic data service master model
The document provides an overview of tasks in SQL Server Integration Services (SSIS), including the FTP task and Script task. It discusses the purposes and configuration of the FTP task for transferring files between local and remote locations. It also covers how the Script task allows custom code to perform functions not available in other SSIS tasks, and how to configure and write scripts for the Script task.
Dokumen tersebut membahas tentang kejahatan dunia maya dan cybercrime. Secara singkat, dokumen tersebut menjelaskan berbagai jenis kejahatan dunia maya seperti unauthorized access, data forgery, cyber espionage, serta undang-undang dan kasus yang terkait dengan cybercrime di Indonesia.
The document summarizes topics that were covered in an SQL community meeting in December 2018, including tuning queries for performance, understanding execution plans, using performance monitoring tools, and troubleshooting queries. Key areas discussed were the SQL query processing steps, factors that affect performance like the buffer cache hit ratio, and methods for analyzing execution plans and data access operators like table scans and index seeks.
This document discusses in-memory database functionality in SQL Server including architecture, tables and indexes, stored procedures, restrictions, monitoring tools, concurrency control, and data management views. It covers creating in-memory enabled databases, table types, index types, updating statistics, and natively compiled stored procedures. The document also mentions analyzing, migrating, and reporting tools for reviewing in-memory databases.
Presenting SQL Server Performance Tools Such as Resource Governor, Resource Pools, Monitoring SQL With Transaction SQL (SP_Who, sys.dm_exec_sessions, etc) in BATAM Center
14 th Edition of International conference on computer visionShulagnaSarkar2
About the event
14th Edition of International conference on computer vision
Computer conferences organized by ScienceFather group. ScienceFather takes the privilege to invite speakers participants students delegates and exhibitors from across the globe to its International Conference on computer conferences to be held in the Various Beautiful cites of the world. computer conferences are a discussion of common Inventions-related issues and additionally trade information share proof thoughts and insight into advanced developments in the science inventions service system. New technology may create many materials and devices with a vast range of applications such as in Science medicine electronics biomaterials energy production and consumer products.
Nomination are Open!! Don't Miss it
Visit: computer.scifat.com
Award Nomination: https://x-i.me/ishnom
Conference Submission: https://x-i.me/anicon
For Enquiry: Computer@scifat.com
Malibou Pitch Deck For Its €3M Seed Roundsjcobrien
French start-up Malibou raised a €3 million Seed Round to develop its payroll and human resources
management platform for VSEs and SMEs. The financing round was led by investors Breega, Y Combinator, and FCVC.
Microservice Teams - How the cloud changes the way we workSven Peters
A lot of technical challenges and complexity come with building a cloud-native and distributed architecture. The way we develop backend software has fundamentally changed in the last ten years. Managing a microservices architecture demands a lot of us to ensure observability and operational resiliency. But did you also change the way you run your development teams?
Sven will talk about Atlassian’s journey from a monolith to a multi-tenanted architecture and how it affected the way the engineering teams work. You will learn how we shifted to service ownership, moved to more autonomous teams (and its challenges), and established platform and enablement teams.
Everything You Need to Know About X-Sign: The eSign Functionality of XfilesPr...XfilesPro
Wondering how X-Sign gained popularity in a quick time span? This eSign functionality of XfilesPro DocuPrime has many advancements to offer for Salesforce users. Explore them now!
8 Best Automated Android App Testing Tool and Framework in 2024.pdfkalichargn70th171
Regarding mobile operating systems, two major players dominate our thoughts: Android and iPhone. With Android leading the market, software development companies are focused on delivering apps compatible with this OS. Ensuring an app's functionality across various Android devices, OS versions, and hardware specifications is critical, making Android app testing essential.
Unveiling the Advantages of Agile Software Development.pdfbrainerhub1
Learn about Agile Software Development's advantages. Simplify your workflow to spur quicker innovation. Jump right in! We have also discussed the advantages.
Preparing Non - Technical Founders for Engaging a Tech AgencyISH Technologies
Preparing non-technical founders before engaging a tech agency is crucial for the success of their projects. It starts with clearly defining their vision and goals, conducting thorough market research, and gaining a basic understanding of relevant technologies. Setting realistic expectations and preparing a detailed project brief are essential steps. Founders should select a tech agency with a proven track record and establish clear communication channels. Additionally, addressing legal and contractual considerations and planning for post-launch support are vital to ensure a smooth and successful collaboration. This preparation empowers non-technical founders to effectively communicate their needs and work seamlessly with their chosen tech agency.Visit our site to get more details about this. Contact us today www.ishtechnologies.com.au
Flutter is a popular open source, cross-platform framework developed by Google. In this webinar we'll explore Flutter and its architecture, delve into the Flutter Embedder and Flutter’s Dart language, discover how to leverage Flutter for embedded device development, learn about Automotive Grade Linux (AGL) and its consortium and understand the rationale behind AGL's choice of Flutter for next-gen IVI systems. Don’t miss this opportunity to discover whether Flutter is right for your project.
Mobile App Development Company In Noida | Drona InfotechDrona Infotech
Drona Infotech is a premier mobile app development company in Noida, providing cutting-edge solutions for businesses.
Visit Us For : https://www.dronainfotech.com/mobile-application-development/
Top Benefits of Using Salesforce Healthcare CRM for Patient Management.pdfVALiNTRY360
Salesforce Healthcare CRM, implemented by VALiNTRY360, revolutionizes patient management by enhancing patient engagement, streamlining administrative processes, and improving care coordination. Its advanced analytics, robust security, and seamless integration with telehealth services ensure that healthcare providers can deliver personalized, efficient, and secure patient care. By automating routine tasks and providing actionable insights, Salesforce Healthcare CRM enables healthcare providers to focus on delivering high-quality care, leading to better patient outcomes and higher satisfaction. VALiNTRY360's expertise ensures a tailored solution that meets the unique needs of any healthcare practice, from small clinics to large hospital systems.
For more info visit us https://valintry360.com/solutions/health-life-sciences
Liberarsi dai framework con i Web Component.pptxMassimo Artizzu
In Italian
Presentazione sulle feature e l'utilizzo dei Web Component nell sviluppo di pagine e applicazioni web. Racconto delle ragioni storiche dell'avvento dei Web Component. Evidenziazione dei vantaggi e delle sfide poste, indicazione delle best practices, con particolare accento sulla possibilità di usare web component per facilitare la migrazione delle proprie applicazioni verso nuovi stack tecnologici.
UI5con 2024 - Bring Your Own Design SystemPeter Muessig
How do you combine the OpenUI5/SAPUI5 programming model with a design system that makes its controls available as Web Components? Since OpenUI5/SAPUI5 1.120, the framework supports the integration of any Web Components. This makes it possible, for example, to natively embed own Web Components of your design system which are created with Stencil. The integration embeds the Web Components in a way that they can be used naturally in XMLViews, like with standard UI5 controls, and can be bound with data binding. Learn how you can also make use of the Web Components base class in OpenUI5/SAPUI5 to also integrate your Web Components and get inspired by the solution to generate a custom UI5 library providing the Web Components control wrappers for the native ones.
WWDC 2024 Keynote Review: For CocoaCoders AustinPatrick Weigel
Overview of WWDC 2024 Keynote Address.
Covers: Apple Intelligence, iOS18, macOS Sequoia, iPadOS, watchOS, visionOS, and Apple TV+.
Understandable dialogue on Apple TV+
On-device app controlling AI.
Access to ChatGPT with a guest appearance by Chief Data Thief Sam Altman!
App Locking! iPhone Mirroring! And a Calculator!!
Hand Rolled Applicative User ValidationCode KataPhilip Schwarz
Could you use a simple piece of Scala validation code (granted, a very simplistic one too!) that you can rewrite, now and again, to refresh your basic understanding of Applicative operators <*>, <*, *>?
The goal is not to write perfect code showcasing validation, but rather, to provide a small, rough-and ready exercise to reinforce your muscle-memory.
Despite its grandiose-sounding title, this deck consists of just three slides showing the Scala 3 code to be rewritten whenever the details of the operators begin to fade away.
The code is my rough and ready translation of a Haskell user-validation program found in a book called Finding Success (and Failure) in Haskell - Fall in love with applicative functors.
The Key to Digital Success_ A Comprehensive Guide to Continuous Testing Integ...kalichargn70th171
In today's business landscape, digital integration is ubiquitous, demanding swift innovation as a necessity rather than a luxury. In a fiercely competitive market with heightened customer expectations, the timely launch of flawless digital products is crucial for both acquisition and retention—any delay risks ceding market share to competitors.
The Key to Digital Success_ A Comprehensive Guide to Continuous Testing Integ...
Ssis event handler
1. SSIS : EVENT HANDLER, LOGGING AND
CONFIGURATION FILE
KIKI NOVIANDI – DATA PLATFORM MVP
2. Microsoft Data Platform MVP Since 2006
Founder SQL Server Indonesia User Group Community
ABOUT ME
My Name : Kiki Rizki Noviandi
Milis : sqlserver-indo@yahoogroups.com
https://www.facebook.com/groups/sqlserverindonesia
http://www.kwad5.com
https://mvp.microsoft.com/en-us/PublicProfile/33869?fullName=Kiki%20Rizki%20Noviandi
5. SQL Server Integration Services (SSIS)
ETL = Extract – Transform – Load
Get the data from
source system as
efficiently as possible
Source Destination
Perform
Calculations on the
data
Load the data in the
target storage
6. Reasons
Failure of ancestors control
Truncation or source/destination
connection issue
Conversion failure
Issues from migrating files and
data
Package has an error due to
privileges of OS controls
Other failure reasons
Package task failed !!!!
SQL Server Integration Services (SSIS)
Error Handling in SSIS
7. Error Handling in SSIS
Control Flow:
Add a failure constraint and redirects the workflow to an alternate
Data Flow:
Send the row out to an error path by configuring the error output of
the Source/ Destination/Transformation as redirect to error row and
save it to review later.
Use Event Handler: OnError event in a separate window. Write
custom script or just send an email to a team to notify the error.
SQL Server Integration Services (SSIS)
8. Logging in SSIS: Log providers
SQL Server Integration Services (SSIS)
Log Providers Description
Text File log provider writes log entries to ASCII text files in a
comma-separated value (CSV) format
SQL Server Profiler
log provider
writes traces that you can view using SQL
Server Profiler (.trc)
SQL Server log
provider
writes log entries to the sysssislog table in
a SQL Server database
Windows Event log
provider
writes entries to the Application log in the
Windows Event log on the local computer
XML File log writes log files to an XML file (.xml)
9. Event Handling in SSIS
Integration Services packages are event-driven. This means we can specify
routines to execute when a particular event occurs. An event can be the
completion of a task or an error that occurs during task execution.
SQL Server Integration Services (SSIS)
Event Handlers Description
OnError Generated as the result of an error condition
OnPreValidate Fired before Validation process starts
OnQueryCancel Fired when user clicks on cancel or during an executable to determine whether it
should stop running
OnTaskFailed Signals the failure of a task and typically follows OnError Event
OnPreExecute Indicates that an executable component is about to be launched
OnPostExecute Takes place after an executable component finishes running
10. SSIS Auditing
Add task(s) to the Event Handlers of the package.
Select auditing for the entire package or for a specific task.
Select events OnError, OnPostExecution, OnVariableValueChanged etc.
Inside every dataflow task add row count components after source and
target to track extracted and loaded row count.
Add Variables at package level scope to store rowcount for each dataflow.
Add Variables in the OnPostExecute event handler scope to store certain
information about DataFlow source/target (e.g. Query, TableName ...)
SQL Server Integration Services (SSIS)
11. SSIS Auditing:
Helps to answer the following questions
Which package was run and for how long?
Who owns the package? Or who modified the package?
When was the package executed?
What kind of data and how many records were written or changed by
ETL?
What kind of errors and how many errors occurred?
SQL Server Integration Services (SSIS)
12. Benefits that Custom Auditing and Logging can bring to your ETL
process
Help you provide regulatory compliance
Provide a deep understanding of database activity and additional
insight into data anomalies that may occur
Can help answer important questions like, “When was that row last
updated?”
help you identify specific data for targeted rollbacks
SQL Server Integration Services (SSIS)
13. Logging vs Custom Auditing
SQL Server Integration Services (SSIS)
Logging Custom Auditing
o Captures Metadata- Information
package execution itself
o Captures Information about data,
along with package metadata
o Errors encountered, Execution Time
package, Data bytes, data flow buffer
details, Machine name, Package
Task name
o Row counts of Extracts, Inserts,
Updates, Deletes and Errors, you can
default the status of the changed
and package execution
o Choose the Log providers and its
location
o Use Execute SQL task to define
variables , parameter binding, and
assigning values to the parameters
o Provides limited information on the
package
o Designed for DBAs/ Users who can
query and request for more
information about the data
14. Question:
SQL Server Integration Services (SSIS)
If Change Data Capture (CDC)
reads and tracks every Historical
data and net changes from SQL
Server transaction logs, then
whey not use CDC for Auditing?
15. SQL Server Integration Services (SSIS)
Answer: There’s a downside to this
1. The amount of history data can become huge fast
2. Does not return all information about the changes you
might need, for e.g. who made the change, when and
how? (when a record is deleted or updated)
3. Delay possible: The history data takes some time to catch
up, because it is based on the transaction logs and the
operation is asynchronous.
4. It depends on the SQL Server Agent. If the Agent is not
running or crashes, no history is being tracked.
16. Longer running package? How would you optimize the Package Execution?
Parallel execution of SSIS tasks
In case of Incremental load, Use Execute SQL task instead of OLEDB Command
transformation to process the updated/ new inserts
Avoid processing the redundant columns in the data flow task
Keep notice of buffers and execution tree
Avoid using checkpoints while auditing SSIS package as they cannot store variables as
Type objects and cannot integrate with, or are most often ignored by event handlers
Use Lookup, Conditional split to customize the SCD work flow
Enable Error handling and logging on package failure
SQL Server Integration Services (SSIS)
17. Key Components in SSIS Package Custom Auditing
SQL Server Integration Services (SSIS)
ETL: Data warehouse tables (Staging,
Dimensions, Facts, Data marts)
Slowly Changing Dimensions Type 1/2
Extract Meta data and Row counts (DML
Operations)
Parent-Package Configuration
On-Error Event Handler to Capture Error message
Audit SSIS
Package
18. DEMO
An Audit table was structured by defining the components with the required information
regarding the Metadata and the transactional records.
Attributes Definition
Audit Key A global unique ID assigned to every execution of
ETL package or packages in the target table. Usually
it is an auto identity integer starting from ‘1’
ParentAuditkey Surrogate ID assigned to the execution of child
packages as metadata. ID inherited from the audit
key of master package. Batch/ Load ID mapped with
the process of every loads/ updates
PkgName assigns the Name of every Corresponding Package
executed including Master package and child
packages.
PkgID Internal GUID of every SSIS package
ExecStartDT Start time of package execution
19. Demo (contd.)
Attributes Definition
ExecStopDT End time of package execution
TableName Assigns the table name when package executes to
define or populate a table.
PkgName assigns the Name of every Corresponding Package
executed including Master package and child
packages.
ExecutionInstanceGUID Contains the Global Unique ID for every process,
generated by SSIS
ExtractRowCnt Contains the Count of records extracted from the
source file
InsertRowCnt Contains the Count of Inserted Records in Staging
and Dimension tables by the ETL process.
UpdateRowCnt: Contains the count of updated records, especially
in dimension Tables which uses the SCD Type II
functionality.
20. Demo (contd.)
Attributes Definition
ErrorRowCnt Contains the records which were erroneous or not
processed in ETL
TableInitialRowCnt Contains the number of records initially existing in
any staging and Dimension table
TableFinalRowCnt Contains the total number of records in any staging
and Dimension table after inserted , updated or
deleted in the execution process
DeleteRowCnt Contains the number records deleted in the ETL
process
SuccessfullyProcessingInd determines the status of execution of every ETL. If
successfully executed then sets ‘Y’. Default is set to
‘N’.
21. Master package (Capturing Meta data)
Demo (contd.): Workflow
Execute SQL task that checks for the number of rows affected and inserts a Temporary
Dummy row in the beginning of the Audit table
Add Audit key and Parent Audit Key variables and add Execute SQL task that Stores
the highest value of Audit key in the parameter.
Add Execute SQL task which populates the Meta data in the Audit table. In the Task
Editor connect to the target database and write a T-SQL query to insert the Meta data
to the defined Parameters including Audit and Parent audit keys
22. Master package (calling child packages in data warehouse)
Demo (contd.): Workflow
Add the Execute Package Task to call the Child package (e.g Audit package of a
Staging table) which processes the loading and auditing of the first Staging table
Add the Execute Package Task to call the Child package (e.g Audit package of a
Dimension table) which processes the loading and auditing of the firs Dicmension table
updates the Metadata specifically the end time of the execution and success status of
the execution process in the audit table
23. SQL Server Integration Services is an exceptionally high performance integration
and transformation tool
Customize the Auditing using Execute SQL tasks, Row counts, parameters, system
package variables to capture transactional information and metadata (DMLs)
Recommended using an alternative to SCD transform component to preserve
historical record and counts in Dimension tables if implementing SCD Type 2
Implement Error capturing strategies in Data flow and Control flow tasks
Unless required, limit the use of Event handlers to OnError and/ or OnTaskFailed
events as it carries a large I/O overhead and can slow the application performance
dramatically
Use Script component and Script task to customize the error information at the
package level and at individual task flow
TAKE AWAYS
Business Intelligence Development Studio
Control Flow Over View
Connection Managers
Using the Execute SQL Task
Using the Script Task
Working with Variables
Working with Precedence Constraints
Using Loop Containers
Logging and Error Handling