New modeling option in SQL 2012 Analysis Services – Tabular Server
•BISM Vision
•Table-like modeling
•Finding Remote of James Bond Car
•ABCD – Anybody Can Dance
The document discusses harnessing the power of SQL Server columnstore indexes and Analysis Services ROLAP. It finds that combining clustered columnstore indexes with ROLAP in Analysis Services provides very fast performance for aggregates and distinct counts on large datasets of over 1 billion records, returning results within seconds. It recommends settings like enabling ROLAP distinct counts at the data source and maintaining statistics to optimize query plans when using this solution.
Karan Gulati has over 5 years of experience as a SQL Server Analysis Services expert at Microsoft. In this presentation, he provides an overview of key data warehousing and OLAP concepts, including: defining a data warehouse and why OLAP is used; the components of a cube like measures, dimensions, and schemas; and slowly changing dimension types like Type 1, 2, and 3. He explains these concepts at a high level to help attendees understand the terminology in the SQL and data warehousing fields.
Building a SSAS Tabular Model DatabaseCode Mastery
This document provides an overview and agenda for a presentation on creating a tabular model using SQL Server 2012 Analysis Services. It discusses the different types of models in SSAS, the differences between multidimensional and tabular models, and demonstrates creating a basic tabular model using the AdventureWorks sample database. The presentation covers basics of SSAS, using Visual Studio for multidimensional and PowerPivot models, key features of tabular models like DirectQuery and xVelocity, and concludes with a Q&A section.
This document discusses optimizing Power BI and Analysis Services (AAS) data models. It outlines the required tools for reviewing and optimizing models, including DAX Studio, Vertipaq Analyzer, Tabular Editor, and Best Practice Analyzer. The document provides steps for initial review of table sizes, column sizes and cardinality, and DAX expressions. It also lists best practices for data model design like star schema, only including needed data, and simplifying reporting and naming. Resources for learning more about optimization and troubleshooting performance are also included.
This document discusses SSAS tabular models and compares them to multidimensional models. Tabular models offer shorter development times than multidimensional models. While tabular models have some limitations compared to multidimensional models, they provide high performance through in-memory column-based data storage and up to 10x data compression. The document provides a detailed comparison of the features and capabilities of tabular and multidimensional models. It also discusses considerations for choosing between the two types of models based on factors like data complexity, user requirements, and hardware.
This document discusses several topics related to SQL Server Analysis Services (SSAS) including:
- Best practices for SSAS design including dimensions, measures, partitioning and security.
- New features in the upcoming "Denali" release including the BI semantic model and PowerPivot integration.
- Performance tuning techniques such as distinct count optimization and scale out queries.
- Tools for analyzing SSAS queries and cube design best practices.
- Design considerations for large enterprise solutions including partitioning, hardware sizing and concurrency management.
This document provides an overview of Microsoft SQL Server Analysis Services (SSAS) tabular models. It discusses the different modes in SSAS including multidimensional and tabular. The key differences between multidimensional and tabular models are described. Tabular models are better suited for tools like Power View, Power BI, and SQL Server Reporting Services. The document demonstrates how to build a sample tabular model in SQL Server Management Studio and Analysis Services including adding data, measures, columns, and other model elements. New features of tabular models in SQL Server 2017 like the user interface and DAX functions are also summarized.
The document discusses harnessing the power of SQL Server columnstore indexes and Analysis Services ROLAP. It finds that combining clustered columnstore indexes with ROLAP in Analysis Services provides very fast performance for aggregates and distinct counts on large datasets of over 1 billion records, returning results within seconds. It recommends settings like enabling ROLAP distinct counts at the data source and maintaining statistics to optimize query plans when using this solution.
Karan Gulati has over 5 years of experience as a SQL Server Analysis Services expert at Microsoft. In this presentation, he provides an overview of key data warehousing and OLAP concepts, including: defining a data warehouse and why OLAP is used; the components of a cube like measures, dimensions, and schemas; and slowly changing dimension types like Type 1, 2, and 3. He explains these concepts at a high level to help attendees understand the terminology in the SQL and data warehousing fields.
Building a SSAS Tabular Model DatabaseCode Mastery
This document provides an overview and agenda for a presentation on creating a tabular model using SQL Server 2012 Analysis Services. It discusses the different types of models in SSAS, the differences between multidimensional and tabular models, and demonstrates creating a basic tabular model using the AdventureWorks sample database. The presentation covers basics of SSAS, using Visual Studio for multidimensional and PowerPivot models, key features of tabular models like DirectQuery and xVelocity, and concludes with a Q&A section.
This document discusses optimizing Power BI and Analysis Services (AAS) data models. It outlines the required tools for reviewing and optimizing models, including DAX Studio, Vertipaq Analyzer, Tabular Editor, and Best Practice Analyzer. The document provides steps for initial review of table sizes, column sizes and cardinality, and DAX expressions. It also lists best practices for data model design like star schema, only including needed data, and simplifying reporting and naming. Resources for learning more about optimization and troubleshooting performance are also included.
This document discusses SSAS tabular models and compares them to multidimensional models. Tabular models offer shorter development times than multidimensional models. While tabular models have some limitations compared to multidimensional models, they provide high performance through in-memory column-based data storage and up to 10x data compression. The document provides a detailed comparison of the features and capabilities of tabular and multidimensional models. It also discusses considerations for choosing between the two types of models based on factors like data complexity, user requirements, and hardware.
This document discusses several topics related to SQL Server Analysis Services (SSAS) including:
- Best practices for SSAS design including dimensions, measures, partitioning and security.
- New features in the upcoming "Denali" release including the BI semantic model and PowerPivot integration.
- Performance tuning techniques such as distinct count optimization and scale out queries.
- Tools for analyzing SSAS queries and cube design best practices.
- Design considerations for large enterprise solutions including partitioning, hardware sizing and concurrency management.
This document provides an overview of Microsoft SQL Server Analysis Services (SSAS) tabular models. It discusses the different modes in SSAS including multidimensional and tabular. The key differences between multidimensional and tabular models are described. Tabular models are better suited for tools like Power View, Power BI, and SQL Server Reporting Services. The document demonstrates how to build a sample tabular model in SQL Server Management Studio and Analysis Services including adding data, measures, columns, and other model elements. New features of tabular models in SQL Server 2017 like the user interface and DAX functions are also summarized.
Multidimensional or tabular points to considerDeepak Kumar
- Tabular cube solutions were created for self-service BI, low-latency real-time analytics, and compatibility with Power BI and competitors' in-memory solutions.
- Tabular uses columnar storage and can be deployed either with an in-memory or direct query model. Unlike MOLAP, it does not pre-aggregate data and stores data compressed on disk for backup.
- Tabular differs from multidimensional models in that it only uses tables instead of dimensions and facts, has a single model per database, and the model can pull from multiple data sources. Some functionality like ragged hierarchies is also missing.
Azure Data Factory Data Flows Training v005Mark Kromer
Mapping Data Flow is a new feature of Azure Data Factory that allows building data transformations in a visual interface without code. It provides a serverless, scale-out transformation engine for processing big data with unstructured requirements. Mapping Data Flows can be authored and designed visually, with transformations, expressions, and results previews, and then operationalized with Data Factory scheduling, monitoring, and control flow.
Real-world BISM in SQL Server 2012 SSASLynn Langit
The document discusses the Business Intelligence Semantic Model (BISM) in SQL Server Analysis Services. It provides an overview of what BISM is, why it should be used, how to get started with it, and how to create and enhance BISM models. It also includes demonstrations of creating a BISM model in SQL Server Data Tools and deploying it to Analysis Services.
Azure Data Factory Data Wrangling with Power QueryMark Kromer
Azure Data Factory now allows users to perform data wrangling tasks through Power Query activities, translating M scripts into ADF data flow scripts executed on Apache Spark. This enables code-free data exploration, preparation, and operationalization of Power Query workflows within ADF pipelines. Examples of use cases include data engineers building ETL processes or analysts operationalizing existing queries to prepare data for modeling, with the goal of providing a data-first approach to building data flows and pipelines in ADF.
ADF Mapping Data Flows Training Slides V1Mark Kromer
Mapping Data Flow is a new feature of Azure Data Factory that allows users to build data transformations in a visual interface without code. It provides a serverless, scale-out transformation engine to transform data at scale in the cloud in a resilient manner for big data scenarios involving unstructured data. Mapping Data Flows can be operationalized with Azure Data Factory's scheduling, control flow, and monitoring capabilities.
This document provides an overview of advanced analytics using R and SQL. It discusses how R is used widely for analytics and is growing in popularity. Microsoft R Open and Microsoft R Server are introduced as tools for scalable enterprise analytics that integrate with SQL Server. Key capabilities covered include running R scripts directly in the database using SQL Server 2016 extensions, calling stored procedures from applications that execute R code, and bringing compute to data with in-database analytics for performance and scale. Demos and sample programs are referenced to illustrate capabilities of Open Source R and Microsoft R tools.
Azure Data Factory Mapping Data Flow allows users to stage and transform data in Azure during a limited preview period beginning in February 2019. Data can be staged from Azure Data Lake Storage, Blob Storage, or SQL databases/data warehouses, then transformed using visual data flows before being landed to staging areas in Azure like ADLS, Blob Storage, or SQL databases. For information, contact adfdataflowext@microsoft.com or visit http://aka.ms/dataflowpreview.
Mapping Data Flow is a new feature of Azure Data Factory that allows users to build data transformations in a visual interface without code. It provides a serverless, scale-out transformation engine for processing big data with unstructured requirements. Mapping Data Flows can be operationalized with Data Factory's scheduling, control flow, and monitoring capabilities.
Microsoft Data Integration Pipelines: Azure Data Factory and SSISMark Kromer
The document discusses tools for building ETL pipelines to consume hybrid data sources and load data into analytics systems at scale. It describes how Azure Data Factory and SQL Server Integration Services can be used to automate pipelines that extract, transform, and load data from both on-premises and cloud data stores into data warehouses and data lakes for analytics. Specific patterns shown include analyzing blog comments, sentiment analysis with machine learning, and loading a modern data warehouse.
Azure Data Factory Data Flow Preview December 2019Mark Kromer
Visual Data Flow in Azure Data Factory provides a limited preview of data flows that allow users to visually design transformations on data. It features implicit staging of data in data lakes, explicit selection of data sources and transformations through a toolbox interface, and setting of properties for transformation steps and destination connectors. The preview is intended to get early feedback to help shape the future of Visual Data Flow.
Microsoft Azure Data Factory Hands-On Lab Overview SlidesMark Kromer
This document outlines modules for a lab on moving data to Azure using Azure Data Factory. The modules will deploy necessary Azure resources, lift and shift an existing SSIS package to Azure, rebuild ETL processes in ADF, enhance data with cloud services, transform and merge data with ADF and HDInsight, load data into a data warehouse with ADF, schedule ADF pipelines, monitor ADF, and verify loaded data. Technologies used include PowerShell, Azure SQL, Blob Storage, Data Factory, SQL DW, Logic Apps, HDInsight, and Office 365.
Data quality patterns in the cloud with ADFMark Kromer
Azure Data Factory can be used to build modern data warehouse patterns with Azure SQL Data Warehouse. It allows extracting and transforming relational data from databases and loading it into Azure SQL Data Warehouse tables optimized for analytics. Data flows in Azure Data Factory can also clean and join disparate data from Azure Storage, Data Lake Store, and other data sources for loading into the data warehouse. This provides simple and productive ETL capabilities in the cloud at any scale.
Short introduction to different options for ETL & ELT in the Cloud with Microsoft Azure. This is a small accompanying set of slides for my presentations and blogs on this topic
Azure Analysis Services is a tabular multidimensional cloud service for deploying and consuming analysis models. It allows users to create an Azure Analysis Services server to host analysis models in the cloud, deploy models to the server, and then consume the models. The presenter provided an overview of Azure Analysis Services and demonstrated how to create a server, deploy models, and consume them.
Microsoft Azure Data Factory Data Flow ScenariosMark Kromer
Visual Data Flow in Azure Data Factory provides a limited preview of data flows that allow users to visually design transformations on data. It features implicit staging of data in data lakes, explicit selection of data sources and transformations through a toolbox interface, and setting of properties for transformation steps and destination connectors. The preview is intended to get early feedback to help shape the future of visual data flows in Azure Data Factory.
SQL Saturday Redmond 2019 ETL Patterns in the CloudMark Kromer
This document discusses ETL patterns in the cloud using Azure Data Factory. It covers topics like ETL vs ELT, scaling ETL in the cloud, handling flexible schemas, and using ADF for orchestration. Key points include staging data in low-cost storage before processing, using ADF's integration runtime to process data both on-premises and in the cloud, and building resilient data flows that can handle schema drift.
Azure Data Factory Data Flow Limited Preview for January 2019Mark Kromer
Azure Data Factory introduces Visual Data Flow, a limited preview feature that allows users to visually design data flows without writing code. It provides a drag-and-drop interface for users to select data sources, place transformations on imported data, and choose destinations for transformed data. The flows are run on Azure and default to using Azure Data Lake Storage for staging transformed data, though users can optionally configure other staging options. The feature supports common data formats and transformations like sorting, merging, joining, and lookups.
Diplomado Técnico SQL Server 2012 - Sesión 5/8John Bulla
This document provides an overview of a SQL Server 2012 seminar on the semantic model. It introduces Jesús Gil, the seminar leader and SQL Server MVP. It then discusses the semantic model in SQL Server 2012, how it can be used across various BI tools and scenarios from personal to team to organizational BI. It covers considerations for building a semantic model, exploiting the model across various end user experiences, and resources for further information.
SSAS, MDX , Cube understanding, Browsing and Tools information Vishal Pawar
Why we need SSAS Cube
What is SSAS Cube
Way to access Cube
What is Dimension and Attributes
QHP Dimension and Attributes
Process Flow and QHP Cube Browsing
MDX Basics
MDX Tools
Comparison of Queries Written in T-SQL and MDX with Construct
MDX –How to add where condition
Multidimensional or tabular points to considerDeepak Kumar
- Tabular cube solutions were created for self-service BI, low-latency real-time analytics, and compatibility with Power BI and competitors' in-memory solutions.
- Tabular uses columnar storage and can be deployed either with an in-memory or direct query model. Unlike MOLAP, it does not pre-aggregate data and stores data compressed on disk for backup.
- Tabular differs from multidimensional models in that it only uses tables instead of dimensions and facts, has a single model per database, and the model can pull from multiple data sources. Some functionality like ragged hierarchies is also missing.
Azure Data Factory Data Flows Training v005Mark Kromer
Mapping Data Flow is a new feature of Azure Data Factory that allows building data transformations in a visual interface without code. It provides a serverless, scale-out transformation engine for processing big data with unstructured requirements. Mapping Data Flows can be authored and designed visually, with transformations, expressions, and results previews, and then operationalized with Data Factory scheduling, monitoring, and control flow.
Real-world BISM in SQL Server 2012 SSASLynn Langit
The document discusses the Business Intelligence Semantic Model (BISM) in SQL Server Analysis Services. It provides an overview of what BISM is, why it should be used, how to get started with it, and how to create and enhance BISM models. It also includes demonstrations of creating a BISM model in SQL Server Data Tools and deploying it to Analysis Services.
Azure Data Factory Data Wrangling with Power QueryMark Kromer
Azure Data Factory now allows users to perform data wrangling tasks through Power Query activities, translating M scripts into ADF data flow scripts executed on Apache Spark. This enables code-free data exploration, preparation, and operationalization of Power Query workflows within ADF pipelines. Examples of use cases include data engineers building ETL processes or analysts operationalizing existing queries to prepare data for modeling, with the goal of providing a data-first approach to building data flows and pipelines in ADF.
ADF Mapping Data Flows Training Slides V1Mark Kromer
Mapping Data Flow is a new feature of Azure Data Factory that allows users to build data transformations in a visual interface without code. It provides a serverless, scale-out transformation engine to transform data at scale in the cloud in a resilient manner for big data scenarios involving unstructured data. Mapping Data Flows can be operationalized with Azure Data Factory's scheduling, control flow, and monitoring capabilities.
This document provides an overview of advanced analytics using R and SQL. It discusses how R is used widely for analytics and is growing in popularity. Microsoft R Open and Microsoft R Server are introduced as tools for scalable enterprise analytics that integrate with SQL Server. Key capabilities covered include running R scripts directly in the database using SQL Server 2016 extensions, calling stored procedures from applications that execute R code, and bringing compute to data with in-database analytics for performance and scale. Demos and sample programs are referenced to illustrate capabilities of Open Source R and Microsoft R tools.
Azure Data Factory Mapping Data Flow allows users to stage and transform data in Azure during a limited preview period beginning in February 2019. Data can be staged from Azure Data Lake Storage, Blob Storage, or SQL databases/data warehouses, then transformed using visual data flows before being landed to staging areas in Azure like ADLS, Blob Storage, or SQL databases. For information, contact adfdataflowext@microsoft.com or visit http://aka.ms/dataflowpreview.
Mapping Data Flow is a new feature of Azure Data Factory that allows users to build data transformations in a visual interface without code. It provides a serverless, scale-out transformation engine for processing big data with unstructured requirements. Mapping Data Flows can be operationalized with Data Factory's scheduling, control flow, and monitoring capabilities.
Microsoft Data Integration Pipelines: Azure Data Factory and SSISMark Kromer
The document discusses tools for building ETL pipelines to consume hybrid data sources and load data into analytics systems at scale. It describes how Azure Data Factory and SQL Server Integration Services can be used to automate pipelines that extract, transform, and load data from both on-premises and cloud data stores into data warehouses and data lakes for analytics. Specific patterns shown include analyzing blog comments, sentiment analysis with machine learning, and loading a modern data warehouse.
Azure Data Factory Data Flow Preview December 2019Mark Kromer
Visual Data Flow in Azure Data Factory provides a limited preview of data flows that allow users to visually design transformations on data. It features implicit staging of data in data lakes, explicit selection of data sources and transformations through a toolbox interface, and setting of properties for transformation steps and destination connectors. The preview is intended to get early feedback to help shape the future of Visual Data Flow.
Microsoft Azure Data Factory Hands-On Lab Overview SlidesMark Kromer
This document outlines modules for a lab on moving data to Azure using Azure Data Factory. The modules will deploy necessary Azure resources, lift and shift an existing SSIS package to Azure, rebuild ETL processes in ADF, enhance data with cloud services, transform and merge data with ADF and HDInsight, load data into a data warehouse with ADF, schedule ADF pipelines, monitor ADF, and verify loaded data. Technologies used include PowerShell, Azure SQL, Blob Storage, Data Factory, SQL DW, Logic Apps, HDInsight, and Office 365.
Data quality patterns in the cloud with ADFMark Kromer
Azure Data Factory can be used to build modern data warehouse patterns with Azure SQL Data Warehouse. It allows extracting and transforming relational data from databases and loading it into Azure SQL Data Warehouse tables optimized for analytics. Data flows in Azure Data Factory can also clean and join disparate data from Azure Storage, Data Lake Store, and other data sources for loading into the data warehouse. This provides simple and productive ETL capabilities in the cloud at any scale.
Short introduction to different options for ETL & ELT in the Cloud with Microsoft Azure. This is a small accompanying set of slides for my presentations and blogs on this topic
Azure Analysis Services is a tabular multidimensional cloud service for deploying and consuming analysis models. It allows users to create an Azure Analysis Services server to host analysis models in the cloud, deploy models to the server, and then consume the models. The presenter provided an overview of Azure Analysis Services and demonstrated how to create a server, deploy models, and consume them.
Microsoft Azure Data Factory Data Flow ScenariosMark Kromer
Visual Data Flow in Azure Data Factory provides a limited preview of data flows that allow users to visually design transformations on data. It features implicit staging of data in data lakes, explicit selection of data sources and transformations through a toolbox interface, and setting of properties for transformation steps and destination connectors. The preview is intended to get early feedback to help shape the future of visual data flows in Azure Data Factory.
SQL Saturday Redmond 2019 ETL Patterns in the CloudMark Kromer
This document discusses ETL patterns in the cloud using Azure Data Factory. It covers topics like ETL vs ELT, scaling ETL in the cloud, handling flexible schemas, and using ADF for orchestration. Key points include staging data in low-cost storage before processing, using ADF's integration runtime to process data both on-premises and in the cloud, and building resilient data flows that can handle schema drift.
Azure Data Factory Data Flow Limited Preview for January 2019Mark Kromer
Azure Data Factory introduces Visual Data Flow, a limited preview feature that allows users to visually design data flows without writing code. It provides a drag-and-drop interface for users to select data sources, place transformations on imported data, and choose destinations for transformed data. The flows are run on Azure and default to using Azure Data Lake Storage for staging transformed data, though users can optionally configure other staging options. The feature supports common data formats and transformations like sorting, merging, joining, and lookups.
Diplomado Técnico SQL Server 2012 - Sesión 5/8John Bulla
This document provides an overview of a SQL Server 2012 seminar on the semantic model. It introduces Jesús Gil, the seminar leader and SQL Server MVP. It then discusses the semantic model in SQL Server 2012, how it can be used across various BI tools and scenarios from personal to team to organizational BI. It covers considerations for building a semantic model, exploiting the model across various end user experiences, and resources for further information.
SSAS, MDX , Cube understanding, Browsing and Tools information Vishal Pawar
Why we need SSAS Cube
What is SSAS Cube
Way to access Cube
What is Dimension and Attributes
QHP Dimension and Attributes
Process Flow and QHP Cube Browsing
MDX Basics
MDX Tools
Comparison of Queries Written in T-SQL and MDX with Construct
MDX –How to add where condition
Practical Business Intelligence in SharePoint 2013 - Helsinki FinalndIvan Sanders
This document provides information about a presentation on practical business intelligence in SharePoint 2013 in Helsinki. It includes contact information for the presenter, Ivan Sanders, who is a SharePoint MVP with over 20 years of experience designing and developing business intelligence dashboards and Microsoft solutions. Requirements and comparisons for SharePoint 2010 and 2013 hardware are listed. Architectures for BI components like Excel Services, PerformancePoint Services, and Visio Services are described. Installation best practices and links to demo content are also provided. The document ends with a list of trusted SharePoint experts and thanks sponsors of the event.
Practical Business Intelligence in SharePoint 2013 - HonoluluIvan Sanders
This document provides an overview of a presentation on practical business intelligence in SharePoint 2013 given by Ivan Sanders. Ivan Sanders is introduced as a SharePoint MVP/MCT author with over 20 years of experience designing and developing Microsoft solutions, including business intelligence dashboards. The presentation covers topics such as the hardware requirements for SharePoint 2013, the business intelligence architecture including Excel Services, PerformancePoint Services, and Visio Services. It also discusses best practices for installation and configuration as well as techniques for gathering requirements and designing dimensional models, ETL processes, and analytics solutions. Codeplex links are provided for related demo content and source code.
Microsoft BI reporting capabilities (on-prem solutions) Presentationjeromedoyen
Microsoft BI Reporting Solutions (on-prem) Presentation by OKTOPUS Consulting - Discover a complete presentation of MS BI reporting solution (for on-prem architecture) #data #datavizualization #microsoft #bi #Q12018
Balanced BI Approach (Power Pivot & SSAS Tabular)asammartino
This presentation was based on an actual scenario where I educated a client on Power Pivot and basic DAX and they took the basics to completion and upgraded to a full blown Corporate BI Solution in SSAS Tabular with a few easy steps.
Microsoft Power Stack 2019 [Power BI, Excel, Azure & Friends]Olivier Travers
Making sense of Microsoft's renewed push in the business intelligence sector with:
- Power BI including Power BI Report Server, Premium, Embedded, APIs...
- Excel
- Flow, PowerApps. SharePoint, Teams
- SQL Server
- Azure IaaS / PaaS / SaaS
[Latest update: mid-2019] I have put an inordinate amount of research time to keep this presentation up to date since its original publication in February 2017. The latest version is available upon request for a reasonable fee and for my consulting clients, don't hesitate to contact me to discuss your project.
Olivier Travers
olivier@needlestacker.com
Fascinate with SQL SSIS Parallel processing Vishal Pawar
What is Parallel Processing ?
Where Parallel Processing can be use ?
SSIS and BI
SSIS vs SQL Stored Procedure
SSIS and Parallel Processing
How to do SSIS Parallel Processing ?
DEMO
Off Topic –How to use Auto Test Script for Review and Testing
Comparison of all with Time
New Innovation –Parallel Processing with Continuous Serial Threading
Question, Files and Links
As we move from experience and intuition based decision making to factual decision making, it is increasingly important to capture data and store it in a way that allows us to make smarter decisions. This is where Data warehouse/Business Intelligence comes into picture. There is a huge demand for There is a huge demand for Business Intelligence professionals and this course acts as a foundation which opens the door to a variety of opportunities in Business Intelligence space. Though there are many vendors providing BI tools, very few of them provide end-end BI suite and huge customer base. Microsoft stands as leader with its user-friendly and cost effective Business Intelligence suite helping customers to get a 360 degree view of their businesses.
Power BI & Advanced Business Intelligence Tools Excel 2013 / 2016 By Spark Tr...Ahmed Yasir Khan
This document provides information about a two-day workshop on Power Pivot in Karachi and Lahore in September. The workshop investment is 30,000 Pakistani rupees excluding taxes. The facilitator is Ahmed Yasir Khan, who has 20 years of experience in finance and IT and has conducted many training sessions. The workshop will cover topics such as importing and transforming data, creating calculated fields and KPIs, and building dashboards using Power Pivot and Power View in Excel. Upon completion, participants will be able to import and manipulate data, create reports, use DAX functions, and distribute Power Pivot data.
This document discusses building cubes in SQL Server Analysis Services (SSAS) and PowerPivot. It covers cubes created manually in SSAS, auto-cubes created in PowerPivot, and cubes in the upcoming Denali release. PowerPivot allows users to analyze massive data volumes with Excel. Reporting Services and SharePoint can be used to publish and share PowerPivot reports. SSAS provides an advanced feature set for scalable cube design. Denali will converge cube technologies with its new BI Semantic Model.
This document provides an overview and agenda for a presentation on business intelligence and big data technologies. The presentation covers tools such as Excel, PowerPivot, Power View, Reporting Services, PerformancePoint, and HDInsight for working with data from sources like SQL Server, Oracle, DB2, and Hadoop. It discusses self-service BI capabilities and how these tools work with the Microsoft BI stack and platform.
Practical Business Intelligence with SharePoint 2013Ivan Sanders
This sessions provides an overview of the new features available to business users and the knowledge they need to start building their own Dashboards using the tools they already know Excel to implement Business Intelligence features they may not have used previously like SQL Analysis Service, SQL Reporting Services, PowerView, PowerPivot, and Excel Services
This document provides an overview of using Sybase WorkSpace to develop applications for Sybase IQ. It discusses WorkSpace features for enterprise modeling, database development, and migrating data and schemas from Sybase ASE to IQ. Specific capabilities covered include conceptual and physical data modeling, SQL development and debugging, schema development, and using WorkSpace to model replication environments and stage data migration to IQ. Links are provided to learn more about Sybase IQ, WorkSpace, and related products.
The document provides an introduction to Microsoft Business Intelligence (MSBI). It discusses how MSBI addresses the needs of users by integrating data across networks, providing summarized and historical data to help understand organizational health, and enabling 'what-if' analysis. It describes the MSBI architecture and how it uses SQL Server Integration Services, SQL Server Analysis Services, and SQL Server Reporting Services to move data between sources and destinations, perform online analytical processing to build cubes for analysis, and deliver reports, respectively. The document also compares MSBI to other BI tools and argues it provides the most reliable solution at the lowest total cost.
SQL Server 2012 Analysis Services introduces a new BI Semantic Model that provides a single data model for building BI solutions. This unified model supports both multidimensional and tabular data models, providing flexibility for users and developers. It also includes tools for designing, developing, and deploying sophisticated BI applications and enables fast analytical performance through features like Proactive Caching.
Building Modern Data Platform with Microsoft AzureDmitry Anoshin
This document provides an overview of building a modern cloud analytics solution using Microsoft Azure. It discusses the role of analytics, a history of cloud computing, and a data warehouse modernization project. Key challenges covered include lack of notifications, logging, self-service BI, and integrating streaming data. The document proposes solutions to these challenges using Azure services like Data Factory, Kafka, Databricks, and SQL Data Warehouse. It also discusses alternative implementations using tools like Matillion ETL and Snowflake.
The document provides an overview of the speaker's experience and qualifications. It lists the speaker's 7+ years of experience working with Microsoft's BI stack, current role as a MS BI Solution Architect, consulting experience, certifications including MCSA and MCTS in SQL Server, education including a BE in Computer Science and an MBA. It also provides links to the speaker's blogs and details on various SQL and BI articles authored.
Similar to Intoduction to sql 2012 Tabular Modeling (20)
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceIndexBug
Imagine a world where machines not only perform tasks but also learn, adapt, and make decisions. This is the promise of Artificial Intelligence (AI), a technology that's not just enhancing our lives but revolutionizing entire industries.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
Digital Marketing Trends in 2024 | Guide for Staying AheadWask
https://www.wask.co/ebooks/digital-marketing-trends-in-2024
Feeling lost in the digital marketing whirlwind of 2024? Technology is changing, consumer habits are evolving, and staying ahead of the curve feels like a never-ending pursuit. This e-book is your compass. Dive into actionable insights to handle the complexities of modern marketing. From hyper-personalization to the power of user-generated content, learn how to build long-term relationships with your audience and unlock the secrets to success in the ever-shifting digital landscape.
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
Salesforce Integration for Bonterra Impact Management (fka Social Solutions A...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on integration of Salesforce with Bonterra Impact Management.
Interested in deploying an integration with Salesforce for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
5th LF Energy Power Grid Model Meet-up SlidesDanBrown980551
5th Power Grid Model Meet-up
It is with great pleasure that we extend to you an invitation to the 5th Power Grid Model Meet-up, scheduled for 6th June 2024. This event will adopt a hybrid format, allowing participants to join us either through an online Mircosoft Teams session or in person at TU/e located at Den Dolech 2, Eindhoven, Netherlands. The meet-up will be hosted by Eindhoven University of Technology (TU/e), a research university specializing in engineering science & technology.
Power Grid Model
The global energy transition is placing new and unprecedented demands on Distribution System Operators (DSOs). Alongside upgrades to grid capacity, processes such as digitization, capacity optimization, and congestion management are becoming vital for delivering reliable services.
Power Grid Model is an open source project from Linux Foundation Energy and provides a calculation engine that is increasingly essential for DSOs. It offers a standards-based foundation enabling real-time power systems analysis, simulations of electrical power grids, and sophisticated what-if analysis. In addition, it enables in-depth studies and analysis of the electrical power grid’s behavior and performance. This comprehensive model incorporates essential factors such as power generation capacity, electrical losses, voltage levels, power flows, and system stability.
Power Grid Model is currently being applied in a wide variety of use cases, including grid planning, expansion, reliability, and congestion studies. It can also help in analyzing the impact of renewable energy integration, assessing the effects of disturbances or faults, and developing strategies for grid control and optimization.
What to expect
For the upcoming meetup we are organizing, we have an exciting lineup of activities planned:
-Insightful presentations covering two practical applications of the Power Grid Model.
-An update on the latest advancements in Power Grid -Model technology during the first and second quarters of 2024.
-An interactive brainstorming session to discuss and propose new feature requests.
-An opportunity to connect with fellow Power Grid Model enthusiasts and users.
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfChart Kalyan
A Mix Chart displays historical data of numbers in a graphical or tabular form. The Kalyan Rajdhani Mix Chart specifically shows the results of a sequence of numbers over different periods.
OpenID AuthZEN Interop Read Out - AuthorizationDavid Brossard
During Identiverse 2024 and EIC 2024, members of the OpenID AuthZEN WG got together and demoed their authorization endpoints conforming to the AuthZEN API
2. Support Escalation Engineer in Microsoft for last 6 years
Currently focusing more on SQL BI, SQL PDW & Big Data.
Active blogger; contributed to multiple whitepapers which are
published on MSDN or TechNet site
Wrote tools available on CodePlex - ASTrace, Trace
Scheduler, etc.
Achieved highest certification in SQL BI World SQL Server
Analysis Services Maestro (MCM)
About the Speaker @karangspeaks
http://karanspeaks.com
http://blogs.msdn.com/karang
http://twitter.com/karangspeaks
http://in.linkedin.com/in/karanspeaks
3. Agenda
New modeling option in SQL 2012 Analysis Services – Tabular Server
• BISM Vision
• Table-like modeling
• Finding Remote of James Bond Car
• ABCD – Anybody Can Dance
4. END USER TOOLS AND PERFORMANCE MANAGEMENT APPS
Excel
PerformancePoi
nt
BI PLATFORM
SQL Server
Reporting Services
SQL Server
Analysis Services/SQL PDW
SQL Server DBMS & Master Data Services (MDS)
SQL Server Integration Services
SharePoint Server
DELIVERY
Reports Dashboards Excel Services
& Workbook
Analytic
Views Scorecards
MS Business Intelligence Stack
PowerPivot Power View
6. OneModel
The BI Semantic Model is one model for all end-user experiences –
reporting, analytics, scorecards, dashboards, and custom applications
RelationalDataModel
Embracing the relational data model and bringing it together with the
multidimensional model under a single unified BI platform
Why BI Semantic Model?
7. Three Ways to Model
PersonaI BI
Power Pivot for excel
Team BI
Power Pivot for Sharepoint
Organizational BI
Analysis Services
Tabular or MOLAP
8. What’s the Tabular Hype All About?
• In-memory database or xVelocity
Based on the relational methodology
Column-oriented database
Data is stored in a compressed format
11. Developing a Model
Use the Visual Studio
SQL Server Data Tools to
build a BISM
Open an existing
PowerPivot Model using
SSDT or Management
Studio
Deployment
• Immediate changes
implementation
• at the SDDT
• Use small DB for development
13. • What are the modeling options in SQL 2012 SSAS?
• What is the new query language for tabular models?
• What tools can you use for modeling?
• What is xVelocity?
• What tools can you use for reporting on tabular models?
Trivia Questions
14. Summary
If you are new to Analysis Services – go with Tabular
Model
Table-like modeling is easy to learn
New Excel-like language called DAX