This document discusses Tennessee Board of Regents' plans to implement an operational data store and enterprise data warehouse (ODS/EDW) to simplify access to institutional data for various stakeholders. It outlines the information requirements, challenges with the current approach of generating reports from Banner, and the benefits of moving to a maturity model where users can generate their own reports from the ODS/EDW. Key aspects of the planned ODS/EDW implementation include data modeling, extraction, transformation and loading (ETL) processes, administration, and a phased roadmap.
Innovation Webinar - Using IFS Applications BI to drive business excellenceIFS
Studies show that best-in-class businesses—those with the best operating margins and turnover growth in their industries—have clearly defined objectives supported by a Business Intelligence solution. In this session, we’ll look at specific features in the IFS Applications Business Intelligence solution. See how easily these features can help you support strategic business initiatives and reach improved operational results.
Innovation Webinar - Using IFS Applications BI to drive business excellenceIFS
Studies show that best-in-class businesses—those with the best operating margins and turnover growth in their industries—have clearly defined objectives supported by a Business Intelligence solution. In this session, we’ll look at specific features in the IFS Applications Business Intelligence solution. See how easily these features can help you support strategic business initiatives and reach improved operational results.
Nieuwe intuïtieve web-based gebruikers interface bovenop SAP HCM. Kan gebruikt worden als vervanger van de traditionele SAP user-interfaces en bevordert het gebruik van selfservice functionaliteiten.
Selecting BI Tool - Proof of Concept - Андрій МузичукIgor Bronovskyy
A large number of tools and techniques have been developed over the years to support managerial decision making. Thus process of selecting appropriate BI tool turns to be an issue. Implementing and deploying a BI initiative can be lengthy, expensive and failure pron. The Proof of concept method can be used by stakeholders to avoid unnecessary losses.
In the presentation, the description of Proof of Concept method is provided based on the example of selecting among Microsoft stack, MicroStrategy and Business Object Bi tools. The example includes above mentioned technologies overview, reports modeling process, reports development process, report integration in SharePoint, performance testing as well as the decision making model and summary for final tools selection.
Razorfish Multi-Channel Marketing: Better Customer Segmentation and TargetingTeradata Aster
Matt Comstock, Vice President Business Intelligence Office, Razorfish, presents at the Big Analytics 2012 Roadshow.
From search to email to social, customers are interacting with your brand across a variety of channels. But what do people do once they view an advertisement or get an email? What common behaviors are displayed once they’re on your site? By combining media exposure/behavior, site-side media, and in-store purchase data, you can understand better the impact media has on driving value to your business. Come to this session to learn how better data-driven multi-channel analysis lets you see what consumers do before they become a customer to understand what content influences which segments of users by media audience. Discover new segmentation and targeting strategies to improve engagement with your brand and increase advertising lift. See how a leader in digital marketing uses a combination of technologies including Teradata Aster, Hadoop, and Amazon Web Services to handle big data and provide big analytics to improve business value.
Capturing Business Requirements For Scorecards, Dashboards And ReportsJulian Rains
This paper helps Management Information and Business Intelligence related projects build a solid foundation for their reporting business requirements gathering. It defines the scope of the information needed to design and build dashboards, scorecards and other types of report.
Big Data Analytics in a Heterogeneous World - Joydeep Das of SybaseBigDataCloud
Big Data Analytics is characterized by analysis of data on three vectors: exploding data volume, proliferating data variety (relational, multi-media), and accelerating data velocity. However, other key vectors such as costs and skill set needed for Big Data Analytics are often overlooked. In this session, we will consider all five vectors by exploring various techniques where traditional but progressive technologies such as column store DBMS and Event Stream Processing is combined with open source frameworks such as Hadoop to exploit the full potential of Big Data Analytics.
Agenda:
- Big Data Analytics in the real world
- Commercial and Open Source techniques
- Bringing together Commercial and Open Source techniques
* Architectures
* Programming APIs
(e.g. embedded and federated MapReduce)
- Conclusions
Trending use cases have pointed out the complementary nature of Hadoop and existing data management systems—emphasizing the importance of leveraging SQL, engineering, and operational skills, as well as incorporating novel uses of MapReduce to improve distributed analytic processing. Many vendors have provided interfaces between SQL systems and Hadoop but have not been able to semantically integrate these technologies while Hive, Pig and SQL processing islands proliferate. This session will discuss how Teradata is working with Hortonworks to optimize the use of Hadoop within the Teradata Analytical Ecosystem to ingest, store, and refine new data types, as well as exciting new developments to bridge the gap between Hadoop and SQL to unlock deeper insights from data in Hadoop. The use of Teradata Aster as a tightly integrated SQL-MapReduce® Discovery Platform for Hadoop environments will also be discussed.
Nieuwe intuïtieve web-based gebruikers interface bovenop SAP HCM. Kan gebruikt worden als vervanger van de traditionele SAP user-interfaces en bevordert het gebruik van selfservice functionaliteiten.
Selecting BI Tool - Proof of Concept - Андрій МузичукIgor Bronovskyy
A large number of tools and techniques have been developed over the years to support managerial decision making. Thus process of selecting appropriate BI tool turns to be an issue. Implementing and deploying a BI initiative can be lengthy, expensive and failure pron. The Proof of concept method can be used by stakeholders to avoid unnecessary losses.
In the presentation, the description of Proof of Concept method is provided based on the example of selecting among Microsoft stack, MicroStrategy and Business Object Bi tools. The example includes above mentioned technologies overview, reports modeling process, reports development process, report integration in SharePoint, performance testing as well as the decision making model and summary for final tools selection.
Razorfish Multi-Channel Marketing: Better Customer Segmentation and TargetingTeradata Aster
Matt Comstock, Vice President Business Intelligence Office, Razorfish, presents at the Big Analytics 2012 Roadshow.
From search to email to social, customers are interacting with your brand across a variety of channels. But what do people do once they view an advertisement or get an email? What common behaviors are displayed once they’re on your site? By combining media exposure/behavior, site-side media, and in-store purchase data, you can understand better the impact media has on driving value to your business. Come to this session to learn how better data-driven multi-channel analysis lets you see what consumers do before they become a customer to understand what content influences which segments of users by media audience. Discover new segmentation and targeting strategies to improve engagement with your brand and increase advertising lift. See how a leader in digital marketing uses a combination of technologies including Teradata Aster, Hadoop, and Amazon Web Services to handle big data and provide big analytics to improve business value.
Capturing Business Requirements For Scorecards, Dashboards And ReportsJulian Rains
This paper helps Management Information and Business Intelligence related projects build a solid foundation for their reporting business requirements gathering. It defines the scope of the information needed to design and build dashboards, scorecards and other types of report.
Big Data Analytics in a Heterogeneous World - Joydeep Das of SybaseBigDataCloud
Big Data Analytics is characterized by analysis of data on three vectors: exploding data volume, proliferating data variety (relational, multi-media), and accelerating data velocity. However, other key vectors such as costs and skill set needed for Big Data Analytics are often overlooked. In this session, we will consider all five vectors by exploring various techniques where traditional but progressive technologies such as column store DBMS and Event Stream Processing is combined with open source frameworks such as Hadoop to exploit the full potential of Big Data Analytics.
Agenda:
- Big Data Analytics in the real world
- Commercial and Open Source techniques
- Bringing together Commercial and Open Source techniques
* Architectures
* Programming APIs
(e.g. embedded and federated MapReduce)
- Conclusions
Trending use cases have pointed out the complementary nature of Hadoop and existing data management systems—emphasizing the importance of leveraging SQL, engineering, and operational skills, as well as incorporating novel uses of MapReduce to improve distributed analytic processing. Many vendors have provided interfaces between SQL systems and Hadoop but have not been able to semantically integrate these technologies while Hive, Pig and SQL processing islands proliferate. This session will discuss how Teradata is working with Hortonworks to optimize the use of Hadoop within the Teradata Analytical Ecosystem to ingest, store, and refine new data types, as well as exciting new developments to bridge the gap between Hadoop and SQL to unlock deeper insights from data in Hadoop. The use of Teradata Aster as a tightly integrated SQL-MapReduce® Discovery Platform for Hadoop environments will also be discussed.
Agile Data Warehouse Design for Big Data PresentationVishal Kumar
Synopsis:
[Video link: http://www.youtube.com/watch?v=ZNrTxSU5IQ0 ]
Jim Stagnitto and John DiPietro of consulting firm a2c) will discuss Agile Data Warehouse Design - a step-by-step method for data warehousing / business intelligence (DW/BI) professionals to better collect and translate business intelligence requirements into successful dimensional data warehouse designs.
The method utilizes BEAM✲ (Business Event Analysis and Modeling) - an agile approach to dimensional data modeling that can be used throughout analysis and design to improve productivity and communication between DW designers and BI stakeholders. BEAM✲ builds upon the body of mature "best practice" dimensional DW design techniques, and collects "just enough" non-technical business process information from BI stakeholders to allow the modeler to slot their business needs directly and simply into proven DW design patterns.
BEAM✲ encourages DW/BI designers to move away from the keyboard and their entity relationship modeling tools and begin "white board" modeling interactively with BI stakeholders. With the right guidance, BI stakeholders can and should model their own BI data requirements, so that they can fully understand and govern what they will be able to report on and analyze.
The BEAM✲ method is fully described in
Agile Data Warehouse Design - a text co-written by Lawrence Corr and Jim Stagnitto.
About the speaker:
Jim Stagnitto Director of a2c Data Services Practice
Data Warehouse Architect: specializing in powerful designs that extract the maximum business benefit from Intelligence and Insight investments.
Master Data Management (MDM) and Customer Data Integration (CDI) strategist and architect.
Data Warehousing, Data Quality, and Data Integration thought-leader: co-author with Lawrence Corr of "Agile Data Warehouse Design", guest author of Ralph Kimball’s “Data Warehouse Designer” column, and contributing author to Ralph and Joe Caserta's latest book: “The DW ETL Toolkit”.
John DiPietro Chief Technology Officer at A2C IT Consulting
John DiPietro is the Chief Technology Officer for a2c. Mr. DiPietro is responsible
for setting the vision, strategy, delivery, and methodologies for a2c’s Solution
Practice Offerings for all national accounts. The a2c CTO brings with him an
expansive depth and breadth of specialized skills in his field.
Sponsor Note:
Thanks to:
Microsoft NERD for providing awesome venue for the event.
http://A2C.com IT Consulting for providing the food/drinks.
http://Cognizeus.com for providing book to give away as raffle.
Microsoft Data Warehouse Business Intelligence Lifecycle - The Kimball ApproachMark Ginnebaugh
Data Warehouse - Business Intelligence Lifecycle Overview by Warren Thronthwaite
This slide deck describes the Kimball approach from the best-selling Data Warehouse Toolkit, 2nd Edition. It was presented to the Bay Area Microsoft Business Intelligence User Group in October 2012.
Starting with business requirements and project definition, the lifecycle branches out into three tracks: Technical, Data and Applications. You will learn:
* The major steps in the Lifecycle and what needs to happen in each one.
* Why business requirements are so important and how they influence all major decisions across the entire DW/BI system.
* Key tools for prioritizing business requirements and creating an enterprise information framework.
* How to break up a DW/BI system into doable increments that add real business value and can be completed in a reasonable time frame.
Best practices and tips on how to design and develop a Data Warehouse using Microsoft SQL Server BI products.
This presentation describes the inception and full lifecycle of the Carl Zeiss Vision corporate enterprise data warehouse.
Technologies covered include:
•Using SQL Server 2008 as your data warehouse DB
•SSIS as your ETL Tool
•SSAS as your data cube Tool
You will Learn:
•How to Architect a data warehouse system from End-to-End
•Components of the data warehouse and functionality
•How to Profile data and understand your source systems
•Whether to ODS or not to ODS (Determining if a operational Data Store is required)
•The staging area of the data warehouse
•How to Build the data warehouse – Designing Dimensions and Fact tables
•The Importance of using Conformed Dimensions
•ETL – Moving data through your data warehouse system
•Data Cubes - OLAP
•Lessons learned from Zeiss and other projects
Building an Effective Data Warehouse ArchitectureJames Serra
Why use a data warehouse? What is the best methodology to use when creating a data warehouse? Should I use a normalized or dimensional approach? What is the difference between the Kimball and Inmon methodologies? Does the new Tabular model in SQL Server 2012 change things? What is the difference between a data warehouse and a data mart? Is there hardware that is optimized for a data warehouse? What if I have a ton of data? During this session James will help you to answer these questions.
Affordable Analytics & Planning IBM Cognos ExpressSenturus
Solution that delivers reporting, analysis, dashboard, planning, budgeting and forecasting capabilities at an affordable price. View the webinar video recording and download this deck: http://www.senturus.com/resources/affordable-analytics-and-planning/.
Watch this webinar if you are considering switching from spreadsheets to a business analytics system and searching for an affordable, easy-to-implement solution.
Senturus, a business analytics consulting firm, has a resource library with hundreds of free recorded webinars, trainings, demos and unbiased product reviews. Take a look and share them with your colleagues and friends: http://www.senturus.com/resources/.
Building a business intelligence architecture fit for the 21st century by Jon...Mark Tapley
Objectives of the presentation:
To record some history –what has happened in the past that makes the future quite challenging.
To provide real examples of BI at work –good and bad.
To illustrate the nature of data and why it has become so important in driving forward
the business in the 21stcentury.
To outline a way to align technology with the business so that efforts and budget are spent
in a way that will enable the future rather that support the past.
To propose a set of principles and ideas that can guide a company in a way to make data available to all who have the penchant to turn it into useful and valuable information.
To describe the new organisation unit that will be needed to realise the dream.
Global Big Data Conference Hyderabad-2Aug2013- Finance/Manufacturing Use CasesSanjay Sharma
Financial institutions today are under intense pressure to provide more value add to the customers, reduce IT costs and also grow year to year. This challenge has been further complicated by huge amounts of data being generated as well as mandatory federal compliances in place.
Similarly, Manufacturing industry today also is facing the challenge to process huge amount of data in real time and predict failures as early as possible to reduce cost and increase production efficiency.
The session will cover some high level Big Data use cases applicable to financial and manufacturing domain and how big data technologies are being used successfully to solve these challenges using some examples in credit card/banking industry in financial domain and semi-conductor production in manufacturing domain.
DataOps - Big Data and AI World London - March 2020 - Harvinder AtwalHarvinder Atwal
Title
DataOps, the secret weapon for delivering AI, data science, and business intelligence value at speed.
Synopsis
● According to recent research, just 7.3% of organisations say the state of their data and analytics is excellent, and only 22% of companies are currently seeing a significant return from data science expenditure.
● Poor returns on data & analytics investment are often the result of applying 20th-century thinking to 21st-century challenges and opportunities.
● Modern data science and analytics require secure, efficient processes to turn raw data from multiple sources and in numerous formats into useful inputs to a data product.
● Developing, orchestrating and iterating modern data pipelines is an extremely complex process requiring multiple technologies and skills.
● Other domains have to successfully overcome the challenge of delivering high-quality products at speed in complex environments. DataOps applies proven agile principles, lean thinking and DevOps practices to the development of data products.
● A DataOps approach aligns data producers, analytical data consumers, processes and technology with the rest of the organisation and its goals.
Our mission is: transforming data to reveal business and clinical insights. We accomplish this through our data management, business intelligence and analytics consulting services. We ensure that organizations have the proper tools, technology and processes to improve performance – relative to predefined critical success factors and key performance indicators – based on greater insight and analysis through analytics. We offer a framework for establishing an Analytics Center of Excellence within organizations to define roles and responsibilities and coordinate activities and tasks among key stakeholders. With emphasis on statistical analysis, forecasting, optimization, and simulation, analytics provides results that are predictive and prescriptive, injecting clarity and confidence in decision making and improving performance through situational awareness at all levels of the organization.
Through our past consulting engagements, we observed significant challenges and short-comings in how these organizations navigate such a data-rich environment in the pursuit of analytical excellence. Based on our assessment and evaluation, we develop a roadmap for establishing an information environment that enables stakeholders to improve clinical decision-making and performance (as related to quality, outcomes, cost and utilization) through data visualizations and advanced analytics. This roadmap accounts for both structured and unstructured data, and it includes provisions for controlled data access based on security and privacy policies. We manage the transition from on-premise to cloud-based data sources and leverage the cloud as an aggregation point for creating a Big Data analytics platform. We then perform an alternatives analysis of feasible solutions based on several factors, including: delivered capabilities, ease of implementation, performance, scalability, interoperability and integration with legacy systems, and functionality -- at a cost that maximizes ROI.
Quicker Insights and Sustainable Business Agility Powered By Data Virtualizat...Denodo
Watch full webinar here: https://bit.ly/3xj6fnm
Presented at Chief Data Officer Live 2021 A/NZ
The world is changing faster than ever. And for companies to compete and succeed they need to be agile in order to respond quickly to market changes and emerging opportunities. Data plays an integral role in achieving this business agility. However, given the complex nature of the enterprise data architecture finding and analysing data is an increasingly challenging task. Data virtualization is a modern data integration technique that integrates data in real-time, without having to physically replicate it.
Watch on-demand this session to understand what data virtualization is and how it:
- Delivers data in real-time, and without replication
- Creates a logical architecture to provide a single view of truth
- Centralises the data governance and security framework
- Democratises data for faster decision making and business agility
Data Lakes are early in the Gartner hype cycle, but companies are getting value from their cloud-based data lake deployments. Break through the confusion between data lakes and data warehouses and seek out the most appropriate use cases for your big data lakes.
SAP Incentive Administration and SAP Paybacks and Chargebacks by Vistex are applications that extend SAP® ERP and SAP CRM software functionality. These applications provide companies with an embedded, fully-integrated solution for handling discount pricing, marketing fund claims, sales commissions & broker fees, in- & outbound royalty payments, and volume- & growth-based sales or purchasing rebates.
With visibility into costs, pricing and incentives, SAP Incentive Administration and SAP Paybacks and Chargebacks can analyze the gross-to-net profitability of sales in complex channels involving any number of intermediary channel partners. The solutions define the eligible products, channels, customers, pricing and incentives in each sales agreement. In many cases, multiple agreements may be involved in selling a product to the end-customer since there is typically an agreement between each intermediary partner in the supply chain.
These applications are able to consolidate a view of the multiple agreements, incentives and transactions, providing insight into revenue and spend. Operationally, the system needs to process the large influx of daily transactions, and analytical reporting consumes a significant portion of system resources.
SAP® HANA™ technology takes these solutions to the next level by enabling real-time insight into revenue and spend. First, the solution lowers data demands on the operational database used for transaction processing. Second, the solution virtually eliminates the data latency that is inherent in replicated data marts using traditional database technologies. Third, the solution accelerates analytical processing to provide insight on larger data sets (including Big Data) at faster speeds than traditional analytical tools.
SAP Co-innovation Lab (COIL) global network is designed for driving open innovation projects and initiatives to extend SAP’s solution coverage and enhance our solution infrastructure efficiency with partners. Vistex partnered with SAP and IBM in the SAP Co-Innovation Lab to develop a solution to provide real-time profitability analytics while reducing the overall impact on transactional processing and other business operations. As part of the project, an operational system was created with millions of lines of transactional data reflecting a large size (multi-billion dollar revenue) company, and reports analyzing revenue and spend were defined. This environment was used to evaluate the speed at which the analytical data set could be updated with new and changed data, as well as the time necessary to analyze and report revenue and spend data using multiple transaction types.
SAP HANA was used to reduce typical analytical report generation from several minutes to less than one second, while maintaining an up-to-date analytics data source that is updated within milliseconds of data changes in the operational system.
The Business Value of Business IntelligenceSenturus
Learn about various BI architectures and approaches, as well as a comparison of different vendors’ BI offerings. See a demonstration of OLAP cube building. View the video recording and download this deck: http://www.senturus.com/resources/the-business-value-of-business-intelligence/
Senturus, a business analytics consulting firm, has a resource library with hundreds of free recorded webinars, trainings, demos and unbiased product reviews. Take a look and share them with your colleagues and friends: http://www.senturus.com/resources/.
While many utilities look forward to the benefits of deploying smart grid technology, they need help on where and how to start. A manageable distribution management system (DMS) pilot implementation is a good kickoff toward the smart grid goal, because it successfully demonstrates the possibilities while it builds support from stakeholders across the enterprise. Schneider Electric helps the utility implement the DMS pilot, using a phased Build — Learn — Plan — Execute approach.
In such a DMS pilot project, the company works with a small team of utility personnel knowledgeable of the organization’s network data stores and analytical functions. They build a fully operational load flow model that represents a subset of the network and reflects circuit data from the GIS database or other sources and includes two HV/MV substations and four to eight feeders, ideally with the switches and enough load profile data to support some switching and basic optimization functionality.
In the Learn phase of the DMS pilot, team members evaluate functionality of the model and completeness and accuracy of the data used. Tuning the model builds team confidence in its understanding of the data needed and the accuracy of the basic DMS algorithms.
In the Plan phase, the utility identifies current business plan and internal and external drivers toward grid automation, considering throughout the political and regulatory environments. A strategy is designed to remove obstacles and achieve identified goals.
In the final Execute phase, the utility will contract for software and services; train core team; develop system configuration and convert data; and finally deploy the system with site acceptance testing and rollout.
This fastrack DMS implementation gets the utility started and moving at the ‘speed of value’ as it builds confidence in DMS technology. It offers a proof-of-concept of DMS benefits across the organization — more reliable service, reduction of peak demand, utility cost savings and more.
ADV Slides: How to Improve Your Analytic Data Architecture MaturityDATAVERSITY
Many organizations are immature when it comes to data use. The answer lies in delivering a greater level of insight from data, straight to the point of need. Enter: machine learning.
In this webinar, William will look at categories of organizational response to the challenge across strategy, architecture, modeling, processes, and ethics. Machine learning maturity levels tend to move in harmony across these categories. As a general principle of maturity models, you can’t skip levels in any category, nor can you advance in one category well beyond the others.
Vis-à-vis ML, attaining and retaining momentum up the model is paramount for success. You will ascend the model through concerted efforts delivering business wins utilizing progressive elements of the model, and thereby increasing your machine learning maturity. The model will evolve. No plateaus are comfortable for long.
With ML maturity markers, sequencing, and tactics, this webinar provides a plan for how to build analytic Data Architecture maturity in your organization.
How Data Virtualization Puts Machine Learning into Production (APAC)Denodo
Watch full webinar here: https://bit.ly/3mJJ4w9
Advanced data science techniques, like machine learning, have proven an extremely useful tool to derive valuable insights from existing data. Platforms like Spark, and complex libraries for R, Python and Scala put advanced techniques at the fingertips of the data scientists. However, these data scientists spend most of their time looking for the right data and massaging it into a usable format. Data virtualization offers a new alternative to address these issues in a more efficient and agile way.
Attend this session to learn how companies can use data virtualization to:
- Create a logical architecture to make all enterprise data available for advanced analytics exercise
- Accelerate data acquisition and massaging, providing the data scientist with a powerful tool to complement their practice
- Integrate popular tools from the data science ecosystem: Spark, Python, Zeppelin, Jupyter, etc
Discussion on strategies for shaping model behaviors and approaches for a modern (contemporary) IT practice in higher education. CampusWorks, Inc. annual meeting 2016.
Matrix of collaborative IT projects referenced in panel discussion “Collaboration by Design, Innovation with Purpose” at the EDUCAUSE annual conference Nov. 2011.
Report proposing the establishment of a cyberinfrastructure for Tennessee to enable collaborative research among Oak Ridge National Laboratory (ORNL), Tennessee Board of Regents (TBR), and the University of Tennessee (UT).
EDUCAUSE Live! presentation given September 8, 2010. Talent management is the process of attracting, selecting, training, developing and promoting employees throughout the institution. A focus on obtaining and developing talent ensures that the staff has the tools/support/resources necessary to perform well, are properly motivated/compensated, and are ready to transition into leadership roles as appropriate. They become valuable assets because over time they develop the necessary core competencies and internalized institutional core values.
Credit Card Computers and Their Application in HEThomas Danford
Presented at THEITS 2014: The Raspberry Pi (RPi) and Beaglebone Black (BBB) are small single-board computers that bring relatively new computer concepts to higher education. The idea is to replace traditional expensive equipment with relatively inexpensive equipment that gives the student/user the freedom to experiment through trial and error without the fear/consequences of crashing more expensive systems. This session gave an overview of each board’s hardware, necessary peripherals, optional accessories, OS and development software, and their strengths/weaknesses/limitations. The new learning model these boards offer, the trade-offs, and areas in higher education in which they may play a role in learning and other applications were also discussed.
Providing Metrics for Decision Makers CoHEsion13Thomas Danford
Departments across any institution, from finance to HR, enrollment to alumni, to student services et al., management is constantly looking for ways to improve the performance of their organizations and initiatives. Nevertheless, providing metrics to enable decision makers to align departmental goals with the mission of the institution is difficult. This presentation will chronicle what the Tennessee Board of Regents is doing to lower the barriers of cost, time, and quality in delivering actionable metrics to campus leaders across the system.
10 Determinants and 13 Ground Rules CoHEsion13Thomas Danford
10 Determinants & 13 Ground Rules that Improve Institutional Performance
Improving both the quality of service that your organization delivers along with the value of the employees that deliver the service are two crucial pillars in institutional performance. This discussion will focus on the application of the “10 Determinants of Service Quality” along with the “13 Ground Rules for Success in the Information Age” in managing an organization. The 10 Determinants will focus on the understanding of where the service quality “perception gap” arises and how management can address it. In a similar fashion, the 13 Ground Rules will provide the backdrop for what kind of employees we need to look for and develop.
Keynote for the Tennessee Association for Institutional Researchers (TENNAIR) 2013 conference. The theme of the conference being “big data” the presentation centered around the big data project of the Tennessee Board of Regents.
During the June 2010 quarterly meeting of the Tennessee Board of Regents, board members approved an implementation plan recommended by the National Center for Higher Education Management Systems (NCHEMS) that called for the creation of a data warehouse to be used to enhance decision-making at both the system and campus levels. The strategy now referred to as the “Common Data Repository” (CDR) is to create a single authoritative data warehouse where data from institutions will automatically be fed into the CDR from their Banner administrative systems be they hosted or located at the campus. The presentation provided an overview of the project as to its strategic purpose, how the technology will work, and the role that the functional users will play (including governance).
It was an honor to be asked to participate in the 2013 Ellucian Live's Executive Forum concurrent session on "The ROI of Consolidating ERPs and Services Across Multiple Campuses." These are the slides used during the presentation and a better description of the projects can be found at: http://tdanford.blogspot.com/
Six institutions collectively investigating hosting resulted in a unique cloud collaboration with a third-party provider. Due diligence determined that virtualization and clustering technologies provided real cost savings and Tier 3–4 facility benefits. This poster session chronicled the process and describe the pros and cons, cost factors, tangible and intangible benefits, and lessons learned. Poster URL: http://bit.ly/RgEROJ
These were the poster session slides
TBR Business Process Improvement EDUCAUSE12Thomas Danford
On-line presentation at EDUCAUSE 2012: The Tennessee Board of Regents embarked on a multi-institution business process realignment project for the system's 13 community colleges. The project identified 255 initiatives that defined process improvements in multiple ways, including process optimization, policy, and training. This session chronicled the project from its innovative approach through lessons learned.
In the early fall of 2012 the TBR signed an agreement with SciQuest for an eProcurement and Spend Management solution for the entire system. At the TN-Summit I led a panel discussion on the rationale for the project and steps going forward. These are the slides that were used to stimulate the discussion with the audience.
Presentation given at the TNSCORE 2012 annual conference. Tennessee is a designated EPSCoR state. EPSCoR (Experimental Program to Stimulate Competitive Research) is a program administered by the National Science Foundation to assist states in boosting the level of research funding provided by NSF. This presentation gave an overview of the development of cyberinfrastructure in the state as well as planned future improvements.
An Exploration: Moving Your Enterprise to a Cloud CollaborationThomas Danford
Presentation at Educause Southeast 2012 - The ever-costly hardware refresh cycle for administrative systems, coupled with budget cutbacks and IT audit findings, prompted five community colleges and their system offices to explore hosting or cloud computing as an alternative to independent systems at each of their campuses. Is collaborating in such a move to the cloud truly a viable option for lowering or maintaining current costs, both in real dollars and in staff hours? Can benefits be realized in terms of providing enhanced, more secure services, better redundancy, and increased availability and scalability? What issues arise when institutions collaborate in such a venture? Bring your own experiences and questions to this open dialogue where we'll create a working roadmap that you and others can follow.
Rethinking Disaster Prepardness to Leverage Resources in a Cloud and Mobile World: Presentation given at the 2012 Tennessee Higher Education Symposium (THEITS) - In many respects the disaster recovery plans of today are based upon the environments of old where commodity hardware, cloud resources and mobile devices didn’t exist. In November of 2011 the Tennessee Board of Regents office became the first public higher education organization to move its ERP system to the cloud by having it hosted at the state’s new data center. The following January, state auditors came on site to perform a routine biennial audit. The audit process included an information systems and disaster recovery component which led to a complete rethinking of disaster recovery in the new environment. This presentation chronicled the issues of moving mission critical systems to the cloud and how cloud resources from various sources coupled with mobile devices can be incorporated for cost effective disaster recovery planning.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
Generating a custom Ruby SDK for your web service or Rails API using Smithy
TBR ODS EDW Planning 2007
1. Tennessee Board of Regents
ODS/EDW Planning
Presented by Thomas Danford
Fall 2007
2. Information Requirements of the TBR
Institutional Research
Administrative and Academic
Departments A single, trusted source of
i l d f
institutional data
Easy-to-understand data layout
coupled with industry-based, Accurate data based on
easy-to-use reporting tools that consistently applied institutional
allow end-users to satisfy their business rules
b i l
own requests
CIO/IT audience
Executive Level Robust Information Access
architecture based on
The same information “visible” to commercially supported tools
all layers of the institution
An abstracted data model built
Less time arguing over data; more with and for higher education
time using information to make
better and timely decisions Supports choice of reporting
tools to meet varied needs and
budgets
2
3. Reporting out of Banner can be Complex
EXAMPLE: The simple report below for 5 pieces of information
requires accessing data fields from 4 Banner tables:
ID Name Address Phone Age
_______________________________________________________________
123-43-4564 Buler, Brian 1464 West Street West Chester PA (453)657-5768 43
324-64-3445 Carol, Bob 84 South Street Philadelphia PA (728)495-4535 34
848-39-4848 Crain, Joan 23 Brandywine Kenneth PA 23
834-56-3443 Kaachen, Perry 3444 Broadview Allentown PA (435)309-5445 54
…..
….
…
..
.
3
4. Banner Normalizes Data (3rd Normal Form) so
that it exists only in one location
The tables for the
example report: SPRIDEN
SPBPERS SPRADDR
SPRTELE
4
5. SQL is the programming language used
to manage data in Banner
The SQL code for the example report:
SELECT SPRIDEN_ID,SPRIDEN_NAME,SPRADDR_STREET_LINE1,SPRADDR_CITY,
SPRADDR_STAT_CODE,SPRADDR_ZIPC_CODE,
SPRTELE_PHONE_AREA_CODE||’-’||SPRTELE_PHONE_NUMBER,
FUNC(AGE(SPBPERS_BIRTH_DATE))
FROM SRPIDEN,A.SPRADDR,B.SPRTELE,SPBPERS
WHERE SPRIDEN_CHANGE_IND IS NULL
AND SPRIDEN_PIDM = SPBPERS_PIDM (+)
AND SPBPERS_PIDM = SPRADDR_PIDM (+)
AND SYSDATE BETWEEN A.SPRADDR_FROM_DATE (+) AND A.SPRADDR_END_DATE (+)
AND A.SPRADDR_ADDR_CODE (+) = FUNC(ADDRHICR(SPRIDEN_PIDM,”OF”)
AND A.SPRADDR_ASTA_CODE (+) = “A”
AND A.SPRADDR_PIDM = B.SPRTELE_PIDM (+)
AND A.SPRADDR_ADDR_CODE = B.SPRTELE_ADDR_CODE (+)
AND B.SPRTELE_SEQ_NO (+) = (SELECT MAX( C.SPRTELE_SEQ_NO)
FROM C.SPRTELE
WHERE C.SPRTELE_PIDM = B.SPRTELE_PIDM
AND C.SPRTELE_ADDR_CODE =
C SPRTELE ADDR CODE
B.SPRTELE_ADDR_CODE)….
…ORDER BY SPRIDEN_NAME;
5
6. Approach Under Plus - IT Department Made
Reports Based on Specifications Provided by
Functional Users …
Frustrating developing specifications
Inefficient iterative process
Very time consuming
Multiple functional users competing for limited
IT resources
Functional dependency on IT developed
6
7. Information Access Maturity Model
SunGard
Plus
Reporting Created by Self-
Self-Service Analytical Institutional
IT Staff Reporting
R ti Reporting
R ti Performance
P f
Ad-
Ad-Hoc Historical Trends Management
Time
Forecasting Process Improvement
Data Mining Dashboard/Scorecard
Analytical Applications
Custom
Reports
Baseline
Reports
Level 1 Level 2 Level 3 Level 4 Level 5
Not Available
7
8. ODS/EDW & Banner Approach
Instead of continually spending IT staff time and
budget on developing and programming reports for
functional users, we will develop the infrastructure that
simplifies how information is accessed.
accessed
The infrastructure must support the enterprise and
provide both a common framework and methodology
for solving Information access problems.
Simplifying the access gives both IT and functional
users the information they need, when they need it,
with less of an effort.
8
9. Information Access Maturity Model
SunGard SunGard SunGard
Operational Enterprise
Banner Data St
D t Store Data
D t
Warehouse
Institutional
Performance
Analytical Management
•Process Improvement
Reporting •Dashboard/Scorecard
Time
•Historical Trends •Analytical Applications
•Forecasting
Self-
Self-Service •Data Miningg
Reporting SunGard
S G d
•Ad-Hoc
Ad- Performance
Custom Management
Reports Solutions
Baseline
Reports
Level 1 Level 2 Level 3 Level 4 Level 5
Institution Adoption
9
10. Information Access Maturity Model
SunGard SunGard SunGard SunGard
Banner Operational Enterprise Performance
Data St
D t Store Data
D t Management
M t
Other Warehouse Solutions
Systems
KPIs
Banner ETL ODS ETL EDW
Alerts
SELF-
SELF-SRVC
BANNER CUSTOM REPORTS OLAP
Oracle Discoverer Scorecards
QUERIES REPORTS Cognos Impromptu
g p p CUBES Dashboards
&REPORTS Others (any ODBC) Cognos Powerplay
Others
Level 1 Level 2 Level 3 Level 4 Level 5
10
12. Data Modeling Strategy
Model specifically designed for end-user reporting
Applicable TBR (enterprise) wide
Simplified & standardized data structures
User-friendly table and field names
Designed from a reporting perspective
g p gp p
Indexed and optimized for reporting
Flexible enough for individual campus
requirements
Consistency between Banner product releases
12
13. Key Roles in ODS/EDW
(Senior) Data Architect(s) – An IT management role that is responsible for the
data modeling, ETL processing, and administration of the ODS/EDW. Experience
and Responsibilities:
p
Designs data dictionaries & data warehouses
Enterprise application and analytics integration
Metadata registry
Oracle RDBMS and Structured Query Language (SQL)
XML, including schema definitions and transformations
ODBC/JDBC linkages & reporting tools
Business & Systems Analyst(s) – An internal consultancy role that identifies
options for improving business systems and bridges the information needs of the
business with the use of the ODS/EDW. Experience and Responsibilities:
Gathers and defines business requirements
Analyzes, maps, and documents processes (current state/future state)
Identifies, collects, and analyzes data requirements (forecasting, trend analysis, etc.)
Contribute to the design of the ODS/EDW from a business needs point of view
Creates required reports, KPI’s, alerts, scorecards, dashboards, etc.
C t i d t KPI’ l t d d hb d t
ODBC/JDBC reporting tools
13
14. Data Model Approach & Responsibilities
Model specifically designed for On Line
Analytical Processing (OLAP) reporting
Enterprise Wide
Star Schemas
Designed from a departmental perspective
Multidimensional business analysis
Complex calculations
Trend analysis
Sophisticated data modeling
Documenting the ODS/EDW (ERDs Data Dictionaries
(ERDs, Dictionaries,
Flow Charts, Metadata, etc.)
14
16. ETL Processing Responsibilities
Commercial ETL tool
Oracle Warehouse Builder (OWB)
Improved customization
Improved extensibility
User defined time slices
Incremental Refresh
Only move what has changed
Feeds from Campuses to TBR ODS/EDW
ODS/EDW Campuses
Custom Warehouses (MTSU & UoM)
16
18. Administration Responsibilities
Administration tool for execution and monitoring of
ETL processes
Web-based
Job S
Submission/Scheduling and Logging
/S
Snap-Shots
Transformation Setup and Maintenance
Data cleansing
Security – (FGA)
Meta Data Repository
18
19. Data Warehouse Implementation Roadmap
Federal, State, THEC Business Discovery Project Requirements
Requirements & Analysis Scoping Gathering
Business
Analysis
Project &
Infrastructure Planning
DB Architecture Extraction Transform Load
Data
Structure
and
Implementation
Management
& Testing
DB Design
Reporting &
R ti
Analytics
Architecture
Reporting and
Analytics
Development
Reporting &
Analytics Design Deployment &
Implementation Testing
(KPIs, Dashboards, Support
Alerts, Etc.)
19
20. Discussion … Q&A
Di i
Special thanks to SunGard HE who provided content for this presentation. …
20