On May 9, 2014, the Digital Accountability and Transparency Act (DATA Act), was signed into law. The Act, drawing some lessons from both the Federal Funding Accountability and Transparency Act of 2006 (FFATA) and the American Recovery and Reinvestment Act of 2009 (ARRA), is the nation’s first legislative mandate for data transparency. It requires the Department of the Treasury (Treasury) and the White House Office of Management and Budget (OMB) to transform U.S. federal spending from disconnected documents into open, standardized data, and to publish that data online.
www.pwc.com/publicsector
Forging a federal government open data agenda by liv watsonWorkiva
The federal government possesses an enormous amount of valuable public data, which should be used
to improve government services and promote private sector innovation. This legislation seeks to
achieve these goals by creating an expectation that – by default – government data will be open and
available whenever possible. Specifically, this bill defines open data without locking in yesterday’s
technology; creates standards for making federal government data available to the public; requires the
federal government to use open data to improve decision making; and ensures accountability by
requiring oversight during key periods of implementation.
We Consider Open Data To Be Part Of A Broader Trendnoblex1
We consider open data to be part of a broader trend towards “open government” where open data combines with social media, mobile technology and other feedback mechanisms to transform the relationship governments have with citizens, delivering better, more relevant public services (which we could broadly term “citizen-centric” open data). Open data also has the power to improve individuals’ lives through private or third sector innovation on the back of publicly-available data sets, resulting in valuable services and economic growth (which we could term “consumer-centric” open data).
Source: https://ebookschoice.com/we-consider-open-data-to-be-part-of-a-broader-trend/
The FreeBalance white paper describes how integration between donor and government financials systems reduces transaction costs and corruption through automation. This improves efficiency and effectiveness by data harmonization. The white paper also details how manual methods of integration compromises data quality and timeliness resulting in less coordination and less effective aid
Forging a federal government open data agenda by liv watsonWorkiva
The federal government possesses an enormous amount of valuable public data, which should be used
to improve government services and promote private sector innovation. This legislation seeks to
achieve these goals by creating an expectation that – by default – government data will be open and
available whenever possible. Specifically, this bill defines open data without locking in yesterday’s
technology; creates standards for making federal government data available to the public; requires the
federal government to use open data to improve decision making; and ensures accountability by
requiring oversight during key periods of implementation.
We Consider Open Data To Be Part Of A Broader Trendnoblex1
We consider open data to be part of a broader trend towards “open government” where open data combines with social media, mobile technology and other feedback mechanisms to transform the relationship governments have with citizens, delivering better, more relevant public services (which we could broadly term “citizen-centric” open data). Open data also has the power to improve individuals’ lives through private or third sector innovation on the back of publicly-available data sets, resulting in valuable services and economic growth (which we could term “consumer-centric” open data).
Source: https://ebookschoice.com/we-consider-open-data-to-be-part-of-a-broader-trend/
The FreeBalance white paper describes how integration between donor and government financials systems reduces transaction costs and corruption through automation. This improves efficiency and effectiveness by data harmonization. The white paper also details how manual methods of integration compromises data quality and timeliness resulting in less coordination and less effective aid
Autonomic’s summary the Vivek Kundra’s 25 Point Federal IT Reformation plan. A view of the CIO’s office taking what seems to be a very sensible & pragmatic approach to IT reformation.
#RegReporting is a tough nut to crack! In his recent blog, Prakash Jalihal writes on why the process has become so complicated and explains how HEXANIKA can streamline Regulatory Reporting for banks using #BigData technology:
Our guest speaker, Cavan Capps, who is Big Data Lead services presented this talk as part of the Program on Information Science Brown Bag Series.
[slideshare id]
Big Data provides both challenges and opportunities for the official statistical community. The difficult issues of privacy, statistical reliability, and methodological transparency will need to be addressed in order to make full use of Big Data in the official statistical community. Improvements in statistical coverage at small geographies, new statistical measures, more timely data at perhaps lower costs are the potential opportunities. This talk will provides an overview of some of the research being done by the Census Bureau as it explores the use of “Big Data” for statistical agency purposes.
Speaker Bio: Cavan Capps is the U.S. Census Bureau’s Lead on Big Data processing. In that role he is focusing on new Big Data sources for use in official statistics, best practice private sector processing techniques and software/hardware configurations that may be used to improve statistical processes and products. Previously, Mr. Capps initiated, designed and managed a multi-enterprise, fully distributed, statistical network called the DataWeb. The 'DataWeb' is a data library of networked statistical databases from all federal statistical data domains, with sophisticated visualization, descriptive analytics, data integration and dashboard construction tools. The DataWeb is the source of official API to Census data products.
Department of Veteans Affairs Preliminary Regulatory Reform PlanObama White House
When President Obama unveiled his plan to create a 21st-century regulatory system that protects the health and safety of Americans in a cost-effective way, he called for an unprecedented government-wide review of rules already on the books. As a result of that review, the Department of Veterans Affairs has identified initiatives to reduce burdens and save money. Read the agency plan and share your comments, feedback and questions.
Visit WhiteHouse.gov/RegulatoryReform to view all the plans and learn more.
The Open Data Economy Unlocking Economic Value by Opening Government and Publ...Capgemini
Few governments are leveraging open data for economic benefits
Governments and public authorities across the world are launching Open Data initiatives. Public administration officials are now beginning to realize the value that opening up data can have. For instance, the direct impact of Open Data on the EU27 economy was estimated at €32 Billion in 2010, with an estimated annual growth rate of 7%.
However, very few governments are taking the right measures in realizing the economic benefits of Open Data. This report examines the reasons behind this tendency, drawing on an analysis of 23 select countries across the world.
Momentum Shift: Creating Markets for Advanced Energy at the State LevelLexie Briggs
To view the full webinar (with audio!), click here: http://info.aee.net/ca-nv-tx-va-state-policy-webinar
Whatever might be happening at the federal level, states are taking the lead in creating markets for advanced energy. AEE's State Policy Program seeks to maintain this momentum by working with our coalition of State and Regional Partners and our business members to promote advanced energy legislation in statehouses around the nation. In this webinar you will hear from policy experts who have intimate knowledge of the latest legislative developments in the following states:
* California: Cap & Trade, Storage, Transportation
* Nevada: Retail Choice Issue, Legislative Update
* Texas: Legislative Tax Issue, PUCT Regulatory Proceeding on Data Access
* Virginia: Access to Advanced Energy, Legislative, and Regulatory Update
IT Optimization: Navigation Fiscal AusterityOmar Toor
The Federal Government faces a situation similar to that of the private sector in the early 2000s. Many corporations experienced rapid growth in the late 1990s. Companies spent tens of millions of dollars on ERP, CRM, and other enterprise IT systems. As the below graphic illustrates, large enterprise systems grew corporate expense budgets at an unprecedented rate in the form of support, maintenance, enhancement, operations, and amortization. The late 1990’s technology and dot com busts, multiple downturns, and a recession caused industry to change their spending habits and drive cost out of their baseline. Some succeeded, many failed, and a few went bankrupt.
The question is whether Federal COOs, CFOs, and CIOs will wait for OMB to levy cuts on them or whether Federal executives will act to address the systemic drivers of IT expense so they are ready to respond to the inevitable round of forthcoming budget cuts. In the words of George Bernard Shaw, “The possibilities are numerous once we decide to act and not react.” Acting now could protect agency missions and even redirect additional funds to critical needs. If CFOs and CIOs wait for the inevitable budget mandate, it will be too late to identify waste - and the only thing left to cut will be investment dollars.
www.pwc.com/publicsector
Autonomic’s summary the Vivek Kundra’s 25 Point Federal IT Reformation plan. A view of the CIO’s office taking what seems to be a very sensible & pragmatic approach to IT reformation.
#RegReporting is a tough nut to crack! In his recent blog, Prakash Jalihal writes on why the process has become so complicated and explains how HEXANIKA can streamline Regulatory Reporting for banks using #BigData technology:
Our guest speaker, Cavan Capps, who is Big Data Lead services presented this talk as part of the Program on Information Science Brown Bag Series.
[slideshare id]
Big Data provides both challenges and opportunities for the official statistical community. The difficult issues of privacy, statistical reliability, and methodological transparency will need to be addressed in order to make full use of Big Data in the official statistical community. Improvements in statistical coverage at small geographies, new statistical measures, more timely data at perhaps lower costs are the potential opportunities. This talk will provides an overview of some of the research being done by the Census Bureau as it explores the use of “Big Data” for statistical agency purposes.
Speaker Bio: Cavan Capps is the U.S. Census Bureau’s Lead on Big Data processing. In that role he is focusing on new Big Data sources for use in official statistics, best practice private sector processing techniques and software/hardware configurations that may be used to improve statistical processes and products. Previously, Mr. Capps initiated, designed and managed a multi-enterprise, fully distributed, statistical network called the DataWeb. The 'DataWeb' is a data library of networked statistical databases from all federal statistical data domains, with sophisticated visualization, descriptive analytics, data integration and dashboard construction tools. The DataWeb is the source of official API to Census data products.
Department of Veteans Affairs Preliminary Regulatory Reform PlanObama White House
When President Obama unveiled his plan to create a 21st-century regulatory system that protects the health and safety of Americans in a cost-effective way, he called for an unprecedented government-wide review of rules already on the books. As a result of that review, the Department of Veterans Affairs has identified initiatives to reduce burdens and save money. Read the agency plan and share your comments, feedback and questions.
Visit WhiteHouse.gov/RegulatoryReform to view all the plans and learn more.
The Open Data Economy Unlocking Economic Value by Opening Government and Publ...Capgemini
Few governments are leveraging open data for economic benefits
Governments and public authorities across the world are launching Open Data initiatives. Public administration officials are now beginning to realize the value that opening up data can have. For instance, the direct impact of Open Data on the EU27 economy was estimated at €32 Billion in 2010, with an estimated annual growth rate of 7%.
However, very few governments are taking the right measures in realizing the economic benefits of Open Data. This report examines the reasons behind this tendency, drawing on an analysis of 23 select countries across the world.
Momentum Shift: Creating Markets for Advanced Energy at the State LevelLexie Briggs
To view the full webinar (with audio!), click here: http://info.aee.net/ca-nv-tx-va-state-policy-webinar
Whatever might be happening at the federal level, states are taking the lead in creating markets for advanced energy. AEE's State Policy Program seeks to maintain this momentum by working with our coalition of State and Regional Partners and our business members to promote advanced energy legislation in statehouses around the nation. In this webinar you will hear from policy experts who have intimate knowledge of the latest legislative developments in the following states:
* California: Cap & Trade, Storage, Transportation
* Nevada: Retail Choice Issue, Legislative Update
* Texas: Legislative Tax Issue, PUCT Regulatory Proceeding on Data Access
* Virginia: Access to Advanced Energy, Legislative, and Regulatory Update
IT Optimization: Navigation Fiscal AusterityOmar Toor
The Federal Government faces a situation similar to that of the private sector in the early 2000s. Many corporations experienced rapid growth in the late 1990s. Companies spent tens of millions of dollars on ERP, CRM, and other enterprise IT systems. As the below graphic illustrates, large enterprise systems grew corporate expense budgets at an unprecedented rate in the form of support, maintenance, enhancement, operations, and amortization. The late 1990’s technology and dot com busts, multiple downturns, and a recession caused industry to change their spending habits and drive cost out of their baseline. Some succeeded, many failed, and a few went bankrupt.
The question is whether Federal COOs, CFOs, and CIOs will wait for OMB to levy cuts on them or whether Federal executives will act to address the systemic drivers of IT expense so they are ready to respond to the inevitable round of forthcoming budget cuts. In the words of George Bernard Shaw, “The possibilities are numerous once we decide to act and not react.” Acting now could protect agency missions and even redirect additional funds to critical needs. If CFOs and CIOs wait for the inevitable budget mandate, it will be too late to identify waste - and the only thing left to cut will be investment dollars.
www.pwc.com/publicsector
Leading in extraordinary times, the 2015 US CEO SurveyOmar Toor
Learn how US CEOs are positioning for a new era where overseas business growth is balanced more evenly between developed and emerging economies, and mainstream adoption of digital technologies everywhere is surging.
My presentation to "Transparency Camp 09", about how to go beyond transparency to an integrated strategy based on "democratizing data" (structuring and syndicating it and providing social media analysis tools to share it). This integrated strategy will provide transparency, give workers the real-time information they need, reform government regulation, cut corporate paperwork, and crowdsource innovation. It may, or may not, cure the common cold under certain conditions.
Automating Stimulus Fund Reporting: How New Technologies Simplify Federal Rep...FindWhitePapers
Read about how U.S. government agencies and organizations spending economic stimulus funds are dealing with reporting and analysis requirements - that not only require tracking spending but measuring its economic impact. Find out what to look for in the technology available today to support this data management.
Running Head W6 Case StudiesW6 Case Studies- Business Intellige.docxjeffsrosalyn
Running Head: W6 Case Studies
W6 Case Studies- Business Intelligence 2
<>
University of the Cumberland’s
ITS-531-09 Business Intelligence
Professor: <>
27th Nov 2019
Table of Contents
Introduction3
Application case - Tax Collections Optimization for New York State3
1. What is the key difference between the former tax collection system and the new system?3
2. List at least three benefits that were derived from implementing the new system.3
3. In what ways do analytics and optimization support the generation of an efficient tax collection system?4
4. Why was tax collection a target for decreasing the budget deficit in the State of New York?4
Diagram15
Application case - Solving Crimes by Sharing Digital Forensic Knowledge.6
1. Why should digital forensics information be shared among law enforcement communities?6
2. What does egocentric theory suggest about knowledge sharing?6
3. What behavior did the developers of NRDFI observe in terms of use of the system?7
4. What additional features might enhance the use and value of such a KMS?7
Conclusion.8
Diagram28
Introduction
There are several difference amid the previous former tax collection and the new tax system. The state incorporated new system that also brought about some several changes on budget deficit and tax collection system. The new system also created more effective and efficient policies and rules on tax collection reducing the need of too many personnel. This paper discusses about both the old system and the new system.Application case - Tax Collections Optimization for New York State1. What is the key difference between the former tax collection system and the new system?
First and foremost, difference in that the previous tax system has rigid rules that took a long time to adopt since it needed too much resources while the new system had simple rules that took a short time to implement. Secondly the previous tax system emphasized on what could be done while the new system focus on what should be done by the tax collection officers. Previously the tax collection system employed the linear approach method for the identification and collection of delinquent taxes while the new system incorporated smarter decision on which of the delinquent cases to focus on first within the available framework. The new system was much different from the old system since it utilized the C-RL methodology to set up rules for tax collection while the old one did not incorporate any methodology when setting up new rules.2. List at least three benefits that were derived from implementing the new system.
Implementing the new system brought about several advantages. To begin with, the new system that was adopted in the year 2009 allowed the tax collection agency to only gather the delinquent tax whenever it was needed. Second benefit for implementing the new tax system was the fact that there was evident year to year rise in revenue collected from 2007 to 2010 that summed up to 83 million. An.
New and richer flows of data from organizations in the public space could enrich democracy and might improve effectiveness and efficiency. More public knowledge (one definition of "transparency") could stimulate debate about services and money, increase vigilance and arm scrutineers. But more and better data will not in and of itself bring more accountability or improve services. We must not reduce volume of information with better decision making. Data must become information: it must be grasped and absorbed. Information has then to be applied. Accountability and public satisfaction could move together in a virtuous circle, provided the public understands the data proffered; provided those releasing the data themselves understand it and its potential; provided its quality and accuracy are guaranteed.
Open Data prompts questions about public capacity. The government's response to proposed changes in the school curriculum allowing many more young people aged over 16 to continue studying mathematics and stats shows the government itself accepts the public need to be better equipped. Open Data abuts the contention that those leaving education have to be better prepared to deal with data and numbers, for their own sake as employees as well as in their lives as citizens and family builders (dealing with energy tariffs, insurance, pensions and broadband offers). Open Data links with moves to improve the quantitative skills of university graduates.
As important as the volume of data are presentation and "visualization", the discipline of making data more intelligible. In the jargon this means paying attention to metadata and data polishing. It puts emphasis on intermediaries to help the public make sense of data. Statisticians and academics are fond of the term "metadata". This directs attention to the explanatory material that ought to accompany data release. Another missing term is narrative. What the public want is data to tell a story about the performance of schools, crime in their area and so on. Open Data needs to look at who writes and who puts out these stories. Another key term is visualization – covering the many ways in which data, especially quantitative data, can be projected, for example exploiting the graphical resources of the web.
Data release should anticipate the sense the public will make of what is presented and how they might use data. Each department and agency should subject itself to a "data challenge": is the information intelligible? Translating data into information that is fit for public consumption requires good analysis and interpretation, which is lacking in many councils. The question does not capture the dynamism and spirit of opportunity and innovation that ought to accompany data release.
Source: https://ebookschoice.com/an-enhanced-right-to-open-data/
Governance and Architecture in Data IntegrationAnalytiX DS
AnalytiX™ Mapping Manager™ provides this discipline and rigor through its dedicated data mapping methodology as well as its metadata management processes and powerful patented mapping technology. AnalytiX™ Mapping Manager™ was designed and developed to not only fill the gap of having the ability to manage and version mapping specifications, but to also streamline and improve current process and drive standards around the entire process and across the enterprise for all integration and governance processes.
James J Okarimia
Managing Partner
Aligning Finance, Risk and Data Analytics in Meeting the Requirements of Emerging Regulations
Banks must meet more (and more varied) regulations today than ever. The sheer scale and scope of banking regulations, including Dodd-Frank, Basel III and IFRS, pose challenges to all financial institutions, from the smallest bank to the largest financial services enterprise.
A Summary of Top 28 areas covered by EC Proposed Regulation for CRR, CRD IV and Basel III Regulatory Compliance and Implementation of the proposal: A publication by James Jeffrey Okarimia
Partner at RM associates: Partners in Enterprise Risk Managements
James J Okarimia
Managing Partner
Aligning Finance, Risk and Data Analytics in Meeting the Requirements of Emerging Regulations
Banks must meet more (and more varied) regulations today than ever. The sheer scale and scope of banking regulations, including Dodd-Frank, Basel III and IFRS, pose challenges to all financial institutions, from the smallest bank to the largest financial services enterprise.
James J Okarimia
Managing Partner
Aligning Finance, Risk and Data Analytics in Meeting the Requirements of Emerging Regulations
Banks must meet more (and more varied) regulations today than ever. The sheer scale and scope of banking regulations, including Dodd-Frank, Basel III and IFRS, pose challenges to all financial institutions, from the smallest bank to the largest financial services enterprise.
Similar to The DATA Act: A Revolution in Federal Government Transparency (20)
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdf
The DATA Act: A Revolution in Federal Government Transparency
1. The DATA Act:
A Revolution in Federal
Government Transparency
www.pwc.com/publicsector
On May 9, 2014, the Digital Accountability and Transparency Act (DATA Act), was
signed into law. The Act, drawing some lessons from both the Federal Funding
Accountability and Transparency Act of 2006 (FFATA) and the American Recovery
and Reinvestment Act of 2009 (ARRA), is the nation’s first legislative mandate for data
transparency. It requires the Department of the Treasury (Treasury) and the White
House Office of Management and Budget (OMB) to transform U.S. federal spending
from disconnected documents into open, standardized data, and to publish that data
online.
2. 1 PwC The DATA Act: A Revolution in Federal Government Transparency
What will the DATA Act do?
The DATA Act amends and
augments FFATA in order to increase
accountability, transparency,
accessibility, quality, and
standardization in federal spending
data. Under the law, each federal
agency will report financial and
payment information in accordance
with government-wide financial data
standards, developed and issued by
Treasury and OMB. The wide array of
reports and data compilations related to
spending that currently exist – financial
management, payments, budget
actions, procurements, grants, and
assistance – can be standardized and
streamlined.
The objectives of the DATA Act are
multifold. The Act aims to:
• Track direct federal agency
expenditures and link federal
contract, loan, and grant spending
information to programs of federal
agencies
• Establish government-wide data
standards for financial data and
provide searchable government-
wide spending data that is accessible
and reusable
• Apply data analytics to standardized
federal spending data to better
illuminate potential waste and fraud
• Simplify reporting for recipients
of federal awards by consolidating
reporting requirements, eliminating
duplicative reporting, and
improving transparency and process
effectiveness
The DATA Act stipulates a general
timeline for Treasury and OMB to
implement the new requirements, shown
in the figure on the next page.
A few items in the timeline deserve
further discussion. First, upon
enactment of the law, the Secretary of
the Treasury has the option to establish
a data analysis center to examine
federal spending data to prevent
and reduce improper payments and
improve efficiency. If such a center is
established, the Recovery Accountability
and Transparency Board (RATB) could
become its core.
Second, one year following the
enactment of the DATA Act, OMB (or
another agency it designates) will
establish a pilot program to inform
recommendations surrounding
government-wide data standards, the
elimination of duplicative financial
reporting, and the reduction of
compliance costs for federal award
recipients. The program, which will run
for two years, will include recipients of
federal contracts, grants, and sub-awards
across multiple programs and agencies.
Third, Treasury and OMB will issue
guidance on government-wide data
standards for federal spending within
one year of enactment. Two years
after the release of this guidance, each
agency reports financial and payment
information in accordance with the
government-wide standards. The
following year, all information published
on USASpending.gov is consistent with
government-wide data standards.
>
3. The DATA Act: A Revolution in Federal Government Transparency PwC 2
What are the potential implications for Federal organizations?
Simply put, the DATA Act is a government-wide requirement. As such, by 2017,
organizations will report their financial and payment information in government-wide
data standards determined by Treasury and OMB along with other act requirements.
For example, for all federal spending, posted information must include appropriations,
obligations, unobligated balances, and outlays (i.e. disbursements). Currently,
details on appropriations normally are not published aside from legislative text
and appropriation justification books. Treasury will publish a breakdown of each
appropriation as well as detail for each account, showing the amounts received,
obligated, and spent, as well as a further breakdown by program activity and
object class. This means the flow of federal funds from appropriation to account to
expenditure can be publicly available and accessible in standardized structured data
formats.
The comprehensive nature of DATA Act reporting will require federal organizations to
coordinate across stakeholders, systems, and sources of financial information. Chief
Financial Officer (CFO) organizations, given their responsibility with setting standards
for financial management data and developing and managing financial policies, will
be best positioned in a majority of departments and agencies to take the lead for this
coordination in the initial efforts for the DATA Act.
Although guidance on the federal spending data standards is not expected to be issued
by Treasury and OMB until May 2015, and meeting those standards is an additional
two years away, organizations can begin preparing now in a proactive manner. Rather
than waiting to react to the data standards once they are issued, CFOs can begin
to understand and plan for how the DATA Act may affect current responsibilities,
stakeholders, and systems. Furthermore, by engaging in the effort early, they
potentially could help shape the standards.
Figure 1: DATA Act Implementation Timeline
4. 3 PwC The DATA Act: A Revolution in Federal Government Transparency
A “Data Centric” Approach
Many have speculated that DATA Act adherence will require updates or upgrades to
existing systems or possibly the development and fielding of new systems to facilitate
compliant data reporting – what could be called a “software centric” approach. While
some software will be necessary to facilitate mapping from existing systems to the
agreed standard, the focus instead would be a “data centric” approach rather than on
software.
First, unique identifiers for, and entities receiving, federal awards must be among the
common data elements developed. These unique identifiers should be consistently
applied government-wide. Ultimately, common data elements, like unique identifiers,
can allow information from different reports to be united. A common data standard
that is widely accepted, non-proprietary, searchable, platform-independent, and
computer-readable can be used. This will facilitate automation of the federal
government’s many separate spending reports; allow them to be checked for quality
and analyzed for waste, fraud, and abuse; facilitate reusable access; and allow for
aggregation without double counting.
Second, agencies and departments could begin mapping all required DATA Act
elements to the transaction level in the appropriate systems of record. Transactional
data facilitates real-time searches and queries to the “checkbook level” in accordance
with the stated Treasury goal. A key point here is that there is no limit to the number
or types of systems that can be involved. An organization should not worry that it has
“too many” systems from which data need to be drawn. As long as they are source
systems of record, they can be included regardless of type or age or platform. (Possible
exceptions include the use of an existing reporting or data warehouse system for some
or all required data, the exclusion of a source system that is classified or sensitive, or
a system that is unable to accommodate a real-time interface; this latter point will be
discussed in greater detail on the next page.)
5. The DATA Act: A Revolution in Federal Government Transparency PwC 4
Assuming a common data standard
and the mapping of required elements
in existing systems, the next step will
be for organizations to map from their
existing data sources to the agreed-to
standardized data definitions – the
common language. There are a broad
range of commercially available mapping
applications to accomplish this, and the
price points range from several hundred
to several thousand dollars. The actual
mapping effort will be a one-time exercise
that should not be time consuming to
implement and maintain (the exact
methodology and solution used will
depend on each type of system needing
coverage). This will naturally work more
easily for enterprise resource planning
(ERP) and other web and cloud-enabled
systems that maintain data. However,
older legacy mainframe or compiler-based
systems, which are still in use in some
agencies, can also be accommodated with
additional effort. One method for doing
this would be to “replicate” the mainframe
data in a database that could then be
“wrapped” for data access as discussed
above. Solutions of this nature might
be a bit more labor or cost intensive,
but should still not rise to the level of
a new system implementation or be of
such a magnitude as to hinder DATA Act
adherence.
Assuming an organization has identified
and mapped its data in accordance
with DATA Act standards, how might
the “data centric” approach work? One
possibility is that a semantic query using
the common English definitions would
pull data elements from the “mapped”
systems of record into a consuming
report, visualization template, or other
analytical application. This arrangement
could theoretically accommodate an
unlimited number of systems, facilitating
simultaneous queries to pull and assemble
data from any number of departments
and subordinate organizations, and
render the requested data in a report or
visual presentation. The graphic below
shows how this might work.
Figure 2: Sample of an illustrative “data-centric” approach
6. 5 PwC The DATA Act: A Revolution in Federal Government Transparency
Items for CFO consideration
As in any effort, it is important for
organizational CFOs to begin their
response to the DATA Act by strategizing.
There are several questions that should be
considered.
• What stakeholders should be
involved? Each CFO and organization
should consider who needs to be
at the table in order to implement
DATA Act requirements. For instance,
based on the legislation, there is a
strong chance that USASpending.gov
will be the platform used to publish
financial information under this law.
Therefore, if an agency currently
provides data to USASpending.gov,
individuals or offices responsible
might also be involved in DATA Act
efforts as they oversee the quality of
uploaded data. Additionally, standing
policy and operations groups, such as
an Accounting Policy Working Group
along with representatives from the
agency information architecture
team, would also merit inclusion in
DATA Act involvement and assistance
activities.
• What resources will be needed?
Early on, organizations and agencies
should consider, to the best of their
ability, what the level of effort will be
over the next four years in order to
complete successfully this initiative.
Initially, key leaders and personnel
would most likely absorb time spent in
DATA Act preparation as part of their
regular duties and workload. Agencies
should assess if current resource
levels can handle ongoing DATA Act
responsibilities and duties, and if not
begin activities to plan, forecast, and
budget for additional government or
contractor personnel support.
• What are the exact DATA Act
requirements? Successful execution
also hinges on having a clear
understanding of the requirements.
CFOs initially can focus on those
guidelines provided in the DATA Act
itself, and then on the guidance issued
by Treasury and OMB. When collecting
requirements, organizations can use
them to identify any gaps that may
exist in the selected data system(s)
and determine if supplementary data
will need to be gathered from other
outside or non-system sources (given
some of the considerations discussed
before).
• Will governance be required?
Finally, CFOs should consider
the extensive coordination and
collaboration across stakeholder
groups that will be required for
successful DATA Act implementation.
Agencies might therefore look at
leveraging program and change
management tools to help establish a
governance framework. For example,
a project plan can be established to
propose a baseline schedule along
with a work breakdown structure
and a timeline for each task. A
communication plan could also be
developed, including a stakeholder
matrix that identifies primary project
tasks and the stakeholders required
to participate in those tasks. And,
once the data has been mapped and
automated querying established,
organizations can examine how these
new capabilities might be leveraged by
internal stakeholders (to potentially
improve operations, discussed in
more detail below) and external
stakeholders (which can analyse now
transparent data to better address the
impacts of current federal programs
and proposed changes).
There is also a major DATA Act benefit
that CFOs can consider, and that is in
how internal programs and processes
might be improved. Transparent and
accessable data will give many federal
organizations a tool to examine and
reengineer internal programs and
business processes. For example,
auditors might be able to leverage newly
accessable data to automate required
transaction sampling, thereby replacing
manual and time-consuming effort.
Program analysts should be able to
better analyze and benchmark funding
and execution data across time and
multiple organizations, allowing for
much deeper levels of analytics than
had previously been possible. More
detailed and standardized “dashboarding”
capabilities can be available to senior
leaders across government, allowing for
more transparent and timely analysis of
key programs and organizations. These
potential benefits are only the tip of the
iceberg, as many more initiatives and
improvements are certain to be identified
as organizations and stakeholders take
advantage of their data in increasingly
creative ways.
7. The DATA Act: A Revolution in Federal Government Transparency PwC 6
Conclusion
The DATA Act represents a tremendous opportunity for the federal government to
increase citizen and taxpayer transparency and lay the groundwork to improve its
financial reporting and analysis capabilities. Federal CFOs at every level therefore
should begin now to engage their organization’s vital stakeholders and start working
towards full DATA Act implementation. Doing so will help achieve the success of this
initiative while taking government spending data and analysis to new levels.
References:
• Digital Accountability and Transparency Act (DATA Act), Public Law No. 113-101
• DATA Act, Data Transparency Coalition website, available at
http://datacoalition.com/issues/data-act.html.
>
For more information, please contact:
Don McCrory
Principal
(703) 918-1351
donald.g.mccrory@us.pwc.com
Joe Kull
Director
(703) 918-1320
joseph.kull@us.pwc.com
Chris Babcock
Director
(703) 918-4588
christopher.babcock@us.pwc.com
Award-Winning Excellence
PwC’s Public Sector Practice was recently recognized with the Malcolm Baldrige National
Quality Award, which was established by Congress for performance excellence through
innovation, improvement and visionary leadership. With a singular focus on government
agencies, and the employment of serious thought leaders and seasoned former government
officials, including former high-level military leaders, PwC’s Public Sector Practice
has an outstanding record of providing exemplary service to government agencies.
>