The document discusses data quality as a business success factor. It provides two case studies: (1) At automotive supplier ZF Friedrichshafen AG, consistent and accurate master data is required for customer relationship management. (2) At Bayer CropScience, root causes of poor data quality were identified, including a lack of data quality training and heterogeneous data maintenance tools. The document emphasizes that corporate data quality management relates to business strategy and should follow a lifecycle approach. Benefits of improved data quality can include inventory savings and reduced costs of obsolete records.
This presentation illustrates best practices in master data governance through a rich set of case studies. The presentation leverages seven years of in-depth experience in the field from the Competence Center Corporate Data Quality.
The Role of Community-Driven Data Curation for EnterprisesEdward Curry
With increased utilization of data within their operational and strategic processes, enterprises need to ensure data quality and accuracy. Data curation is a process that can ensure the quality of data and its fitness for use. Traditional approaches to curation are struggling with increased data volumes, and near real-time demands for curated data. In response, curation teams have turned to community crowd-sourcing and semi-automatedmetadata tools for assistance. This chapter provides an overview of data curation, discusses the business motivations for curating data and investigates the role of community-based data curation, focusing on internal communities and pre-competitive data collaborations. The chapter is supported by case studies from Wikipedia, The New York Times, Thomson Reuters, Protein Data Bank and ChemSpider upon which best practices for both social and technical aspects of community-driven data curation are described.
E. Curry, A. Freitas, and S. O’Riáin, “The Role of Community-Driven Data Curation for Enterprises,” in Linking Enterprise Data, D. Wood, Ed. Boston, MA: Springer US, 2010, pp. 25-47.
Developing an Sustainable IT Capability: Lessons From Intel's JourneyEdward Curry
Intel Corporation set itself a goal to reduce its global-warming greenhouse gas footprint by 20% by 2012 from 2007 levels. Through the use of sustainable IT, the Intel IT organization is recognized as a significant contributor to the company’s sustainability strategy by transforming its IT operations and overall Intel operations. This article describes how Intel has achieved IT sustainability benefits thus far by developing four key capabilities. These capabilities have been incorporated into the Sustainable ICT Capability Maturity Framework (SICT-CMF), a model developed by an industry consortium in which the authors were key participants. The article ends with lessons learned from Intel’s experiences that can be applied by business and IT executives in other enterprises.
Modernizing the Enterprise Monolith: EQengineered Consulting Green PaperMark Hewitt
Are you an enterprise that recognizes the business liability inherent in the monolithic or otherwise dated enterprise software applications you have built? Does your technology represent an impediment to the needed agility and flexibility required to meet the needs of today’s business environment?
Historically, enterprise software development focused on an approach that incorporated all functionality into a single process, and replicated it across servers as additional capacity was required. Today, these large applications have become bloated and unmanageable as new features and functionality are added. And, as small changes are made to existing functionality, the requirements to update and redeploy the server-side application becomes an intractable juggernaut.
Forward-thinking organizations like Amazon and Netflix led the way toward agile processes, deconstructed software stacks, and efficient APIs. Both large and small organizations serious about embracing modern practices have followed by decoupling the front and back end of their enterprise applications, employing microservices and cloud technologies, and adopting agile methodologies. These very steps can serve to highlight additional technical deficits in old solutions and codebases, which in turn become stumbling blocks to modern development practices.
As these technology trends continue to evolve, how can your company keep pace and remain viable?
In this green paper, we discuss how CIOs, CTOs, and VPs of Engineering can lead the needed modernization with their counterparts in marketing and the business to ensure that their organizations remain competitive in today’s customer-driven and technology-led economy.
Key questions addressed include:
• Why is technical modernization vital for the business?
• What types of modernization projects are there?
• How does modernization fit into your organization?
Data Resource Management: Good Practices to Make the Most out of a Hidden Tre...Boris Otto
Management of the data resource in the industrial enterprise becomes a strategic capability in the digital age. The talk motivates data resource management, presents proven practices and outlines principles of modern data management approaches.
This presentation illustrates best practices in master data governance through a rich set of case studies. The presentation leverages seven years of in-depth experience in the field from the Competence Center Corporate Data Quality.
The Role of Community-Driven Data Curation for EnterprisesEdward Curry
With increased utilization of data within their operational and strategic processes, enterprises need to ensure data quality and accuracy. Data curation is a process that can ensure the quality of data and its fitness for use. Traditional approaches to curation are struggling with increased data volumes, and near real-time demands for curated data. In response, curation teams have turned to community crowd-sourcing and semi-automatedmetadata tools for assistance. This chapter provides an overview of data curation, discusses the business motivations for curating data and investigates the role of community-based data curation, focusing on internal communities and pre-competitive data collaborations. The chapter is supported by case studies from Wikipedia, The New York Times, Thomson Reuters, Protein Data Bank and ChemSpider upon which best practices for both social and technical aspects of community-driven data curation are described.
E. Curry, A. Freitas, and S. O’Riáin, “The Role of Community-Driven Data Curation for Enterprises,” in Linking Enterprise Data, D. Wood, Ed. Boston, MA: Springer US, 2010, pp. 25-47.
Developing an Sustainable IT Capability: Lessons From Intel's JourneyEdward Curry
Intel Corporation set itself a goal to reduce its global-warming greenhouse gas footprint by 20% by 2012 from 2007 levels. Through the use of sustainable IT, the Intel IT organization is recognized as a significant contributor to the company’s sustainability strategy by transforming its IT operations and overall Intel operations. This article describes how Intel has achieved IT sustainability benefits thus far by developing four key capabilities. These capabilities have been incorporated into the Sustainable ICT Capability Maturity Framework (SICT-CMF), a model developed by an industry consortium in which the authors were key participants. The article ends with lessons learned from Intel’s experiences that can be applied by business and IT executives in other enterprises.
Modernizing the Enterprise Monolith: EQengineered Consulting Green PaperMark Hewitt
Are you an enterprise that recognizes the business liability inherent in the monolithic or otherwise dated enterprise software applications you have built? Does your technology represent an impediment to the needed agility and flexibility required to meet the needs of today’s business environment?
Historically, enterprise software development focused on an approach that incorporated all functionality into a single process, and replicated it across servers as additional capacity was required. Today, these large applications have become bloated and unmanageable as new features and functionality are added. And, as small changes are made to existing functionality, the requirements to update and redeploy the server-side application becomes an intractable juggernaut.
Forward-thinking organizations like Amazon and Netflix led the way toward agile processes, deconstructed software stacks, and efficient APIs. Both large and small organizations serious about embracing modern practices have followed by decoupling the front and back end of their enterprise applications, employing microservices and cloud technologies, and adopting agile methodologies. These very steps can serve to highlight additional technical deficits in old solutions and codebases, which in turn become stumbling blocks to modern development practices.
As these technology trends continue to evolve, how can your company keep pace and remain viable?
In this green paper, we discuss how CIOs, CTOs, and VPs of Engineering can lead the needed modernization with their counterparts in marketing and the business to ensure that their organizations remain competitive in today’s customer-driven and technology-led economy.
Key questions addressed include:
• Why is technical modernization vital for the business?
• What types of modernization projects are there?
• How does modernization fit into your organization?
Data Resource Management: Good Practices to Make the Most out of a Hidden Tre...Boris Otto
Management of the data resource in the industrial enterprise becomes a strategic capability in the digital age. The talk motivates data resource management, presents proven practices and outlines principles of modern data management approaches.
A Reference Process Model for Master Data ManagementBoris Otto
The management of master data (MDM) plays an important role for companies in responding to a number of business drivers such as regulatory compliance and efficient reporting. With the understanding of MDM’s impact on the business drivers companies are today in the process of organizing MDM on corporate level. While managing master data is an organizational task that cannot be encountered by simply implementing a software system, business processes are necessary to meet the challenges efficiently. This paper describes the design process of a reference process model for MDM. The model design process spanned several iterations comprising multiple design and evaluation cycles, including the model’s application in three participative case studies. Practitioners may use the reference model as an instrument for the analysis and design of MDM processes. From a scientific perspective, the reference model is a design artifact that represents an abstraction of processes in the field of MDM.
Wikipedia (DBpedia): Crowdsourced Data CurationEdward Curry
Wikipedia is an open-source encyclopedia, built collaboratively by a large community of web editors. The success of Wikipedia as one of the most important sources of information available today still challenges existing models of content creation. Despite the fact that the term ‘curation’ is not commonly addressed by Wikipedia’s contributors, the task of digital curation is the central activity of Wikipedia editors, who have the responsibility for information quality standards.
Wikipedia, is already widely used as a collaborative environment inside organizations5.
The investigation of the collaboration dynamics behind Wikipedia highlights important features and good practices which can be applied to different organizations. Our analysis focuses on the curation perspective and covers two important dimensions: social organization and artifacts, tools & processes for cooperative work coordination. These are key enablers that support the creation of high quality information products in Wikipedia’s decentralized environment.
A business can be made more valuable by making low intellectual content activities effortless and high intellectual content activities more functional and available to knowledge workers at every level ...
If an organization has not eliminated clerical office activities almost entirely from its job descriptions, it will suffer from poor customer service, poor use of its information resources, longer times to market and higher operating costs.
Dealing with Semantic Heterogeneity in Real-Time InformationEdward Curry
Tutorial at the EarthBiAs 2014 Summer School on Dealing with Semantic Heterogeneity in Real-Time Information
Part I: Large Scale Open Environments
Part Ii: Computational Paradigms
Part III: RDF Event Processing
Part IV: Theory of Event Exchange
Part V: Approaches to Semantic Decoupling
Part VI: Example Application: Linked Energy Intelligence
Sustainable Internet of Things: Alignment approach using enterprise architectureAnjar Priandoyo
ICoICT 2020 The 8th International Conference on Information and Communication Technology
Nunung Nurul Qomariyah
Computer Science Department
Faculty of Computing and Media
Bina Nusantara University
Anjar Priandoyo
Environment Dept., University of York
York, United Kingdom
Technology Consulting - PwC Indonesia
The effect of technology-organization-environment on adoption decision of bi...IJECEIAES
Big data technology (BDT) is being actively adopted by world-leading organizations due to its expected benefits. However, most of the organizations in Thailand are still in the decision or planning stage to adopt BDT. Many challenges exist in encouraging the BDT diffusion in businesses. Thus, this study develops a research model that investigates the determinants of BDT adoption in the Thai context based on the technology-organizationenvironment (TOE) framework and diffusion of innovation (DOI) theory. Data were collected through an online questionnaire. Three hundred IT employees in different organizations in Thailand were used as a sample group. Structural equation modeling (SEM) was conducted to test the hypotheses. The result indicated that the research model was fitted with the empirical data with the statistics: Normed Chi-Square=1.651, GFI=0.895, AFGI=0.863, NFI=0.930, TLI=0.964, CFI=0.971, SRMR=0.0392, and RMSEA=0.046. The research model could, at 52%, explain decision to adopt BDT. Relative advantage, top management support, competitive pressure, and trading partner pressure show significant positive relation with BDT adoption, while security negatively influences BDT adoption.
Data governance at belgacom - presentation for DAMA Belux 7 nov 2013Peter Simoens
the drivers, plan and approach used to deploy data governance at Belgacom. Includes use of our BEIM enterprise data model, metadata management, data stewardship,...
Accurate BI &MDM Lead to successful Project Execution!Orchestra Networks
McDermott International’s global CIO describes why MDM is vital for accurate reporting, BI and big data analytics.
Presented at Gartner Enterprise Information and Master Data Management Summit, Las Vegas.
McDermott is an engineering and construction company focused on oil and gas field development projects. McDermott PARS (Project Analytics and Reporting System) supports project delivery with reports that integrate information from across the enterprise. At the heart of PARS: an MDM that manages the relationships–between domains, applications and time–required for accurate reporting and analytics.
White Paper - The Business Case For Business IntelligenceDavid Walker
This white paper looks at the business case that should lie behind the decision to build a data warehouse and provide a business intelligence solution.
There are three primary drivers for making the investment in a business intelligence solution
1. Measurement and management of the business process
2. Analysis of why things change in the business in order to react better in the future
3. Providing information for stakeholders
As a consequence of the investment there will also be a number of secondary benefits that will help to justify the investment and these are also discussed. Finally there are a number of ‘anti-drivers’ – reasons for not embarking on a business intelligence programme.
When it comes to Green IT, businesses have been reactive. Interest in Green IT rises significantly when energy prices increase, and drops just as quickly when prices flatten out. This is typical of the ad-hoc approach taken by most organizations which has led to inconsistent results. This research will help organizations determine:
•Why Green IT is important.
•Examples of Green IT opportunities.
•The state of Green IT today.
•How to implement a successful Green IT program.
In this storyboard, learn how a strategic approach to Green IT and a longer-term commitment to sustainability can positively impact the bottom line.
In today’s globalized, competitive marketplace, being able to leverage technology to deliver faster turnaround times, meet lower pricing goals and provide customizable options can mean the difference between sustainability and irrelevancy. In this ebook, we’ll explore some of the leading solutions transforming the manufacturing industry:
- Automation for cost savings
- 3D printing for improved productivity
- Smart data for quality assurance
- Connectivity for safety and communication
- Security solutions to protect it all
Learn more: http://ms.spr.ly/6006Twegg
Corporate Data Quality: Research and Services OverviewBoris Otto
This presentation gives an overview of the research in the Competence Center Corporate Data Quality (CC CDQ) at the University of St. Gallen in Switzerland and the service portfolio in the field of corporate data quality of the Business Engineering Institute (BEI) St. Gallen.
A Reference Process Model for Master Data ManagementBoris Otto
The management of master data (MDM) plays an important role for companies in responding to a number of business drivers such as regulatory compliance and efficient reporting. With the understanding of MDM’s impact on the business drivers companies are today in the process of organizing MDM on corporate level. While managing master data is an organizational task that cannot be encountered by simply implementing a software system, business processes are necessary to meet the challenges efficiently. This paper describes the design process of a reference process model for MDM. The model design process spanned several iterations comprising multiple design and evaluation cycles, including the model’s application in three participative case studies. Practitioners may use the reference model as an instrument for the analysis and design of MDM processes. From a scientific perspective, the reference model is a design artifact that represents an abstraction of processes in the field of MDM.
Wikipedia (DBpedia): Crowdsourced Data CurationEdward Curry
Wikipedia is an open-source encyclopedia, built collaboratively by a large community of web editors. The success of Wikipedia as one of the most important sources of information available today still challenges existing models of content creation. Despite the fact that the term ‘curation’ is not commonly addressed by Wikipedia’s contributors, the task of digital curation is the central activity of Wikipedia editors, who have the responsibility for information quality standards.
Wikipedia, is already widely used as a collaborative environment inside organizations5.
The investigation of the collaboration dynamics behind Wikipedia highlights important features and good practices which can be applied to different organizations. Our analysis focuses on the curation perspective and covers two important dimensions: social organization and artifacts, tools & processes for cooperative work coordination. These are key enablers that support the creation of high quality information products in Wikipedia’s decentralized environment.
A business can be made more valuable by making low intellectual content activities effortless and high intellectual content activities more functional and available to knowledge workers at every level ...
If an organization has not eliminated clerical office activities almost entirely from its job descriptions, it will suffer from poor customer service, poor use of its information resources, longer times to market and higher operating costs.
Dealing with Semantic Heterogeneity in Real-Time InformationEdward Curry
Tutorial at the EarthBiAs 2014 Summer School on Dealing with Semantic Heterogeneity in Real-Time Information
Part I: Large Scale Open Environments
Part Ii: Computational Paradigms
Part III: RDF Event Processing
Part IV: Theory of Event Exchange
Part V: Approaches to Semantic Decoupling
Part VI: Example Application: Linked Energy Intelligence
Sustainable Internet of Things: Alignment approach using enterprise architectureAnjar Priandoyo
ICoICT 2020 The 8th International Conference on Information and Communication Technology
Nunung Nurul Qomariyah
Computer Science Department
Faculty of Computing and Media
Bina Nusantara University
Anjar Priandoyo
Environment Dept., University of York
York, United Kingdom
Technology Consulting - PwC Indonesia
The effect of technology-organization-environment on adoption decision of bi...IJECEIAES
Big data technology (BDT) is being actively adopted by world-leading organizations due to its expected benefits. However, most of the organizations in Thailand are still in the decision or planning stage to adopt BDT. Many challenges exist in encouraging the BDT diffusion in businesses. Thus, this study develops a research model that investigates the determinants of BDT adoption in the Thai context based on the technology-organizationenvironment (TOE) framework and diffusion of innovation (DOI) theory. Data were collected through an online questionnaire. Three hundred IT employees in different organizations in Thailand were used as a sample group. Structural equation modeling (SEM) was conducted to test the hypotheses. The result indicated that the research model was fitted with the empirical data with the statistics: Normed Chi-Square=1.651, GFI=0.895, AFGI=0.863, NFI=0.930, TLI=0.964, CFI=0.971, SRMR=0.0392, and RMSEA=0.046. The research model could, at 52%, explain decision to adopt BDT. Relative advantage, top management support, competitive pressure, and trading partner pressure show significant positive relation with BDT adoption, while security negatively influences BDT adoption.
Data governance at belgacom - presentation for DAMA Belux 7 nov 2013Peter Simoens
the drivers, plan and approach used to deploy data governance at Belgacom. Includes use of our BEIM enterprise data model, metadata management, data stewardship,...
Accurate BI &MDM Lead to successful Project Execution!Orchestra Networks
McDermott International’s global CIO describes why MDM is vital for accurate reporting, BI and big data analytics.
Presented at Gartner Enterprise Information and Master Data Management Summit, Las Vegas.
McDermott is an engineering and construction company focused on oil and gas field development projects. McDermott PARS (Project Analytics and Reporting System) supports project delivery with reports that integrate information from across the enterprise. At the heart of PARS: an MDM that manages the relationships–between domains, applications and time–required for accurate reporting and analytics.
White Paper - The Business Case For Business IntelligenceDavid Walker
This white paper looks at the business case that should lie behind the decision to build a data warehouse and provide a business intelligence solution.
There are three primary drivers for making the investment in a business intelligence solution
1. Measurement and management of the business process
2. Analysis of why things change in the business in order to react better in the future
3. Providing information for stakeholders
As a consequence of the investment there will also be a number of secondary benefits that will help to justify the investment and these are also discussed. Finally there are a number of ‘anti-drivers’ – reasons for not embarking on a business intelligence programme.
When it comes to Green IT, businesses have been reactive. Interest in Green IT rises significantly when energy prices increase, and drops just as quickly when prices flatten out. This is typical of the ad-hoc approach taken by most organizations which has led to inconsistent results. This research will help organizations determine:
•Why Green IT is important.
•Examples of Green IT opportunities.
•The state of Green IT today.
•How to implement a successful Green IT program.
In this storyboard, learn how a strategic approach to Green IT and a longer-term commitment to sustainability can positively impact the bottom line.
In today’s globalized, competitive marketplace, being able to leverage technology to deliver faster turnaround times, meet lower pricing goals and provide customizable options can mean the difference between sustainability and irrelevancy. In this ebook, we’ll explore some of the leading solutions transforming the manufacturing industry:
- Automation for cost savings
- 3D printing for improved productivity
- Smart data for quality assurance
- Connectivity for safety and communication
- Security solutions to protect it all
Learn more: http://ms.spr.ly/6006Twegg
Corporate Data Quality: Research and Services OverviewBoris Otto
This presentation gives an overview of the research in the Competence Center Corporate Data Quality (CC CDQ) at the University of St. Gallen in Switzerland and the service portfolio in the field of corporate data quality of the Business Engineering Institute (BEI) St. Gallen.
First San Francisco Partner's Managing Director, Kelle O'Neal spoke to group of 150+ people at Oracle Open World, October, 2009 about Data Governance and its imperative use of technology to support data quality in large organizations.
National Patient Safety Foundation 2012 Dashboard DemoEdgewater
Edgwater attended the NPSF 2012 Patient Safety Congress in order to showcase our proven expertise in developing Patient Safety & Quality systems and processes. This presentation highlights some Edgewater client success stories as well as a demonstration of dashboards developed as part of our projects.
Database Architechs is a database-focused consulting company for 17 years bringing you the most skilled and experienced data and database experts with a wide variety of service offering covering all database and data related aspects.
This presentation reports on data governance best practices. Based on a definition of fundamental terms and the business rationale for data governance, a set of case studies from leading companies is presented. The content of this presentation is a result of the Competence Center Corporate Data Quality (CC CDQ) at the University of St. Gallen, Switzerland.
Our mission is: transforming data to reveal business and clinical insights. We accomplish this through our data management, business intelligence and analytics consulting services. We ensure that organizations have the proper tools, technology and processes to improve performance – relative to predefined critical success factors and key performance indicators – based on greater insight and analysis through analytics. We offer a framework for establishing an Analytics Center of Excellence within organizations to define roles and responsibilities and coordinate activities and tasks among key stakeholders. With emphasis on statistical analysis, forecasting, optimization, and simulation, analytics provides results that are predictive and prescriptive, injecting clarity and confidence in decision making and improving performance through situational awareness at all levels of the organization.
Through our past consulting engagements, we observed significant challenges and short-comings in how these organizations navigate such a data-rich environment in the pursuit of analytical excellence. Based on our assessment and evaluation, we develop a roadmap for establishing an information environment that enables stakeholders to improve clinical decision-making and performance (as related to quality, outcomes, cost and utilization) through data visualizations and advanced analytics. This roadmap accounts for both structured and unstructured data, and it includes provisions for controlled data access based on security and privacy policies. We manage the transition from on-premise to cloud-based data sources and leverage the cloud as an aggregation point for creating a Big Data analytics platform. We then perform an alternatives analysis of feasible solutions based on several factors, including: delivered capabilities, ease of implementation, performance, scalability, interoperability and integration with legacy systems, and functionality -- at a cost that maximizes ROI.
Shared Digital Twins: Collaboration in EcosystemsBoris Otto
This presentation introduces the concept of shared digital Twins from a cusiness perspective and outlines recent technological developments for shared digital twin management.
Deutschland auf dem Weg in die DatenökonomieBoris Otto
Der Vortrag greift aktuelle Diskussionsstränge zwischen Wirtschaft, Wissenschaft und Politik auf und thematisiert u.a. die betriebswirtschaftliche, volkswirtschaftliche, informationstechnische und ethische Dimension der Datenökonomie.
International Data Spaces: Data Sovereignty for Business Model InnovationBoris Otto
This presentation given at the European Big Data Value Forum on November 13, 2018, in Vienna introduces International Data Spaces (IDS) as a reference architecture and implementation for data sovereignty. The IDS archiecture rests on usage control technologies and trusted computing environments and, thus, forms a strategic enabler for a fair data economy which respects the interests of the data owners.
Business mit Daten? Deutschland auf dem Weg in die smarte DatenwirtschaftBoris Otto
This presentation (in German) given at the "Tage der digitalen Technologien" on May 15, 2019, in Berlin addresses data ecosystems as an innovative institutional format for creating value out of shared data. Furthermore, the talk points to selected challenges in setting up data ecosystems.
International Data Spaces: Data Sovereignty and Interoperability for Business...Boris Otto
This presentation was held in a workshop session on IoT Business Models and Data Interoperability at the Max Planck Institute for Innovation and Competition in Munich on 8 October 2018. The presenation introduces the concept of business ecosystems and the role of data within the latter, then outlines the state of the art in terms of interoperability and sovereignty and finally sketches the IDS contribution.
This presentation was part of the IDS Webinar on Data Governance. It gives a brief overview of the history on Data Governance, describes how governing data has to be further developed in the era of business and data ecosystems, and outlines the contribution of the International Data Spaces Association on the topic.
Smart Data Engineering: Erfolgsfaktor für die digitale TransformationBoris Otto
Diese Präsentation wurde auf dem Strategieforum IoT auf Schloss Hohenkammer am 30.5.2018 vorgetragen und führt in die Herausforderungen im Datenmanagement im Internet der Dinge ein. Zudem werden Prinzipien des Smart Data Engineering erläutert.
IDS: Update on Reference Architecture and Ecosystem DesignBoris Otto
This presentation motivates the Industrial Data Space and gives an update on the IDS Reference Architecture Model as well as the related ecosystem. It sets data in the context of business model innovation and points out how the IDS Reference Architecture relates to alternative data architecture styles such as data lakes and blockchain technology, for example. The presentation was given at the IDSA Summit on March 22, 2018.
Datensouveränität in Produktions- und LogistiknetzwerkenBoris Otto
Dieser Vortrag motiviert Datensouveränität in Produktions- und Logistiknetzwerken. Datensouveränität ist die Fähigkeit zur Selbstbestimmung über das Wirtschaftsgut Daten - auch beim Austauschen der Daten in Unternehmensnetzwerken. Der Vortrag führt in die Architektur des Industrial Data Space ein, der einen virtuellen Datenraum für den souveränen Datenaustausch bildet. Der Vortrag schließt mit Anwendungsbeispielen und einer Diskussion des Beitrags für die Wissenschaft und die Praxis.
Digital Business Engineering am Fraunhofer ISSTBoris Otto
This presentation (in German) gives an overview about how Fraunhofer ISST supports digital transformation projects in various industries. It motivates Digital Business Engineering as a methodological framework and show-cases typical applications. The presentation was given at the Fraunhofer ISST 25th anniversary event at Zeche Zollern in Dortmund.
Der Vortrag leitet am Beispiel der Automobilindustrie in die wesentlichen Entwicklungen zur Digitalisierung von Industriebetrieben ein und stellt dabei die besondere Rolle der Daten und eines wirksamen Datenmanagements heraus. Abschließend gibt der Vortrag Empfehlungen zum Management der Digitalen Transformation.
Data Sovereignty - Call for an International EffortBoris Otto
This presentation will be given at the Digitisting Manufacturing in the G20 Conference on March 16, 2017, in Berlin, in the context of the workshop "Data Sovereignty in Global Value Networks".
This presentation was held at the 2nd Internet of Manufacturing Conference on February 7, 2017, in Munich, Germany. It addresses the need of a new kind of data management to cope with the requirements digital scenarios pose on the industrial enterprise. Motivated by examples, the talk outlines design principles for smart data management and concludes with two leading examples, namely the Industrial Data Space initiative and the Corporate Data League.
Industrial Data Space: Referenzarchitekturmodell für die DigitalisierungBoris Otto
Diese Präsentation auf der VDI Industrie 4.0 Tagung am 25.1.2017 in Düsseldorf gibt ein Update der Entwicklungen des Industrial Data Space. Schwerpunkte sind Datensouveränität, der Industrial Data Space als Bindeglied zwischen IoT-Cloud-Plattformen sowie der Referenz-Use-Case Logistik.
Industrial Data Space: Digitale Souveränität über DatenBoris Otto
Der Vortrag führt in Grundbegriffe der Datenökonomie ein und macht einen Vorschlag zur Definition des Begriffs der digitalen Souveränität. Zudem arbeitet der Vortrag heraus, welchen wichtigen Beitrag der Industrial Data Space zur Wahrung der digitalen Souveränität leistet.
The Industrial Data Space aims at establishing a virtual data space in which partners in business ecosystems can securely exchange and easily link their data assets. The presentation puts the Industrial Data Space in the context of recent developments in the area of Smart Service Welt and Industrie 4.0 and sketches a reference architecture model and functional software components. Furthermore, the presentation introduces the Industrial Data Space Association which institutionalizes the user requirements and drives standardization. The presentation was given at the Industry 4.0 session at MACH 2016 on April 14, 2016, in Birmingham, UK.
Industrial Data Space: Digital Sovereignty for Industry 4.0 and Smart ServicesBoris Otto
The presentation takes a look on the digitization of the industrial enterprise, linking Industry 4.0 and Smart Service activities. It points out the crucial role of data for future business success and positions the Industrial Data Space as a collaborative approach to securely exchange and easily link data within business ecosystems. The presentation was given at the Manufacturing Analytics workshop organized by the Insitute of Manufacturing at the University of Cambridge on February 1st, 2016.
Industrial Data Space: Referenzarchitektur für Data Supply ChainsBoris Otto
Dieser Vortrag stellt den Industrial Data Space als Referenz-Architektur für Data Supply Chains vor. Data Supply Chains sind vernetzte, unternehmensübergreifende Datenflüsse. Data Supply Chains sind Voraussetzung um hybride Leistungsangebote (Smart Services) einerseits und digitalisierte Leistungserstellung (Industrie 4.0) andererseits zu verbinden. Durch die effektive und effiziente Bewirtschaftung von Data Supply Chains erhöhen Unternehmen ihre Wettbewerbsfähigkeit. Der Industrial Data Space liefert hierzu die Blaupause, als Referenzarchitektur für die Datenökonomie.
Daten sind die strategische Ressource im digitalen Zeitalter. Der Industrial Data Space zielt darauf ab, den sicheren Austausch und die einfache Kombination von Daten für Unternehmen zu ermöglichen. Dadurch lassen sich smarte Dienstleistungen einfacher verwirklichen. Fraunhofer erarbeitet in einem vom Bundesministerium für Bildung und Forschung geförderten Projekt die Basis dazu und entwickelt ein Referenzarchitekturmodell für den Industrial Data Space, das in ausgewählten Use Cases pilotiert wird.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on: