See the companion webinar at: http://embt.co/1sc5YZl
According to a recent report from SINTEF, an independent research organization in Scandinavia, 90% of all the data in the world has been generated over the last two years. Where does it all go? And how do we make sense of it all? To get control of the data, you have to know more about that data and how it relates to other data. Data modeling provides the infrastructure needed to capture the right level of information about the data and its associated metadata.
In this webinar, Torquil Harkness will discuss five key considerations for effective data modeling, including:
+ Aspects of model design
+ Attributes such as naming standards
+ The importance of planning for future growth
The document discusses creating an "Educational Futures Evidence Hub" to engage academics, practitioners, enterprises, and policymakers through a dynamic web presence. It describes the Thematic Research Network and its goal of using collective intelligence and open educational resources to better understand the impact of emerging technologies on education. The document raises questions about how to augment systems' ability to sense, respond to, and shape their environment through the lenses of complex adaptive systems, resilience, sensemaking, and human-computer interaction.
This document outlines how to create a course through the FreeU learning platform by finding self-motivated friends to join, planning an online or in-person launch date, asking participants to contribute media like books, music or art to the course library, defining terms, building a timeline, participating in peer review through one-pagers with a central image and 4 short statements, and treating the course as a work in progress.
El documento describe un método para evaluar la seguridad de un sistema u organización simulando un ataque de un intruso. El método incluye varias fases como la recopilación de información, enumeración de activos, detección de vulnerabilidades y explotación de vulnerabilidades para probar la seguridad del objetivo. El objetivo final es ayudar a la organización a identificar debilidades y mejorar su postura de seguridad.
Csw2016 gong pwn_a_nexus_device_with_a_single_vulnerabilityCanSecWest
The document discusses exploiting an out-of-bounds (OOB) access vulnerability in Chrome's V8 JavaScript engine to install unauthorized apps. It describes triggering OOB memory reads during JSON serialization by modifying the length of an array being serialized. This allows controlling memory and leaking data, enabling arbitrary reads/writes and potentially code execution. It also demonstrates abusing inline hooking and injected scripts to install apps by simulating user interactions despite sandbox protections.
Watch the companion webinar at: http://embt.co/1xNGSWn
Data integrity is a critical feature within every DBMS, and part of the integrity is keeping data in a consistent state and recoverable. It all starts in the DB2 log.
Join Martin Hubel, DB2 Evangelist and Scott Walz, Director of Software Consultants as they share insight and best practices on understanding DB2 processes and log management strategy.
Key Takeaways:
+ Why data logs are important
+ Proper log recovery actions
+ Log management best practices
See the companion webinar at: http://embt.co/1ylvkZB
With the multiple high-profile data breaches being reported lately, the need to protect your data assets is elevated. What makes a piece of data sensitive or ‘Personally Identifiable Information?' Whether it’s identifying which information is restricted or who can access it, you need a way to control your data.
In this webinar, Tim Radney will share key concerns and challenges for securing your data and Rob Loranger will show you how to implement data security in your database modeling environment with a live demonstration.
View the companion webinar at: http://embt.co/1xcLFjJ
If you’ve ever wanted to code or understand more about PL/SQL code, this 2-part series is for you.
Join Oracle ACE Director, Dan Hotka and Solutions Consultant Director, Scott Walz as they present and demo the fundamentals of PL/SQL and much more.
Watch the webinar to learn about:
+ PL/SQL variable types and naming convention
+ How to code PL-SQL
+ When to use IF-THEN-ELSE or CASE
+ Profiling and debugging PL/SQL
The document discusses creating an "Educational Futures Evidence Hub" to engage academics, practitioners, enterprises, and policymakers through a dynamic web presence. It describes the Thematic Research Network and its goal of using collective intelligence and open educational resources to better understand the impact of emerging technologies on education. The document raises questions about how to augment systems' ability to sense, respond to, and shape their environment through the lenses of complex adaptive systems, resilience, sensemaking, and human-computer interaction.
This document outlines how to create a course through the FreeU learning platform by finding self-motivated friends to join, planning an online or in-person launch date, asking participants to contribute media like books, music or art to the course library, defining terms, building a timeline, participating in peer review through one-pagers with a central image and 4 short statements, and treating the course as a work in progress.
El documento describe un método para evaluar la seguridad de un sistema u organización simulando un ataque de un intruso. El método incluye varias fases como la recopilación de información, enumeración de activos, detección de vulnerabilidades y explotación de vulnerabilidades para probar la seguridad del objetivo. El objetivo final es ayudar a la organización a identificar debilidades y mejorar su postura de seguridad.
Csw2016 gong pwn_a_nexus_device_with_a_single_vulnerabilityCanSecWest
The document discusses exploiting an out-of-bounds (OOB) access vulnerability in Chrome's V8 JavaScript engine to install unauthorized apps. It describes triggering OOB memory reads during JSON serialization by modifying the length of an array being serialized. This allows controlling memory and leaking data, enabling arbitrary reads/writes and potentially code execution. It also demonstrates abusing inline hooking and injected scripts to install apps by simulating user interactions despite sandbox protections.
Watch the companion webinar at: http://embt.co/1xNGSWn
Data integrity is a critical feature within every DBMS, and part of the integrity is keeping data in a consistent state and recoverable. It all starts in the DB2 log.
Join Martin Hubel, DB2 Evangelist and Scott Walz, Director of Software Consultants as they share insight and best practices on understanding DB2 processes and log management strategy.
Key Takeaways:
+ Why data logs are important
+ Proper log recovery actions
+ Log management best practices
See the companion webinar at: http://embt.co/1ylvkZB
With the multiple high-profile data breaches being reported lately, the need to protect your data assets is elevated. What makes a piece of data sensitive or ‘Personally Identifiable Information?' Whether it’s identifying which information is restricted or who can access it, you need a way to control your data.
In this webinar, Tim Radney will share key concerns and challenges for securing your data and Rob Loranger will show you how to implement data security in your database modeling environment with a live demonstration.
View the companion webinar at: http://embt.co/1xcLFjJ
If you’ve ever wanted to code or understand more about PL/SQL code, this 2-part series is for you.
Join Oracle ACE Director, Dan Hotka and Solutions Consultant Director, Scott Walz as they present and demo the fundamentals of PL/SQL and much more.
Watch the webinar to learn about:
+ PL/SQL variable types and naming convention
+ How to code PL-SQL
+ When to use IF-THEN-ELSE or CASE
+ Profiling and debugging PL/SQL
Part 1 of this webinar set discussed the performance challenges caused by plan instability. Using SQL profiles and SQL patches are mechanisms that can provide targeted relief to individual SQL statements that have plan regressions. However, Oracle provides SQL Plan Management features that are intended to prevent plan regressions from creating challenges in the first place.
In this webinar, attendees will learn:
- How SQL Plan Management (SPM) works
- How SQL Plan Baselines are created
- How SQL Plan Baselines are evolved/enabled for use
- Limitations of SPM
Featured speaker: Karen Morton, Enkitec
Watch the companion webinar at: http://embt.co/1hjDU8s
Many DBAs may only know enough about data modeling to be dangerous. There are a number of challenges that DBAs face when trying to do data modeling, as well as some preconceived notions of what they think data modeling can (or can’t) do for them, such as generating useful DDL code.
This 90-minute session will provide specific insights and examples to show DBAs how a data modeling tool can help them improve database performance. Data modeling can simplify routine tasks and provide valuable context for a database implementation. Karen Lopez and John Sterrett will debunk seven dangerous myths that DBAs believe about data modeling, and also discuss and demonstrate:
+ Challenges DBAs encounter with data modeling
+ What data modeling really means and how it adds value
+ Why data modeling is key to successful agile projects
+ How data model-driven development saves time and money
+ Why data modeling should be done throughout the development lifecycle
Watch the companion webinar at: http://forms.embarcadero.com/ManagingMulti-Plat
With IT budgets being stretched thin and the extreme challenge of hiring good talent, the ability to easily support multiple database platforms is a fantastic skill to have. In fact, a lot of IT pros become “accidental” managers of their second platform. In this webinar, you will learn about SQL Server and Oracle, their similarities, their differences, and some tools to ease the pain of multi-platform management.
Join Joseph D’Antoni, SQL expert and evangelist, Kyle Edens, Sr. Account Executive and Scott Walz, Director of Embarcadero Software Consultants as they provide insight on techniques and tools to help you thrive in a multi-platform environment.
In this session, you will have the opportunity to:
+ Learn about the complexities and similarities of managing a multi-platform environment
+ Compare and contrast Oracle and SQL Server and see the differences in managing each platform
+ Bridge the knowledge gap using database tools
This session will benefit DBAs, accidental DBAs, system admins, storage admins as well as anyone looking to expand their skill set working with multiple platforms.
Learn more about Embarcadero DBArtisan and try it free: http://www.embarcadero.com/products/dbartisan
Watch the webinar at: http://embt.co/1OMDHK7
Although master data management (MDM) systems have been deployed in numerous industries and organizations, the vision of creating an overall “single source of truth” is beginning to yield to a more pragmatic perspective of providing visibility to shared information about uniquely-identifiable entities within the enterprise. This more mature approach sheds light on some of the potential gaps associated with the typical out-of-the-box data models for customer or product.
In this webinar, David Loshin addresses data modeling for MDM systems, and share insights about:
+ Some of the complexities emerging from reliance on canned master data models
+ Alternatives for revising how master data entities are viewed and consumed within the enterprise
+ How a consumption-oriented engagement process will help the master data modeler devise thoughtful conceptual and logical representations of shared master data
He also discusses how these different ways of looking at master data modeling will help reduce complexity for master data adoption, system interoperability, and legacy migration.
Learn more about ER/Studio at http://www.embarcadero.com/products/er-studio
Watch the companion webinar at: http://embt.co/16cXD4h
Join Oracle ACE Director, Dan Hotka and Solutions Consultant Director, Scott Walz in part two of the series, where they will continue to build on that knowledge and share even more expertise on PL/SQL procedures, functions and packages.
Watch the webinar to learn about:
+ Procedures, functions and packages
+ Tips on PL/SQL compiling options
+ Performance tuning
Learn more about DB PowerStudio at: http://embt.co/DBPower
See the companion webinar at: http://embt.co/1tqdQxA
Being a DBA can be challenging and frustrating. There is always a lot to learn, many priorities and not enough hours in the day to get it all done.
Join SQL Server MVPs John Sterrett and Mike Walsh, along with Scott Walz, Director of Embarcadero Software Consultants, as they reflect back on the life of a DBA. By sharing real-life personal use cases and scenarios, they will give you a sense of direction, help you prioritize and get you started down the right path.
In this presentation, you will learn:
+ Make the first things, the first things
+ The goals and an action plan to start becoming a proactive DBA
From other DBAs who have been right where you are
Learn more about Embarcadero data modeling solutions at http://embt.co/data-modeling
The ER/Studio Enterprise Team edition provides model and metadata collaboration and enables faster and more effective decision-making using more accurate data. The flagship ER/Studio Data Architect is the foundation for this comprehensive suite that also includes import and export bridges, a model repository, and a data source registry. Additionally, the ER/Studio Team Server Core in this edition allows unlimited web user access to models and metadata, and provides greater meaning, understanding and context to enterprise data with an extensible glossary so users can view, store and centrally manage authoritative business definitions.
The document discusses the challenges of integrating data from multiple sources. It notes that continuously streamed data sources can influence business analytics by impacting customer satisfaction, identifying opportunities, and more. However, integrating these different data sources poses challenges like ensuring entity identifiability across datasets, dealing with limited or no existing data governance, addressing potential editorial biases, and handling missing metadata. The document provides examples and strategies for profiling data elements, determining if they are structurally and semantically conformable, and leveraging metadata to enable successful data integration.
The document summarizes a webinar introducing Embarcadero RAD Studio XE7. It highlights key features such as building shared user interfaces across devices, extending Windows applications to new devices and gadgets, and new enterprise mobility services for secure database access and custom APIs. A demonstration is provided of the new FireUI for multi-device apps, app tethering across devices, and the enterprise mobility services.
Is This Really a SAN Problem? Understanding the Performance of Your IO Subsy...Embarcadero Technologies
Learn more about Embarcadero database tools at: http://www.embarcadero.com/products/database-tools
Nearly 80% of performance issues appear to be related to the performance of storage. In reality, only about half of those are actual bottlenecks - frequently things like missing indexes, bad database design or misuse of features can either negatively impact the performance of the storage, or make it look like the root cause of the issue.
Join Microsoft MVP, Joseph D’Antoni and Embarcadero Director of Software Consultants, Scott Walz as they shed light on diagnosing your IO subsystem.
In this session, you will learn:
+ Where to look in SQL Server to gather information
+ How to use Windows Performance Monitor to analyze storage performance
+ What a "false positive" storage problem might look like
There are only so many times you can yell at the SAN admin, before they get cranky and start giving you 1GB drives, so attend this session and learn when the time is right.
You’ve been using ER/Studio for a while, and you may be wondering why you should look at the latest release with the new Enterprise Team Edition. How does the new Team Server really compare to Portal, and why should you care? Join this special webinar just for our valued long-time customers, and learn the latest information about XE6 and future releases of ER/Studio, and how you can improve your data architecture with visibility across your organization with enhancements, including:
* Data source registry
* Unified glossaries
* Business engagement
We will explain what these features are and why they matter for your data environment, and provide a live demonstration of the Team Server capabilities.
See the companion webinar at: http://forms.embarcadero.com/DataArchitectureSuccessStories
No matter what industry you are in, you know that an effective data architecture will simplify your job as a data modeler and improve the quality and usability of your data assets across the organization. But you may be wondering what others have done to establish and manage their data landscapes, and what you should consider when evaluating solutions.
In this session, two case studies will be presented which describe real-world scenarios of Embarcadero customers who have made the switch from CA ERwin. Roxann Collin of Optum / UnitedHealth Group and Kimberly Hondel of Ohio Department of Job and Family Services will share their stories, including:
+ The data architecture challenges they were facing
+ The reasons they chose ER/Studio
+ The results they’ve seen in their data environments
Rob Loranger will host our guests and share some highlights of the ER/Studio portfolio.
See the companion webinar on demand at: http://forms.embarcadero.com/ERStudioNewRelease
Embarcadero is committed to enabling our users to adapt to emerging technology platforms and challenges, as well as delivering enhanced capabilities to collaborate with other stakeholders. The new release of ER/Studio will include new capabilities targeted to today’s complex enterprise data architectures and demanding workflows, allowing you to:
+ Increase visibility and productivity with Agile change management
+ Add consistency to your data with Team Server glossary enhancements
+ Identify data source impacts resulting from model changes
Watch the companion webinar: http://embt.co/1BIRvPw
Business users and analysts are often trying to solve a very specific data-related problem, and when researching it, may wonder why certain items don’t correlate. Maybe you need to reconcile old data and new data, and eliminate erroneous entries. How do you find what the various terms mean and where the relevant data resides? Business stakeholders need visibility to the organization’s models and metadata, but at the right level of detail for their use. Join this session to learn about business data access challenges, including:
+ What issues exist with current methods
+ What information business users really need
+ How to find that information
Karen Lopez will share tips and insights on working through the data challenges for business analysts and Josh Buckner will share a solution to address those concerns.
Watch the companion webinar at: http://embt.co/1EBGmkK
There are many ways to find SQL that is performing poorly. The hard part is what to do with a bad SQL statement once you have it. In this webinar, several real-world examples will be reviewed to help you learn how to evaluate poorly performing SQL. Each example will demonstrate a commonly occurring SQL performance problem and provide a method to solve it.
Key points:
1. Why data logs are important
2. Proper log recovery actions
3. Log management best practices
Understanding Hardware: The Right Fights for the DBA to Pick with the Server ...Embarcadero Technologies
This document summarizes a presentation given by Joey D'Antoni on how database administrators should pick their battles when working with server, storage, and virtualization teams. Some of the key points made include: avoiding fights over issues like RAID levels, virtual vs physical servers, and storage layouts; ensuring databases are on the latest versions of Windows; using features like Group Managed Service Accounts; working with storage administrators on space allocation and backups; and understanding virtualization best practices when databases are virtualized. The overall message is that relationships with other teams are important, and DBAs should focus on collaborating rather than arguing over issues that don't significantly impact databases.
The document discusses challenges with semantic consistency across business applications and the need to harmonize common business terms and reference data. It describes extracting definitions from various sources, identifying anomalies, resolving conflicts, and standardizing terms and values to promote shared understanding. Establishing governance around a common business metadata glossary and conceptual data model can help address issues arising from different interpretations of similar concepts.
The document discusses the Connected Digital Economy Catapult, a UK organization that aims to accelerate digital innovation. It outlines four challenges the Catapult addresses: reducing licensing friction for content reuse, developing open data platforms, ensuring personal data privacy and trust, and creating living labs and demonstrators. The Catapult engages with universities, small companies, and corporations through collaborative R&D projects, fellowships, and innovation spaces like a proposed Connected Products Studio to help innovators develop proof-of-concept connected products.
Yield Improvement Through Data Analysis using TIBCO SpotfireTIBCO Spotfire
Presented by: Andrew Choo, Sr. Yield Engineer, TriQuint Semiconductor
TIBCO Spotfire and Teradata: First to Insight, First to Action; Warehousing, Analytics and Visualizations for the High Tech Industry Conference
July 22, 2013 The Four Seasons Hotel Palo Alto, CA
Mateo Valero - Big data: de la investigación científica a la gestión empresarialFundación Ramón Areces
El 3 de julio de 2014, organizamos en la Fundación Ramón Areces una jornada con el lema 'Big Data: de la investigación científica a la gestión empresarial'. En ella estudiamos los retos y oportunidades del Big data en las ciencias sociales, en la economía y en la gestión empresarial. Entre otros ponentes, acudieron expertos de la London School of Economics, BBVA, Deloite, Universidades de Valencia y Oviedo, el Centro Nacional de Supercomputación...
STFC funds and operates world-class science infrastructure for industry, government, and academia. It has invested heavily in high performance computing (HPC) since the 1960s. Democratizing access to HPC and reducing its complexity are now key challenges to enable wider use, especially in industry. The Hartree Centre works to address this through new engagement models, visualization tools, and platforms that make HPC resources more accessible and usable for non-experts.
- Dynamic neural networks (DNNs) can adapt to varying resource availability on edge devices through techniques like incremental training and group convolution pruning. This allows meeting requirements for timing, power/energy, and accuracy.
- Experiments on two embedded platforms showed that dynamic DNNs combined with DVFS and task mapping can reduce energy consumption while maintaining classification accuracy compared to static DNNs.
- Runtime power management is needed to coordinate heterogeneous processors, respond to environmental factors, balance power consumption and battery life, and meet requirements for concurrently executing tasks and applications under varying conditions on edge devices.
Part 1 of this webinar set discussed the performance challenges caused by plan instability. Using SQL profiles and SQL patches are mechanisms that can provide targeted relief to individual SQL statements that have plan regressions. However, Oracle provides SQL Plan Management features that are intended to prevent plan regressions from creating challenges in the first place.
In this webinar, attendees will learn:
- How SQL Plan Management (SPM) works
- How SQL Plan Baselines are created
- How SQL Plan Baselines are evolved/enabled for use
- Limitations of SPM
Featured speaker: Karen Morton, Enkitec
Watch the companion webinar at: http://embt.co/1hjDU8s
Many DBAs may only know enough about data modeling to be dangerous. There are a number of challenges that DBAs face when trying to do data modeling, as well as some preconceived notions of what they think data modeling can (or can’t) do for them, such as generating useful DDL code.
This 90-minute session will provide specific insights and examples to show DBAs how a data modeling tool can help them improve database performance. Data modeling can simplify routine tasks and provide valuable context for a database implementation. Karen Lopez and John Sterrett will debunk seven dangerous myths that DBAs believe about data modeling, and also discuss and demonstrate:
+ Challenges DBAs encounter with data modeling
+ What data modeling really means and how it adds value
+ Why data modeling is key to successful agile projects
+ How data model-driven development saves time and money
+ Why data modeling should be done throughout the development lifecycle
Watch the companion webinar at: http://forms.embarcadero.com/ManagingMulti-Plat
With IT budgets being stretched thin and the extreme challenge of hiring good talent, the ability to easily support multiple database platforms is a fantastic skill to have. In fact, a lot of IT pros become “accidental” managers of their second platform. In this webinar, you will learn about SQL Server and Oracle, their similarities, their differences, and some tools to ease the pain of multi-platform management.
Join Joseph D’Antoni, SQL expert and evangelist, Kyle Edens, Sr. Account Executive and Scott Walz, Director of Embarcadero Software Consultants as they provide insight on techniques and tools to help you thrive in a multi-platform environment.
In this session, you will have the opportunity to:
+ Learn about the complexities and similarities of managing a multi-platform environment
+ Compare and contrast Oracle and SQL Server and see the differences in managing each platform
+ Bridge the knowledge gap using database tools
This session will benefit DBAs, accidental DBAs, system admins, storage admins as well as anyone looking to expand their skill set working with multiple platforms.
Learn more about Embarcadero DBArtisan and try it free: http://www.embarcadero.com/products/dbartisan
Watch the webinar at: http://embt.co/1OMDHK7
Although master data management (MDM) systems have been deployed in numerous industries and organizations, the vision of creating an overall “single source of truth” is beginning to yield to a more pragmatic perspective of providing visibility to shared information about uniquely-identifiable entities within the enterprise. This more mature approach sheds light on some of the potential gaps associated with the typical out-of-the-box data models for customer or product.
In this webinar, David Loshin addresses data modeling for MDM systems, and share insights about:
+ Some of the complexities emerging from reliance on canned master data models
+ Alternatives for revising how master data entities are viewed and consumed within the enterprise
+ How a consumption-oriented engagement process will help the master data modeler devise thoughtful conceptual and logical representations of shared master data
He also discusses how these different ways of looking at master data modeling will help reduce complexity for master data adoption, system interoperability, and legacy migration.
Learn more about ER/Studio at http://www.embarcadero.com/products/er-studio
Watch the companion webinar at: http://embt.co/16cXD4h
Join Oracle ACE Director, Dan Hotka and Solutions Consultant Director, Scott Walz in part two of the series, where they will continue to build on that knowledge and share even more expertise on PL/SQL procedures, functions and packages.
Watch the webinar to learn about:
+ Procedures, functions and packages
+ Tips on PL/SQL compiling options
+ Performance tuning
Learn more about DB PowerStudio at: http://embt.co/DBPower
See the companion webinar at: http://embt.co/1tqdQxA
Being a DBA can be challenging and frustrating. There is always a lot to learn, many priorities and not enough hours in the day to get it all done.
Join SQL Server MVPs John Sterrett and Mike Walsh, along with Scott Walz, Director of Embarcadero Software Consultants, as they reflect back on the life of a DBA. By sharing real-life personal use cases and scenarios, they will give you a sense of direction, help you prioritize and get you started down the right path.
In this presentation, you will learn:
+ Make the first things, the first things
+ The goals and an action plan to start becoming a proactive DBA
From other DBAs who have been right where you are
Learn more about Embarcadero data modeling solutions at http://embt.co/data-modeling
The ER/Studio Enterprise Team edition provides model and metadata collaboration and enables faster and more effective decision-making using more accurate data. The flagship ER/Studio Data Architect is the foundation for this comprehensive suite that also includes import and export bridges, a model repository, and a data source registry. Additionally, the ER/Studio Team Server Core in this edition allows unlimited web user access to models and metadata, and provides greater meaning, understanding and context to enterprise data with an extensible glossary so users can view, store and centrally manage authoritative business definitions.
The document discusses the challenges of integrating data from multiple sources. It notes that continuously streamed data sources can influence business analytics by impacting customer satisfaction, identifying opportunities, and more. However, integrating these different data sources poses challenges like ensuring entity identifiability across datasets, dealing with limited or no existing data governance, addressing potential editorial biases, and handling missing metadata. The document provides examples and strategies for profiling data elements, determining if they are structurally and semantically conformable, and leveraging metadata to enable successful data integration.
The document summarizes a webinar introducing Embarcadero RAD Studio XE7. It highlights key features such as building shared user interfaces across devices, extending Windows applications to new devices and gadgets, and new enterprise mobility services for secure database access and custom APIs. A demonstration is provided of the new FireUI for multi-device apps, app tethering across devices, and the enterprise mobility services.
Is This Really a SAN Problem? Understanding the Performance of Your IO Subsy...Embarcadero Technologies
Learn more about Embarcadero database tools at: http://www.embarcadero.com/products/database-tools
Nearly 80% of performance issues appear to be related to the performance of storage. In reality, only about half of those are actual bottlenecks - frequently things like missing indexes, bad database design or misuse of features can either negatively impact the performance of the storage, or make it look like the root cause of the issue.
Join Microsoft MVP, Joseph D’Antoni and Embarcadero Director of Software Consultants, Scott Walz as they shed light on diagnosing your IO subsystem.
In this session, you will learn:
+ Where to look in SQL Server to gather information
+ How to use Windows Performance Monitor to analyze storage performance
+ What a "false positive" storage problem might look like
There are only so many times you can yell at the SAN admin, before they get cranky and start giving you 1GB drives, so attend this session and learn when the time is right.
You’ve been using ER/Studio for a while, and you may be wondering why you should look at the latest release with the new Enterprise Team Edition. How does the new Team Server really compare to Portal, and why should you care? Join this special webinar just for our valued long-time customers, and learn the latest information about XE6 and future releases of ER/Studio, and how you can improve your data architecture with visibility across your organization with enhancements, including:
* Data source registry
* Unified glossaries
* Business engagement
We will explain what these features are and why they matter for your data environment, and provide a live demonstration of the Team Server capabilities.
See the companion webinar at: http://forms.embarcadero.com/DataArchitectureSuccessStories
No matter what industry you are in, you know that an effective data architecture will simplify your job as a data modeler and improve the quality and usability of your data assets across the organization. But you may be wondering what others have done to establish and manage their data landscapes, and what you should consider when evaluating solutions.
In this session, two case studies will be presented which describe real-world scenarios of Embarcadero customers who have made the switch from CA ERwin. Roxann Collin of Optum / UnitedHealth Group and Kimberly Hondel of Ohio Department of Job and Family Services will share their stories, including:
+ The data architecture challenges they were facing
+ The reasons they chose ER/Studio
+ The results they’ve seen in their data environments
Rob Loranger will host our guests and share some highlights of the ER/Studio portfolio.
See the companion webinar on demand at: http://forms.embarcadero.com/ERStudioNewRelease
Embarcadero is committed to enabling our users to adapt to emerging technology platforms and challenges, as well as delivering enhanced capabilities to collaborate with other stakeholders. The new release of ER/Studio will include new capabilities targeted to today’s complex enterprise data architectures and demanding workflows, allowing you to:
+ Increase visibility and productivity with Agile change management
+ Add consistency to your data with Team Server glossary enhancements
+ Identify data source impacts resulting from model changes
Watch the companion webinar: http://embt.co/1BIRvPw
Business users and analysts are often trying to solve a very specific data-related problem, and when researching it, may wonder why certain items don’t correlate. Maybe you need to reconcile old data and new data, and eliminate erroneous entries. How do you find what the various terms mean and where the relevant data resides? Business stakeholders need visibility to the organization’s models and metadata, but at the right level of detail for their use. Join this session to learn about business data access challenges, including:
+ What issues exist with current methods
+ What information business users really need
+ How to find that information
Karen Lopez will share tips and insights on working through the data challenges for business analysts and Josh Buckner will share a solution to address those concerns.
Watch the companion webinar at: http://embt.co/1EBGmkK
There are many ways to find SQL that is performing poorly. The hard part is what to do with a bad SQL statement once you have it. In this webinar, several real-world examples will be reviewed to help you learn how to evaluate poorly performing SQL. Each example will demonstrate a commonly occurring SQL performance problem and provide a method to solve it.
Key points:
1. Why data logs are important
2. Proper log recovery actions
3. Log management best practices
Understanding Hardware: The Right Fights for the DBA to Pick with the Server ...Embarcadero Technologies
This document summarizes a presentation given by Joey D'Antoni on how database administrators should pick their battles when working with server, storage, and virtualization teams. Some of the key points made include: avoiding fights over issues like RAID levels, virtual vs physical servers, and storage layouts; ensuring databases are on the latest versions of Windows; using features like Group Managed Service Accounts; working with storage administrators on space allocation and backups; and understanding virtualization best practices when databases are virtualized. The overall message is that relationships with other teams are important, and DBAs should focus on collaborating rather than arguing over issues that don't significantly impact databases.
The document discusses challenges with semantic consistency across business applications and the need to harmonize common business terms and reference data. It describes extracting definitions from various sources, identifying anomalies, resolving conflicts, and standardizing terms and values to promote shared understanding. Establishing governance around a common business metadata glossary and conceptual data model can help address issues arising from different interpretations of similar concepts.
The document discusses the Connected Digital Economy Catapult, a UK organization that aims to accelerate digital innovation. It outlines four challenges the Catapult addresses: reducing licensing friction for content reuse, developing open data platforms, ensuring personal data privacy and trust, and creating living labs and demonstrators. The Catapult engages with universities, small companies, and corporations through collaborative R&D projects, fellowships, and innovation spaces like a proposed Connected Products Studio to help innovators develop proof-of-concept connected products.
Yield Improvement Through Data Analysis using TIBCO SpotfireTIBCO Spotfire
Presented by: Andrew Choo, Sr. Yield Engineer, TriQuint Semiconductor
TIBCO Spotfire and Teradata: First to Insight, First to Action; Warehousing, Analytics and Visualizations for the High Tech Industry Conference
July 22, 2013 The Four Seasons Hotel Palo Alto, CA
Mateo Valero - Big data: de la investigación científica a la gestión empresarialFundación Ramón Areces
El 3 de julio de 2014, organizamos en la Fundación Ramón Areces una jornada con el lema 'Big Data: de la investigación científica a la gestión empresarial'. En ella estudiamos los retos y oportunidades del Big data en las ciencias sociales, en la economía y en la gestión empresarial. Entre otros ponentes, acudieron expertos de la London School of Economics, BBVA, Deloite, Universidades de Valencia y Oviedo, el Centro Nacional de Supercomputación...
STFC funds and operates world-class science infrastructure for industry, government, and academia. It has invested heavily in high performance computing (HPC) since the 1960s. Democratizing access to HPC and reducing its complexity are now key challenges to enable wider use, especially in industry. The Hartree Centre works to address this through new engagement models, visualization tools, and platforms that make HPC resources more accessible and usable for non-experts.
- Dynamic neural networks (DNNs) can adapt to varying resource availability on edge devices through techniques like incremental training and group convolution pruning. This allows meeting requirements for timing, power/energy, and accuracy.
- Experiments on two embedded platforms showed that dynamic DNNs combined with DVFS and task mapping can reduce energy consumption while maintaining classification accuracy compared to static DNNs.
- Runtime power management is needed to coordinate heterogeneous processors, respond to environmental factors, balance power consumption and battery life, and meet requirements for concurrently executing tasks and applications under varying conditions on edge devices.
Afterwork big data et data viz - du lac à votre écranJoseph Glorieux
This document discusses a data visualization workshop hosted by OCTOSuisse on exploring and visualizing big data from a data lake. It provides an overview of OCTO's big data capabilities and projects. It then uses a case study of Swiss public transportation data to demonstrate data exploration, analysis, and visualization techniques using tools like Tableau. The goal is to understand data, identify insights, and effectively communicate findings to others.
II Konferencja Naukowa : Nauka o informacji (informacja naukowa) w okresie zmian, Warszawa, 15-16.04.2013 r. Instytut Informacji Naukowej i Studiów Bibliologicznych, Uniwersytet Warszawski
The 2nd Scientific Conference : Information Science in an Age of Change, April 15-16, 2013. Institute of Information and Book Studies, University of Warsaw
The document summarizes Blair Baker's presentation on preparing for the global workplace. The presentation discusses five disruptive forces affecting the workplace: 1) global economics and economic activity, 2) accelerating technological change, 3) global demographics, 4) global interconnectivity, and 5) pace of change. It also describes opportunities in fields like cybersecurity, cloud technologies, and data analytics. The presentation encourages attendees to assess themselves, determine goals, and connect with others in their desired field.
This document discusses strategies for managing emerging technologies. It begins by outlining common technology predictions that turned out to be wrong, and explains the challenges of predicting future technologies. Next, it describes basic research areas and some emerging technologies like bandwidth, databases, GPS, and teledisc systems. It emphasizes the importance of understanding technology for managers and outlines frameworks for evaluating and fostering innovation. Overall, the key message is that organizations must actively monitor technological developments to identify opportunities and manage risks from emerging technologies.
The document provides an introduction and agenda for a course on big data and data science. It defines big data as large, complex data sets that are difficult to process using traditional data processing applications. It notes that 90% of data in the world today was created in the last two years alone. It also defines the four V's of big data: volume, variety, velocity, and veracity. The document defines data science as an interdisciplinary field that uses scientific methods, processes, algorithms and systems to extract knowledge and insights from structured and unstructured data. It notes that data scientists work with hypothesis generation, data analysis, and data visualization to gather insights that inform decisions. The document outlines some of the day-to-day responsibilities
New Innovative Additive Manufacturing processes KTN
The document discusses several new additive manufacturing processes and projects. It begins with Smartdrop, a non-contact patterned coating technology that jets glue to apply functional fluids without waste. It then discusses the UK EB Additive Manufacturing Platform project to develop complex geometries and advanced materials using electron beam wire additive manufacturing. Finally, it summarizes the RoboWAAM project which involves developing a large scale robotic production platform for industrial wire arc additive manufacturing applications with a build size of meters.
Innovations in Academic-Industry Collaboration in Taiwan and Hong KongLin Haiqiu
The document discusses innovations in academic-industry collaboration in Taiwan and Hong Kong. It outlines the speaker's presentation on understanding the knowledge-based economy and national innovation systems. Examples of innovation centers are provided, including ITRI in Taiwan which focuses on various technologies. The importance of collaboration between universities, government, businesses and technology institutes is discussed for creating new wealth from knowledge.
The document discusses the forces driving accelerated technological change and exponential growth. It notes that exponential technologies like networks, sensors, robotics, 3D printing, and artificial intelligence are converging in unexpected ways with massive implications. The history of communications technology from smoke signals to the modern 5G world is reviewed. The NEC DX2000 computing platform and shared accelerated flash storage solutions for telephony workloads are presented as enabling more genius, wealth concentration, and a potential world of abundance through exponential communication.
¿Es posible construir el Airbus de la Supercomputación en Europa?AMETIC
Presentación a cargo de Mateo Valero, Director del Barcelona Supercomputing Center, en el marco de la 30ª edición de los Encuentros de Telecomunicaciones y Economía Digital.
The document discusses the growth of digital information from the 1940s to present day. It notes that the amount of digital data created annually is growing exponentially and is expected to increase six-fold every four years. However, only a small percentage of total organizational data is structured in a way that is easily usable by computers, while the majority remains unstructured like documents, photos and videos. Ensuring high quality information through accurate tagging, metadata, and reducing errors is important for effectively managing both structured and unstructured data.
This document summarizes Kx Systems, a company that provides a high-performance time-series database called kdb+. Kdb+ can process and analyze large volumes of real-time and historical time-series data extremely fast with low latency. It is widely used in financial services and is now being applied to other industries like manufacturing, utilities, and life sciences. Kx Systems offers software, consulting services, and can help clients integrate kdb+ with their existing technologies and scale their deployments.
Practical IEC 61850 for Substation Automation for Engineers and TechniciansLiving Online
COPY THIS LINK INTO YOUR BROWSER TO FIND OUT MORE: bit.ly/11AM1oL
Older (‘legacy’) substation automation protocols and hardware/software architectures provided basic functionality for power system automation, and were designed to accommodate the technical limitations of the technologies available at the time. However, in recent years there have been vast improvements in technology, especially on the networking side. This has opened the door for dramatic improvements in the approach to power system automation in substations.
The latest developments in networking such as high-speed, deterministic, redundant Ethernet, as well as other technologies including TCP/IP, high-speed Wide Area Networks and high-performance embedded processors, are providing capabilities that could hardly be imagined when most legacy substation automation protocols were designed.
IEC61850 is a part of the International Electro-technical Commission (IEC) Technical Committee 57 (TC57) architecture for electric power systems. It is an important new international standard for substation automation, and it will have a significant impact on how electric power systems are designed and built in future. The model-driven approach of IEC61850 is an innovative approach and requires a new way of thinking about substation automation. This will result in significant improvements in the costs and performance of electric power systems.
This workshop provides comprehensive coverage of IEC 61850 and will provide you with the tools and knowledge to tackle your next substation automation project with confidence.
WHO SHOULD ATTEND?
This workshop is designed for personnel with a need to understand the techniques required to use and apply IEC 61850 to substation automation, hydro power plants, wind turbines and distributed energy resources as productively and economically as possible. This includes engineers and technicians involved with:
Consulting
Control and instrumentation
Control systems
Design
Maintenance supervisors
Electrical installations
Process control
Process development
Project management
SCADA and telemetry systems
COPY THIS LINK INTO YOUR BROWSER TO FIND OUT MORE: bit.ly/11AM1oL
Similar to 5 Key Considerations for Data Modeling (20)
Replay and more: https://blogs.embarcadero.com/pytorch-for-delphi-with-the-python-data-sciences-libraries/
The next installment of the Embarcadero Open Source Live Stream takes a look at the Delphi side of the Python Ecosystem with the new Python Data Sciences Libraries and related projects that make it super easy write Delphi code against Python libraries and easily deploy on Windows, Linux, MacOS, and Android. Specific examples with the Python Natural Language Toolkit and PyTorch, the library that powers projects like Tesla Autopilot, Uber's Pyro, Hugging Face's Transformers.
This is part of a series of regular live streams discussing the latest in Embarcadero open source projects. Hosted by Jim McKeeth and joined by members of the community and developers involved in these open source projects, as well as members of Embarcadero and Idera’s Product Management. A great opportunity to see behind the scenes and help shape the future of Embarcadero’s Open Source projects.
Android on Windows 11 - A Developer's Perspective (Windows Subsystem For Andr...Embarcadero Technologies
The Windows Subsystem for Android (WSA) brings native Android applications to the Windows 11 desktop. Learn how to set up and configure Windows Subsystem for Android for use in software development. See what is required to run WSA as well as what is required to target it from your Android development. Windows Subsystem for Android is available for public preview on Windows 11.
Webinar replay and more: https://blogs.embarcadero.com/?p=134192
for Linux (WSL2) with full GUI and X windows support. Join this webinar to better understand WSL2, how it works, proper setup, configuration options, and learn to target it in your application development. Test your Linux applications on your Windows desktop without the need for a second computer or the overhead of a virtual machine. Learn to leverage additional Linux features and APIs from your applications.
Examples with Delphi 11 Alexandria and FMXLinux
Learn how Embarcadero's newly released free Python modules bring the power and flexibility of Delphi's GUI frameworks to Python. VCL and FireMonkey (FMX) are mature GUI libraries. VCL is focused on native Windows development, while FireMonkey brings a powerful flexible GUI framework to Windows, Linux, macOS, and even Android. This webinar will introduce you to these new free Python modules and how you can use them to build graphical users interfaces with Python. Part 2 will show you how to target Android GUI applications with Python!
Introduction to Python GUI development with Delphi for Python - Part 1: Del...Embarcadero Technologies
Learn how Embarcadero’s newly released free Python modules bring the power and flexibility of Delphi’s GUI frameworks to Python. VCL and FireMonkey (FMX) are mature GUI libraries. VCL is focused on native Windows development, while FireMonkey brings a powerful flexible GUI framework to Windows, Linux, macOS, and even Android. This webinar will introduce you to these new free Python modules and how you can use them to build graphical users interfaces with Python. Part 2 will show you how to target Android GUI applications with Python!
Join Jim McKeeth as he introduces you to FMXLinux, and shows how you can bring the power of FireMonkey to Linux.
Outline:
Installation via GetIt Package Manager
Linux, PAServer, SDK, & Package Installation
FMXLinux usage and Samples
FireDAC Database Access on Linux
Migrating from Windows VCL to FMXLinux
3rd Party FMXLinux Support
Deploying rich web apps via Broadway
https://embt.co/FMXLinuxIntro
Combining the Strenghts of Python and Delphi
Links replay and more
https://blogs.embarcadero.com/combining-the-strengths-of-delphi-and-python/
Python4Delphi repository
https://github.com/pyscripter/python4delphi
Part 1
https://blogs.embarcadero.com/webinar-replay-python-for-delphi-developers-part-1-introduction/
Webinar by Kiriakos Vlahos (aka PyScripter)
and Jim McKeeth (Embarcadero)
Replay https://youtu.be/aCz5h96ObUM
Find out more, and register for part 2
https://embt.co/3hSAKrg
Check out the library
https://github.com/pyscripter/python4delphi
Agenda
Motivation and Synergies
Introduction to Python
Introduction to Python for Delphi
Simple Demo
TPythonModule
TPyDelphiWrapper
Embeddable Databases for Mobile Apps: Stress-Free Solutions with InterBaseEmbarcadero Technologies
When it comes to developing mobile applications, keeping data on your device is a must-have feature, but can still be risky. With embedded InterBase, you can deploy high-performance multi-device applications that maintain 256-bit encryption, have a small footprint and need little, if any, administration.
What can participants expect to learn: Using InterBase in your mobile apps is easier than you may expect. Learn to develop mobile applications using InterBase, and how to take advantage of some of the convenient features about InterBase like Change Views and 256-bit security.
Join Mary Kelly, InterBase Engineer & RAD Software Consultant, and Jim McKeeth, Chief Developer Advocate & Engineer, for this webinar replay.
Replay: https://embt.co/2qUPwWY
Rad Server Industry Template - Connected Nurses Station - Setup DocumentEmbarcadero Technologies
This document provides instructions for setting up a connected nurses station sample project using RAD Server, InterBase, and EMS. The key steps include:
1. Configuring the InterBase database and EMS server
2. Creating users in EMS Management Console including a "nurseuser"
3. Installing OpenSSL libraries for push notifications
4. Setting up push notification services for Android and iOS
TMS Software's Map Packs make it easy to integrate mapping into your applications. Based on the Google Maps and OpenStreet Maps sources. Join us for this webinar to learn how to take your mapping to the next level.
Works on VCL, FireMonkey (FMX), Windows, Android, iOS, macOS, Delphi and C++Builder.
Applications built with Delphi and C++ Builder for the Windows platform have proven to be indispensable instruments for businesses, but rewriting them for the cloud is often cost-prohibiting. rollApp offers a cloud platform that can run existing desktop applications in the cloud without any need to modify them. At this webinar you will learn how to move your application to the cloud and offer the benefits of a cloud solution to your users in a matter of a few weeks.
Learn about the latest features of C++11 that you can take advantage of today in C++Builder 10.1 Berlin.
David Millington, Embarcadero's new C++Builder Product Manager, shows cool C++11 code in the IDE that can be compiled for Windows, macOS, iOS and Android using the Embarcadero C++Builder Clang-enhanced compiler.
C++11 language features covered include:
Auto typed variables
Variadic templates
Lambda expressions
Atomic operations
Unrestricted unions
and more
Slide deck for the June 2, 2016 Embarcadero Webinar
This webinar will show you how to build mobile applications for iOS and Android using Delphi and C++Builder 10.1 Berlin. We will cover getting started, best practices for mobile UI/UX, building your first app, using FireUI Live Preview, creating custom design views and Live Previews, a real world example of creating, submitting and getting store acceptance for an iOS and Android app, working with databases, what’s new for mobile development and more.
This webinar will also give advice to Windows VCL desktop application developers who want to migrate their as much of their existing code to the iOS and Android mobile platforms
In this webinar we take a deeper dive into:
• How to get started building Mobile Apps if you are a Windows VCL desktop developer
• Building Mobile Apps using the different target platforms configurations
• Best practices and Apple/Google UI/UX guidelines for mobile applications – you’ll need to follow these to get your apps accepted.
• Creating FireUI Designer Custom IDE Views for other Mobile Devices
• FireUI Live Preview – extending the App to support custom component viewing
• Accessing Local and Remote Databases from your mobile apps
• Submitting apps to the Apple App Store, Google Play
Technical demonstrations will be presented by the team. Live Q&A will be done during and at the end of the webinar.
This document discusses RAD Server, a back-end platform from Embarcadero Technologies for building multi-tier applications with Delphi and C++Builder. RAD Server provides automated REST/JSON API publishing of server-side Delphi and C++ code. It also includes integration middleware, built-in application services, and tools for managing APIs, users and analytics. RAD Server allows developers to quickly develop and deploy modern multi-tier applications with Delphi and C++. Pricing options are provided on a per user or unlimited user basis.
ER/Studio is the complete business-driven data architecture solution that combines data modeling, business process, and application modeling and reporting with cross-organizational team collaboration for data architectures and enterprises of all sizes.
“Oh my goodness! What did I do?” Chances are you have heard, or even uttered this expression. This demo-oriented session will show many examples where database professionals were dumbfounded by their own mistakes, and could even bring back memories of your own early DBA days.
Businesses make critical decisions using key data assets, but stakeholders often find it difficult to navigate the complex data landscape to ensure they have the right data and understand it correctly. Companies are dealing with a number of different technologies, multiple data formats, and high data volumes, along with the requirements for data security and governance.
Watch the companion webinar at:
Join John Sterrett, Senior Advisor at Linchpin People and Scott Walz, Director of Software Consultants, to learn how execution plans get invalidated and why data skew could be the root cause to seeing different execution plans for the same query. We will look at options for forcing a query to use a particular execution plan. Finally, you will learn how this complex problem can be identified and resolved simply using a new feature in SQL Server 2016 called Query Store.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIVladimir Iglovikov, Ph.D.
Presented by Vladimir Iglovikov:
- https://www.linkedin.com/in/iglovikov/
- https://x.com/viglovikov
- https://www.instagram.com/ternaus/
This presentation delves into the journey of Albumentations.ai, a highly successful open-source library for data augmentation.
Created out of a necessity for superior performance in Kaggle competitions, Albumentations has grown to become a widely used tool among data scientists and machine learning practitioners.
This case study covers various aspects, including:
People: The contributors and community that have supported Albumentations.
Metrics: The success indicators such as downloads, daily active users, GitHub stars, and financial contributions.
Challenges: The hurdles in monetizing open-source projects and measuring user engagement.
Development Practices: Best practices for creating, maintaining, and scaling open-source libraries, including code hygiene, CI/CD, and fast iteration.
Community Building: Strategies for making adoption easy, iterating quickly, and fostering a vibrant, engaged community.
Marketing: Both online and offline marketing tactics, focusing on real, impactful interactions and collaborations.
Mental Health: Maintaining balance and not feeling pressured by user demands.
Key insights include the importance of automation, making the adoption process seamless, and leveraging offline interactions for marketing. The presentation also emphasizes the need for continuous small improvements and building a friendly, inclusive community that contributes to the project's growth.
Vladimir Iglovikov brings his extensive experience as a Kaggle Grandmaster, ex-Staff ML Engineer at Lyft, sharing valuable lessons and practical advice for anyone looking to enhance the adoption of their open-source projects.
Explore more about Albumentations and join the community at:
GitHub: https://github.com/albumentations-team/albumentations
Website: https://albumentations.ai/
LinkedIn: https://www.linkedin.com/company/100504475
Twitter: https://x.com/albumentations
20 Comprehensive Checklist of Designing and Developing a WebsitePixlogix Infotech
Dive into the world of Website Designing and Developing with Pixlogix! Looking to create a stunning online presence? Look no further! Our comprehensive checklist covers everything you need to know to craft a website that stands out. From user-friendly design to seamless functionality, we've got you covered. Don't miss out on this invaluable resource! Check out our checklist now at Pixlogix and start your journey towards a captivating online presence today.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
“An Outlook of the Ongoing and Future Relationship between Blockchain Technologies and Process-aware Information Systems.” Invited talk at the joint workshop on Blockchain for Information Systems (BC4IS) and Blockchain for Trusted Data Sharing (B4TDS), co-located with with the 36th International Conference on Advanced Information Systems Engineering (CAiSE), 3 June 2024, Limassol, Cyprus.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
2. EMBARCADERO TECHNOLOGIES
Data….
• Every Minute of every day we create:
- More than 204 million email messages
- Over 2 million Google search queries
- 48 hours of new YouTube videos
- More than 100,000 tweets
• Megabyte, gigabyte, terabyte, petabyte, exabyte, zettabyte, yottabytes
The new NSA facility in Utah can
hold 5 zettabytes of data.
To store only 1Zb of data, it would
take 62.5 billion iPhones!
3. EMBARCADERO TECHNOLOGIES
Island Clinic - Ebola
Treatment Centre
Data… used to save lives
WHO /C. Black http://www.who.int/features/2014/liberia-ebola-island-clinic/en/
4. EMBARCADERO TECHNOLOGIES
Data… used to save lives
Mobile phone location data
Integrating data sets from anonymised mobile phone usage and demographic indicators.
mage Credit: PLOS Currents.
7. EMBARCADERO TECHNOLOGIES
Model Design
Visualizing can be useful to see the results
75% …………………..
25% …………………..
25% ………………….
12.50% ……………...
12.50% ………………
All purpose flour
Cake flour
Granulated sugar
Butter
Eggs
9. EMBARCADERO TECHNOLOGIES
Model Design
• Logical Model
- The organisation of your data. Basically, the Blueprint.
• Physical Model
- The ‘physical structure’ of the data in the database.
10. EMBARCADERO TECHNOLOGIES
Model Design
• Logical Model
- The organisation of your data. Basically, the Blueprint.
• Physical Model
- The ‘physical structure’ of the data in the database.
• Normalisation
- Eliminating redundancy and mitigating corruption.
- 1NF:the key, 2NF: the whole key, 3NF:nothing but the key.
So help me Codd, Edgar F.
11. EMBARCADERO TECHNOLOGIES
Model Design
Customer Name Customer Address Customer Tel No. Product Cost
Holmes, S 221B Baker St, London +44 1632 960957 Hat 44.99
Holmes, S 221B Baker St, London +44 1632 960957 Pipe 22.99
Fletcher, J 698 Candlewood Lane,
Cabot Cove, Maine
+001 1632 960428 Typewriter 129.99
Fletcher, J 698 Candlewood Lane,
Cabot Cove, Maine
+001 1632 960428 Hat 44.99
12. EMBARCADERO TECHNOLOGIES
Model Design
Customer
Customer ID Customer Name Customer Address Customer Tel
No.
20 Holmes, S 221B Baker St,
London
+44 1632 960957
20 Holmes, S 221B Baker St,
London
+44 1632 960957
30 Fletcher, J 698 Candlewood
Lane, Cabot Cove,
Maine
+001 1632
960428
30 Fletcher, J 698 Candlewood
Lane, Cabot Cove,
Maine
+001 1632
960428
Orders
Order ID Customer ID
ORD001 20
ORD002 30
ORD003
Order ID Product ID Quantity
ORD001 001 1
ORD001 002 1
ORD002 003 1
ORD002 001 1
Product ID Product Cost
001 Hat 44.99
002 Pipe 22.99
003 Typewriter 129.99
Order details
Products
13. EMBARCADERO TECHNOLOGIES
Naming Standards
• An example of a very short naming standard.
tNYEZC - table of NY Employees Zip Code.
14. EMBARCADERO TECHNOLOGIES
Naming Standards
• Be clear and understandable to everyone.
• Add a detail of description – tbl for a table etc.
• Use a ‘naming standards template’ to ensure consistency.
15. EMBARCADERO TECHNOLOGIES
Planning for Growth
• Each engine of a jet on a flight from London to New
York generates 10TB of data every 30 minutes.
Source: Pratt and Whitney.
• 90% of the World’s data generated over the last two
years.
Source: Science Daily.
26. EMBARCADERO TECHNOLOGIES
Thank you!
• Product Videos: http://www.embarcadero.com/products/er-studio/
product-videos
• Wiki and Documentation: http://docs.embarcadero.com/
• Learn more about the ER/Studio product family:
http://www.embarcadero.com/data-modeling
• Trial Downloads: http://www.embarcadero.com/downloads
• To arrange a demo, please contact Embarcadero Sales:
sales@embarcadero.com, (888) 233-2224
26
Editor's Notes
90 percent of the worlds data has been created in the last two years? Yes I can believe that, when i look at my iphoto library, the growth is definitely exponential. In fact I have read a few times that in total, the worlds data is doubling every two years, so at least it looks like there won't be any slowdown.
It has grabbed most business headlines over the last 5 years, it sometimes it can feel like we are surrounded by an ocean of data.
It is the 1000lb elephant in the room, except everyone IS actually talking about it!
So with all of this new data created, how useful is it.
Right now with the ebola epidemic affecting so many lives, data is being used to create an effective response.
This is a father and daughter in Liberia, Mr Nyenati Kaffia. They have just been checked out of the Island treatment center in Monrovia. A new treatment center that was set up to fight the epidemic.
He said that it was bittersweet that day as he tragically lost his son, but he and his daughter survived ebola. It is an example of how large amounts of fairly simple data, can make a crucial difference. In such countries with limited resources, information can make a crucial difference.
The Information is gathered from mobile phone location data. Using the data correctly it was noted that when the epidemic hits certain areas it could be seen that the movement of the population changes.
They are now using this data to find the most effective areas to set up emergency treatment centers. Saving lives like Mr Nyenati Kaffia and his daughter, and making those limited resources as effective as possible.
And so just simple mobile phone location data, used effectively can really make a difference. Certainly great to see..
Today we are going to look at some of the benefits of data modelling. I shall be using Embarcadero's Data Architect. There are many aspects of data modelling and it is a subject that can be very in depth. I shall be touching on a few important aspects, so let's take a short look at 5 of these.
Model Design - We will look at some of the basic aspects of model design and how, high quality, large designs can be achieved in a short period with advanced data modeling.
Planning for Growth - Future-proofing your design and being able to scale up for growth without having any detrimental effects is key to building a good foundation for your project. One of my previous positions I was able to see what can happen if things go wrong and you have not prepared for successful growth. I hasten to add, it was not one of my projects and mercifully there were a couple of tenacious and flexible developers that were able to put in a lot of extra work and pull things around. valuable lessons were learned and what surprised me was how quickly, other companies quickly take up the slack and filled the gaps in the niche that was created by us in the first place. As a company they are still on top and doing great business. However this is not always the case and they were very lucky.
Naming Standards - Whether you are importing your design using our reverse engineering wizard or building your model from the bottom up, naming standards are useful in mitigating misinterpretation.
Data Lineage - How important is it to know the origins of your data? Well very, if you are covered by any sort of data governance. Simply to know where your data comes from will help you to prove your results.
Big Data - No webinar would be complete without the obligatory 'tip of the hat' to Big Data. With headlines like 'a single airliner jet engine producing 10Tb of data in just 30 mins' and, '90 percent of data on the planet being generated in the last 2 years', it continues to grab headlines and appear in our inboxes, so we shall touch on it today too. ER/Studio has recently had some exciting updates that allow you to work with big data platforms so we shall have a look at those too.
Organisations and individuals are tight for time. While years ago it was ok for large projects to take many years of planning, having the same approach now would mean that the technology may have moved on before you have finished and made your work obsolete.
Growth can be exponential and capitalizing on this can be the difference in being a success or not.
It is easy to understand the importance of a data model. We often reach for a scrap of paper, back of a napkin, to visualise our ideas. We are creatures in a visual world. That is why we are always told to write down our goals, we accomplish more when we see it written visually, it is just who we are. We understand more when we see it.
And unless you are a Terminator, R2D2 or another kind of robot, or cyborg for those who are picky, you most likely benefit from breaking down that visual data. Staring at a spreadsheet or looking at the piechart of the same data is an easy example of explaining the difference of this. So this data of total data, would be impossible to understand in a pie chart, but is easily recognisable in a bar graph.
Design is key. No amount of smoothly written code can account for design errors. Normalizing too many times can cause performance problems. It might be possible to normalize down to the 6th normal form but not a great idea if your hardware has a seizure. So when we look at model design and future proofing there is an importance to getting it right.
Let us just touch on the different areas of the model design.
Logical Model:
So the logical model is normally developed before the physical model. It is the business and functional requirements of your system and allows you to determine the organisation of the data in the database. You can think of it as your blueprint.
Physical Model:
The physical Model shows how the data is actually stored in the database. Specifying the data type and how the tables are stored.
Keeping the two separate will keep things from getting too complex. Although it is no guarantee that things will not get complex. Starting with a good physical design will ensure that your logical design is strong.
Adding business rules such as phone number and address to the customer entity, can also ensure less problems later on. For instance when the customer orders are being delivered and there is no phone number for the courier to find the house, resulting in a missed delivery and an angry customer, can be avoided, because the system will not allow an order without a telephone number.
Domains are very useful in creating entity attributes or columns in your models as they are basically attribute templates that can be applied repeatedly. You build them like an attribute, specifying names, datatype properties etc. This is great as you can then ensure that anything that references it, will comply.
Let me show you this is action. .....
There are so many things we can go in to with model design and these are only a few. while we are talking about model design I also wanted to touch on normalisation.
Normalization is the process of eliminating redundancy and streamlining your design. Separating out and relating your tables.
There are plenty of videos out there that explain this much better, but for a point of reference, a quick and rough example is the following:
So you want to keep a record of your customer orders on your database.
You have their name, their address and what they ordered.
The records would show the following:
Customer ID, Name, Address, what they ordered.
The table could show the following line of "Customer ID, Name, Address, product 1, product 2, product 3 etc.
Normalisation sounds complicated, but it is common sense. normalisation is the process of removing any redundancy and duplicates. Mitigating the possibility of updating duplicate data in only one location, and therefore causing a data corruption.
and of course removing redundant data, cuts down on storage and database size.
Of course splitting the data in to many different related tables is possibly going to impact the IO of your database and in some cases a denormalisation may be needed for performance
Once you have built or imported your model, you can use the a validation tool to check that all is well, before you move forward and generate a physical database for example.
Staying with ERStudio DA, i can show that in action. Here we have a basic logical database that i have made. I could also reverse engineer a database from an existing database. That is a very exciting tool indeed. It allows you to model from any major platform. Here is a Teradata database that I have previously imported with the reverse engineer wizard. Ok so moving back to my much smaller database, I can go here and validate the model. Let's see if we find any errors.
So to touch on naming standards, they are nomally in place already in your organisation, however because of the history of the space allowed for names, historically very small, what can be in place is some sort of naming code.
This is great if you are hacked, while the hacker spends time trying to work out that tNYEZC is the table of NY Employees Zip Code. you can cut his unauthorised access maybe :-)
It used to be that naming conventions were restricted by the length of characters. however this is generally not the case anymore and although Oracle allows for 30 bytes and SQL server allows for 128, we recommend keeping clear and concise names.
What might make sense to you, needs to be clear for people after you leave or get promoted.
Some people get a little upset at redundancy, so adding rpt before the name if it is a report, however I would say that there is no harm in it and it can certainly help in reverse engineering, to have everything labelled nicely.
The naming standards template can allow you to enforce naming standards and also apply them later.
We can have a quick look at that.
These headlines were going to be in my Big Data slide... but really we should remember that you cannot have Big Data, without first having little data.
There are a couple of aspects that i would like to touch on here. Planning for storage and predicting growth.
Two main issues where database performance is concerned, is fragmentation and space. Data fragmentation can occur if not planned for in the creation phase and well, we all know what happens when we run out of space.
Maintaining good IO speeds, you can plan for this in the physical design phase. Separating out tables from their indexes, and separating out columns in tables that are often referenced together. If you are doing this from the ground up, then hopefully you will get a chance to separate out the databases on to the drives you have available. Going as granular as splitting the contents of tables on to separate physical disks or arrays of disks.
Most people have disk arrays, raid 5 or 10. Raid 5 having slower write speeds, but if you are lucky enough to be using some good hardware SSD drive arrays or a big flash memory array, then you have it better than most. hoever you may have database sizes to warrant that kind of expense, so what the left gives, the right takes away. :-)
Not getting it right can result in extreme slowness or even downtime.
In a previous job we were given separate drives for our logs and various numbered data drives. it was only when we have serious lag highlighted on a stress test, that we found out that the separate drives were actually virtual drives on a single sata archive array.
A lack of communication and missed steps from our team because we had originally started off building the server as a development demo, but had not requested from the systems team to move it to pre-production.
In ERStudio, as a starting point in planning for growth you can use the projected row count. We have a useful utility called the Capacity Planner which accurately predicts growth.
The tool can be used from within the table editor and can predict table growth from the capacity planning tab of the table editor. covering things like growth rate, growth type and max size etc
Using the capacity planner utility outside the table editor, you can forecast your storage requirements, helping you budget and plan your engineering.
We have a nice section in the wiki for explaining the facets of the Planner so i shall not cover it here, but I shall put some links at the end and you can have a look at a later date.
For databases that let you specify initial extent sizes, such as Oracle, set the initial extent size to the estimated size in order to avoid data fragmentation as the table grows.
If you cast your mind back to your maths teacher at school. They told you that you should always show your workings. Because simply, if you do not show your working… we do not know what you have done… or more importantly, what you have not done.
How do you get to your facts and figures?
Data Lineage can be thought of as a data trail. The Hansel and Gretel trail of breadcrumbs of your data. Critically important for data integrity and trust by your employees and customers.
Used to verify your end results, forecasting results, data governance or tracing your ETL transformations. After all, it is common for your systems to be in many places at once and on differing systems. Acquisitions, mergers, outdated systems and newly implemented systems give a common diverse topography to most company systems. Dealing with this diversity can be a challenge and keeping track of where the data originates is what data lineage though an ETL process is all about.
What-if scenarios are possible. If we picture a medium sized organisation. Something as simple as changing a supplier, running a marketing drive to increase sales can all utilise your data flow, and in turn your data lineage to project any number of what-if scenarios.
In the Data Lineage tab within Data Architect, there is the option to add data lineage information. We do have the option to ‘import’ data lineage information as well. Which can be done from SSIS and Oracle etc. This can be very useful if you have disparate systems containing data lineage information, and can be a great way to bring those disparate systems together using the reverse engineer wizard and importing data lineage. allowing you to model your complete system. You can even generate a physical database with the built in wizards, but that can be for another time. If you would like to see that in action, i am sure we have some great videos on it in action. I shall add some links at the end.
Big data is so often in the news. There are so many headline grabbing statistics from the increase in the collection and use of data that it is easy to understand why.
The rapid expansion in the volume of data and the speed at which it is used and collected. Adding to this an increasing diversity to the data, it is affecting all of us in many ways.
There is an old story that i heard back in the military of a new military base that was being renovated. I have heard it repeated about a college campus too.
It goes like this, the pedestrian footpaths had not been laid yet and there were arguments about if the paths should go around the grass at the edges, or link buildings diagonally.
So the adjutant of the barracks allowed people to walk anywhere they liked, for 6 months. When the grass was finally worn down, they could match the paths to the places where the grass was worn away.
I liked the military version as walking on the grass was severely punishable.
The point here is that only those who walk the paths, know the best place to lay the foundations.
Your customer habits, your employee processes, the supply and demand of your business are all critical factors in how your success can be developed. You know your business better than anyone, you walk across that grass every day.
A talent shortage is also reported on frequently as the exponential growth and use of data is skyrocketing, there is a lack of people to take up the slack.... ER Studio can make your users into experts with the simple yet intuitive interface. and now it supports Hadoop Hive, MongoDB, Teradata - show link for supported platforms...
Supporting the big data platforms means that reverse engineering any disparate systems, and enabling you to handle your data in one conjoined data flow.