Data, the way that we process it and store it, is one of many important aspects of IT. Data is the lifeblood of our organizations, supporting real-time business processes and decision-making. For our DevOps strategy to be truly effective we must be able to safely and quickly evolve production databases, just as we safely and quickly evolve production code. Yet for many organizations their data sources prove to be less than trustworthy and their data-oriented development efforts little more than productivity sinkholes. We can, and must, do better.
This presentation begins with a collection of agile principles for data professionals and of data principles for agile developers - the first step in working together is to understand and appreciate the priorities and strengths of the people that we work with. Our focus is on a collection of practices that enable development teams to easily and safely evolve and deploy databases. These techniques include agile data modeling, database refactoring, database regression testing, continuous database integration, and continuous database deployment.
We also work through operational strategies required of production databases to support your DevOps strategy. If data sources aren’t an explicit part of your DevOps strategy then you’re not really doing DevOps, are you?
Data Lakehouse, Data Mesh, and Data Fabric (r1)James Serra
So many buzzwords of late: Data Lakehouse, Data Mesh, and Data Fabric. What do all these terms mean and how do they compare to a data warehouse? In this session I’ll cover all of them in detail and compare the pros and cons of each. I’ll include use cases so you can see what approach will work best for your big data needs.
It is a fascinating, explosive time for enterprise analytics.
It is from the position of analytics leadership that the mission will be executed and company leadership will emerge. The data professional is absolutely sitting on the performance of the company in this information economy and has an obligation to demonstrate the possibilities and originate the architecture, data, and projects that will deliver analytics. After all, no matter what business you’re in, you’re in the business of analytics.
The coming years will be full of big changes in enterprise analytics and Data Architecture. William will kick off the fourth year of the Advanced Analytics series with a discussion of the trends winning organizations should build into their plans, expectations, vision, and awareness now.
The list of failed big data projects is long. They leave end-users, data analysts and data scientists frustrated with long lead times for changes. This case study will illustrate how to make changes to big data, models, and visualizations quickly, with high quality, using the tools teams love. We synthesize techniques from devOps, Demming, and direct experience.
10 Reasons Snowflake Is Great for AnalyticsSenturus
Learn why Snowflake analytic data warehouse makes sense for BI including data loading flexibility and scalability, consumption-based storage and compute costs, Time Travel and data sharing features, support across a range of BI tools like Power BI and Tableau and ability to allocate compute costs. View this on-demand webinar: https://senturus.com/resources/10-reasons-snowflake-is-great-for-analytics/.
Senturus offers a full spectrum of services in business intelligence and training on Cognos, Tableau and Power BI. Our resource library has hundreds of free live and recorded webinars, blog posts, demos and unbiased product reviews available on our website at: http://www.senturus.com/senturus-resources/.
Hadoop meets Agile! - An Agile Big Data ModelUwe Printz
Big Data projects are a struggle, not only on the technical side but also on the organizational side. In this talk the author shares his experience and opinions from almost 5 years of Big Data projects and develops an Agile Big Data Model which reflects his ideas on how Big Data projects can be successful, even in large companies.
Talk held at the crossover meetup of the "Agile Stammtisch Rhein-Main" and the "Hadoop & Spark User Group Rhein-Main" at codecentric AG on 31.01.2017.
Building a Turbo-fast Data Warehousing Platform with DatabricksDatabricks
Traditionally, data warehouse platforms have been perceived as cost prohibitive, challenging to maintain and complex to scale. The combination of Apache Spark and Spark SQL – running on AWS – provides a fast, simple, and scalable way to build a new generation of data warehouses that revolutionizes how data scientists and engineers analyze their data sets.
In this webinar you will learn how Databricks - a fully managed Spark platform hosted on AWS - integrates with variety of different AWS services, Amazon S3, Kinesis, and VPC. We’ll also show you how to build your own data warehousing platform in very short amount of time and how to integrate it with other tools such as Spark’s machine learning library and Spark streaming for real-time processing of your data.
Data Lakehouse, Data Mesh, and Data Fabric (r2)James Serra
So many buzzwords of late: Data Lakehouse, Data Mesh, and Data Fabric. What do all these terms mean and how do they compare to a modern data warehouse? In this session I’ll cover all of them in detail and compare the pros and cons of each. They all may sound great in theory, but I'll dig into the concerns you need to be aware of before taking the plunge. I’ll also include use cases so you can see what approach will work best for your big data needs. And I'll discuss Microsoft version of the data mesh.
Data Lakehouse, Data Mesh, and Data Fabric (r1)James Serra
So many buzzwords of late: Data Lakehouse, Data Mesh, and Data Fabric. What do all these terms mean and how do they compare to a data warehouse? In this session I’ll cover all of them in detail and compare the pros and cons of each. I’ll include use cases so you can see what approach will work best for your big data needs.
It is a fascinating, explosive time for enterprise analytics.
It is from the position of analytics leadership that the mission will be executed and company leadership will emerge. The data professional is absolutely sitting on the performance of the company in this information economy and has an obligation to demonstrate the possibilities and originate the architecture, data, and projects that will deliver analytics. After all, no matter what business you’re in, you’re in the business of analytics.
The coming years will be full of big changes in enterprise analytics and Data Architecture. William will kick off the fourth year of the Advanced Analytics series with a discussion of the trends winning organizations should build into their plans, expectations, vision, and awareness now.
The list of failed big data projects is long. They leave end-users, data analysts and data scientists frustrated with long lead times for changes. This case study will illustrate how to make changes to big data, models, and visualizations quickly, with high quality, using the tools teams love. We synthesize techniques from devOps, Demming, and direct experience.
10 Reasons Snowflake Is Great for AnalyticsSenturus
Learn why Snowflake analytic data warehouse makes sense for BI including data loading flexibility and scalability, consumption-based storage and compute costs, Time Travel and data sharing features, support across a range of BI tools like Power BI and Tableau and ability to allocate compute costs. View this on-demand webinar: https://senturus.com/resources/10-reasons-snowflake-is-great-for-analytics/.
Senturus offers a full spectrum of services in business intelligence and training on Cognos, Tableau and Power BI. Our resource library has hundreds of free live and recorded webinars, blog posts, demos and unbiased product reviews available on our website at: http://www.senturus.com/senturus-resources/.
Hadoop meets Agile! - An Agile Big Data ModelUwe Printz
Big Data projects are a struggle, not only on the technical side but also on the organizational side. In this talk the author shares his experience and opinions from almost 5 years of Big Data projects and develops an Agile Big Data Model which reflects his ideas on how Big Data projects can be successful, even in large companies.
Talk held at the crossover meetup of the "Agile Stammtisch Rhein-Main" and the "Hadoop & Spark User Group Rhein-Main" at codecentric AG on 31.01.2017.
Building a Turbo-fast Data Warehousing Platform with DatabricksDatabricks
Traditionally, data warehouse platforms have been perceived as cost prohibitive, challenging to maintain and complex to scale. The combination of Apache Spark and Spark SQL – running on AWS – provides a fast, simple, and scalable way to build a new generation of data warehouses that revolutionizes how data scientists and engineers analyze their data sets.
In this webinar you will learn how Databricks - a fully managed Spark platform hosted on AWS - integrates with variety of different AWS services, Amazon S3, Kinesis, and VPC. We’ll also show you how to build your own data warehousing platform in very short amount of time and how to integrate it with other tools such as Spark’s machine learning library and Spark streaming for real-time processing of your data.
Data Lakehouse, Data Mesh, and Data Fabric (r2)James Serra
So many buzzwords of late: Data Lakehouse, Data Mesh, and Data Fabric. What do all these terms mean and how do they compare to a modern data warehouse? In this session I’ll cover all of them in detail and compare the pros and cons of each. They all may sound great in theory, but I'll dig into the concerns you need to be aware of before taking the plunge. I’ll also include use cases so you can see what approach will work best for your big data needs. And I'll discuss Microsoft version of the data mesh.
Part 3: Models in Production: A Look From Beginning to EndCloudera, Inc.
3 Things to Learn About:
-How to uplevel your existing analytics stack with a collaborative environment that supports the latest open source languages and libraries.
-How to get better use of your core data management investments while opening up new supported tools for data science.
-How to expand data science outside of silo’d environments and enable self-service data science access.
The Shifting Landscape of Data IntegrationDATAVERSITY
Enterprises and organizations from every industry and scale are working to leverage data to achieve their strategic objectives — whether they are to be more profitable, effective, risk-tolerant, prepared, sustainable, and/or adaptable in an ever-changing world. Data has exploded in volume during the last decade as humans and machines alike produce data at an exponential pace. Also, exciting technologies have emerged around that data to improve our abilities and capabilities around what we can do with data.
Behind this data revolution, there are forces at work, causing enterprises to shift the way they leverage data and accelerate the demand for leverageable data. Organizations (and the climates in which they operate) are becoming more and more complex. They are also becoming increasingly digital and, thus, dependent on how data informs, transforms, and automates their operations and decisions. With increased digitization comes an increased need for both scale and agility at scale.
In this session, we have undertaken an ambitious goal of evaluating the current vendor landscape and assessing which platforms have made, or are in the process of making, the leap to this new generation of Data Management and integration capabilities.
Managing Large Amounts of Data with SalesforceSense Corp
Critical "design skew" problems and solutions - Engaging Big Objects, MuleSoft, Snowflake and Tableau at the right time
Salesforce’s ability to handle large workloads and participate in high-consumption, mobile-application-powering technologies continues to evolve. Pub/sub-models and the investment in adjacent properties like Snowflake, Kafka, and MuleSoft, has broadened the development scope of Salesforce. Solutions now range from internal and in-platform applications to fueling world-scale mobile applications and integrations. Unfortunately, guidance on the extended capabilities is not well understood or documented. Knowing when to move your solution to a higher-order is an important Architect skill.
In this webinar, Paul McCollum, UXMC and Technical Architect at Sense Corp, will present an overview of data and architecture considerations. You’ll learn to identify reasons and guidelines for updating your solutions to larger-scale, modern reference infrastructures, and when to introduce products like Big Objects, Kafka, MuleSoft, and Snowflake.
Horses for Courses: Database RoundtableEric Kavanagh
The blessing and curse of today's database market? So many choices! While relational databases still dominate the day-to-day business, a host of alternatives has evolved around very specific use cases: graph, document, NoSQL, hybrid (HTAP), column store, the list goes on. And the database tools market is teeming with activity as well. Register for this special Research Webcast to hear Dr. Robin Bloor share his early findings about the evolving database market. He'll be joined by Steve Sarsfield of HPE Vertica, and Robert Reeves of Datical in a roundtable discussion with Bloor Group CEO Eric Kavanagh. Send any questions to info@insideanalysis.com, or tweet with #DBSurvival.
ADV Slides: Building and Growing Organizational Analytics with Data LakesDATAVERSITY
Data lakes are providing immense value to organizations embracing data science.
In this webinar, William will discuss the value of having broad, detailed, and seemingly obscure data available in cloud storage for purposes of expanding Data Science in the organization.
Geek Sync | Is Your Database Environment Ready for DevOps?IDERA Software
You can watch the replay for this Geek Sync webcast, Is Your Database Environment Ready for DevOps?, in the IDERA Resource Center: http://ow.ly/oqr850A4pKu
Modern software development teams have adopted a continuous delivery approach based upon DevOps and agile development techniques. The small and frequent code changes that result from such an approach can deliver significant benefit in terms of reduced lead time for changes, a lower failure rate, and a reduced mean time to recovery when errors are encountered. So it is no wonder that the DevOps approach is rapidly becoming the de facto standard for application development and deployment.
But until recently, not enough focus has been placed on integrating database development and management into DevOps practices and procedures. Most of the automation and tooling focuses on coding, development, and deployment rather than on the design and administrative requirements of database systems. Failing to include proper database implementation best practices into DevOps will result in problems down-the-line, including poor performance, data integrity issues, difficult to maintain systems, and other problems.
This webinar will examine the core database administration practices that need to be integrated into your DevOps pipeline to achieve success. We will discuss critical tactics that need to be included in your database DevOps approach, and take a look at different ways to successfully integrate database administration into DevOps, without negating the advantages and benefits of DevOps.
Speaker: Craig S. Mullins is president and principal consultant of Mullins Consulting, Inc. where he focuses on data management strategy and consulting. He has been named by IBM as a Gold Consultant and he writes the monthly DBA Corner column for Database Trends & Applications magazine. Craig has over three decades of experience in all facets of database systems development and has written three popular books on database systems and administration. You can follow Craig on Twitter at @craigmullins.
ADV Slides: When and How Data Lakes Fit into a Modern Data ArchitectureDATAVERSITY
Whether to take data ingestion cycles off the ETL tool and the data warehouse or to facilitate competitive Data Science and building algorithms in the organization, the data lake – a place for unmodeled and vast data – will be provisioned widely in 2020.
Though it doesn’t have to be complicated, the data lake has a few key design points that are critical, and it does need to follow some principles for success. Avoid building the data swamp, but not the data lake! The tool ecosystem is building up around the data lake and soon many will have a robust lake and data warehouse. We will discuss policy to keep them straight, send data to its best platform, and keep users’ confidence up in their data platforms.
Data lakes will be built in cloud object storage. We’ll discuss the options there as well.
Get this data point for your data lake journey.
The Data Lake and Getting Buisnesses the Big Data Insights They NeedDunn Solutions Group
Do terms like "Data Lake" confuse you? You’re not alone. With all of the technology buzzwords flying around today, it can become a task to keep up with and clearly understand each of them. However a data lake is definitely something to dedicate the time to understand. Leveraging data lake technology, companies are finally able to keep all of their disparate information and streams of data in one secure location ready for consumption at any time – this includes structured, unstructured, and semi-structured data. For more information on our Big Data Consulting Services, don’t hesitate to visit us online at: http://bit.ly/2fvV5rR
How Celtra Optimizes its Advertising Platformwith DatabricksGrega Kespret
Leading brands such as Pepsi and Macy’s use Celtra’s technology platform for brand advertising. To inform better product design and resolve issues faster, Celtra relies on Databricks to gather insights from large-scale, diverse, and complex raw event data. Learn how Celtra uses Databricks to simplify their Spark deployment, achieve faster project turnaround time, and empower people to make data-driven decisions.
In this webinar, you will learn how Databricks helps Celtra to:
- Utilize Apache Spark to power their production analytics pipeline.
- Build a “Just-in-Time” data warehouse to analyze diverse data sources such as Elastic Load Balancer access logs, raw tracking events, operational data, and reportable metrics.
- Go beyond simple counting and group events into sequences (i.e., sessionization) and perform more complex analysis such as funnel analytics.
ADV Slides: Platforming Your Data for Success – Databases, Hadoop, Managed Ha...DATAVERSITY
Thirty years is a long time for a technology foundation to be as active as relational databases. Are their replacements here? In this webinar, we say no.
Databases have not sat around while Hadoop emerged. The Hadoop era generated a ton of interest and confusion, but is it still relevant as organizations are deploying cloud storage like a kid in a candy store? We’ll discuss what platforms to use for what data. This is a critical decision that can dictate two to five times additional work effort if it’s a bad fit.
Drop the herd mentality. In reality, there is no “one size fits all” right now. We need to make our platform decisions amidst this backdrop.
This webinar will distinguish these analytic deployment options and help you platform 2020 and beyond for success.
View the companion webinar at: http://embt.co/1L8V6dI
Some claim that, in the age of Big Data, data modeling is less important or even not needed. However, with the increased complexity of the data landscape, it is actually more important to incorporate data modeling in order to understand the nature of the data and how they are interrelated. In order to do this effectively, the way that we do data modeling needs to adapt to this complex environment.
One of the key data modeling issues is how to foster collaboration between new groups, such as data scientists, and traditional data management groups. There are often different paradigms, and yet it is critical to have a common understanding of data and semantics between different parts of an organization. In this presentation, Len Silverston will discuss:
+ How Big Data has changed our landscape and affected data modeling
+ How to conduct data modeling in a more ‘agile’ way for Big Data environments
+ How we can collaborate effectively within an organization, even with differing perspectives
About the Presenter:
Len Silverston is a best-selling author, consultant, and a fun and top rated speaker in the field of data modeling, data governance, as well as human behavior in the data management industry, where he has pioneered new approaches to effectively tackle enterprise data management. He has helped many organizations world-wide to integrate their data, systems and even their people. He is well known for his work on "Universal Data Models", which are described in The Data Model Resource Book series (Volumes 1, 2, and 3).
Challenges of Operationalising Data Science in Productioniguazio
The presentation topic for this meet-up was covered in two sections without any breaks in-between
Section 1: Business Aspects (20 mins)
Speaker: Rasmi Mohapatra, Product Owner, Experian
https://www.linkedin.com/in/rasmi-m-428b3a46/
Once your data science application is in the production, there are many typical data science operational challenges experienced today - across business domains - we will cover a few challenges with example scenarios
Section 2: Tech Aspects (40 mins, slides & demo, Q&A )
Speaker: Santanu Dey, Solution Architect, Iguazio
https://www.linkedin.com/in/santanu/
In this part of the talk, we will cover how these operational challenges can be overcome e.g. automating data collection & preparation, making ML models portable & deploying in production, monitoring and scaling, etc.
with relevant demos.
What is Big Data and why it is required and needed for the organization those who really need and generating huge amount of data and when it will be use
Data Engineering is the process of collecting, transforming, and loading data into a database or data warehouse for analysis and reporting. It involves designing, building, and maintaining the infrastructure necessary to store, process, and analyze large and complex datasets. This can involve tasks such as data extraction, data cleansing, data transformation, data loading, data management, and data security. The goal of data engineering is to create a reliable and efficient data pipeline that can be used by data scientists, business intelligence teams, and other stakeholders to make informed decisions.
Visit by :- https://www.datacademy.ai/what-is-data-engineering-data-engineering-data-e/
Artificial Intelligence for Project Managers: Are You Ready?Scott W. Ambler
Artificial intelligence (AI) is finally coming into its own. Technologies such as ChatGPT, DALL-E, driver-assistance, and autonomous robots are clear signs of an AI-driven market shift. AI technologies, in particular machine learning (ML), are being applied in all sectors of the economy. Your organization is likely to soon be running projects to apply and even develop AI if it isn’t already doing so. Are you ready?
This talk overviews AI and how AI/ML initiatives work. We also explore several critical challenges, including the experimental nature of AI initiatives, that data quality is critical to your success, the high failure rate of AI initiatives, and the ethical considerations surrounding AI. We examine the implications of these challenges and work through strategies to address them.
Agenda:
1. What is(n’t) AI?
2. AI terminology in a nutshell
3. Are you ready for AI?
4. The lifecycle of an AI/ML initiative
5. Overcoming the data quality challenge
6. Ethical considerations with AI
7. Business implications of AI
8. Success and failure factors for AI initiatives
Applying Disciplined Agile: Become a Learning OrganizationScott W. Ambler
Agile and lean ways of thinking (WoT) and ways of working (WoW) are the norm in modern organizations, although adoption of them isn’t consistent nor as effective as they could be. PMI’s Disciplined Agile (DA) tool kit is a comprehensive resource that you can leverage to both improve as well as to learn how to improve. True agility requires new WoT, a new mindset and culture, as well as new WoW. Discover the critical aspects of the DA tool kit that enable you to extend and improve upon agile methods such as Scrum and SAFe to become a truly agile, learning organization.
Agenda:
• Successful “agile transformations”
• Our organizational environment
• Disciplined Agile (DA) overview
• Team-level improvement
• Organizational improvement
• Parting advice
Learning Objectives:
• Discover how to apply PMI’s Disciplined Agile (DA) tool kit
• Learn how to apply DA to improve your way of working (WoW)
• Understand how to support diverse ways of thinking (WoT) within your organization
I am available to deliver this presentation to your organization.
Part 3: Models in Production: A Look From Beginning to EndCloudera, Inc.
3 Things to Learn About:
-How to uplevel your existing analytics stack with a collaborative environment that supports the latest open source languages and libraries.
-How to get better use of your core data management investments while opening up new supported tools for data science.
-How to expand data science outside of silo’d environments and enable self-service data science access.
The Shifting Landscape of Data IntegrationDATAVERSITY
Enterprises and organizations from every industry and scale are working to leverage data to achieve their strategic objectives — whether they are to be more profitable, effective, risk-tolerant, prepared, sustainable, and/or adaptable in an ever-changing world. Data has exploded in volume during the last decade as humans and machines alike produce data at an exponential pace. Also, exciting technologies have emerged around that data to improve our abilities and capabilities around what we can do with data.
Behind this data revolution, there are forces at work, causing enterprises to shift the way they leverage data and accelerate the demand for leverageable data. Organizations (and the climates in which they operate) are becoming more and more complex. They are also becoming increasingly digital and, thus, dependent on how data informs, transforms, and automates their operations and decisions. With increased digitization comes an increased need for both scale and agility at scale.
In this session, we have undertaken an ambitious goal of evaluating the current vendor landscape and assessing which platforms have made, or are in the process of making, the leap to this new generation of Data Management and integration capabilities.
Managing Large Amounts of Data with SalesforceSense Corp
Critical "design skew" problems and solutions - Engaging Big Objects, MuleSoft, Snowflake and Tableau at the right time
Salesforce’s ability to handle large workloads and participate in high-consumption, mobile-application-powering technologies continues to evolve. Pub/sub-models and the investment in adjacent properties like Snowflake, Kafka, and MuleSoft, has broadened the development scope of Salesforce. Solutions now range from internal and in-platform applications to fueling world-scale mobile applications and integrations. Unfortunately, guidance on the extended capabilities is not well understood or documented. Knowing when to move your solution to a higher-order is an important Architect skill.
In this webinar, Paul McCollum, UXMC and Technical Architect at Sense Corp, will present an overview of data and architecture considerations. You’ll learn to identify reasons and guidelines for updating your solutions to larger-scale, modern reference infrastructures, and when to introduce products like Big Objects, Kafka, MuleSoft, and Snowflake.
Horses for Courses: Database RoundtableEric Kavanagh
The blessing and curse of today's database market? So many choices! While relational databases still dominate the day-to-day business, a host of alternatives has evolved around very specific use cases: graph, document, NoSQL, hybrid (HTAP), column store, the list goes on. And the database tools market is teeming with activity as well. Register for this special Research Webcast to hear Dr. Robin Bloor share his early findings about the evolving database market. He'll be joined by Steve Sarsfield of HPE Vertica, and Robert Reeves of Datical in a roundtable discussion with Bloor Group CEO Eric Kavanagh. Send any questions to info@insideanalysis.com, or tweet with #DBSurvival.
ADV Slides: Building and Growing Organizational Analytics with Data LakesDATAVERSITY
Data lakes are providing immense value to organizations embracing data science.
In this webinar, William will discuss the value of having broad, detailed, and seemingly obscure data available in cloud storage for purposes of expanding Data Science in the organization.
Geek Sync | Is Your Database Environment Ready for DevOps?IDERA Software
You can watch the replay for this Geek Sync webcast, Is Your Database Environment Ready for DevOps?, in the IDERA Resource Center: http://ow.ly/oqr850A4pKu
Modern software development teams have adopted a continuous delivery approach based upon DevOps and agile development techniques. The small and frequent code changes that result from such an approach can deliver significant benefit in terms of reduced lead time for changes, a lower failure rate, and a reduced mean time to recovery when errors are encountered. So it is no wonder that the DevOps approach is rapidly becoming the de facto standard for application development and deployment.
But until recently, not enough focus has been placed on integrating database development and management into DevOps practices and procedures. Most of the automation and tooling focuses on coding, development, and deployment rather than on the design and administrative requirements of database systems. Failing to include proper database implementation best practices into DevOps will result in problems down-the-line, including poor performance, data integrity issues, difficult to maintain systems, and other problems.
This webinar will examine the core database administration practices that need to be integrated into your DevOps pipeline to achieve success. We will discuss critical tactics that need to be included in your database DevOps approach, and take a look at different ways to successfully integrate database administration into DevOps, without negating the advantages and benefits of DevOps.
Speaker: Craig S. Mullins is president and principal consultant of Mullins Consulting, Inc. where he focuses on data management strategy and consulting. He has been named by IBM as a Gold Consultant and he writes the monthly DBA Corner column for Database Trends & Applications magazine. Craig has over three decades of experience in all facets of database systems development and has written three popular books on database systems and administration. You can follow Craig on Twitter at @craigmullins.
ADV Slides: When and How Data Lakes Fit into a Modern Data ArchitectureDATAVERSITY
Whether to take data ingestion cycles off the ETL tool and the data warehouse or to facilitate competitive Data Science and building algorithms in the organization, the data lake – a place for unmodeled and vast data – will be provisioned widely in 2020.
Though it doesn’t have to be complicated, the data lake has a few key design points that are critical, and it does need to follow some principles for success. Avoid building the data swamp, but not the data lake! The tool ecosystem is building up around the data lake and soon many will have a robust lake and data warehouse. We will discuss policy to keep them straight, send data to its best platform, and keep users’ confidence up in their data platforms.
Data lakes will be built in cloud object storage. We’ll discuss the options there as well.
Get this data point for your data lake journey.
The Data Lake and Getting Buisnesses the Big Data Insights They NeedDunn Solutions Group
Do terms like "Data Lake" confuse you? You’re not alone. With all of the technology buzzwords flying around today, it can become a task to keep up with and clearly understand each of them. However a data lake is definitely something to dedicate the time to understand. Leveraging data lake technology, companies are finally able to keep all of their disparate information and streams of data in one secure location ready for consumption at any time – this includes structured, unstructured, and semi-structured data. For more information on our Big Data Consulting Services, don’t hesitate to visit us online at: http://bit.ly/2fvV5rR
How Celtra Optimizes its Advertising Platformwith DatabricksGrega Kespret
Leading brands such as Pepsi and Macy’s use Celtra’s technology platform for brand advertising. To inform better product design and resolve issues faster, Celtra relies on Databricks to gather insights from large-scale, diverse, and complex raw event data. Learn how Celtra uses Databricks to simplify their Spark deployment, achieve faster project turnaround time, and empower people to make data-driven decisions.
In this webinar, you will learn how Databricks helps Celtra to:
- Utilize Apache Spark to power their production analytics pipeline.
- Build a “Just-in-Time” data warehouse to analyze diverse data sources such as Elastic Load Balancer access logs, raw tracking events, operational data, and reportable metrics.
- Go beyond simple counting and group events into sequences (i.e., sessionization) and perform more complex analysis such as funnel analytics.
ADV Slides: Platforming Your Data for Success – Databases, Hadoop, Managed Ha...DATAVERSITY
Thirty years is a long time for a technology foundation to be as active as relational databases. Are their replacements here? In this webinar, we say no.
Databases have not sat around while Hadoop emerged. The Hadoop era generated a ton of interest and confusion, but is it still relevant as organizations are deploying cloud storage like a kid in a candy store? We’ll discuss what platforms to use for what data. This is a critical decision that can dictate two to five times additional work effort if it’s a bad fit.
Drop the herd mentality. In reality, there is no “one size fits all” right now. We need to make our platform decisions amidst this backdrop.
This webinar will distinguish these analytic deployment options and help you platform 2020 and beyond for success.
View the companion webinar at: http://embt.co/1L8V6dI
Some claim that, in the age of Big Data, data modeling is less important or even not needed. However, with the increased complexity of the data landscape, it is actually more important to incorporate data modeling in order to understand the nature of the data and how they are interrelated. In order to do this effectively, the way that we do data modeling needs to adapt to this complex environment.
One of the key data modeling issues is how to foster collaboration between new groups, such as data scientists, and traditional data management groups. There are often different paradigms, and yet it is critical to have a common understanding of data and semantics between different parts of an organization. In this presentation, Len Silverston will discuss:
+ How Big Data has changed our landscape and affected data modeling
+ How to conduct data modeling in a more ‘agile’ way for Big Data environments
+ How we can collaborate effectively within an organization, even with differing perspectives
About the Presenter:
Len Silverston is a best-selling author, consultant, and a fun and top rated speaker in the field of data modeling, data governance, as well as human behavior in the data management industry, where he has pioneered new approaches to effectively tackle enterprise data management. He has helped many organizations world-wide to integrate their data, systems and even their people. He is well known for his work on "Universal Data Models", which are described in The Data Model Resource Book series (Volumes 1, 2, and 3).
Challenges of Operationalising Data Science in Productioniguazio
The presentation topic for this meet-up was covered in two sections without any breaks in-between
Section 1: Business Aspects (20 mins)
Speaker: Rasmi Mohapatra, Product Owner, Experian
https://www.linkedin.com/in/rasmi-m-428b3a46/
Once your data science application is in the production, there are many typical data science operational challenges experienced today - across business domains - we will cover a few challenges with example scenarios
Section 2: Tech Aspects (40 mins, slides & demo, Q&A )
Speaker: Santanu Dey, Solution Architect, Iguazio
https://www.linkedin.com/in/santanu/
In this part of the talk, we will cover how these operational challenges can be overcome e.g. automating data collection & preparation, making ML models portable & deploying in production, monitoring and scaling, etc.
with relevant demos.
What is Big Data and why it is required and needed for the organization those who really need and generating huge amount of data and when it will be use
Data Engineering is the process of collecting, transforming, and loading data into a database or data warehouse for analysis and reporting. It involves designing, building, and maintaining the infrastructure necessary to store, process, and analyze large and complex datasets. This can involve tasks such as data extraction, data cleansing, data transformation, data loading, data management, and data security. The goal of data engineering is to create a reliable and efficient data pipeline that can be used by data scientists, business intelligence teams, and other stakeholders to make informed decisions.
Visit by :- https://www.datacademy.ai/what-is-data-engineering-data-engineering-data-e/
Artificial Intelligence for Project Managers: Are You Ready?Scott W. Ambler
Artificial intelligence (AI) is finally coming into its own. Technologies such as ChatGPT, DALL-E, driver-assistance, and autonomous robots are clear signs of an AI-driven market shift. AI technologies, in particular machine learning (ML), are being applied in all sectors of the economy. Your organization is likely to soon be running projects to apply and even develop AI if it isn’t already doing so. Are you ready?
This talk overviews AI and how AI/ML initiatives work. We also explore several critical challenges, including the experimental nature of AI initiatives, that data quality is critical to your success, the high failure rate of AI initiatives, and the ethical considerations surrounding AI. We examine the implications of these challenges and work through strategies to address them.
Agenda:
1. What is(n’t) AI?
2. AI terminology in a nutshell
3. Are you ready for AI?
4. The lifecycle of an AI/ML initiative
5. Overcoming the data quality challenge
6. Ethical considerations with AI
7. Business implications of AI
8. Success and failure factors for AI initiatives
Applying Disciplined Agile: Become a Learning OrganizationScott W. Ambler
Agile and lean ways of thinking (WoT) and ways of working (WoW) are the norm in modern organizations, although adoption of them isn’t consistent nor as effective as they could be. PMI’s Disciplined Agile (DA) tool kit is a comprehensive resource that you can leverage to both improve as well as to learn how to improve. True agility requires new WoT, a new mindset and culture, as well as new WoW. Discover the critical aspects of the DA tool kit that enable you to extend and improve upon agile methods such as Scrum and SAFe to become a truly agile, learning organization.
Agenda:
• Successful “agile transformations”
• Our organizational environment
• Disciplined Agile (DA) overview
• Team-level improvement
• Organizational improvement
• Parting advice
Learning Objectives:
• Discover how to apply PMI’s Disciplined Agile (DA) tool kit
• Learn how to apply DA to improve your way of working (WoW)
• Understand how to support diverse ways of thinking (WoT) within your organization
I am available to deliver this presentation to your organization.
EDGY captures the intersection of three critical facets: identity, experience, and architecture. When two of these facets intersect we have brand, organization, and product. When all three facets intersect that’s when it gets interesting. This keynote works through each facet and intersection combinations within EDGY and examines it from an enterprise agility point of view. How does EDGY enable enterprise agility? What issues to we face in making each facet successful? Each intersection successful? What happens if we focus on a single facet at a time? What insights can you take from EDGY to help improve your team, your organization?
Agile Data Warehousing (DW)/Business Intelligence (BI): Addressing the Hard P...Scott W. Ambler
The world moves at a rapid pace, and your organization must be able to respond to changing conditions. Your DW/BI team is being asked to help end users answer new questions to gain new insights at an increasing pace. They need to become agile, but are struggling to do so.
This presentation addresses a series of difficult questions that DW/BI must have answers to if they are to learn how to work in a work in an agile manner:
• How can we proceed without modeling everything up front?
• What can we do when our users can’t tell us what data they need?
• How can we easily respond to changing requirements?
• How do we implement “vertical slices” of value?
• It takes weeks to analyze a legacy data source, how does that fit into a two-week sprint?
• How can we realistically deal with the quality problems of all the data sources that we work with, and usually aren’t responsible for?
• Our end users want updates in hours or days, how do we do that in two-week sprints?
• Is it realistic to evolve production data sources?
• How can we deliver new changes quickly into production?
• And more.
Organizations around the world have successfully adopted agile and lean ways of working (WoW) on the DW/BI teams. You can too.
Scott Ambler is available to present this to your organization.
Technical Debt: A Management Problem That Requires a Management SolutionScott W. Ambler
The primary cause of technical debt in your organization is very likely your managers – not your programmers nor your architects. The management desire to be “on time and on budget” often motivates deployment of poor-quality assets and rarely leaves room for investment in long-term quality. Although technical professionals may readily realize this problem managers often do not, or if they do they don’t view technical debt as a priority. It is time for a change.
This presentation explores the root causes of technical debt within organizations, many of which trace back to the management mindset and the strategies that result from it. Just like the technical challenges of addressing technical debt must be addressed by technical solutions, the management challenges of technical debt must be addressed by management solutions. It works through how to make leadership aware of technical debt and its implications, how to evolve your management practices to avoid and address technical debt, and enterprise-level strategies to embed technical debt thinking and behaviors into your culture. Results from industry research are shared throughout.
2021 marks the 20 anniversary of the Agile Manifesto. Yet many organizations are still struggling to clearly improve value delivery for their customers. In this talk Scott Ambler and Mark Lines explain why agile has struggled in the past and what we can do about it. Go beyond agile rhetoric, agile methods and frameworks and learn how to optimize agility for your situation, not others. We can do better, and it is not difficult. Disciplined Agile can help. The journey starts with an investment in learning, optimizing for your situation, and then removing obstacles to accelerate delivery and delight your customers.
A fundamental philosophy from the early days of Agile, and particularly of XP, is that teams should own their process. Today we would say that they should be allowed, and better yet, enabled, to choose their own way of working (WoW).
This was a powerful vision, but it was quickly abandoned to make way for the Agile certification gold rush. Why do the hard work of learning your craft, of improving your WoW via experimentation and learning, when you can instead become a certified master of an agile method in two days or a program consultant of a scaling framework in four? It sounds great, and certainly is great for anyone collecting the money, but 18 years after the signing of the Agile Manifesto as an industry we’re nowhere near reaching Agile’s promise. Nowhere near it.
We had it right in the very beginning, and the lean community had it right all along – teams need to own their process, they must be enabled to choose their WoW. To do this we need to stop looking for easy answers, we must reject the simplistic solutions that the agile industrial complex wants to sell us, and most importantly recognize that we need #NoFrameworks.
Agile transformations: The good, the bad, and the uglyScott W. Ambler
Are the majority of agile transformations failing? Succeeding? Just sort of stumbling along? It’s really hard to tell. You hear a lot of promises and platitudes from consulting firms specializing in transformations, you read case studies that focus on the good and downplay the bad, and there’s a plethora of agile trainers who will certify that you’re a master, a professional, or an agile coach in just a few short days. Who do you trust to share with you what’s really happening in organizations making these transformations? What’s really working? What isn’t?
How does an agile software development team choose its way of working, and do so in a context sensitive manner?
This was presented at the Toronto Agile Conference on October 30, 2018.
In agile we like to say that teams should own their own process by choosing their way of working, their “WoW.” Not only is this true of agile software development teams, it is also true for DevOps. DevOps in the enterprise is interesting because there is more to it that Dev + Ops: we also have DevSecOps, BizDevOps, and Database DevOps to take into consideration, not to mention the realities of support and release management in an established enterprise. Because every organization is different, one strategy, one “process size”, does not fit all. Worse yet, every organization faces a changing environment within which it operates, so not only does it need a WoW that meets its current needs it needs to know how to evolve that WoW as its situation evolves.
How does data management fit into agile development? How can data professionals take an agile approach to data management? What mindset do data professionals need to succeed in an agile world?
Measuring Agile: A Disciplined Approach To MetricsScott W. Ambler
This presentation works through important questions that people have about metrics on agile teams, principles around how to be effective with your agile metrics strategy, how to measure agile teams following a lightweight approach to GQM, potential metrics to collect about agile teams, and how to support IT governance through effective metrics rollups.
Enterprise architecture (EA) can potentially promote a common business vision within your organization, provide guidance to improve both business and IT decision making, and improve IT efficiencies. Unfortunately many EA teams struggle to provide these benefits, often because they are perceived as ivory tower or being too difficult to work with.
The adoption of disciplined agile and lean strategies that are based on collaboration, enablement, and streamlining the flow of work are the keys to EA success. Disciplined strategies that produce light-weight, yet still sufficient, artifacts are the key to your success. This presentation explores both the success factors and failure factors surrounding EA, pragmatic strategies for a lean/agile approach to EA, and how EA is supported and enhanced by the Disciplined Agile framework. This isn’t your grandfather’s EA strategy.
Management is so important on agile delivery teams that we do it every single day, but that doesn't imply that we
need team managers. Having said that, there are still some manager roles needed, albeit far fewer than in the past, when we scale agile both tactically and strategically within our IT organizations. So where do the rest of the
managers go?
This presentation examines what happens to traditional managers when their organization adopts agile and lean strategies. We work through the implications of several critical forces that enable us to thin out the ranks of middle management. First, agile methods push many technical management tasks into the hands of the team, thereby taking that work away from managers. Second, leadership tasks are assigned to new team roles such as the Product Owner, the Team Lead/Scrum Master, and the Architecture Owner. Third, the move away from a project-based mindset to a product-based one results in stable teams that require far less functional/resource management. Fourth, application of business intelligence technologies to implement automated team and portfolio dashboards reduces the need for manual status reporting.
Some management-oriented work remains. Teams that haven't yet automated reporting will find that someone needs to track and report progress. Large teams, also known as program teams, will likely need a Program Manager or more accurately a Program Coordinator. To support IT-level functions you are likely to need people in roles such as Portfolio Manager, Operations Manager, Help Desk Manager, and Community of Practice (CoP) Lead. Managers are still clearly needed, but in practice there tends to be far fewer management positions within agile organizations than what we find in traditional ones. This implies that many existing managers will need to reskill and transition into one of the new agile roles. The good news is that there is room for everyone within agile if they're willing to learn new skills and change with the times.
Analysis is so important to agile teams they do it every day. Every. Single. Day. In some respects agile teams perform analysis in a very different manner than traditional teams, and in some respects in a very similar manner. Agile analysis is collaborative and evolutionary in nature. Disciplined agile analysis takes it up a notch to address the complexity factors agile teams face at scale.
In this presentation we discuss how disciplined agile teams address analysis activities throughout the lifecycle. The transition to agile requires a mindset, skill set, and very often role change for people who are currently business analysts. On the majority of agile teams the role of business analyst has disappeared, but in some situations at scale the role is of vital importance – this isn’t your father’s software team any more. Lessons learned from several organizations making the transition to agile will be shared.
Key learning points:
• Discover how disciplined agile teams approach analysis, and modeling in general
• Learn agile analysis and modeling strategies
• Discover how business analysts can transition to an agile environment
Video can be found here: https://www.youtube.com/watch?v=-8tR-UbUpvI
“Technical debt” refers to any quality issues within the implementation of an IT solution that hampers your ability to work with or evolve that solution. Technical debt is often thought of as a source code problem, but it also occurs in your user interface design, in your data sources, in your network architecture, and in many other places. This presentation explores disciplined agile strategies to avoid technical debt in the first place, to remove existing technical debt, and how to fund the removal of technical debt. Industry data regarding technical debt will be shared.
Disciplined Agile Outsourcing: Making it work for both the customer and the s...Scott W. Ambler
Outsourcing projects suffer from two significant yet easily addressed problems. First, the customer’s instincts for how to run an outsourced project are more likely to hurt rather than help them. Second, service providers (SPs) prove to be little more than order takers that don’t have the courage to negotiate a winning strategy. The Disciplined Agile Delivery (DAD) process decision framework provides the foundation needed to succeed at “agile offshoring” by addressing the needs of both the customer and the SP. DAD is a goal-driven, hybrid agile, full delivery methodology that is enterprise aware and scalable. DAD provides a foundation from which you can tailor a viable strategy for disciplined agile outsourcing. This presentation explores strategies for effectively initiating and governing an outsourced IT delivery project in an agile manner. Outsourcing introduces a collection of risks that can be uniquely addressed with a disciplined agile strategy.
During this presentation you will learn:
• What the Disciplined Agile Delivery (DAD) framework is.
• The risks associated with outsourcing.
• Disciplined agile outsourcing from the point of view of the customer.
• Disciplined agile outsourcing from the point of view of the service provider.
• What you need to do to succeed at disciplined agile outsourcing.
• Industry statistics regarding agile outsourcing in practice
• Criteria to determine if you’re ready for outsourcing IT delivery projects.
This presentation explores three important questions:
1. How does disciplined agile software development work?
2. How does agile analysis work?
3. How do business analysts fit on agile teams?
Versions of this presentation has been given several times at conferences internationally.
An updated version of this presentation is available at http://www.slideshare.net/ScottWAmbler/disciplined-agile-business-analysis-58401041
Continuous Architecture and Emergent Design: Disciplined Agile StrategiesScott W. Ambler
An overview of how disciplined agile teams address architecture and design. This includes initial agile architecture modeling, proving the architecture early in the project, test-driven development, architecture spikes, architecture handbooks, and many more.
Architecture and design are so important to disciplined agile teams that we consider these issues every day. Your approach to architecture is a key enabler of agility at scale.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.