DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
2. Background
A formative evaluation, evidence of an instructional program’s worth is gathered
for use in making decisions about how to revise the program while it is being
developed. This is why it is called "formative" evaluation, because the instruction is in its
developmental stages and is not yet "grown up". The idea is to find out if your newly
developed course works at teaching the objectives you need to teach to the learners
who need to learn them, before you present it to your target audience. In any given
formative evaluation, you can find out how to make your instruction more:
Effective
Efficient
Interesting/Motivating
Usable
Acceptable You do this by carrying out procedures that will provide you with evidence
as to the effectiveness of your instruction. The emphasis is on collecting data and
revising the instruction.
3. Objective
Describe the purposes for and various stages of formative evaluation of instructor-
developed materials, instructor- selected materials, and instructor- presented
instruction.
Describe the instruments used in a formative evaluation.
Develop an appropriate formative evaluation plan and construct instruments for a set of
instructional materials or an instructor presentation.
Collect data according to a formative evaluation plan for a given set of instructional
materials or instructor presentation.
4. Formative Evaluation
Definition
The collection of data and information during the development of
instruction that can be used to improve the effectiveness of the instruction.
Purpose
To obtain data that can be used to revise the instruction to make it more
efficient and effective.
6. 1 TO 1
PURPOSE: IDENTIFY AND REMOVE ERRORS IN INSTRUCTION
CRITERIA
CLARITY
IMPACT
FEASIBILITY
SELECTING LEARNERS
DATA COLLECTION
OUTCOMES
7. CRITERIA
During the development of the instructional strategy and the instruction itself, designers and
developers make a myriad of translations and decisions that link the content, learners,
instructional format, and instructional setting. The one- to- one trials provide designers with their
first glimpse of the viability of these links and translations from the learners’ perspective. The
three main criteria and the decisions de-signers will make during the evaluation are as follows:
1. Clarity: Is the message, or what is being presented, clear to individual target learners?
2. Impact: What is the impact of the instruction on individual learner’s attitudes and achievement
of the objectives and goals?
3. Feasibility: How feasible is the instruction given the available resources ( time/ context)?
8. LEARNER SELECTION
NOT AN EXPIEREMENT
NO RANDOM SELECTION
LEARNERS SHOULD REPRESENT A WIDE VARIETY BUT IN A SMALL
GROUP
EVALUATE
9. DATA
1: CLEAR BASIC MESSAGE
VOCAB, SENTENCE COMPLEXITY, STRUCTURE
2: LINKS
WORKS FOR LEARNER, EXAMPLES,
3: PROCEDURES
TYPE OF INSTRUCTION, VARIOTION, CLARITY MAY CHANGE IF NOT APPROPIATE
10. STEPS
a one- to- one evaluation is to explain to the learner that a new set of instructional materials has
been designed and that you would like his or her reaction to them. You should say that any
mistakes that learners might make are probably due to deficiencies in the material and not theirs.
Encourage the learners to be relaxed and to talk about the materials.
You should have the learners not only go through the instructional materials but also have them
take the test( s) provided with the materials.
11. QUESTIONAIRES
HELPS YOU SPOT MISTAKES
LETS YOU KNOW WHY THEY MADE CERTAIN CHOICES
ALLOWS THE EVALUATION TO BE BASED ON THEIR OPINION AS WELL
12. INTERPREATING DATE
The information on the clarity of instruction, impact on learner, and
feasibility of instruction needs to be summarized and focused.
Particular aspects of the instruction found to be weak can then be
reconsidered in order to plan revisions likely to improve the instruction
for similar learners.
13. OUTCOMES
The outcomes of one- to- one trials are instruction that
1) contains appropriate vocabulary, language complexity, examples, and
illustrations for the participating learner;
( 2) either yields reasonable learner attitudes and achievement or is
revised with the objective of improving learner attitudes or performance
during sub-sequent trials; and
( 3) appears feasible for use with the available learners, resources, and
setting. The instruction can be refined further using small group trials.
14. SMALL GROUP
Purposes To determine the effectiveness of changes made following
the one-to-one evaluation.
To identify any remaining learning problems that learners may have.
To determine whether learners can use the instruction without interacting
with the instructor.
15. EVALUATION
To determine Weakness(es) in the Instruction
Focusing the design only on the goals and objectives of the instruction would be too
limited.
Data on learners’ achievement of goals and objectives would be insufficient, though
important, because these data will only provide information about where errors occur
rather than why they occur.
17. DESIGN REVIEW
Does the instructional goal match the problem identified in the needs assessment?
Does the learner & environmental analysis match the audience?
Does the task analysis include all the prerequisite skills?
Are the test items reliable and valid, and do they match the objectives?
18. REVIEW
Is the content accurate & up-to-date?
Does it present a consistent perspective?
Are examples, practice exercises, & feedback realistic & accurate?
Is the pedagogy consistent with current instructional theory?
Is the instruction appropriate to the audience?
19. SUMMARY
Formative evaluation of instructional materials is conducted to determine the effectiveness of the
materials and to revise them in areas where they are ineffective. Formative evaluations should be
conducted on newly developed materials as well as existing materials that are selected based on
the instructional strategy. Evaluations are necessary for both mediated and instructor presented
materials.