Robert Barton from Cisco presented on Cisco Kinetic, an IoT analytics platform. Cisco Kinetic consists of three modules: the Gateway Management Module for onboarding and managing IoT gateways at scale, the Edge and Fog Processing Module for analyzing IoT data in real-time at the edge, and the Data Control Module for securely routing IoT data between edge, fog, and cloud according to data policies. Cisco Kinetic aims to enable end-to-end IoT analytics across the entire network from device to cloud.
TechWiseTV Workshop: Cisco DNA Center AssuranceRobb Boyd
Watch the replay: http://cs.co/9007Dbh39
In this deep dive you’ll learn how this comprehensive solution provides actionable intelligence to help you get to the right IT decision faster. And speed you on your way to an intent-based network. Learn how to gain end-to-end network visibility in one easy-to-use dashboard, make more sense out of data by eliminating noise and false positives, reduce downtime and troubleshooting time with rapid root-cause analysis and actionable insights and move beyond reactive monitoring with proactive and predictive analytics.
Resources:
Watch the related TechWiseTV episode: http://cs.co/9008DXCQi
TechWiseTV: http://cs.co/9009DzrjN
Enterprise-Grade Trust: Collaboration Without CompromiseRobb Boyd
In today’s agile work environment, customers need to collaborate in real time with partners, vendors, and customers, and they want the best collaboration tools possible. At the same time, they’re cognisant of potential accidental or intentional misuse of data and malicious attacks – and the ramifications they can have for their company’s finances and reputation.
Cisco provides best-in-class collaboration tools with true end-to-end encryption that enable secure cross-company collaboration. Find out more about the six considerations for collaboration security and the new Cisco Webex Extended Security Pack – which provides a full-functionality Cisco Cloudlock cloud access security broker for Webex Teams with native Webex anti-malware capabilities powered by Cisco Talos ClamAV.
Resources:
TechWiseTV: http://cs.co/9009DzrjN
TechWiseTV Workshop: Cisco DNA Center AssuranceRobb Boyd
Watch the replay: http://cs.co/9007Dbh39
In this deep dive you’ll learn how this comprehensive solution provides actionable intelligence to help you get to the right IT decision faster. And speed you on your way to an intent-based network. Learn how to gain end-to-end network visibility in one easy-to-use dashboard, make more sense out of data by eliminating noise and false positives, reduce downtime and troubleshooting time with rapid root-cause analysis and actionable insights and move beyond reactive monitoring with proactive and predictive analytics.
Resources:
Watch the related TechWiseTV episode: http://cs.co/9008DXCQi
TechWiseTV: http://cs.co/9009DzrjN
Enterprise-Grade Trust: Collaboration Without CompromiseRobb Boyd
In today’s agile work environment, customers need to collaborate in real time with partners, vendors, and customers, and they want the best collaboration tools possible. At the same time, they’re cognisant of potential accidental or intentional misuse of data and malicious attacks – and the ramifications they can have for their company’s finances and reputation.
Cisco provides best-in-class collaboration tools with true end-to-end encryption that enable secure cross-company collaboration. Find out more about the six considerations for collaboration security and the new Cisco Webex Extended Security Pack – which provides a full-functionality Cisco Cloudlock cloud access security broker for Webex Teams with native Webex anti-malware capabilities powered by Cisco Talos ClamAV.
Resources:
TechWiseTV: http://cs.co/9009DzrjN
Incredible Compute Density: Cisco DNA Center Platform: Digging Deeper with APIsRobb Boyd
Learn more on getting hands-on with Cisco DNA Center Platform APIs. Join us as we go over the brand new DNA Center Platform and show you how to start integrating and developing your own applications on DNA Center. The possibilities are endless!
Learn how you can tailor this technology-agnostic, cloud-based platform to continually integrate, monitor, manage, and optimize the unique and disparate elements in your organization. Discover how to define and manage assets, sensors, and workflows using your own business rules, use third-party tags with sensors you already have in place, export maps from Cisco Prime Infrastructure and connect CMX to OI and navigate the dashboard, plus set up reports and alerts.
A Connected Data Landscape: Virtualization and the Internet of ThingsInside Analysis
The Briefing Room with Dr. Robin Bloor and Cisco
Live Webcast March 3, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=a75f0f379405de155800a37b2bf104db
Data at rest, data in motion - regardless of its trajectory, data remains the lifeblood of today's information economy. But finding a way to bridge old systems with new opportunities requires an innovative data strategy, one that takes advantage of multiple processing technologies. With the optimal architecture in place, companies can harness years of work in traditional information systems, while opening the door to the flood of new data sources available.
Register for this episode of The Briefing Room to learn from veteran Analyst Dr. Robin Bloor, as he explains how data virtualization and other data technologies fundamentally change what's possible with data access, movement and analysis. He'll be briefed by David Besemer of Cisco, who will discuss how this new kind of data strategy can enable the integration of legacy systems, Cloud computing and the Internet of Things. He'll also answer questions about how Big Data and the IoT are helping to redefine the practice of data management.
Visis InsideAnalysis.com for more information.
All Together Now: Connected Analytics for the Internet of EverythingInside Analysis
The Briefing Room with Mark Madsen and Cisco
Live Webcast August 18, 2015
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=0eff120f8b2879b582b77f4ff207ee54
Today's digital enterprises are seeing an explosion of data at the edge. The Internet of Everything is fast approaching a critical mass that will demand a sea change in how companies process data. This new world of information is widely distributed, streaming, and overall becoming too big to move. Experts predict that within two to three years, the bulk of analytic processing will take place on the fringes of information architectures. As a result, forward-thinking companies are dramatically shifting their analytic strategies.
Register for this episode of The Briefing Room to hear veteran Analyst Mark Madsen of Third Nature explain how a new era of information architectures is now unfolding, paving the way to much more responsive and agile business models. He'll be briefed by Kim Macpherson of the Cisco Data and Analytics Business Unit, who will explain how her company's platform is uniquely suited for this new, federated analytic paradigm. She'll demonstrate how edge analytics can help companies address opportunities quickly and effectively.
Visit InsideAnalysis.com for more information.
Cisco Kinetic. Раскрывая ценность данныхCisco Russia
Запись вебинара:
http://ciscoclub.ru/raskryvaya-cennost-dannyh-s-pomoshchyu-cisco-kinetic
Данная сессия посвящена распределенной IoT-платформе Cisco Kinetic, ее архитектуре, типовым сценариям применения.
Slides: Why You Need End-to-End Data Quality to Build Trust in KafkaDATAVERSITY
By adopting streaming architectures like Apache Kafka as a way to ingest and move large amounts of data very quickly, organizations are making major investments to access real-time data – and fundamentally changing how they do business. However, the advantages of Kafka can quickly be outweighed by the threat of poor Data Quality. Without Data Quality, all of the time and resources spent in building a new framework will fail to return the benefits that a Kafka platform offers.
Join Infogix’s Jeff Brown as he shares how data trust in your Kafka streaming framework is achievable when you put the proper validations and Data Quality components in place.
In this webinar, you’ll learn:
• Why organizations are moving to a streaming-based architecture
• What challenges are being faced when adopting Kafka messages as a new system-to-system communication method
• How to build data trust within your organization and its streaming framework
• Key directions on how to reconcile, balance, validate, and apply Data Quality to your streaming Data Architecture
• What customers are saying about their Kafka investment and how they’re working with Infogix to deliver data trust
The starting point for this project was a MapReduce application that processed log files produced by the support portal. This application was running on Hadoop with Ruby Wukong. At the time of the project start it was underperforming and did not show good scalability. This made the case for redesigning it using Spark with Scala and Java.
Initial review of the Ruby code revealed that it was using disk IO excessively, in order to communicate between MapReduce jobs. Each job was implemented as a separate script passing large data volumes through. Spark is more efficient in managing intermediate data passed between MapReduce jobs – not only it keeps it in memory whenever possible, it often eliminates the need for intermediate data at all. However, that alone not brought us much improvement since there were additional bottlenecks at data aggregation stages.
The application involved a global data ordering step, followed by several localized aggregation steps. This first global sort required significant data shuffle that was inefficient. Spark allowed us to partition the data and convert a single global sort into many local sorts, each running on a single node and not exchanging any data with other nodes. As a result, several data processing steps started to fit into node memory, which brought about a tenfold performance improvement.
How Cisco Migrated from MapReduce Jobs to Spark Jobs - StampedeCon 2015StampedeCon
At the StampedeCon 2015 Big Data Conference: The starting point for this project was a MapReduce application that processed log files produced by the support portal. This application was running on Hadoop with Ruby Wukong. At the time of the project start it was underperforming and did not show good scalability. This made the case for redesigning it using Spark with Scala and Java.
Initial review of the Ruby code revealed that it was using disk IO excessively, in order to communicate between MapReduce jobs. Each job was implemented as a separate script passing large data volumes through. Spark is more efficient in managing intermediate data passed between MapReduce jobs – not only it keeps it in memory whenever possible, it often eliminates the need for intermediate data at all. However, that alone not brought us much improvement since there were additional bottlenecks at data aggregation stages.
The application involved a global data ordering step, followed by several localized aggregation steps. This first global sort required significant data shuffle that was inefficient. Spark allowed us to partition the data and convert a single global sort into many local sorts, each running on a single node and not exchanging any data with other nodes. As a result, several data processing steps started to fit into node memory, which brought about a tenfold performance improvement.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.