Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Jonathon Wright - Intelligent Performance Cognitive Learning (AIOps)


Published on

Since its beginning, the Performance Advisory Council aims to promote engagement between various experts from around the world, to create relevant, value-added content sharing between members. For Neotys, to strengthen our position as a thought leader in load & performance testing. During this event, 12 participants convened in Chamonix (France) exploring several topics on the minds of today’s performance tester such as DevOps, Shift Left/Right, Test Automation, Blockchain and Artificial Intelligence.

Published in: Software
  • Login to see the comments

Jonathon Wright - Intelligent Performance Cognitive Learning (AIOps)

  1. 1. Intelligent Performance Cognitive Learning (AIOps) Jonathon Wright
  2. 2. Cognitive Learning (AIOps)
  3. 3. CognitiveLearning– DigitalEvolution,over Revolution
  4. 4. CognitiveIntelligence(AIOps) – AI as a Platform
  5. 5. Neotys PAC 2017 – CognitiveLearning(AIOps) • THINK – Design Thinking • CREATE – Insight-Driven • Feedback loop from delivered iterative software development to idealisation. (i.e. BIZ > DEV > TEST > OPS) • LEARN – Cognitive Learning (Descriptive, Prescriptive Predictive Analytics)
  6. 6. Neotys PAC 2018 – Performanceas Code (ShiftLeft) • Performance as Code (PAC) • Capture • Import • Export • Test Data as Code (TDC) • Identify • Subset / Mask • Virtualize
  7. 7. Neotys PAC 2018 – DigitalPerformanceLifecycle(Shift Right) jonathon-wright • OpsDev • Observing • Interpreting • Modelling • Digital Experience (DX) • Behaviours • Interactions • Sentiment
  8. 8. Intelligent Performance
  9. 9. Intelligent Performance– Legacy is our Legacy
  10. 10. UK Government - DigitalServices– Solution Thinking • Digital Service Standards • Scalable & shared services • Reuse, integrate & adaptable • Open standards & common stack • Digital Transformation Journey • Mega/Multi-cloud deployment • Microcontainerization (Kubernetes) • Microservices (1,400 nodes/endpoints) • Enterprise Solutions (20+ Applications, 44 releases per month in 2018, 6 week cycle time supporting continuous delivery and deployment (CDD).
  11. 11. UK Government - DigitalCharter– LeanThinking • Value-Driven • Prove hypothesis rapidly, fail fast, fail safely and learn rapidly – celebrate failure and successes. • Cognitive Skills GAP • Transition of knowledge of digital advocates for digital technology between project and programme. • Adopt Open Standards • Build technology that uses open standards to ensure the technology works and communicates with other technology that that can be easily decoupled, extended and evolved.
  12. 12. Continuous Performance(CPx) Trends--The-year-of-AIOps-World-Quality-Report-2018-19
  13. 13. Continuous Performance(CPx) – WebDriver> Migrate UI / API Tests Docking (CLI for Testing) UI Atata . NET CORE (Selenium) UI (Selenium) NeoLoad Wrapper API (RestAssured) Headless UI (Selenium) W3C Navigation Timings Intelligent APM NeoLoad Docker (.NET CORE) Wrapper RPA Export to Microcontainerization Continuous Performance (CPx) Actionable Insight from DynaTraceVisual Studio 2019 Enterprise VIP
  14. 14. Continuous Performance(CPx) – NeoLoad Selenium Wrapper • NeoLoad Selenium Wrapper • NeoLoad Data Exchange API • DynaTrace APM Plugin + Time Series Data API
  15. 15. Continuous Performance(CPx) – Docker Grid( • Fiddler (https://www.telerik .com/fiddler) • ( ) • Atata (https://atata- framework.github.i o) • Docker for Windows (Kitematic)
  16. 16. Continuous Performance(CPx) – Intelligent Testing Invoke re- usable tasks in a common tech stack Analyze requirements for upstream/ downstream changes. Create tests to rigorously test the changes based on time and risk. Find or make the test data needed to execute all of the tests. Spin up an environment and automatically execute the tests. Analyse and aggregate data from numerous sources. Copy and move information between systems, files, and tools. Communicate with teams via chat (Slack) and email. Intelligent testing automates and optimizes more than just test execution. RPA performs repetitious, rule- based tasks in the background. Intelligent Automation framework connects technologies and keeps data aligned. Maintain data across a range of files and platforms Closed feedback loop: Run results fed back in / models updated BPMN SOAP/REST Databases MainframexPDL
  17. 17. Continuous Performance(CPx) - Azure DevOps
  18. 18. Continuous Performance(CPx) – Mega/Multi-Cloud • PortSwagger (Kali distribution) • ( m/PerfDriver) • Puppeteer ( m/GoogleChrome/ puppeteer) • Windows Server 2019 + Docker .NET Core
  19. 19. Continuous Performance(CPx) – W3C NavigationTimings • First Byte • R/R Pairs • DOM Processing • Interactive • Complete • Rending • Ready to use
  20. 20. Intelligent Operations
  21. 21. Intelligent Operations– CognitiveAdaptiveIntelligence
  22. 22. Intelligent Operations– CognitiveAdaptiveIntelligence DynaTrace + NeoLoad
  23. 23. Intelligent Operations– Pinpoint FailureAnalysis • Business Impact / Route Cause Analysis (RCA) • PurePath (Multidimensional Transactional Correlation) • Graph Based ML
  24. 24. Continuous Digital– Model Based Performance(MBP) Multidimensional Transactional Correlation Test Assets BPMN diagrams Data Lake Platform under Test Mobile & web Thick client Back-end, APIs, Messages Reports and results Auto-update JIRA and HP ALM.Auto-update Trello boards, Git, etc. Slack and email alerts MBP Modeller UI (Headless) API (REST) Enterprise Architecture Test Data as Code Performance as Code Performance Sessions Development Executable Specifications (ATDD / Swagger) Cognitive Learning (AIOps) Blueprint / SchematicDependency AnalysisAutomatic Maintenance APM
  25. 25. Continuous Digital– DigitalEngineering(Transpose/ Reverse) Build mathematically precise, BPMN-style flowcharts in-sprint, automatically using existing designs and test assets. These are made as complete as possible using a Platform Spider, DB Crawlers, message capture, and a UI Scanner. ✓ Provision complete diagrams that go beyond the UI or business keywords; ✓ Already used by Subject Matter Experts – but QA and developers can overlay additional logic onto the same models; ✓ Continually discover, learn, and pay off technical debt; ✓ Consolidate disparate files and formats for dependency and impact analysis; ✓ Critically modelling requirements identifies and eradicates ambiguities and incompleteness;
  26. 26. Continuous Digital– Cause & Effect Modelling The mathematical precision of the flowchart means that test cases can be generated directly from requirement models as they are continually updated. Tests are paths through the model, created like a route is by a GPS. ✓ Automated coverage algorithms generate the smallest set cases to cover the model. ✓ Test more in fewer tests – either exhaustively, or based on time, test history, and risk. ✓ Testing is measurable, with expected results defined. ✓ Root cause analysis creates new tests to pinpoint✓ QA provision logical precise, visual models to development – no wild goose chases searching for bugs in code. ✓ Automated maintenance: test cases can be auto-updated as the requirements models change. Changes made to one component ripple across the regression pack for rigorous, in-sprint testing.
  27. 27. Continuous Digital– Test Dataas Code (TDaC) 500+ synthetic data generation functions define data dynamically at the model level. Data is associated with each node and is created “just in time” during test creation. Test Data as Code can be defined dynamically in the requirements model, and is resolved “just in time” for every test created. The functions can create realistic synthetic data from scratch and can reference values in existing databases, files, and the mainframe. ✓ Compliant data for every test, provisioned on demand. ✓ No delays associated with cross-team constraints, waiting for Ops teams, or manually finding and making data. ✓ Resolved “just in time” for naturally up-to-date data. ✓ All data needed to satisfy the chosen coverage level is created at the same time as tests, including negative scenarios and outliers. ✓ Test and data models are linked, and keep up-to-date at the same time.
  28. 28. Continuous Digital- DiagnosisEngine
  29. 29. Continuous Digital– FailingForward/ Self Healing • Model-Based Performance (MBT) • • • •
  30. 30. Continuous Digital- Intelligent Operations
  31. 31. Continuous Digital– DigitalPerformanceLifecycle (DPL) • ACT-SENSE-RESPOND • Execute existing test artefacts (functional or non-functional) • Node test (reverse proxy endpoint) • Cause & Effect modelling (codename amber) • PROBE-SENSE-RESPOND • Endpoint / node discovery (APM / OneAgent) • Node probe (APM / request attributes) • Schematic (Multidimensional Transactional Correlation / Graph Machine Learning) • SENSE-ANALYZE-RESPOND • Route Cause Analysis (Investigate Alerts) • Blueprint (APM / MTC / Time Series Data) • Pinpoint Failure Analysis (MBT / CPx / CDD)
  32. 32. Augmented Performance – LifecycleVirtualization • SENSE-CATEGORIZE-RESPOND • Establish Multidimensional baseline • Node Learn (unstructured > structured) • Lifecycle virtualization (virtualize endpoint within microcontainer > NFV > NV > SV > AV) Node.TestNode.Probe Node.Learn Node.Data> > >
  33. 33. Augmented Performance(AP) – Microcontainerization
  34. 34. Augmented Performance(AP) – AdaptiveData Store
  35. 35. Continuous Digital(AIOps) – Intelligent Performance
  36. 36. Thank you