We are using data at a record pace. This directly impacts data centers and how they manage the increase in demand. Check out the data center trends for 2014.
The Evolving Data Center – Past, Present and FutureCisco Canada
The journey to Cloud is not linear. Realistically, most environments will have workloads that continue to run on both physical and virtualized infrastructures for some time. Join Cisco’s Data Centre Experts, as they outline the key technologies transforming the Data Centre, enabling an intelligent infrastructure which will support physical, virtualized and cloud applications as part of Cisco’s Unified Data Centre Architecture.
Re-architecting the Datacenter to Deliver Better Experiences (Intel)COMPUTEX TAIPEI
COMPUTEX TAIPEI 2013 - Cloud Industry Forum
Topic: Re-architecting the Datacenter to Deliver Better Experiences
Speaker:Lisa H. Graff
Vice President and General Manager of Datacenter Marketing Group, Intel
Hyperscale Nightmare-Potential Consequences of using Consumer SSDs in DatacenterStorage Switzerland
Hyper-scale data centers (and many enterprises as well) face relentless pressure to contain costs. One way to lower those costs is to use consumer-grade solid-state drives in their scale-out server architectures, instead of the enterprise-class SSDs that are routinely used in these environments. This strategy can lower the initial cost for flash drives, but can have other undesirable consequences that cause real, long-term problems. In these high drive-count infrastructures, deploying the wrong flash can cause a nightmare scenario of:
• Inconsistent Performance of non-enterprise drives
• Poor Reliability of non-enterprise drives
• Replacement of failed drives - adds cost, disruption
• Management of mixed drive environment - can be complex
• High Cost - of abandoning alternative SSDs altogether
Protect yourself and register for our live webinar “Hyper-scale Nightmare: 5 Ways Flash can Cripple your Data Center” featuring Storage Switzerland Founder, George Crump and Shawn Worsell, Director of Product Management at OCZ/Toshiba.
The Evolving Data Center – Past, Present and FutureCisco Canada
The journey to Cloud is not linear. Realistically, most environments will have workloads that continue to run on both physical and virtualized infrastructures for some time. Join Cisco’s Data Centre Experts, as they outline the key technologies transforming the Data Centre, enabling an intelligent infrastructure which will support physical, virtualized and cloud applications as part of Cisco’s Unified Data Centre Architecture.
Re-architecting the Datacenter to Deliver Better Experiences (Intel)COMPUTEX TAIPEI
COMPUTEX TAIPEI 2013 - Cloud Industry Forum
Topic: Re-architecting the Datacenter to Deliver Better Experiences
Speaker:Lisa H. Graff
Vice President and General Manager of Datacenter Marketing Group, Intel
Hyperscale Nightmare-Potential Consequences of using Consumer SSDs in DatacenterStorage Switzerland
Hyper-scale data centers (and many enterprises as well) face relentless pressure to contain costs. One way to lower those costs is to use consumer-grade solid-state drives in their scale-out server architectures, instead of the enterprise-class SSDs that are routinely used in these environments. This strategy can lower the initial cost for flash drives, but can have other undesirable consequences that cause real, long-term problems. In these high drive-count infrastructures, deploying the wrong flash can cause a nightmare scenario of:
• Inconsistent Performance of non-enterprise drives
• Poor Reliability of non-enterprise drives
• Replacement of failed drives - adds cost, disruption
• Management of mixed drive environment - can be complex
• High Cost - of abandoning alternative SSDs altogether
Protect yourself and register for our live webinar “Hyper-scale Nightmare: 5 Ways Flash can Cripple your Data Center” featuring Storage Switzerland Founder, George Crump and Shawn Worsell, Director of Product Management at OCZ/Toshiba.
Every business has a data center, regardless of the size. Even the smallest business has it. It is an ever-growing part of business in the modern world and a key business parameter, since data center influences the functioning of business enterprise. Imagine what happens to the business operation when the data center is interrupted. Any interruption can lead to serious breakdown. That is why efficient backup strategy is essential.
Determining your data center strategy is critical in this expanding world of big data, cloud and mobility. Should you build your own data center, consider a wholesale arrangement, colocate with another carrier or transfer your critical information to the cloud? Or, does some combination of these options best suit your needs? Where do you even begin when planning these large enterprise decisions?
Join Randy Ortiz, VP of Data Center Design and Engineering, from Internap as he breaks down the steps you need to take to achieve a successful outcome for your data center initiatives.
Key topics include:
*Important decision-making considerations
*Why flexibility matters
*Top trends to watch today
What Does It Cost to Build a Data Center? (SlideShare)SP Home Run Inc.
http://DataCenterLeadGen.com
What Does It Cost to Build a Data Center? (SlideShare).
The “build a data center” decision is not to be taken lightly. Consider these different cost factors to see if a build or lease is better.
Copyright (C) SP Home Run Inc. All worldwide rights reserved.
As DCIM emerges into a familiar term, a fortified discipline, a new market of solutions, and by definition 'integrates IT and facilities management', what does 'bridging the departmental gap' really mean? Where are the gaps, where will the synergy be, and what will be done differently with DCIM in the mix? Join Michael Tresh, Director of Product Management and Marketing, as he discusses legacy, current, and future data center infrastructure management.
Next Generation Data Center - IT TransformationDamian Hamilton
Computerworld CIO Event in Hong Kong sponsored by Dimension Data, EMC & Cisco.
Insights into Dimension Data's DC strategy and recent Client engagements
Data Center Infrastructure Management (DCIM) solutions are combining best practices from facilities & IT, and are simplifying capacity management, increasing availability, and extending the life of existing data centers. But to experience these benefits, you need to choose the solution that best fits your data center. Join Viridity Software co-founder and CTO, Mike Rowan, as he discusses how DCIM software helps you manage your data center more efficiently. This webinar will include a quick demonstration of Viridity EnergyCenter and will be open to your questions.
Panduit Physical Infrastructure Manager™ (PIM™) Software Platform and PViQ Intelligent Hardware combine for a comprehensive data center infrastructure management (DCIM) solution. This intelligent software and hardware provides data center professionals greater staff productivity and visibility of all data center assets along with their connectivity, locations, and relationships. PIM™ solutions allow you to discover, visualize, model, control, report, predict and manage all physical data center assets including the ability to simply deploy new assets and plan capacity for future growth. PIM™ solutions can also help control energy costs, reduce risks and increase operational efficiency.
The Trellis DCIM (Data Center Infrastructure Managment) Solution is a Dynamic infrastructure optimization Platform. It's the first holistic DCIM platform of hardware, software and services to bridge the critical gap between IT equipment and data center physical infrastructure.
Trellis ensures Availability, Efficiency, Agility for the modern Data Center Infrastructure.
ScottMadden has developed an approach for analyzing data center requirements and driving improvements in existing data center retrofits. Our approach takes into account the technological requirements, the physical attributes of a data center, and the requirements for a rigorous measurement and verification program needed to ensure improvements actually capture the energy efficiently gains and the resultant greenhouse gas reductions.
Our approach addresses the latest trends in data center management such as virtualization and cloud computing and provide a framework for developing metrics needed to drive changes in data center performance.
This presentation shows that Data Center Infrastructure Management (DCIM) Software to a Data Center Manager is what ERP software is to a VP - Manufacturing. This is the 2nd presentation from a series of 3-part series from GreenField Software on the subject: DCIM for High Availability.
DCIM Software charts out the relationship maps for assets by identifying various dependencies among them. Threshold-based alerts on critical parameters, combined with impact analysis of Move-Add-Change, mitigates risks of DC failures.
GreenField Software’s Mission is to help Data Centers control capital expenditures reduce operating expenses and mitigate the risks of Data Center failures. Besides DCIM Software, GFS offers Data Center Advisory Services in the areas of best practices, capacity planning, energy efficiency and business continuity of data centers.
This presentation provides an overview of business trends and perspectives on the deployment and implementation of network virtualization technologies.
Every business has a data center, regardless of the size. Even the smallest business has it. It is an ever-growing part of business in the modern world and a key business parameter, since data center influences the functioning of business enterprise. Imagine what happens to the business operation when the data center is interrupted. Any interruption can lead to serious breakdown. That is why efficient backup strategy is essential.
Determining your data center strategy is critical in this expanding world of big data, cloud and mobility. Should you build your own data center, consider a wholesale arrangement, colocate with another carrier or transfer your critical information to the cloud? Or, does some combination of these options best suit your needs? Where do you even begin when planning these large enterprise decisions?
Join Randy Ortiz, VP of Data Center Design and Engineering, from Internap as he breaks down the steps you need to take to achieve a successful outcome for your data center initiatives.
Key topics include:
*Important decision-making considerations
*Why flexibility matters
*Top trends to watch today
What Does It Cost to Build a Data Center? (SlideShare)SP Home Run Inc.
http://DataCenterLeadGen.com
What Does It Cost to Build a Data Center? (SlideShare).
The “build a data center” decision is not to be taken lightly. Consider these different cost factors to see if a build or lease is better.
Copyright (C) SP Home Run Inc. All worldwide rights reserved.
As DCIM emerges into a familiar term, a fortified discipline, a new market of solutions, and by definition 'integrates IT and facilities management', what does 'bridging the departmental gap' really mean? Where are the gaps, where will the synergy be, and what will be done differently with DCIM in the mix? Join Michael Tresh, Director of Product Management and Marketing, as he discusses legacy, current, and future data center infrastructure management.
Next Generation Data Center - IT TransformationDamian Hamilton
Computerworld CIO Event in Hong Kong sponsored by Dimension Data, EMC & Cisco.
Insights into Dimension Data's DC strategy and recent Client engagements
Data Center Infrastructure Management (DCIM) solutions are combining best practices from facilities & IT, and are simplifying capacity management, increasing availability, and extending the life of existing data centers. But to experience these benefits, you need to choose the solution that best fits your data center. Join Viridity Software co-founder and CTO, Mike Rowan, as he discusses how DCIM software helps you manage your data center more efficiently. This webinar will include a quick demonstration of Viridity EnergyCenter and will be open to your questions.
Panduit Physical Infrastructure Manager™ (PIM™) Software Platform and PViQ Intelligent Hardware combine for a comprehensive data center infrastructure management (DCIM) solution. This intelligent software and hardware provides data center professionals greater staff productivity and visibility of all data center assets along with their connectivity, locations, and relationships. PIM™ solutions allow you to discover, visualize, model, control, report, predict and manage all physical data center assets including the ability to simply deploy new assets and plan capacity for future growth. PIM™ solutions can also help control energy costs, reduce risks and increase operational efficiency.
The Trellis DCIM (Data Center Infrastructure Managment) Solution is a Dynamic infrastructure optimization Platform. It's the first holistic DCIM platform of hardware, software and services to bridge the critical gap between IT equipment and data center physical infrastructure.
Trellis ensures Availability, Efficiency, Agility for the modern Data Center Infrastructure.
ScottMadden has developed an approach for analyzing data center requirements and driving improvements in existing data center retrofits. Our approach takes into account the technological requirements, the physical attributes of a data center, and the requirements for a rigorous measurement and verification program needed to ensure improvements actually capture the energy efficiently gains and the resultant greenhouse gas reductions.
Our approach addresses the latest trends in data center management such as virtualization and cloud computing and provide a framework for developing metrics needed to drive changes in data center performance.
This presentation shows that Data Center Infrastructure Management (DCIM) Software to a Data Center Manager is what ERP software is to a VP - Manufacturing. This is the 2nd presentation from a series of 3-part series from GreenField Software on the subject: DCIM for High Availability.
DCIM Software charts out the relationship maps for assets by identifying various dependencies among them. Threshold-based alerts on critical parameters, combined with impact analysis of Move-Add-Change, mitigates risks of DC failures.
GreenField Software’s Mission is to help Data Centers control capital expenditures reduce operating expenses and mitigate the risks of Data Center failures. Besides DCIM Software, GFS offers Data Center Advisory Services in the areas of best practices, capacity planning, energy efficiency and business continuity of data centers.
This presentation provides an overview of business trends and perspectives on the deployment and implementation of network virtualization technologies.
Five Power Trends on Their Way to Your Data Centerdigitallibrary
The forces shaping data-center change are clear: 96 percent of data centers will run out of capacity by 2011. Data-center energy use is at an all-time high and so are availability demands. This presentation will share the top power protection, distribution and management trends surrounding those issues and share benchmarks from some of the most critical data centers in the world. You'll learn the latest thinking on power-system energy efficiency, an emerging approach to high-density power distribution, and new ways to grow your power-protection capacity without sacrificing availability.
Presentation by Bo Parker, Managing Director of Center for Technology and Innovation at PricewaterhouseCoopers. Presentation was shown during the lecture at Digital October technology entrepreneurship center in Moscow, on 26 October.
MANAGE DEVICES AND APPS FROM THE CLOUD
With the proliferation of mobile devices in the workplace, employees can, and do, work from just about anywhere. To stay productive, this mobile workforce demands consistent access to corporate resources and data from any location on any device. This trend has introduced significant challenges for IT administrators who want to enable enterprise mobility while ensuring that corporate resources are protected from unauthorized access.
Leveraging Microsoft Intune, you can deliver application and device management completely from the cloud, or on-premises through integration with System Center 2012 Configuration Manager, all via a single management console.
Microsoft has also incorporated manageability and data protection directly into the Intune-managed Office mobile apps to help maximize productivity while providing the flexibility to extend these same management capabilities to your existing line-of-business apps through the Intune App Wrapping Tool.
Intune is included as part of Microsoft’s Enterprise Mobility Suite, the most cost-effective way to leverage Microsoft’s enterprise mobility cloud services for all of your employees.
SIMA AZ: Emerging Information Technology Innovations & Trends 11/15/17Mark Goldstein
Mark Goldstein, International Research Center presented a big overview of Emerging Information Technology Innovations & Trends to the Society for Information Management Arizona Chapter (SIM AZ) on 11/15/17 showcasing the latest and greatest emerging technologies and novel tech innovations, highlighting the market and societal transformations underway or anticipated. It covered Advances in Computer Power and Pervasiveness; Internet of Things (IoT) Overview and Ecosystem; Mobility, Augmented Reality and Virtual Reality (AR/VR); Medical Advances Through Informatics; Artificial Intelligence (AI) and Robotics; Big Data, Its Applications and Implications; and Onward into the Future…
Cloudera + Syncsort: Fuel Business Insights, Analytics, and Next Generation T...Precisely
Effective AI and ML projects require a perfect blend of scalable, clean data funneled from a variety of sources across the business. The only problem? Uncleaned data often lives in hard-to-access legacy systems, and it costs time and money to build the right foundation to deliver that data to answer ever-changing questions from business users. Together, Cloudera and Syncsort enable you to build a scalable foundation of data connections to reinvent the data lifecycle of all your projects in the most efficient way possible.
View this webinar on-demand to learn how innovative solutions from Cloudera and Syncsort enable AI and ML success. You will learn:
• Best practices for transforming complex data into clear, actionable insights for AI and ML projects
• How to visually assess the quality of the sources in your data lake and their completeness, consistency, and accuracy
• The value of an Enterprise Data Cloud and the newly unveiled Cloudera Data Platform
• How Syncsort Connect integrates natively with the Cloudera Data Platform
¿Cómo las manufacturas están evolucionando hacia la Industria 4.0 con la virt...Denodo
Watch full webinar here: https://bit.ly/3cbpipB
Uno de los sectores en los que la transformación digital está teniendo un efecto más disruptivo es el de la fabricación. Líderes del sector manufacturero están apostando por el Big Data, la computación en la nube, la inteligencia artificial y el Internet de las Cosas (IoT) entre otras tecnologías, además de contemplar la llegada de la 5G, con el fin de:
- Automatizar los procesos de manera eficiente, para permitir una mayor producción en menor tiempo
- Crear valor añadido en los productos manufacturados
- Conectar la planta industrial con el punto de venta
- Impulsar el análisis en tiempo real de datos provenientes de diferentes cadenas de producción
Sin embargo, para alcanzar estos objetivos y llevar a cabo esta revolución tecnológica, también conocida como industria 4.0, las manufacturas tienen que enfrentarse a una serie de desafíos no negligentes. El sector industrial es el que genera más datos en el mundo, y en la era digital, la velocidad, la diversidad y el volumen exponencial de los datos pueden superar las arquitecturas de TI tradicionales. Además, la mayoría de los fabricantes se enfrentan a silos de datos, lo que hace que su tratamiento sea lento y costoso. Necesitan entonces una plataforma de TI fiable que permita integrar, centralizar y analizar datos de distintas fuentes y diferentes formatos de manera ágil y segura para poner la información al servicio del negocio.
Los expertos de Enki y Denodo te proponen este seminario online para descubrir qué es la virtualización de datos, y por qué líderes del sector apuestan por esta tecnología innovadora para optimizar su estrategia de TI y conseguir un ROI significativo gracias a un acceso más rápido, simple y unificado a los datos industriales.
Analyst Webinar: Best Practices In Enabling Data-Driven Decision MakingDenodo
Watch full webinar here: https://bit.ly/37YkgN4
This presentation looks at the trends that are emerging from companies on their journeys to becoming data-driven enterprises.
These trends are taken from a survey of 500 companies and highlight critical success factors, what companies are doing, their progress so far and their plans going forward. It also looks at the role that data virtualization has within the data driven enterprise.
During the session we'll address:
- What is a data-driven enterprise?
- What are the critical success factors?
- What are companies doing to create a data-driven enterprise and why?
- What progress are they making?
- What are the plans on people, process and technologies?
- Why is data virtualization central to provisioning and accessing data in a data-driven enterprise?
- How should you get started?
Looking to the Future: Embracing the Cloud for a More Modern Data Quality App...Precisely
Data quality: it’s what we all strive for, and yet we don’t always have what we need to achieve it.
Embracing the cloud with a more holistic, yet simplified user experience will help you find exponential value in your data today – and plan for tomorrow. Join us to learn about a more modern approach that will empower your teams to more deeply understand, trust, and pro-actively address anomalies in your critical data.
Learn more about the value of next-generation cloud solutions that will power your organization into the future by joining us on September 22 where you will hear from Precisely’s Emily Washington, SVP of Product Management, Chuck Kane, VP of Product Management, and David Woods, SVP of Strategic Services. Be sure to bring your questions for our team of experts to the live Q&A session following their presentations and demos.
MT01 The business imperatives driving cloud adoptionDell EMC World
Cloud adoption has reached an inflection point, pushing organizations into an "adapt or die" state, forcing new operating models, effective management of internal and external resources, and transformation towards an application-centric mentality. Cloud approaches are maturing past the point of public clouds domination, shifting focus to private & hybrid cloud and effective management of a multi-cloud environment. Attend this session to learn how to realize true business value when the friction of the business dynamic is supported by flexible cloud services delivered with predictability & speed.
Data Architecture Strategies: Data Architecture for Digital TransformationDATAVERSITY
MDM, data quality, data architecture, and more. At the same time, combining these foundational data management approaches with other innovative techniques can help drive organizational change as well as technological transformation. This webinar will provide practical steps for creating a data foundation for effective digital transformation.
In today’s environment it can be difficult for organizations to keep up with customer expectations. So often they try to innovate and deliver the best products while also modernizing legacy applications. Executives are pushing to bring all of these legacy applications to the cloud, but without the proper planning and execution this can become an expensive, time-consuming project.
Join this session to learn how to effectively move your data to the cloud with data that you can trust.
In today’s environment it can be difficult for organizations to keep up with customer expectations. So often they try to innovate and deliver the best products while also modernizing legacy applications. Executives are pushing to bring all of these legacy applications to the cloud, but without the proper planning and execution this can become an expensive, time-consuming project.
Join this session to learn how to effectively move your data to the cloud with data that you can trust.
By Thoughtworks | Building data as a product: The key to unlocking Data Mesh'...IngridBuenaventura
Building data as a product: The key to unlocking Data Mesh's potential
Data as a product is an exciting concept. It brings product thinking into datasets and facilitates data driven culture by encouraging teams to share data rather than data being in a silo. Once we are convinced with the philosophy of data as a product then there is an immediate question of how to build them?
In this talk we will uncover ways to design and architect data as a product that meets the needs of a business use case. We will also discuss creating a blueprint for better resiliency via contracts and service level objectives.
Speakers: Harmeet Sokhi, Lead Data Consultant, Thoughtworks and Vishal Srivastava, Senior Data Engineer, Thoughtworks
Harmeet has extensive experience in Cloud, data engineering and machine learning operations. She has worked on designing large enterprise-scale data applications and has also implemented mature machine learning engineering solutions for clients in several industries. She is always in the pursuit of learning and keeps herself current in the ever-changing technology landscape. She is an experienced team leader who helps address challenges, both technical and non-technical, to deliver highly credible results.
Vishal is a Senior Data Engineer with DevOps skills who has worked across a range of industries. He has experience in establishing cloud infrastructure foundations, event-driven data lake, data visualisation, master data management, data quality and data governance frameworks. He is passionate about real time event driven distributed systems. Vishal has used these experiences to enable use cases which help businesses realise real value from data.
Increase Operational Efficiency with District Heating and Cooling (DHC) Management System Powered by the CyberVille® IoE / Industrial Internet Application Platform
On the video below you can see Fortum's Suomenoja CHP power plant installation in action: https://www.youtube.com/watch?v=e6upXL-qcG4
Please see also Industrial Internet Consortium (IIC) case study on same topic: http://www.iiconsortium.org/case-studies/Cyberlightning_Fortum_Case_Study.pdf
Building a hybrid, dynamic cloud on an open architectureDaniel Krook
Daniel Krook's version of the IBM open cloud overview, focusing on the business and technological imperatives driving the IBM strategy for customers.
Presented 9/30 and 10/1 at Boston TechFest, Cambridge, MA.
Consumption based analytics enabled by Data VirtualizationDenodo
Watch full webinar here: https://buff.ly/2NM5Jtf
An eclectic mix of old and new data drives every decision and every interaction, but too many organisations are attempting unsuccessfully to consolidate this data into a single repository which is time-consuming, resource-intensive, expensive, and risky.
Join this Denodo and HCL Webinar to discover how data virtualization provides an effective modern day architecture and an alternative to data consolidation and the challenges of fragmented data ecosystems and traditional integration approaches. We will share stories and provide multiple perspectives on best practices and solutions.
Content will include:
- Business use cases that highlight challenges and solutions that result in faster time-to-market and greater ROI.
- Suggested approaches to achieve extreme agility for competitive advantage.
OpenWorld: 4 Real-world Cloud Migration Case StudiesDatavail
In this presentation, get answers to these questions and more by exploring four different successful real-world Oracle EPM Cloud migration and implementation case studies for Oracle Enterprise Planning and Budgeting Cloud Service, Oracle Financial Consolidation and Close Cloud Service, and Oracle Account Reconciliation Cloud Service. Attendees get a birds-eye view into the practicalities of moving to the cloud and making the business case for their own company.
IBM NYSE event - 1-16 IBM's Stephen Leonard - Remake Enterprise IT for the Ne...Cliff Kinard
Stephen Leonard GM, Sales, IBM Systems and Technology Group -
Why Infrastructure Matters
• Remake Enterprise IT for the New Era of Cloud
• Drive Business Transformation with Big Data & Analytics
Keynote presentation from the Jan 16th IBM Infrastructure Matters event NYSE.
Lean how Matters Stephen Leonard General Manager Global Markets IBM Systems elucidates the impact of IBM Flex system in businesses which can be instrumental in helping firms to cut their operational down cost. In simple terms “IBM Flex System represents an entirely new generation of technology, with more performance and bandwidth, true integrated enterprise SAN storage.
For more information on Pure Systems, visit http://ibm.co/J7Zb1v.
http://www.scribd.com/doc/210711973/stephen-leonard-ibm-big-data-and-cloud
Moving IBM i Applications to the Cloud with AWS and PreciselyPrecisely
Core transactional systems like IBM i represent an essential element of the global economy and run mission-critical business processes today.
However, to remain competitive in today’s constantly evolving IT landscape, organizations must integrate cloud-based technologies, such as those from the AWS Cloud, into their architecture to unlock business value, especially in advanced analytics and AI applications.
Organizations that successfully integrate and operate new cloud-based technologies alongside their core legacy systems pave the way to deliver solutions that drive operations efficiently while creating space for innovation and forward-thinking projects. This combination will serve as a competitive advantage as business and customer needs continue to grow.
AWS Mainframe Modernization Data Replication with Precisely unleashes mainframe data for innovation with the AWS Cloud by enabling near real-time replication of heterogeneous data from mainframe data sources like Db2, IMS, and VSAM to a wide range of AWS Cloud database destinations.
Join the session to see how Precisely and AWS together provide modernization capabilities to users looking to drive innovation from IBM Series I data through near real-time replication to the AWS Cloud, providing the foundation of new business channels.
During this session, we will discuss:
- How successful organizations manage both cloud and legacy systems
- The combined AWS and Precisely offering
- Real-world use cases of IBM i users leveraging AWS and Precisely
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
According to DCD’s census report of global data center investment in 2013 which covers all expenditures related to DC construction, Infrastructure, and IT, there was a worldwide increase of 8% in spending over 2012. The global investment spend was 151.3 Billion $US with Latin America leading the way with 12.2% increase. This is due to many factors, the biggest is the increasingly growing number of devices connected to the internet or ‘The Internet of Things’ driving and enabling Social Media, Entertainment, and instant data any time, anywhere and on any device. More businesses are relying on the internet to enable customers to shop, research, and perform tasks. This has led to new data center models being developed and the rise of service and cloud provider giants such as Amazon, Baidu, Ali Baba, and Google.
The nexus of forces describes the convergence and mutual reinforcement of four interdependent trends: social interaction, mobility, cloud, and information. The forces combine to empower individuals as they interact with each other and their information through well-designed ubiquitous technology.
What Is the Nexus, and Where Does It Come From?
The nexus is the point at which two or more of four major IT forces converge to create new patterns of outcomes in technology use, business reality, market dynamics and changes to the lives of people.
The main drivers for the trends in the industry are due to what Gartner calls the Nexus of Forces. Social media, Mobile devices, Information (big data) and cloud technologies are reshaping the way businesses rely on their data centers to generate new opportunities, methods of advertising, optimize the customer experience, and new methods of collecting valuable marketing research in expensively. This nexus of forces has made it easier for the consumer to demand more of the business, raising the bar for more instant gratification.
Four independent forces — social, mobile, cloud and information — have converged as a result of human behavior, creating a technology-immersed environment. The forces interact and reinforce one another and are associated through complex dependencies. New business opportunities emerge from this Nexus of Forces, especially scenarios that extend reach and relationship to customers, citizens, patients, employees or any other participant in an ecosystem of humans and machines. The combination of pervasive mobility, near-ubiquitous connectivity, industrial compute services and information access decreases the gap between idea and action. To take advantage of the Nexus of Forces and respond effectively, organizations must face the challenges of modernizing their systems, skills and mindsets. Organizations that ignore the Nexus of Forces will be displaced by those that can move into the opportunity space more quickly — and the pace is accelerating.
Cisco estimates that there are 1.4 Trillion devices that can be connected but only 10 Billion are connected today. Cisco predicts by 2020 that there will be 500 Billion devices attached to the internet.
On-Demand Business: Anytime/where/device data – creates the Consumerization of enterprise IT, powered by the internet, that empowers to user more than ever. The enterprise needs to be intelligent and interactive to conduct business in an on-demand manner. To do so, enterprises need to shift from functional to integrated enterprise platforms. Shifting from silos to an integrated enterprise platforms is key to competing. Quick fact: 50% of all internet traffic is from a mobile device (smart phone, tablet, notebook, etc.)
Example is UK retailer John Lewis and Waitrose has seen ‘omni-channel’ or use of the on-line, mobile platfoms, in store, and social media driving sales upward (19.2% and 41.4% year over year). Web channel business bandwidth grew 50%. Retailers are using the data collected to provide greater personalization experience with customers, thus strengthening a personal bond that sometimes is missing from a pure on-line purchase. They are seeing people come to the store for advice and then purchase on line.
Show-rooming is increasing – provides more power to the consumer. So because of this we’re seeing a shift consumers/retailer relationships that also shifts the way businesses are using data centers we’re seeing two major trends.
Due to the increase demand by the consumer on businesses to provide services and access to data via the internet, business have been relying more and more on the data center to drive the business. The data center is becoming what the factory of the late 1800’s and throughout most of the 1900’s. It is the delivery mechanism for products, services, advertising, news, information, and communication of the 21st Century. Whose household today doesn’t have at least 5-10 devices connected to the internet? The future promises more devices connecting, a variety of time saving services, and new delivery methods for products such as entertainment, education, banking, etc. Businesses are using their data centers to generate more revenue in the form of these services, product delivery, and collecting ‘Big Data’ to use to develop new products and services to the consumer.
We’ve been experiencing this decrease or delay in spending from the Enterprise DC due to the adoption of virtualization. This is increasing the power density of the cabinet and decreasing the need for additional cabinets and space to support data center support. What use to be 1000s of servers, operating at 20% or less of capacity, in dozens of cabinets, to support a business are now reduced to a 1/10 of the servers operating at 75-90% or more of capacity. This delays the need for more DC build out. Network connectivity is decreased due the efficiencies of the server and software to use network bandwidth at a higher capacity level. At the same time energy saving is elevated by the decrease in servers being used and new DC trends such as containment, free cooling, and higher DC operating temperatures.
3) Outsourced Services – I’ve divided these to two types. MTDC refers the trend to not own and operate the data center as an ASSET; but rather lease the space and the services to support the ‘physical environment’ aka power delivery chain, cooling, and all the maintenance required to operate a data center. The business still owns and operates the IT infrastructure. Some models MTDC provide cables and cabinets. Others provide just space, power and cooling. Hyperscale data center services come in a variety of shapes, sizes, and flavors. A business can purchase ‘Cloud Services’ which are an amount of compute capacity in the form of servers, storage, network security and/or bandwidth for the business to LEASE for a period of time to run their applications and services. This eliminates the need for space, equipment and support. MTDC operators are also looking to provide other services to supplement their offerings and income in the form of Network connectivity or bandwidth between their properties as well as other cloud or XaaS offerings. Other models include a variety of XaaS or ‘something’ as a service such as storage, infrastructure, software, etc. that a business can leverage to reduce their capital investment in IT and data center. We are seeing more of a blurred line between cloud and XaaS and predict that these will merge in the future. Today we’re seeing an 80/20 split according to Gartner between MTDC services and Hyperscale services. MTDC is beginning to decline as the Hyperscale services are proving their reliability and cost savings. Gartner envisions in the next 10-15 years that there will be 80% of US data centers using Hyperscale technology and services in some form or a hybrid design.
Rarely Interactive with Consumer --- Website was a place to get information, a marketing tool / display only one way to get info on the net.
New Types of end points… This means no longer person to person; but machine to machine. Eg, newer household appliances connected to the web send status of operation back to manufacturer for proactive repair.
We’ve been experiencing this decrease or delay in spending from the Enterprise DC due to the adoption of virtualization. This is increasing the power density of the cabinet and decreasing the need for additional cabinets and space to support data center support. What use to be 1000s of servers, operating at 20% or less of capacity, in dozens of cabinets, to support a business are now reduced to a 1/10 of the servers operating at 75-90% or more of capacity. This delays the need for more DC build out. Network connectivity is decreased due the efficiencies of the server and software to use network bandwidth at a higher capacity level. At the same time energy saving is elevated by the decrease in servers being used and new DC trends such as containment, free cooling, and higher DC operating temperatures.
Example: Data Center.com Reports Barclays Shuts Six Data Centers (2/13/2014): “… Barclays has revealed that it has shut six datacentres and reduced thousands of servers since mid-2013 as part of its cost-cutting efforts. The bank is implementing a series of initiatives including restructuring the business, redundancies and investment in technology – to help reduce operating expenditures. As part of this programme titled ‘Transform’, the bank - as well as closing data centres – has reduced its number of servers by 6,000 since H1 2013.
Data Centres News (2/13/2014) reports The US General Services Administration or GSA to Close 24 Data Centres in 2014…” The Office of Management and Budget has touted datacentre consolidation as a major cost-cutting initiative, estimated to save US$3 billion by 2015. The initial goal was to close 800 of the government’s estimated 2,094 datacentres. Subsequent revised estimates suggest that this was a sustainable under-estimation and that the total number of datacentres is above 6,000.”
This chart shows how Enterprise DC operators are delaying their next DC build out. The affects of virtualization on the server installation base. Spending on actual server hardware is decreasing or staying relatively flat (purple bars). Logical (virtualized) server installed base is increasing (red line), while the physical server installed base is declining or staying flat (purple line). This is decreasing the need for more data centers; but making the existing data centers more dense. Costs of power and cooling are again trending upward and managing this increase of logical servers is becoming more expensive. Outsourcing some of theses services is trending upward to decrease the cost of IT.
Gartner forecasts that the industry is about 68% virtualized and will continue to increase moving forward. There will always be applications that require a single server; but the trend is to continue down the road of virtualization which is key for moving to the cloud. Virtualized applications can be moved into cloud architecture and take advantage of Hyperscale technology (private, hybrid, or public).
Data Centre News reports (2/13/2014): Telecity Group Results confirms positive out loook, “…On an organic basis we opened new capacity across Europe, typically around existing sites, where connectivity is high and we have growing demand from ecosystems of interconnected customers. We have also enhanced our offerings, in particular with our new Cloud-IX platform which ensures we extend our position at the core of the digital economy in Europe.”
451 Reports: $23 Billion in revenue for managed hosting.
Industrially designed infrastructure: back to basics…Disaggregate for cost (separated into component parts or modules), Designed for efficiency Engineered for serviceability, architected for agility
Web-Oriented Architecture – Design for failure and scale. Typical (failure in the) 1st year for a new google cluster: one pdu, 20 racks, 1000 machines, 1000s of disks (source: Designs, Lessons and advice from building large distributed systems (Jeff Dean, Google)
Velocity oriented Processes: DevOps--- Continuous Development, monitoring and delivery.
API based= Development, QA, Operations
Collaboratively Aligned Organization: Shared metrics with Dev & Ops
Risk Embracing Culture: Opposite of IT comfort zone. Take risks, good enough attitude. Break things, learn from them and move on. Challenge traditional norms.
Open Source: Plenty of free information and designs out there for all of these. You don’t need to re-invent the wheel.
Hand crafted by guild-level artisans verses Standardized and mass produced.
Large shops must adopt today. SMB cannot avoid this forever.
Scaling both volume and geographically is a key factor
Another Driver to using Outsourced Services or private cloud architecture the value of faster. Businesses state that agility and speed are the biggest reasons for moving to a cloud architecture. Ultimately, the primary business case for cloud computing will often be speed for the business. This enables business units to make a business case based on speed. The value of “FASTER” is an new trend in the data center as the business world we are quickly evolving to. Getting the product out to market as quickly as possible. In a ‘connected’ world, opportunities come and go, fast. Consumers expectations of immediate gratification due to the internet. It also allows the business to experiment with low cost/investment; thus eliminating the barriers of experimentation. Products that aren’t favored by consumers can be quickly discarded without major losses in profits. In other words fail faster to win big.
Education: No more certification theater: Vendor and Expert programs lose their value (Carl Klaunch, Gartner)
Virtualizing the DC: From Static to Dynamic. Monolithic servers, storage and networks to Fabric enabled networks, storage, and servers, to Future Fabric Based systems: pooled and globally shared resources, boundariless, unified fabric, disaggregated components combined dynamically.
Fabric: the ability to combine components at will (server, network, storage, I/O, specialty engines). Benefits: increase useful life, improves density and energy efficiency, evolving rapidly, SiPho accelerating use, leverages technology disruptions, trend supported by major systems vendors and start ups.
Servers: Inexpensive rack mounted, 2 or 4 socket, internal SAS drives, simplified, no frills, reduce complexity, ultra efficient physical infrastructure, industrial engineering, open source and almost in-house everything, ODM/Self build
ELE: use lower power processor types (ARM, Atom, Power, MIPS, Xeon), thousands of servers per rack, 10s to 100s of thousands of cores per rack
ODM: OEM servers built by ODMs, ODMs produce 76% of all servers, ODM now selling direct to end users, Self build + ODM direct is about 7.27% of market, Hyperscale will consume 17% of x86 market by 2016 (Gartner), ODM direct share will be 4.39 of x86 market by 2016, Colalitions of buyers increasing influence (open compute, open networking)
SiPho: direct chip fabrication of the optical link, larger data rates, longer distance connections, low latency, should lead to volume economics….PoC with Intel and Corning
Consumers are reshaping the hardware industry. The White Box Affect: Hyperscales are adopting OpenCompute techniques by utilizing white box servers or sometimes known as Skinless servers, specifically designed by the Hyperscale provider to only have the bare components required to run their software, creating a very low power server. All energy is used to operate the server, no unnecessary components are installed. Concentrating all of the power supplied to produce compute. The losers in this market are the big hardware manufacturers who produce 76% of all servers today. They are Dell, HP, and Lenovo (IBM sold it’s server unit to Lenovo in January). They have been reporting lower sales and eroding margins. The winners are the ODM (Original Design Manufacturers) who are building purpose built servers for these shops and larger end-users adopting Hyperscale principles. Today ODMs have 7.27% of the market share. Gartner forecasts Hyperscale will consume 17% of the x86 market by 2015 and continue to grow. ODM direct marketshare will be 4.39% of the x86 market by 2016.
ELE or Extremely Energy Efficient servers are extremely dense servers utilizing ARM and ATOM processors (which are mainly used in smart phones and tablets for their low power draw). These servers provide hundreds of processors in a 4-8 RU box and uses a fraction of the power that the same number of servers would use. HP’s Moonshot has 480 processor and uses a little over 1500 watts of power. The speed and memory abilities of these servers has increased 10 fold in the past 12 months. These servers use between 8-16 high speed assemblies for 40 G bps network connections.
Other manufacturers feeling this affect are the Networking giants. Part of the OpenCompute project is to design and build a network switch that will be much like a server, generic hardware with the ability to run a variety of networking software operating systems. This eliminate the current hold that network manufactures hold on hardware, software and support of their products. The reality of this is still several years off, but gaining momentum. The Networking Giants are combating this at every turn and designing products to provide more flexibility and functionality of simplier, low latency, two tier networks with appliances making connections and directing traffic in place of todays switches and routers.
Traditional Manufactures have been creating a variety of products that integrate services, hardware, software, and modular designs to provide ‘cloud in a box’ to keep their market share and show the consumer that a private cloud option is available, at a low cost, with service and support provided by the manufacturers. If adopted, traditional manufactures will keep a larger portion of their market share especially with risk adverse customers.
The next two slides show the growth in the Hyperscale Data Center market. The increase use of ODM or purpose built servers over traditional servers shows the influence that the Hyperscale concepts are driving the market.
At Belden, we understand a data center managers goals and pain points….
1. Do you believe that all data centers will utilize Cloud technology and construct data centers like the large Cloud providers like Amazon?
A. Maybe, someday. I believe there will be private clouds in enterprise data centers with a Hybrid cloud that bridges services that are offered by the public cloud providers at a much lower cost than an enterprise can support themselves. I see Disaster Recovery systems and storage being big players in this ‘Hybrid’ cloud world. Of course, being technology, every day someone builds attempts to build the better mousetrap or should I say service. The concept of failing fast will test the viability of many more applications at a much lower cost by startups and well established companies.
2. Are you telling us that we need to adopt Cloud technology even if we’re a small business?
A. You shouldn’t take risks for the sake of risk taking. A business needs to take calculated risks at first and provide the security to the staff that it’s ok to fail, but fail quickly and learn from your mistakes. Risk and Fail Fast go hand in hand. Extreme amounts of vetting/QA in the cloud based enterprise is a dying concept. Implement in a calculated process, like Facebook’s method of rolling new code out slowly across the globe and seeing what arises.
3. You said that being a generalist in business and technology will be more valuable to a business then an expert. If there are no experts, who will get the work accomplished?
A. Eventually you will need to get with the modern world and use this technology to stay competitive. We tossed out typewriters (the work horse of the pre-1980 office) for desktop computers. Today, a company could not remain competitive if they gave their employees a typewriter instead of a PC, laptop, or tablet to perform their jobs. Same way with these concepts discussed here today. Smaller companies may find it easier to outsource in one of the ways discussed until business grows to a level that makes sense to support this by themselves.
4. You said that being a generalist in business and technology will be more valuable to a business then an expert. If there are no experts, who will get the work accomplished?
A. By this, I mean being open minded to different technologies and new concepts. Having a well rounded skill set while understanding the needs of your company and putting the success of the business first to be successful in this highly competitive business climate.