Self-tuning data centers aim to minimize human intervention through machine learning techniques. Current challenges include meeting service level agreements for performance and uptime while maximizing efficiency of resources and minimizing costs. A self-tuning architecture uses monitoring data to detect issues and make recommendations for scaling, migration, or tuning of resources without human input. This approach aims to optimize data centers so they can scale efficiently to support growing workloads and applications.
NGN statement on e2e network performance froze in 2008 with the Y.2173 (former Y.mnm) standard which specified a method for coordination across domains, involving active probing, parsing and sharing results, etc. Today, it is widely accepted that future network functions is to be virtualized and controlled by software. However, virtual switches -- the best models of which to the day are based on MiniOS/ClickOS/ClickRouter designs -- mimic the blackboxing policy found in clouds. This paper shows that if this policy is revoked in favor of declarative coordination, complexity for optimization of overlaying service networks can be substantially reduced. The proposed case is represented by domains declaring their resources/capabilities (e.g. e2e throughput and delay) openly, thus allowing service networks to coordinate among each other. Traditional approach is represented by the VNE optimization problem.
M2C2: A Mobility Management System For Mobile Cloud ComputingKaran Mitra
Mobile devices have become an integral part of our daily lives. Applications
running on these devices may avail storage and compute resources from
the cloud(s). Further, a mobile device may also connect to heterogeneous
access networks (HANs) such as WiFi and LTE to provide ubiquitous
network connectivity to mobile applications. These devices have limited
resources (compute, storage and battery) that may lead to service
disruptions. In this context, mobile cloud computing enables offloading
of computing and storage to the cloud. However, applications running
on mobile devices using clouds and HANs are prone to unpredictable
cloud workloads, network congestion and handoffs. To run these applications
efficiently the mobile device requires the best possible cloud and
network resources while roaming in HANs. This paper proposes, develops
and validates a novel system called M2C2 which supports mechanisms
for: i.) multihoming, ii.) cloud and network probing, and iii.) cloud
and network selection. We built a prototype system and performed extensive
experimentation to validate our proposed M2C2. Our results
analysis shows that the proposed system supports mobility efficiently
in mobile cloud computing.
Paper can be downloaded from: http://karanmitra.me/wp-content/uploads/2015/02/MitraetalLTUWCNC_Preprint2015.pdf
NGN statement on e2e network performance froze in 2008 with the Y.2173 (former Y.mnm) standard which specified a method for coordination across domains, involving active probing, parsing and sharing results, etc. Today, it is widely accepted that future network functions is to be virtualized and controlled by software. However, virtual switches -- the best models of which to the day are based on MiniOS/ClickOS/ClickRouter designs -- mimic the blackboxing policy found in clouds. This paper shows that if this policy is revoked in favor of declarative coordination, complexity for optimization of overlaying service networks can be substantially reduced. The proposed case is represented by domains declaring their resources/capabilities (e.g. e2e throughput and delay) openly, thus allowing service networks to coordinate among each other. Traditional approach is represented by the VNE optimization problem.
M2C2: A Mobility Management System For Mobile Cloud ComputingKaran Mitra
Mobile devices have become an integral part of our daily lives. Applications
running on these devices may avail storage and compute resources from
the cloud(s). Further, a mobile device may also connect to heterogeneous
access networks (HANs) such as WiFi and LTE to provide ubiquitous
network connectivity to mobile applications. These devices have limited
resources (compute, storage and battery) that may lead to service
disruptions. In this context, mobile cloud computing enables offloading
of computing and storage to the cloud. However, applications running
on mobile devices using clouds and HANs are prone to unpredictable
cloud workloads, network congestion and handoffs. To run these applications
efficiently the mobile device requires the best possible cloud and
network resources while roaming in HANs. This paper proposes, develops
and validates a novel system called M2C2 which supports mechanisms
for: i.) multihoming, ii.) cloud and network probing, and iii.) cloud
and network selection. We built a prototype system and performed extensive
experimentation to validate our proposed M2C2. Our results
analysis shows that the proposed system supports mobility efficiently
in mobile cloud computing.
Paper can be downloaded from: http://karanmitra.me/wp-content/uploads/2015/02/MitraetalLTUWCNC_Preprint2015.pdf
starts with an introduction to mobile cloud computing with a definition, architecture, and advantages/disadvantages. At the next sections, continues with the applications of MCC, detailed challenges in mobile environment and solutions. Lastly the document concludes the main issues about the mobile cloud computing with the conclusion part.
This is a small and simple Presentation on the topic Mobile Cloud Computing Made for a Symposium. The content inside the slides are taken from Google and various research papers, this slide is purely for educational purpose and not meant for commercial publication.
With a rapid growth of the mobile applications and development of cloud computing concept, mobile cloud
computing (MCC) has been introduced to be a potential technology for mobile services. MCC integrates the cloud
computing into the mobile environment and overcomes obstacles related to the performance, security etc discussed in
mobile computing. This paper gives an overview of the MCC including the definition, architecture, and applications. The
issues, existing solutions and approaches are presented.
Energy Efficient Mobile Applications with Mobile Cloud Computing ( MCC )Anand Bhojan
http://www.icghia2014.in/keynotespeakers.htm
Conference Keynote Talk: International Conference on Graph Algorithms, High Performance Implementations and Applications Coimbatore (ICGHIA'14)
The presentation is based on the speech of Rajkumar Buyya on Cloud Bus Toolkit.
Amit Kumar Nath (CSE, DU) and I made this presentation to provide a brief description about some useful cloud bus toolkit, such as, Aneka, CloudSim, Broker, Cloud Maker, Workflow.
- Introduction to Cloud Computing
- Issue analysis on Cloud Computing
- Related stanardization activities
- Standardization issue from ISO/IEC JTC 1 Perspectives
- Recommendation to JTC 1 for standardization
Research Seminar Presentation - A framework for partitioning and execution of...malinga2009
This is a presentation slide-set which presented at Research Seminar Series in UCSC on 12th of August 2013. Two new research papers will be presented and discussed in each week and audience will be motivated to ask questions regarding those two papers. Altogether 40 papers will be presented within an academic year.
Abstract : This paper addresses the problem of automatic temporal annotation of realistic human actions in video using minimal manual supervision. To this end we consider two associated problems: (a) weakly-supervised learning of action models from readily available annotations, and (b) temporal localization of human actions in test videos. To avoid the prohibitive cost of manual annotation for training, we use movie scripts as a means of weak supervision. Scripts, however, provide only implicit, noisy, and imprecise information about the type and location of actions in video. We address this problem with a kernel-based discriminative clustering algorithm that locates actions in the weakly-labeled training data. Using the obtained action samples, we train temporal action detectors and apply them to locate actions in the raw video data. Our experiments demonstrate that the proposed method for weakly-supervised learning of action models leads to significant improvement in action detection. We present detection results for three action classes in four feature length movies with challenging and realistic video data.
Link to paper :
http://dl.acm.org/citation.cfm?id=2479946
starts with an introduction to mobile cloud computing with a definition, architecture, and advantages/disadvantages. At the next sections, continues with the applications of MCC, detailed challenges in mobile environment and solutions. Lastly the document concludes the main issues about the mobile cloud computing with the conclusion part.
This is a small and simple Presentation on the topic Mobile Cloud Computing Made for a Symposium. The content inside the slides are taken from Google and various research papers, this slide is purely for educational purpose and not meant for commercial publication.
With a rapid growth of the mobile applications and development of cloud computing concept, mobile cloud
computing (MCC) has been introduced to be a potential technology for mobile services. MCC integrates the cloud
computing into the mobile environment and overcomes obstacles related to the performance, security etc discussed in
mobile computing. This paper gives an overview of the MCC including the definition, architecture, and applications. The
issues, existing solutions and approaches are presented.
Energy Efficient Mobile Applications with Mobile Cloud Computing ( MCC )Anand Bhojan
http://www.icghia2014.in/keynotespeakers.htm
Conference Keynote Talk: International Conference on Graph Algorithms, High Performance Implementations and Applications Coimbatore (ICGHIA'14)
The presentation is based on the speech of Rajkumar Buyya on Cloud Bus Toolkit.
Amit Kumar Nath (CSE, DU) and I made this presentation to provide a brief description about some useful cloud bus toolkit, such as, Aneka, CloudSim, Broker, Cloud Maker, Workflow.
- Introduction to Cloud Computing
- Issue analysis on Cloud Computing
- Related stanardization activities
- Standardization issue from ISO/IEC JTC 1 Perspectives
- Recommendation to JTC 1 for standardization
Research Seminar Presentation - A framework for partitioning and execution of...malinga2009
This is a presentation slide-set which presented at Research Seminar Series in UCSC on 12th of August 2013. Two new research papers will be presented and discussed in each week and audience will be motivated to ask questions regarding those two papers. Altogether 40 papers will be presented within an academic year.
Abstract : This paper addresses the problem of automatic temporal annotation of realistic human actions in video using minimal manual supervision. To this end we consider two associated problems: (a) weakly-supervised learning of action models from readily available annotations, and (b) temporal localization of human actions in test videos. To avoid the prohibitive cost of manual annotation for training, we use movie scripts as a means of weak supervision. Scripts, however, provide only implicit, noisy, and imprecise information about the type and location of actions in video. We address this problem with a kernel-based discriminative clustering algorithm that locates actions in the weakly-labeled training data. Using the obtained action samples, we train temporal action detectors and apply them to locate actions in the raw video data. Our experiments demonstrate that the proposed method for weakly-supervised learning of action models leads to significant improvement in action detection. We present detection results for three action classes in four feature length movies with challenging and realistic video data.
Link to paper :
http://dl.acm.org/citation.cfm?id=2479946
Machine Learning The Key Ingredient to Self-Driving Data CenterSergey A. Razin
This deck was presented at Great WIde Open event in Atlanta and describes the vision, approach and toolset in ML that can ultimately deliver the self-driving dacenter.
Self-Driving Data Center (Apply Machine Learning to the Cloud)Sergey A. Razin
Traditional datacenter is broken up into a number of silos: network, storage, virtualization, and application. The emerging Software Defined Datacenter movement breaks those silos and creates a playground for innovation, convergence, and new opportunities to reveal the hidden and unknown. During this session I will describe what the Software Defined Data Center hype is all about, how it breaks the traditionally established silos while creating the opportunities for data driven orchestration powered by Machine Learning principals that will make cloud providers and enterprises FINALLY realize the value of virtualization and ultimately deliver the self-driving data center where open initiative is at the front and center.
The Self Healing Cloud: Protecting Applications and Infrastructure with Autom...Denim Group
Organizations often have to deploy arbitrary applications on their infrastructure without thorough security testing. These applications can contain serious security vulnerabilities that can be detected and exploited remotely and in an automated manner. The applications themselves and the infrastructure they are deployed on are then at risk of exploitation. Configuration changes or vendor-provided software updates and patches are typically used to address infrastructure vulnerabilities. However, application-level vulnerabilities often require coding changes to be fully addressed.
Virtual patching is a technique where targeted rules are created for web application firewalls (WAFs) or other IDS/IPS technologies to help mitigate specific known application vulnerabilities. This allows applications to be “virtually” patched prior to actual code-level patches being applied. These virtual patches are most often applicable to vulnerabilities that have a strong detection signature such as SQL injection and cross-site scripting (XSS) because the detection rules can be targeted to detect these signatures, but limited only to specific parts of the application attack surface where the application is known to be vulnerable.
This presentation examines the automatic creation of virtual patches from automated web application security scanner results and explores scenarios where this approach might be successfully employed. It discusses theoretical approaches to the problem and provides specific demonstrations using Open Source tools such as the skipfish and w3af scanners and Snort and mod_security protection technologies. Finally, it looks at opportunities to apply these techniques to protect arbitrary applications deployed into arbitrary infrastructures so that short-term protection against common web application attacks can be consistently applied while minimizing false blocking of legitimate traffic.
Dynatrace: New Approach to Digital Performance Management - Gartner Symposium...Michael Allen
New cloud stacks, containers, micro-services, automation and DevOps is driving an explosion of application code and infrastructure complexity. It's now nearly impossible to solve the Digital Application Performance Management challenges with traditional tools and approaches. Hear how we are delivering on our vision for Digital performance management, and how the role of digital virtual assistants might transcend into your enterprise. Meet D.A.V.I.S.
Ensuring security of a company’s data and infrastructure has largely become a data analytics challenge. It is about finding and understanding patterns and behaviors that are indicative of malicious activities or deviations from the norm. Data, Analytics, and Visualization are used to gain insights and discover those malicious activities. These three components play off of each other, but also have their inherent challenges. A few examples will be given to explore and illustrate some of these challenges,
Video (at YouTube) - http://bit.ly/19TNSTF
Big Data Security Analytics, Data Science and Machine Learning are a few of the new buzzwords that have invaded out industry of late. Most of what we hear are promises of an unicorn-laden, silver-bullet panacea by heavy-handed marketing folks, evoking an expected pushback from the most enlightened members of our community.
This talk will help parse what we as a community need to know and understand about these concepts and help understand where the technical details and actual capabilities of those concepts and also where they fail and how they can be exploited and fooled by an attacker.
The talk will also share results of the author's current ongoing research (on MLSec Project) of applying machine learning techniques to information secuirty monitoring.
AI & ML in Cyber Security - Welcome Back to 1999 - Security Hasn't ChangedRaffael Marty
We are writing the year 2017. Cyber security has been a discipline for many years and thousands of security companies are offering solutions to deter and block malicious actors in order to keep our businesses operating and our data confidential. But fundamentally, cyber security has not changed during the last two decades. We are still running Snort and Bro. Firewalls are fundamentally still the same. People get hacked for their poor passwords and we collect logs that we don't know what to do with. In this talk I will paint a slightly provocative and dark picture of security. Fundamentally, nothing has really changed. We'll have a look at machine learning and artificial intelligence and see how those techniques are used today. Do they have the potential to change anything? How will the future look with those technologies? I will show some practical examples of machine learning and motivate that simpler approaches generally win. Maybe we find some hope in visualization? Or maybe Augmented reality? We still have a ways to go.
Introduction to Modern Data Virtualization 2021 (APAC)Denodo
Watch full webinar here: https://bit.ly/2XXyc3R
“Through 2022, 60% of all organisations will implement data virtualization as one key delivery style in their data integration architecture," according to Gartner. What is data virtualization and why is its adoption growing so quickly? Modern data virtualization accelerates that time to insights and data services without copying or moving data.
Watch on-demand this webinar to learn:
- Why organizations across the world are adopting data virtualization
- What is modern data virtualization
- How data virtualization works and how it compares to alternative approaches to data integration and management
- How modern data virtualization can significantly increase agility while reducing costs
Advanced Analytics and Machine Learning with Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/32c6TnG
Advanced data science techniques, like machine learning, have proven an extremely useful tool to derive valuable insights from existing data. Platforms like Spark, and complex libraries for R, Python and Scala put advanced techniques at the fingertips of the data scientists. However, these data scientists spent most of their time looking for the right data and massaging it into a usable format. Data virtualization offers a new alternative to address these issues in a more efficient and agile way.
Attend this webinar and learn:
- How data virtualization can accelerate data acquisition and massaging, providing the data scientist with a powerful tool to complement their practice
- How popular tools from the data science ecosystem: Spark, Python, Zeppelin, Jupyter, etc. integrate with Denodo
- How you can use the Denodo Platform with large data volumes in an efficient way
- About the success McCormick has had as a result of seasoning the Machine Learning and Blockchain Landscape with data virtualization
In this presentation we review the basic architecture behind SQL Server StreamInsight.
Regards,
Ing. Eduardo Castro Martínez, PhD – Microsoft SQL Server MVP
http://mswindowscr.org
http://comunidadwindows.org
Costa Rica
Technorati Tags: SQL Server
LiveJournal Tags: SQL Server
del.icio.us Tags: SQL Server
http://ecastrom.blogspot.com
http://ecastrom.wordpress.com
http://ecastrom.spaces.live.com
http://universosql.blogspot.com
http://todosobresql.blogspot.com
http://todosobresqlserver.wordpress.com
http://mswindowscr.org/blogs/sql/default.aspx
http://citicr.org/blogs/noticias/default.aspx
http://sqlserverpedia.blogspot.com/
Watch full webinar here: https://bit.ly/2xc6IO0
To solve these challenges, according to Gartner "through 2022, 60% of all organizations will implement data virtualization as one key delivery style in their data integration architecture". It is clear that data virtualization has become a driving force for companies to implement agile, real-time and flexible enterprise data architecture.
In this session we will look at the data integration challenges solved by data virtualization, the main use cases and examine why this technology is growing so fastly. You will learn:
- What data virtualization really is
- How it differs from other enterprise data integration technologies
- Why data virtualization is finding enterprise-wide deployment inside some of the largest organizations
Presentation from Chesapeake Regional Tech Council\'s TechFocus Seminar on Cloud Security; Presented by Scott C Sadler, Business Development Executive - Cloud Computing, IBM US East Mid-Market & Channels on Thursday, October 27, 2011. http://www.chesapeaketech.org
Cloud & Big Data - Digital Transformation in Banking Sutedjo Tjahjadi
Datacomm Cloud Business Overview
Making Indonesia 4.0
Digital Transformation in Banking Industry
Introduction to Cloud Computing
Big Data Analytics Introduction
Big Data Analytics Application in Banking
How Data Virtualization Puts Enterprise Machine Learning Programs into Produc...Denodo
Watch full webinar here: https://bit.ly/3offv7G
Presented at AI Live APAC
Advanced data science techniques, like machine learning, have proven an extremely useful tool to derive valuable insights from existing data. Platforms like Spark, and complex libraries for R, Python and Scala put advanced techniques at the fingertips of the data scientists. However, these data scientists spend most of their time looking for the right data and massaging it into a usable format. Data virtualization offers a new alternative to address these issues in a more efficient and agile way.
Watch this on-demand session to learn how companies can use data virtualization to:
- Create a logical architecture to make all enterprise data available for advanced analytics exercise
- Accelerate data acquisition and massaging, providing the data scientist with a powerful tool to complement their practice
- Integrate popular tools from the data science ecosystem: Spark, Python, Zeppelin, Jupyter, etc.
Bridging the Last Mile: Getting Data to the People Who Need ItDenodo
Watch full webinar here: https://bit.ly/3cUA0Qi
Many organizations are embarking on strategically important journeys to embrace data and analytics. The goal can be to improve internal efficiencies, improve the customer experience, drive new business models and revenue streams, or – in the public sector – provide better services. All of these goals require empowering employees to act on data and analytics and to make data-driven decisions. However, getting data – the right data at the right time – to these employees is a huge challenge and traditional technologies and data architectures are simply not up to this task. This webinar will look at how organizations are using Data Virtualization to quickly and efficiently get data to the people that need it.
Attend this session to learn:
- The challenges organizations face when trying to get data to the business users in a timely manner
- How Data Virtualization can accelerate time-to-value for an organization’s data assets
- Examples of leading companies that used data virtualization to get the right data to the users at the right time
IT-AAC and CISQ are the two leading authorities on how to manage risk in IT intensive programs. Join us and some 220 colleagues on March 15th, Reston Hyatt
Cyber Resilience Summit Briefing March 15, 2016John Weiler
Two leading public service institutes; IT-AAC and CISQ, will present how emerging standards around IT Risk Management have been adopted and proven to mitigate the most common vulnerabilities and weaknesses in IT intensive programs.
A Logical Architecture is Always a Flexible Architecture (ASEAN)Denodo
Watch full webinar here: https://bit.ly/3joZa0a
The current data landscape is fragmented, not just in location but also in terms of processing paradigms: data lakes, IoT architectures, NoSQL, and graph data stores, SaaS applications, etc. are found coexisting with relational databases to fuel the needs of modern analytics, ML, and AI. The physical consolidation of enterprise data into a central repository, although possible, is both expensive and time-consuming. A logical data warehouse is a modern data architecture that allows organizations to leverage all of their data irrespective of where the data is stored, what format it is stored in, and what technologies or protocols are used to store and access the data.
Watch this session to understand:
- What is a logical data warehouse and how to architect one
- The benefits of logical data warehouse – speed with agility
- Customer use case depicting logical architecture implementation
Low Complexity Secure Code Design for Big Data in Cloud Storage SystemsReza Rahimi
In the era of big data, reducing the computational complexity of servers in data centers will be an important goal. We propose Low Complexity Secure Codes (LCSCs) that are specifically designed to provide information theoretic security in cloud distributed storage systems. Unlike traditional coding schemes that are designed for error correction capabilities, these codes are only designed to provide security with low decoding complexity. These sparse codes are able to provide (asymptotic) perfect secrecy similar to Shannon cipher. The simultaneous promise of low decoding complexity and perfect secrecy make these codes very desirable for cloud storage systems with large amount of data. The design is particularly suitable for large size archival data such as movies and pictures. The complexity of these codes are compared with traditional encryption techniques.
Multi-cluster Kubernetes Networking- Patterns, Projects and GuidelinesSanjeev Rampal
Talk presented at Kubernetes Community Day, New York, May 2024.
Technical summary of Multi-Cluster Kubernetes Networking architectures with focus on 4 key topics.
1) Key patterns for Multi-cluster architectures
2) Architectural comparison of several OSS/ CNCF projects to address these patterns
3) Evolution trends for the APIs of these projects
4) Some design recommendations & guidelines for adopting/ deploying these solutions.
# Internet Security: Safeguarding Your Digital World
In the contemporary digital age, the internet is a cornerstone of our daily lives. It connects us to vast amounts of information, provides platforms for communication, enables commerce, and offers endless entertainment. However, with these conveniences come significant security challenges. Internet security is essential to protect our digital identities, sensitive data, and overall online experience. This comprehensive guide explores the multifaceted world of internet security, providing insights into its importance, common threats, and effective strategies to safeguard your digital world.
## Understanding Internet Security
Internet security encompasses the measures and protocols used to protect information, devices, and networks from unauthorized access, attacks, and damage. It involves a wide range of practices designed to safeguard data confidentiality, integrity, and availability. Effective internet security is crucial for individuals, businesses, and governments alike, as cyber threats continue to evolve in complexity and scale.
### Key Components of Internet Security
1. **Confidentiality**: Ensuring that information is accessible only to those authorized to access it.
2. **Integrity**: Protecting information from being altered or tampered with by unauthorized parties.
3. **Availability**: Ensuring that authorized users have reliable access to information and resources when needed.
## Common Internet Security Threats
Cyber threats are numerous and constantly evolving. Understanding these threats is the first step in protecting against them. Some of the most common internet security threats include:
### Malware
Malware, or malicious software, is designed to harm, exploit, or otherwise compromise a device, network, or service. Common types of malware include:
- **Viruses**: Programs that attach themselves to legitimate software and replicate, spreading to other programs and files.
- **Worms**: Standalone malware that replicates itself to spread to other computers.
- **Trojan Horses**: Malicious software disguised as legitimate software.
- **Ransomware**: Malware that encrypts a user's files and demands a ransom for the decryption key.
- **Spyware**: Software that secretly monitors and collects user information.
### Phishing
Phishing is a social engineering attack that aims to steal sensitive information such as usernames, passwords, and credit card details. Attackers often masquerade as trusted entities in email or other communication channels, tricking victims into providing their information.
### Man-in-the-Middle (MitM) Attacks
MitM attacks occur when an attacker intercepts and potentially alters communication between two parties without their knowledge. This can lead to the unauthorized acquisition of sensitive information.
### Denial-of-Service (DoS) and Distributed Denial-of-Service (DDoS) Attacks
This 7-second Brain Wave Ritual Attracts Money To You.!nirahealhty
Discover the power of a simple 7-second brain wave ritual that can attract wealth and abundance into your life. By tapping into specific brain frequencies, this technique helps you manifest financial success effortlessly. Ready to transform your financial future? Try this powerful ritual and start attracting money today!
Bridging the Digital Gap Brad Spiegel Macon, GA Initiative.pptxBrad Spiegel Macon GA
Brad Spiegel Macon GA’s journey exemplifies the profound impact that one individual can have on their community. Through his unwavering dedication to digital inclusion, he’s not only bridging the gap in Macon but also setting an example for others to follow.
1.Wireless Communication System_Wireless communication is a broad term that i...JeyaPerumal1
Wireless communication involves the transmission of information over a distance without the help of wires, cables or any other forms of electrical conductors.
Wireless communication is a broad term that incorporates all procedures and forms of connecting and communicating between two or more devices using a wireless signal through wireless communication technologies and devices.
Features of Wireless Communication
The evolution of wireless technology has brought many advancements with its effective features.
The transmitted distance can be anywhere between a few meters (for example, a television's remote control) and thousands of kilometers (for example, radio communication).
Wireless communication can be used for cellular telephony, wireless access to the internet, wireless home networking, and so on.
APNIC Foundation, presented by Ellisha Heppner at the PNG DNS Forum 2024APNIC
Ellisha Heppner, Grant Management Lead, presented an update on APNIC Foundation to the PNG DNS Forum held from 6 to 10 May, 2024 in Port Moresby, Papua New Guinea.
3. Business Challenge:
Software Defined Business
Software Defined Transportation Software Defined Video Streaming
Control, Management
and Analytics Tier
R&D
Software Defined Leasing/Hostelling Software Defined Data Centers
and Analytics Tier
Resource Pool Tier
4. Engineering Challenge:
Big Data Problem
Resource/Supply Providing
Monitoring, Analytics, Management
Prediction/Optimization
Customer/Service User
Control and
Management Tier
Resource
Pool Tier
Large amount of data and meta data generated
R&D
Large amount of data and meta data generated
6. Applications Spectrum
Computing (CPU , GPU, DSP, FPGA,...)
Self-Driving CarsRobotic/AI Applications
These applications will be
fully or partially supported
by Data Centers Services
(Cloud-Based)
R&D
Storage (DRAM, SSD, HDD,..)
Network (Wired, Wi-Fi, 4G,…)
Data Management Systems
Video Streaming/IoT,…
7. Typical Data Center Architecture
As a simple rule of thumb:
Enterprise Data Center Size :
100 Hosts
1000 VMs
~Logs : 40 GB/Day
Data Center
Management
Host 1 Host 2
Host n
Apps are running
on VMs
R&D
VM-1-k
VM-1-1 VM-2-1
VM-2-m
VM-n-1
VM-n-l
logslogs logs
Storage Pool
Big Data Engineering
and Science
9. Challenges in Data Center
Management
Service Level Agreement (SLA) :
Throughput/Latency (e-commerce applications):
► 2014 US $304 billion increasing 15.4% yearly in e-commerce [1],
► 100ms latency costs 1% decrease in sale [3],
► Page loading should be less than 2 seconds per page not to lose
customer, will decrease overall sales by 7% [2],
R&D
Availability and Fault Tolerance :
► Example Huawei public cloud 99.9999 Availability [4] =
Daily: 0.9s
Weekly: 6.0s
Monthly: 26.3s
Yearly: 5m 15.6s
Scalable and Elastic (on Demand) :
► Should know when and how to scale to satisfy SLA dynamically,
10. Data Center Energy Efficiency and Resource Utilization :
► By 2020 reduction of energy cost 30% based on
European law-Green DC [5],
Challenges in Data Center
Management
Security and Privacy :
► Should guarantee data privacy (like medical data, Financial Data,…) and
security against attacks, data ownership,…
R&D
By 2020 reduction of energy cost 30% based on
European law-Green DC [5],
► US data centers consume ~ 90 billion Kilowatt hours annually =
House hold in NY for two years
► Pollute over 150 million tons of carbon yearly in USA [5],
► ~ 90 percent of the VMs utilizes < 15% of assigned cores [9],
► ~ 90 percent of the VMs only have < 10 IOPS [9],
► Average server runs on [12%-18%] of their capacity most of the time
still consuming 30% to 60% of their maximum power consumption [6,7].
► High utilization -> save in power consumption->Low carbon footprint
11. Software Compliance and License :
► ~ $500,000 spent on software licensing for average size data center,
► It could be per User/Device/VM/Core/…
► Different models and policies for license like [8]:
1) Running licensed workload on bare metal (no virtualization),
2) Running licensed workload on dedicated cluster,
3) Migrate licensed workload,
Challenges in Data Center
Management
R&D
3) Migrate licensed workload,
4) …
► Workloads and cluster growth bring challenges for software license,
► This bring the challenge how to minimize the cost of software on data
centers and not violate license policy,
Dynamic Service Pricing :
► Computing, network and storage are utilities for workloads.
► Should model to find a dynamic way and good policy of pricing in
competitive market of cloud providers while increasing revenue.
12. Self-Tuning Data Center :
Simplified Service Architecture
VM
Scheduling and
Orchestrating Services
and Resources
Real-time Log and
Monitoring Service
Alert and Policy
Service
Recommendation
Service
2) Ask correct size, type
And location for resource
Based on request
1) Request resource
3) Correct conf and resource
size and place
4) Allocate required
resources
1) Telemetry and log sending
Initial
State
R&D
1) Telemetry and log sending
2) Query logs for policy and
alert checks
4) Check for violation
and warnings
5) Alert of Violation
6) Ask for Recommendation
7 ) Send Recommendations
and Recipes
8 ) Apply Recommendation
Operational and
Recovery State
1 ) Ask Recommendation
For Self-Tune (for example
in low traffic state)
2) Send Tuning Plan
and Recommendation
(like VM migration or resizing)
3 ) Apply Self-Tuning
recommendation
Self-Tuning
State
3) Collected Data
13. Huawei Position in
Self-Tuning Data Center
► Huawei Cloud is growing very fast > 50% revenue increase y-y.
► Huawei launched its first public cloud outside China in Europe
(announced in CeBIT 2016) with 50,000 Hosts.
► Working on intelligent service in Huawei R&D Storage Lab in USA to
address self-tuning data centers and provide solution for Huawei
customers and their needs.
► Using and contributing idea from/to open source big data
R&D
► Using and contributing idea from/to open source big data
community.
14. Conclusions and
Future Directions
► Cloud-based ecosystem is the future of IT.
► Cloud data center composed of different resources
to satisfy applications requirements.
► Managing these resources is a complicated task that human
can not do it manually.
► Machines in data centers are generating big amount of logs which
describe what happen in data center.
R&D
describe what happen in data center.
► Data scientists and engineers are needed to study system
behavior and data center optimization.
► This will result to the next generation data centers which are self-tuning
and need minimum human efforts.
15. References
[1] U.S Census Bureau News : http://www2.census.gov/retail/releases/historical/ecomm/14q4.pdf
[2] Akamai Newsroom : http://www.akamai.com/html/about/press/releases/2009/press_091409.html
[3] High Scalability Blog : http://highscalability.com/latency-everywhere-and-it-costs-you-sales-how-crush-it
[4] High Availability : https://en.wikipedia.org/wiki/High_availability
[5] European Commission on Renewable Energy : https://ec.europa.eu/energy/en/topics/renewable-
energy
R&D
energy
[6] ISSUE Brief : https://www.nrdc.org/sites/default/files/data-center-efficiency-assessment-IB.pdf
[7] ISSUE Paper : https://www.nrdc.org/sites/default/files/data-center-efficiency-assessment-IP.pdf
[8] Turbotonic white paper “Licensing, Compliance & Audits in the Cloud Era”, 2015.
[9] CloudPhysics, Global IT Data Lake Report, Q4 2016