Cloud computing and Green computing are two most emergent areas in information communication technology (ICT) with immense applications in the entire globe. Due to tremendous improvements in computer networks, the people prefer the Network-based computing instead of doing something in an in-house based computing. In any business sectors, daily business and individual computing are now migrating from individual hard drives to internet servers. Therefore more and more companies are investing in building large datacenters to host Cloud services. These datacenters not only consume huge amount of energy but are also very complex in the infrastructure itself. Certain studies propose to make these datacenters energy efficient by using technologies such as virtualization and consolidation. These solutions are mostly cost driven and thus, do not directly address the critical impact on the environmental sustainability in terms of CO2 emissions. Hence, in this work, we propose a user-oriented Cloud architectural framework, i.e. Carbon Aware Green Cloud Architecture, which addresses this environmental problem from the overall usage of Cloud Computing resources.
Green cloud computing aims to minimize environmental impact by optimizing computing resource usage. It focuses on reducing materials, energy, water and e-waste through techniques like virtualization, consolidation, automation and multitenancy. These improvements lead to greater efficiency and resource utilization in cloud data centers and networks. Metrics like PUE, CUE and DCP are used to measure a cloud's environmental footprint and productivity.
This document discusses green cloud computing. It begins by defining cloud computing and green computing, noting that cloud computing requires large data centers that consume significant energy. It then discusses how green cloud computing aims to reduce this energy usage through techniques like server virtualization and energy-aware resource allocation. Specific strategies that cloud providers and data centers are taking to improve energy efficiency are also summarized, such as geographic placement of data centers and measures to optimize cooling.
Green cloud computing aims to make cloud infrastructure more energy efficient and environmentally friendly. Adopting measures like using more renewable energy sources, virtualizing servers, and improving data center cooling can help reduce carbon emissions and operational costs. Virtualizing servers allows multiple virtual machines to run on a single physical server, increasing efficiency and hardware utilization. Data centers also aim to lower their power usage effectiveness rating by implementing designs with hot-aisle/cold-aisle configurations and adopting newer technologies. Transitioning to renewable energy sources for power can further reduce the carbon footprint of cloud infrastructure and lead to more stable energy prices over time.
Cloud computing has the potential to improve energy efficiency through server consolidation and switching off unused servers, however, increasing internet traffic and data storage demands driven by cloud services could negate these savings; while Microsoft claims its cloud solutions reduce energy use by 30-90% compared to on-premise installations, Greenpeace argues collective cloud demand will increase CO2 emissions even with efficient data centers. The presentation analyzes the environmental sustainability of cloud computing by exploring technologies and mechanisms that support this goal as well as studies with differing views on cloud computing's impact.
The document discusses green cloud computing and describes a technical seminar presented by S.Sai Madhuri. It defines cloud computing and discusses types including SaaS, PaaS, and IaaS. It then explains green computing and green cloud computing, describing the core components and architecture of data centers. The document outlines the objective of calculating energy consumption using a green cloud simulator in VMWare Player to analyze existing systems and develop more efficient solutions.
Green cloud computing aims to make cloud computing more environmentally sustainable by reducing energy consumption and carbon emissions. The document discusses how cloud data centers use significant amounts of energy. It then introduces green cloud computing and the Green Cloud Simulator tool, which can model a data center's energy usage. The document provides steps to build a new virtual data center in the simulator and view statistics on device energy consumption and graphs of the results. The summary highlights the goal of reducing cloud computing's environmental impact.
Green cloud computing using heuristic algorithmsIliad Mnd
Green computing is defined as the study and practice of designing , manufacturing, using, and disposing of computers, servers, and associated sub systems such as
monitors, printers, storage devices, and networking and
communications systems efficiently and effectively with
minimal or no impact on the environment. Research continues into key areas such as making the use of computers as energy efficient as possible, and designing algorithms and systems for efficiency related computer technologies.
Energy Saving by Virtual Machine Migration in Green Cloud Computingijtsrd
Nowadays the innovations have turned out to be so quick and advanced that enormous all big enterprises have to go for cloud. Cloud provides wide range of services, from high performance computing to storage. Datacenter consisting of servers, network, wires, cooling systems etc. is very important part of cloud as it carries various business information onto the servers. Cloud computing is widely used for large data centers but it causes very serious issues to environment such as heat emission, heavy consumption of energy, release of toxic gases like methane, nitrous oxide, carbon dioxide, etc. High energy consumption leads to high operational cost as well as low profit. So we required Green cloud computing, which very environment friendly and energy efficient version of the cloud computing. In this paper the major issues related to cloud computing is discussed. And the various techniques used to minimize the power consumption are also discussed. Ruhi D. Viroja | Dharmendra H. Viroja"Energy Saving by Virtual Machine Migration in Green Cloud Computing" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-1 | Issue-1 , December 2016, URL: http://www.ijtsrd.com/papers/ijtsrd104.pdf http://www.ijtsrd.com/engineering/computer-engineering/104/energy-saving-by-virtual-machine-migration-in-green-cloud-computing/ruhi-d-viroja
Green cloud computing aims to minimize environmental impact by optimizing computing resource usage. It focuses on reducing materials, energy, water and e-waste through techniques like virtualization, consolidation, automation and multitenancy. These improvements lead to greater efficiency and resource utilization in cloud data centers and networks. Metrics like PUE, CUE and DCP are used to measure a cloud's environmental footprint and productivity.
This document discusses green cloud computing. It begins by defining cloud computing and green computing, noting that cloud computing requires large data centers that consume significant energy. It then discusses how green cloud computing aims to reduce this energy usage through techniques like server virtualization and energy-aware resource allocation. Specific strategies that cloud providers and data centers are taking to improve energy efficiency are also summarized, such as geographic placement of data centers and measures to optimize cooling.
Green cloud computing aims to make cloud infrastructure more energy efficient and environmentally friendly. Adopting measures like using more renewable energy sources, virtualizing servers, and improving data center cooling can help reduce carbon emissions and operational costs. Virtualizing servers allows multiple virtual machines to run on a single physical server, increasing efficiency and hardware utilization. Data centers also aim to lower their power usage effectiveness rating by implementing designs with hot-aisle/cold-aisle configurations and adopting newer technologies. Transitioning to renewable energy sources for power can further reduce the carbon footprint of cloud infrastructure and lead to more stable energy prices over time.
Cloud computing has the potential to improve energy efficiency through server consolidation and switching off unused servers, however, increasing internet traffic and data storage demands driven by cloud services could negate these savings; while Microsoft claims its cloud solutions reduce energy use by 30-90% compared to on-premise installations, Greenpeace argues collective cloud demand will increase CO2 emissions even with efficient data centers. The presentation analyzes the environmental sustainability of cloud computing by exploring technologies and mechanisms that support this goal as well as studies with differing views on cloud computing's impact.
The document discusses green cloud computing and describes a technical seminar presented by S.Sai Madhuri. It defines cloud computing and discusses types including SaaS, PaaS, and IaaS. It then explains green computing and green cloud computing, describing the core components and architecture of data centers. The document outlines the objective of calculating energy consumption using a green cloud simulator in VMWare Player to analyze existing systems and develop more efficient solutions.
Green cloud computing aims to make cloud computing more environmentally sustainable by reducing energy consumption and carbon emissions. The document discusses how cloud data centers use significant amounts of energy. It then introduces green cloud computing and the Green Cloud Simulator tool, which can model a data center's energy usage. The document provides steps to build a new virtual data center in the simulator and view statistics on device energy consumption and graphs of the results. The summary highlights the goal of reducing cloud computing's environmental impact.
Green cloud computing using heuristic algorithmsIliad Mnd
Green computing is defined as the study and practice of designing , manufacturing, using, and disposing of computers, servers, and associated sub systems such as
monitors, printers, storage devices, and networking and
communications systems efficiently and effectively with
minimal or no impact on the environment. Research continues into key areas such as making the use of computers as energy efficient as possible, and designing algorithms and systems for efficiency related computer technologies.
Energy Saving by Virtual Machine Migration in Green Cloud Computingijtsrd
Nowadays the innovations have turned out to be so quick and advanced that enormous all big enterprises have to go for cloud. Cloud provides wide range of services, from high performance computing to storage. Datacenter consisting of servers, network, wires, cooling systems etc. is very important part of cloud as it carries various business information onto the servers. Cloud computing is widely used for large data centers but it causes very serious issues to environment such as heat emission, heavy consumption of energy, release of toxic gases like methane, nitrous oxide, carbon dioxide, etc. High energy consumption leads to high operational cost as well as low profit. So we required Green cloud computing, which very environment friendly and energy efficient version of the cloud computing. In this paper the major issues related to cloud computing is discussed. And the various techniques used to minimize the power consumption are also discussed. Ruhi D. Viroja | Dharmendra H. Viroja"Energy Saving by Virtual Machine Migration in Green Cloud Computing" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-1 | Issue-1 , December 2016, URL: http://www.ijtsrd.com/papers/ijtsrd104.pdf http://www.ijtsrd.com/engineering/computer-engineering/104/energy-saving-by-virtual-machine-migration-in-green-cloud-computing/ruhi-d-viroja
In today’s world the growing demand for knowledge has made cloud computing a center of attraction. Cloud computing is providing utility based services to all the users worldwide. It enables presentation of applications from consumers, scientific and business domains. However, data centers created for cloud computing applications consume huge amounts of energy, contributing to high operational costs and a large amount of carbon dioxide emission to the environment. With enhancement of data center, the power consumption is increasing at such a rate that it has become a key concern these days because it is ultimately leading to energy shortcomings and global climatic change. Therefore, we need green cloud computing solutions that can not only save energy, but also reduce operational costs.
What began as tiny baby steps to bridge gaps of human limitations like memory, cognitive fluctuations has today turned into a virtual revolution where an invisible world of 0’s and 1’s is home to the most important bytes and pieces of our lives. Replacing the need for local servers, cloud computing emerged as the matrix within which multiple remote servers coordinated together to store, manage and administer data. The two other baits in the cloud computing movement is the live/real-time information updating and light-speed transfer of data. Initially, it was the center for large-scale corporations. In the last few years, small-scale and medium-sized organizations have also joined the army of cloud computing era. The analytics of the cloud computing culture swing between the spectrums ranging from the sustainability of cloud computing to the increased carbon emissions due to servers.
Read the full blog here:
http://suyati.com/culture-of-cloud-computing-a-green-move-or-eco-death/
Or reach us at: jghosh@suyati.com
This presentation brings insights on cloud and green cloud computing and briefs the readers with its potential in india and how it can be achieved. Numerous insights have been collectively put in into this presentation.
Energy Saving by Migrating Virtual Machine to Green Cloud Computingijtsrd
Green computing is characterized as the examination and practice of structuring, assembling, utilizing, and discarding PCs, servers, and related subsystems, for example, screens, printers, storage gadgets, and systems administration and interchanges frameworks proficiently and successfully with negligible or no effect on the earth. The objective of green computing is to diminish the utilization of hazardous materials, amplify energy proficiency during the items lifetime, and advance the recyclability of obsolete items and factory waste. Green computing can be accomplished by either Product Longevity Resource distribution or Virtualization or Power management. power is the bottleneck of improving the system execution. Among all industries, the information communication technology ICT industry is seemingly answerable for a bigger segment of the overall development in energy utilization. The objective of green cloud computing is to advance the recyclability or biodegradability of outdated items and factory waste by diminishing the utilization of hazardous materials and amplifying the energy productivity during the items lifetime. Stephen Fernandes "Energy Saving by Migrating Virtual Machine to Green Cloud Computing" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-3 , April 2020, URL: https://www.ijtsrd.com/papers/ijtsrd30422.pdf Paper Url :https://www.ijtsrd.com/computer-science/distributed-computing/30422/energy-saving-by-migrating-virtual-machine-to-green-cloud-computing/stephen-fernandes
This document discusses green cloud computing and data centers from an energy efficiency perspective. It begins with basics of green computing and cloud computing concepts. It then discusses green cloud computing in the context of data centers, describing steps to make data centers more energy efficient through server virtualization, energy-aware workload consolidation, and using more efficient cooling methods. Case studies from Senegal, South Africa, and India are presented showing how green cloud computing approaches have helped reduce energy use and costs for organizations in developing areas. Key metrics like PUE (power usage effectiveness) for measuring data center efficiency are also covered.
This document discusses green cloud computing. It begins by defining green computing and cloud computing individually. Green computing aims to reduce power consumption and environmental impact of IT, while cloud computing involves virtualized and interconnected computers. Green cloud computing combines these concepts by making cloud infrastructure and operations more energy efficient. The document then covers benefits like reduced energy use, the role of dynamic provisioning and multi-tenancy in cloud enabling green computing, and a case study on a green cloud architecture and scheduling policies that can reduce carbon emissions by 20%.
On June 24th I presented to the Dependable Systems Engineering group here in the School of Computer Science, St Andrews. The group meets once a month for a presentation from one of its members over lunch. The presenter talks about their current research, providing a good opportunity to keep up to date with other work within the group.On June 24th I presented to the Dependable Systems Engineering group here in the School of Computer Science, St Andrews. The group meets once a month for a presentation from one of its members over lunch. The presenter talks about their current research, providing a good opportunity to keep up to date with other work within the group.
Cloud computing provides scalable and elastic IT capabilities as a service over the internet. Ericsson benefited from Amazon Web Services (AWS) through cost reductions, automated software updates, and enhanced remote access capabilities. This paper evaluates the adaptability, dependability, manageability, and scalability of Amazon EC2, S3, and RightScale services. It examines security concerns for cloud services and makes recommendations to address issues of cost, reliability, and scalability.
This document discusses green cloud computing and data centers. It provides an overview of green computing principles like efficiency and virtualization. Cloud computing is described as a virtualized and scalable computing platform. Green cloud computing from a data center perspective involves diagnosing issues, measuring energy usage, server virtualization, and building efficiently. Case studies from Senegal, South Africa, and India show how green data center approaches and private clouds can reduce energy costs and increase efficiency. The document advocates for more research on maximizing green data center efficiency to benefit developing regions.
Green cloud computing aims to minimize environmental impact by optimizing computing resource usage. It focuses on reducing materials, energy, water and e-waste through techniques like virtualization, consolidation, automation and multitenancy. These improvements lead to greater efficiency and resource utilization in cloud data centers. Common metrics for measuring a cloud's environmental footprint include PUE, CUE and DCIE, which evaluate energy and power usage effectiveness.
This document discusses energy efficiency in cloud computing. It notes that cloud computing has led to large data centers with significant energy usage and carbon footprints. The resource allocation problem in cloud computing is treated as a linear programming problem aimed at minimizing energy consumption. Several heuristic algorithms are adopted and analyzed for resource allocation using an expected time to compute task model to develop green cloud computing solutions that reduce costs and environmental impacts.
This document summarizes a study on green cloud computing. It defines green computing and cloud computing, noting that green cloud computing aims to minimize energy consumption through cloud infrastructure. It outlines different cloud service models and analyzes their energy usage. The document also summarizes a Microsoft study finding cloud can reduce energy usage by 30-60% compared to on-premise systems, but a Greenpeace study argues cloud could increase energy demands significantly if usage grows rapidly. In conclusion, cloud services can be more efficient than local systems depending on usage levels and transport energy costs.
This document summarizes a research paper on green cloud computing. It discusses how cloud computing can help reduce energy consumption and promote more environmentally sustainable IT practices compared to traditional on-site computing. Virtualization is highlighted as a key technology for improving resource utilization and reducing hardware needs. The CLEER model is presented as a tool for analyzing the potential energy savings from transitioning applications to the cloud. Based on this model, a case study found that moving common business software like email and CRM to the cloud in the US could save over 300 petajoules of energy annually.
The document analyzes a study conducted by Microsoft, Accenture, and WSP Atmosphere & Energy to compare the environmental impact of hosting three Microsoft business programs (Exchange, Dynamics CRM, SharePoint) on cloud servers versus on-premise installations. The study found that for large deployments, cloud servers can reduce energy use and carbon emissions by over 30% compared to on-premise installations. For small deployments, cloud servers can reduce energy use and carbon emissions by over 90%. Several factors allow cloud servers to be more efficient, including optimized resource allocation, server utilization, and data center design. Moving to cloud solutions can help organizations improve sustainability while outsourcing IT infrastructure investments.
The document discusses cloud computing and green computing. It defines cloud computing as using remote servers and the internet to maintain data and applications that users can access from any device with an internet connection. Cloud computing provides efficient computing through centralized storage, memory, processing and bandwidth. Green computing refers to environmentally sustainable IT that aims to minimize environmental impact and optimize energy efficiency and cost effectiveness.
This document discusses green cloud computing from the perspective of data centers. It begins with background on green computing and cloud computing. It then discusses how green cloud computing can help balance energy usage in data centers through server virtualization, energy-aware consolidation, and locating data centers in developing regions. The document presents two case studies, one on a green data center in Senegal and another on benefits realized by a cell phone company in South Africa from implementing a private cloud. It concludes with sections on the Indian scenario for green IT standardization and a call to continue research efforts to maximize efficiency of green data centers.
Green computing refers to using computing resources efficiently and minimizing environmental impact. It involves implementing energy-efficient policies and practices when setting up and operating IT systems. The goals of green computing include minimizing energy consumption, purchasing green energy, and reducing employee/customer travel requirements. Green cloud computing aims to achieve efficient infrastructure utilization and processing while minimizing energy usage. It uses techniques like dynamic resource allocation and powering down underutilized servers.
Cloud computing offers utility-oriented IT services worldwide, enabling hosting of applications from various domains. However, data centers consuming huge amounts of energy contributing to high costs and carbon footprints. Green cloud computing solutions are needed to save energy and reduce costs while powering down servers when not in use.
Green networking aims to reduce the carbon footprint of information and communication technology (ICT) networks by improving energy efficiency. Key strategies include optimizing network infrastructure utilization through technologies like virtualization, improving equipment energy efficiency, and locating network resources closer to renewable energy sources. Measurement of energy savings is important to track progress towards a lower carbon "Green Network".
In today’s world the growing demand for knowledge has made cloud computing a center of attraction. Cloud computing is providing utility based services to all the users worldwide. It enables presentation of applications from consumers, scientific and business domains. However, data centers created for cloud computing applications consume huge amounts of energy, contributing to high operational costs and a large amount of carbon dioxide emission to the environment. With enhancement of data center, the power consumption is increasing at such a rate that it has become a key concern these days because it is ultimately leading to energy shortcomings and global climatic change. Therefore, we need green cloud computing solutions that can not only save energy, but also reduce operational costs.
What began as tiny baby steps to bridge gaps of human limitations like memory, cognitive fluctuations has today turned into a virtual revolution where an invisible world of 0’s and 1’s is home to the most important bytes and pieces of our lives. Replacing the need for local servers, cloud computing emerged as the matrix within which multiple remote servers coordinated together to store, manage and administer data. The two other baits in the cloud computing movement is the live/real-time information updating and light-speed transfer of data. Initially, it was the center for large-scale corporations. In the last few years, small-scale and medium-sized organizations have also joined the army of cloud computing era. The analytics of the cloud computing culture swing between the spectrums ranging from the sustainability of cloud computing to the increased carbon emissions due to servers.
Read the full blog here:
http://suyati.com/culture-of-cloud-computing-a-green-move-or-eco-death/
Or reach us at: jghosh@suyati.com
This presentation brings insights on cloud and green cloud computing and briefs the readers with its potential in india and how it can be achieved. Numerous insights have been collectively put in into this presentation.
Energy Saving by Migrating Virtual Machine to Green Cloud Computingijtsrd
Green computing is characterized as the examination and practice of structuring, assembling, utilizing, and discarding PCs, servers, and related subsystems, for example, screens, printers, storage gadgets, and systems administration and interchanges frameworks proficiently and successfully with negligible or no effect on the earth. The objective of green computing is to diminish the utilization of hazardous materials, amplify energy proficiency during the items lifetime, and advance the recyclability of obsolete items and factory waste. Green computing can be accomplished by either Product Longevity Resource distribution or Virtualization or Power management. power is the bottleneck of improving the system execution. Among all industries, the information communication technology ICT industry is seemingly answerable for a bigger segment of the overall development in energy utilization. The objective of green cloud computing is to advance the recyclability or biodegradability of outdated items and factory waste by diminishing the utilization of hazardous materials and amplifying the energy productivity during the items lifetime. Stephen Fernandes "Energy Saving by Migrating Virtual Machine to Green Cloud Computing" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-3 , April 2020, URL: https://www.ijtsrd.com/papers/ijtsrd30422.pdf Paper Url :https://www.ijtsrd.com/computer-science/distributed-computing/30422/energy-saving-by-migrating-virtual-machine-to-green-cloud-computing/stephen-fernandes
This document discusses green cloud computing and data centers from an energy efficiency perspective. It begins with basics of green computing and cloud computing concepts. It then discusses green cloud computing in the context of data centers, describing steps to make data centers more energy efficient through server virtualization, energy-aware workload consolidation, and using more efficient cooling methods. Case studies from Senegal, South Africa, and India are presented showing how green cloud computing approaches have helped reduce energy use and costs for organizations in developing areas. Key metrics like PUE (power usage effectiveness) for measuring data center efficiency are also covered.
This document discusses green cloud computing. It begins by defining green computing and cloud computing individually. Green computing aims to reduce power consumption and environmental impact of IT, while cloud computing involves virtualized and interconnected computers. Green cloud computing combines these concepts by making cloud infrastructure and operations more energy efficient. The document then covers benefits like reduced energy use, the role of dynamic provisioning and multi-tenancy in cloud enabling green computing, and a case study on a green cloud architecture and scheduling policies that can reduce carbon emissions by 20%.
On June 24th I presented to the Dependable Systems Engineering group here in the School of Computer Science, St Andrews. The group meets once a month for a presentation from one of its members over lunch. The presenter talks about their current research, providing a good opportunity to keep up to date with other work within the group.On June 24th I presented to the Dependable Systems Engineering group here in the School of Computer Science, St Andrews. The group meets once a month for a presentation from one of its members over lunch. The presenter talks about their current research, providing a good opportunity to keep up to date with other work within the group.
Cloud computing provides scalable and elastic IT capabilities as a service over the internet. Ericsson benefited from Amazon Web Services (AWS) through cost reductions, automated software updates, and enhanced remote access capabilities. This paper evaluates the adaptability, dependability, manageability, and scalability of Amazon EC2, S3, and RightScale services. It examines security concerns for cloud services and makes recommendations to address issues of cost, reliability, and scalability.
This document discusses green cloud computing and data centers. It provides an overview of green computing principles like efficiency and virtualization. Cloud computing is described as a virtualized and scalable computing platform. Green cloud computing from a data center perspective involves diagnosing issues, measuring energy usage, server virtualization, and building efficiently. Case studies from Senegal, South Africa, and India show how green data center approaches and private clouds can reduce energy costs and increase efficiency. The document advocates for more research on maximizing green data center efficiency to benefit developing regions.
Green cloud computing aims to minimize environmental impact by optimizing computing resource usage. It focuses on reducing materials, energy, water and e-waste through techniques like virtualization, consolidation, automation and multitenancy. These improvements lead to greater efficiency and resource utilization in cloud data centers. Common metrics for measuring a cloud's environmental footprint include PUE, CUE and DCIE, which evaluate energy and power usage effectiveness.
This document discusses energy efficiency in cloud computing. It notes that cloud computing has led to large data centers with significant energy usage and carbon footprints. The resource allocation problem in cloud computing is treated as a linear programming problem aimed at minimizing energy consumption. Several heuristic algorithms are adopted and analyzed for resource allocation using an expected time to compute task model to develop green cloud computing solutions that reduce costs and environmental impacts.
This document summarizes a study on green cloud computing. It defines green computing and cloud computing, noting that green cloud computing aims to minimize energy consumption through cloud infrastructure. It outlines different cloud service models and analyzes their energy usage. The document also summarizes a Microsoft study finding cloud can reduce energy usage by 30-60% compared to on-premise systems, but a Greenpeace study argues cloud could increase energy demands significantly if usage grows rapidly. In conclusion, cloud services can be more efficient than local systems depending on usage levels and transport energy costs.
This document summarizes a research paper on green cloud computing. It discusses how cloud computing can help reduce energy consumption and promote more environmentally sustainable IT practices compared to traditional on-site computing. Virtualization is highlighted as a key technology for improving resource utilization and reducing hardware needs. The CLEER model is presented as a tool for analyzing the potential energy savings from transitioning applications to the cloud. Based on this model, a case study found that moving common business software like email and CRM to the cloud in the US could save over 300 petajoules of energy annually.
The document analyzes a study conducted by Microsoft, Accenture, and WSP Atmosphere & Energy to compare the environmental impact of hosting three Microsoft business programs (Exchange, Dynamics CRM, SharePoint) on cloud servers versus on-premise installations. The study found that for large deployments, cloud servers can reduce energy use and carbon emissions by over 30% compared to on-premise installations. For small deployments, cloud servers can reduce energy use and carbon emissions by over 90%. Several factors allow cloud servers to be more efficient, including optimized resource allocation, server utilization, and data center design. Moving to cloud solutions can help organizations improve sustainability while outsourcing IT infrastructure investments.
The document discusses cloud computing and green computing. It defines cloud computing as using remote servers and the internet to maintain data and applications that users can access from any device with an internet connection. Cloud computing provides efficient computing through centralized storage, memory, processing and bandwidth. Green computing refers to environmentally sustainable IT that aims to minimize environmental impact and optimize energy efficiency and cost effectiveness.
This document discusses green cloud computing from the perspective of data centers. It begins with background on green computing and cloud computing. It then discusses how green cloud computing can help balance energy usage in data centers through server virtualization, energy-aware consolidation, and locating data centers in developing regions. The document presents two case studies, one on a green data center in Senegal and another on benefits realized by a cell phone company in South Africa from implementing a private cloud. It concludes with sections on the Indian scenario for green IT standardization and a call to continue research efforts to maximize efficiency of green data centers.
Green computing refers to using computing resources efficiently and minimizing environmental impact. It involves implementing energy-efficient policies and practices when setting up and operating IT systems. The goals of green computing include minimizing energy consumption, purchasing green energy, and reducing employee/customer travel requirements. Green cloud computing aims to achieve efficient infrastructure utilization and processing while minimizing energy usage. It uses techniques like dynamic resource allocation and powering down underutilized servers.
Cloud computing offers utility-oriented IT services worldwide, enabling hosting of applications from various domains. However, data centers consuming huge amounts of energy contributing to high costs and carbon footprints. Green cloud computing solutions are needed to save energy and reduce costs while powering down servers when not in use.
Green networking aims to reduce the carbon footprint of information and communication technology (ICT) networks by improving energy efficiency. Key strategies include optimizing network infrastructure utilization through technologies like virtualization, improving equipment energy efficiency, and locating network resources closer to renewable energy sources. Measurement of energy savings is important to track progress towards a lower carbon "Green Network".
Green Cloud Computing :Emerging TechnologyIRJET Journal
This document discusses green cloud computing and how cloud infrastructure contributes to high energy consumption. It summarizes that while cloud computing provides cost and scalability benefits, the growing demand on data centers has increased energy usage and carbon emissions. However, the document also explains that cloud computing technologies like dynamic provisioning, multi-tenancy, high server utilization, and efficient data center design can help reduce the environmental impact and enable more sustainable "green" cloud computing through higher efficiency. Future research directions are needed to further optimize cloud resource usage and energy efficiency from a holistic perspective.
Sukhpal Singh Gill and Rajkumar Buyya, "Cloud Data Centers and the Challenge of Sustainable Energy", Cutter Business Technology Journal, Volume 31, Issue 4, Pages 1-2, Publisher Cutter, 2018.
Fog Computing – Enhancing the Maximum Energy Consumption of Data Servers.dbpublications
Fog Computing and IoT systems make use of end-user premises devices as local servers. Here, we are identifying the scenarios for which running applications from NDCs are more energy-efficient than running the same applications from MDC. With the complete survey and analysis of various energy consumption factors such as different flow-variants and time-variants with respect to the Network Equipment we found two energy consumption use cases and respective results. Parameters such as current Load, Pmax, Cmax, Incremental Energy etc evolved with respect to system structure and various data related parameters leading to the conclusion that the NDC utilizes relatively reduced factor of energy comparative to the MDC. The study reveals that NDC as a part of Fog makeweights the MDCs to accompany respective applications, especially in the scenarios where IoT based applications are used where end users are the source data providers and can maximize the server utilization.
Enterprise-level Green ICT Using virtualization to balance energy economicsIJARIDEA Journal
Abstract— The computing industry has been a significant contributor to global warming ever since its
inception. Performance maximization per unit has cost remained the prime focus of academic and industrial
research alike, ignoring environmental impacts in the process if any. However, the infamous global energy
crisis has inevitably pushed power and energy management up the priority list of computing design and
management activities for purely economic reasons today. Green IT lays emphasis on including the
dimensions of environmental sustainability, the offsets of energy efficiency, and the total cost of
disposal and recycling. A green computing initiative must be adaptive and flexible enough to be
able to address problems that keep on increasing in size and complexity with time. Cloud computing concepts
can invariably be applied to reduce e-waste generation. The service oriented architecture lends itself to
incorporating green computing as a process rather than a product. Re-usability, extensibility and flexibility
are some of the key characteristics which are inherent to the cloud and directly help address the vertical
specific challenges to reducing energy consumption in the long run.
Keywords— Cloud computing, Electronic waste, Green Information Technology, Service oriented architecture.
Cloud infrastructure addresses two critical elements of a green IT a.pdfangeldresses
Cloud infrastructure addresses two critical elements of a green IT approach: energy efficiency
and resource efficiency. Whether done in a private or public cloud configuration, as-a-service
computing will be greener for (at least) the following three reasons.
1. Resource virtualization, enabling energy and resource efficiencies.
Virtualization is a foundational technology for deploying cloud-based infrastructure that allows a
single physical server to run multiple operating system images concurrently. As an enabler of
consolidation, server virtualization reduces the total physical server footprint, which has inherent
green benefits.
From a resource-efficiency perspective, less equipment is needed to run workloads, which
proactively reduces data center space and the eventual e-waste footprint. From an energy-
efficiency perspective, with less physical equipment plugged in, a data center will consume less
electricity.
It\'s worth noting that server virtualization is the most widely adopted green IT project
implemented or planned, at 90 percent of IT organizations globally into 2011.
2. Automation software, maximizing consolidation and utilization to drive efficiencies.
The presence of virtualization alone doesn\'t maximize energy and resource efficiencies. To
rapidly provision, move, and scale workloads, cloud-based infrastructure relies on automation
software.
Combined with the right skills and operational and architectural standards, automation allows IT
professionals to make the most of their cloud-based infrastructure investment by pushing the
limits of traditional consolidation and utilization ratios.
The higher these ratios are, the less physical infrastructure is needed, which in turn maximizes
the energy and resource efficiencies from server virtualization.
3. Pay-per-use and self-service, encouraging more efficient behavior and life-cycle management.
The pay-as-you-go nature of cloud-based infrastructure encourages users to only consume what
they need and nothing more. Combined with self-service, life-cycle management will improve,
since users can consume infrastructure resources only when they need it -- and \"turn off\" these
resources with set expiration times.
In concert, the pay-per-use and self-service capabilities of cloud-based infrastructure drive
energy and resource efficiencies simultaneously, since users only consume the computing
resources they need when they need it.
4. Multitenancy, delivering efficiencies of scale to benefit many organizations or business units.
Multitenancy allows many different organizations (public cloud) or many different business units
within the same organization (private cloud) to benefit from a common cloud-based
infrastructure.
By combining demand patterns across many organizations and business units, the peaks and
troughs of compute requirements flatten out. Combined with automation, the ratio between peak
and average loads becomes smaller, which in turn reduces the need for extra infrastructu.
Energy efficient resource allocation in cloud computingDivaynshu Totla
This document discusses energy efficiency in cloud computing. It first provides background on the rising energy consumption of data centers due to increased cloud usage. It then discusses various approaches for improving energy efficiency in clouds, including virtualization and energy-aware scheduling algorithms like round-robin and first-come first-serve. The document proposes an energy-aware VM scheduler that uses these algorithms to minimize server usage and reduce energy consumption while meeting performance requirements. Overall the document analyzes the problem of high cloud energy usage and proposes a scheduler to improve efficiency through virtualization and algorithmic approaches.
Building Blocks for Eco-efficient Cloud Computing for Higher Learning Institu...Editor IJCATR
Owning and managing a cloud-computing infrastructure, i.e. private data centers (DC), is a feasible way forward for an organization to ensure security of data when opting for cloud computing services. However, the cost associated with operating and managing a DC is a challenge because of the huge amount of power consumed and the carbon dioxide added to the environment. In particular, Higher Learning Institutions in Tanzania (HLIT) are among the institutions which need efficient computing infrastructure. This paper proposes eco-efficient cloud computing building blocks that ensure environment protection and optimal operational costs of a cloud computing framework that suffices HLIT computing needs. The proposed building blocks are in a form of power usage (renewable and nonrenewable); cloud deployment model and data center location; ambient climatic conditions and data center cooling; network coverage; quality of service and HLIT cloud software. The blocks are identified by considering HLIT computing requirements and challenges that exist in managing and operating cloud data centers. Moreover, this work identifies the challenges associated with optimization of resource usage in the proposed approach; and suggests related solutions as future work.
IRJET- Recent Trends in Green Cloud ComputingIRJET Journal
- Green computing aims to reduce the environmental impact of computing through more efficient use of computing resources and lower energy consumption. As data storage needs increase, more servers are required, leading to higher power usage and carbon emissions.
- Virtualization and docker are recent trends in green computing that help optimize resource utilization. Virtualization allows multiple operating systems to run on a single machine, reducing the number of physical servers needed. Docker provides a more efficient way to distribute applications using containers.
- Adopting green computing practices like virtualization can help companies reduce computing costs, lower power consumption by 30-70%, and decrease their environmental footprint by using resources more efficiently.
Cloud computing offers to users worldwide a low cost on-demand services, according to their requirements. In the recent years, the rapid growth and service quality of cloud computing has made it an attractive technology for different Tech Companies. However with the growing number of data centers resources, high levels of energy cost are being consumed with more carbon emissions in the air. For instance, the Google data center estimation of electric power consumption is equivalent to the energy requirement of a small sized city. Also, even if the virtualization of resources in cloud computing datacenters may reduce the number of physical machines and hardware equipments cost, it is still restrained by energy consumption issue. Energy efficiency has become a major concern for today’s cloud datacenter researchers, with a simultaneous improvement of the cloud service quality and reducing operation cost. This paper analyses and discusses the literature review of works related to the contribution of energy efficiency enhancement in cloud computing datacenters. The main objective is to have the best management of the involved physical machines which host the virtual ones in the cloud datacenters.
This document discusses green cloud computing and the need to develop optimized algorithms and applications to improve energy efficiency. It notes that while cloud computing provides economic benefits through shared infrastructure, the growing demand has increased energy consumption and carbon emissions. The document examines various technologies that enable green computing in clouds, such as virtualization, and proposes a green cloud architecture framework to improve efficiency from both user and provider perspectives. It stresses the importance of developing optimized algorithms and applications to minimize resource usage and route data to lower-cost energy regions.
Green Computing: A Methodology of Saving Energy by Resource Virtualization.IJCERT
This document discusses green computing and methods for saving energy through resource virtualization. It begins with an abstract that introduces green computing and its goal of reducing energy consumption and environmental impact. The introduction provides more details on cloud computing and how green computing aims to make data center services more energy efficient. The document then discusses various energy saving approaches of green computing like virtualization, cloud computing, and N-computing systems. It also discusses challenges and the future of green computing, concluding that more energy can be saved through continued development and use of virtualization and other green computing methodologies.
To understand the potential advantage of cloud computing in more detail, it is important to look at the different factors contributing to a lower per-user carbon footprint. These factors apply across cloud providers in general and are relevant for many on-premise scenarios.
An energy optimization with improved QOS approach for adaptive cloud resources IJECEIAES
In recent times, the utilization of cloud computing VMs is extremely enhanced in our day-to-day life due to the ample utilization of digital applications, network appliances, portable gadgets, and information devices etc. In this cloud computing VMs numerous different schemes can be implemented like multimedia-signal-processing-methods. Thus, efficient performance of these cloud-computing VMs becomes an obligatory constraint, precisely for these multimedia-signal-processing-methods. However, large amount of energy consumption and reduction in efficiency of these cloud-computing VMs are the key issues faced by different cloud computing organizations. Therefore, here, we have introduced a dynamic voltage and frequency scaling (DVFS) based adaptive cloud resource re-configurability (퐴퐶푅푅) technique for cloud computing devices, which efficiently reduces energy consumption, as well as perform operations in very less time. We have demonstrated an efficient resource allocation and utilization technique to optimize by reducing different costs of the model. We have also demonstrated efficient energy optimization techniques by reducing task loads. Our experimental outcomes shows the superiority of our proposed model 퐴퐶푅푅 in terms of average run time, power consumption and average power required than any other state-of-art techniques.
An Energy Aware Resource Utilization Framework to Control Traffic in Cloud Ne...IJECEIAES
Energy consumption in cloud computing occur due to the unreasonable way in which tasks are scheduled. So energy aware task scheduling is a major concern in cloud computing as energy consumption results into significant waste of energy, reduce the profit margin and also high carbon emissions which is not environmentally sustainable. Hence, energy efficient task scheduling solutions are required to attain variable resource management, live migration, minimal virtual machine design, overall system efficiency, reduction in operating costs, increasing system reliability, and prompting environmental protection with minimal performance overhead. This paper provides a comprehensive overview of the energy efficient techniques and approaches and proposes the energy aware resource utilization framework to control traffic in cloud networks and overloads.
This document discusses how information and communication technology (ICT) accounts for approximately 2% of global carbon dioxide emissions and how the carbon footprint of the ICT sector is projected to double by 2020. It then provides various ways for businesses to adopt green IT technologies like server virtualization, cloud computing, desktop virtualization, cloud data backup and disaster recovery to reduce costs and carbon footprint by increasing efficiency and replacing physical infrastructure with virtual solutions.
This chapter discusses approaches to green computing, including virtualization, server virtualization and consolidation, storage consolidation, and desktop virtualization. These approaches improve cost and energy efficiency through optimized use of computing and storage capacity, electricity, cooling, and real estate. Moving to thin clients and virtual desktops reduces energy consumption compared to traditional desktop computers. Server room upgrades are also discussed to improve cooling/ventilation systems and increase capacity for virtualized servers.
Similar to Improved performance through carbon aware green cloud policy (20)
This document presents a methodology for improved groundwater monitoring and management in Saudi Arabia. The methodology involves:
1. Frequently monitoring groundwater wells to determine optimal sampling frequencies for different water quality parameters.
2. Analyzing parameter changes using geostatistics techniques to generate predictive 3D models and contour maps.
3. Developing a virtual instrument using data fusion techniques to predict levels of sensitive parameters and future variations.
4. Proposing a nanotechnology treatment using titanium photocatalysis to degrade toxic contaminants and heavy metals in groundwater.
The goal is to develop an advanced monitoring program that implements new modeling techniques to better monitor, analyze, predict, and treat groundwater resources in the
1. The document discusses a proposed collaborative cloud computing platform called Utility based User Preference Resource Selection.
2. This platform aims to provide enhanced efficient management of resources and user satisfaction by identifying interdependencies between resources, user preferences, and utilization.
3. It incorporates integrated resource/reputation management, multi-QoS oriented resource selection, price-assisted resource/reputation control, and identification of user preferences.
This document summarizes a research paper that analyzes the impact of various human resource management (HRM) practices on employee retention in the banking sector in India. It first defines key HRM practices like human resource planning, recruitment and selection, training and development, performance appraisal, career planning, welfare activities, and promotions/transfers. It then discusses factors that influence employee retention such as organizational climate, job satisfaction, employer reputation, and commitment. The document aims to explore how implementing effective HRM practices can help banks better retain talented employees in a competitive environment.
This document summarizes the design, development, and cost analysis of four different liquid flow controllers used in food industries: float switch, bistable multivibrator with reed switch, limit switch, and programmable logic controller (PLC). It was found that the bistable multivibrator circuit using an IC 555 chip and reed switch was the most cost-effective solution and provided reliable automatic control of liquid pumps without user interaction. The document describes the components, circuit designs, and costs for each of the four liquid flow controller methods.
1. The study compared the oil absorption capabilities of coconut husk, grounded corn cob, and ungrounded corn cob. It found that coconut husk was the most effective absorber, absorbing virtually all the oil from the water.
2. The larger surface area of materials like the grounded corn cob allowed it to absorb more oil than the ungrounded corn cob. Coconut husk, with its fibrous nature, proved to be the best absorber.
3. While the absorbers removed only small amounts of water as expected, coconut husk achieved the highest ratio of remaining water to remaining oil. Given its effectiveness and low cost, the study recommends coconut husk be used in devices for offshore oil spill
This document summarizes a research study that tested different materials as potential oil spill absorbents. Coconut husk, unground corn cobs, and ground corn cobs were tested to see which absorbed the most oil from a water-oil mixture. The coconut husk absorbed the most oil due to its fibrous nature. While all absorbents removed only a small amount of water as expected, coconut husk was the most effective at oil removal and is therefore recommended for use in oil spill cleanup devices.
This study developed mathematical models to optimize weld bead geometry for TIG welding of aluminum hybrid composites. Response surface methodology was used to establish relationships between welding parameters (arc voltage, current, speed) and bead characteristics (height, width, penetration). Experiments were conducted using a Box-Behnken design. Quadratic models relating the parameters to characteristics had correlation coefficients over 75% and were found to accurately represent the welding process. Optimization identified the optimal parameter combination for achieving desirable bead geometry.
1) Krishi Vigyan Kendras (KVKs) are agricultural extension centers that provide training to farmers to improve their knowledge of best farming practices.
2) The study examined the impact of training from the BCT-KVK center in Visakhapatnam district of India. It found farmers who received training had higher knowledge levels about various crops compared to untrained farmers.
3) The knowledge gap between trained and untrained farmers was largest for crops like groundnuts and black grams (25.57%) and poultry rearing (47.67%), indicating training had the biggest impact on knowledge in these areas.
This document summarizes a training program conducted by the BCT Krishi Vigyan Kendra (KVK) in Visakhapatnam, India to provide unemployed tribal youth skills in nursery raising of agricultural and horticultural crops. The 6-month residential training program taught youth skills like seedling production using pro-trays, nursery maintenance, and landscaping to help generate livelihoods. An evaluation found the training increased participants' knowledge and work efficiency by 58.55%, allowing many to start their own nursery businesses. The program aimed to reduce unemployment and stabilize the socio-economic status of tribal communities through imparting practical skills.
SEASONAL VARIATION IN PHYSICO-CHEMICAL PARAMETERS OF SURFACE WATER AND GROUND...Ijrdt Journal
The present study is carried out to assess the water quality parameters of both surface water and ground water of Singanallur lake region a rivulet from river Noyyal. Parameters like pH, FC, DO, BOD, Turbidity, Total phosphates, Nitrates and Total dissolved solids are measured and compared for both summer and rainy season. Results revealed parameters varied to greater extent for surface water compared to ground water. So the surface water of Singanallur region is highly polluted due to runoff from industries, domestic waste and agricultural
Numerical modeling and analysis of slabsIjrdt Journal
This paper presents numerical modelling of slabs, linear modelling and analyzing of two way slab in a finite element based programming software ATENA and comparing with SAP for accuracy, The difference in result came to 14.3% hence, tolerable. Considering this, further nonlinear modelling and analysis is done in ATENA for one way and two way rectangular slabs, which includes both material and geometric modelling.Flexural load is applied for analysis of one way and two way slab. The displacement contour and crack pattern of slabs is presented which shows the appropriate behavior of slabs.
Literature review on need of composite additives for s.i engineIjrdt Journal
One of the major drawbacks of IC engines is low efficiency and pollution resulting from incomplete combustion. In order to improve the emission properties and performance an additive is blended with gasoline. The main objective of this paper was preparation of premium gasoline. The paper do literature study on effect of different additive on engine performance and emission. Through the study of literature survey, effect of different additives has been studied, it is found that different additive had some negative effect when used individually which conclude that there is need for new composite additives having better performance in respect of engine performance and emission control.
A mobile agent based approach for data management to support 3 d emergency pr...Ijrdt Journal
This document proposes a mobile agent-based approach for data management to support 3D emergency preparedness scenarios over ad-hoc networks. It aims to address the challenges of managing large amounts of data for virtual scenes on mobile devices with limited resources. The approach uses multiple mobile agents that can autonomously make decisions about data computation and node state. The agents work to gather critical data from avatars and supply it to stable neighbor nodes when nodes leave suddenly, to help maintain a persistent virtual environment. The approach is intended to limit disruption to applications and provide a realistic experience even as nodes enter and exit the network dynamically.
Segmentation of medical images using metric topology – a region growing approachIjrdt Journal
A metric topological approach to the region growing based segmentation is presented in this article. Region based growing techniques has gained a significant importance in the medical image processing field for finest of segregation of tumor detected part in the image. Conventional algorithms were concentrated on segmentation at the coarser level which failed to produce enough evidence for the validity of the algorithm. In this article a novel technique is proposed based on metric topological neighbourhood also with the introduction of new objective measure entropy, apart from the traditional validity measures of Accuracy, PSNR and MSE. This measure is introduced to prove the amount of information lost after segmentation is reduced to greater extent which elucidates the effectiveness of the algorithm. This algorithm is tested on the well known benchmarking of testing in ground truth images in par with the proposed region based growing segmented images. The results validated show the validation of effectiveness of the algorithm.
Large scale grid amalgamation of renewable energy sources in indian power systemIjrdt Journal
This document discusses the challenges and solutions related to large-scale integration of renewable energy sources into India's power grid. The key points are:
1. Integrating renewable energy is challenging due to its intermittent nature and geographical distribution far from load centers. This requires improved transmission infrastructure and balancing across regions.
2. Distributed generation from renewables also presents issues if connected to medium and low voltage distribution systems. Solutions include intelligent grid configurations and prioritizing generation to match demand.
3. Potential solutions discussed are energy storage, power electronics to enable grid integration, distributing renewable generation over large areas, and using solar power for irrigation to match generation with load.
Thickener waste management in mineral processing to prevent environmental pol...Ijrdt Journal
Water plays a vital role in mineral processing and about 2-3 tons of water is used for the treatment of one ton of ore. The objective of water recovery in thickener is to increase the solids concentration at the underflow to obtain clear water at the overflow. The particle size distribution, that follows the Rosin- Rammler equation, is considered as the most important factors affecting thickener choosing and waste water treatment. If solids below 200mesh include 8% or more of weight of feed, flocculant should be used to increase the sedimentation rate and the water clarity. Increasing the concentration of solids in the feed (up to 25 wt %) reduces the size and cost of the equipments required for separating. If a high concentration of solids in the feed is used with flocculant, thickener overflow can dilute feed and can increase sedimentation rate and clarity. An extra depth should be added to thickener depth due to the space lost by the turbulence of the fluid resistance force.
Multi turbine micro hydro power generationIjrdt Journal
Increase in human population has increased the demand for energy. Fossil fuels are the major source to meet the world energy requirements, but its rapidly dwindling supply and its adverse effects on our ecological system are of major concern. In India over 70 % of the electricity generated is from coal based power plants. Other renewable such as wind, geothermal, solar, and hydroelectricity represent a 2% share of the Indian fuel mix. Fossil fuels (coal) are a major source of power production in India. Our concept features the run of river active setup of micro hydro power generation using simple gear mechanism. This concept is based on the collection of mechanical energy from two rotors spinning by the effect of higher river velocity and transmission of power from the rotors to a small pinion gear which runs the generator shaft, through two large driver gears attached to the shafts of two rotors. This method of power production is comparatively simpler than others. The objectives of our project include low cost, higher output, environment friendly power production, multiple setups in one row, and decrease the power shortage in India.
Enhancement of heat transfer in tube in-tube heat exchangers using twisted in...Ijrdt Journal
Heat exchangers have several industrial and engineering applications. There are different methods to enhance heat transfer in heat exchangers. Passive technique of heat transfer is the most economical and best suited one. The role of inserts in internal forced convection has been widely acknowledged as a passive device in the heat transfer enhancement. One of such technique is introduction of twisted inserts which enhances the heat transfer coefficient. Twisted aluminium inserts when placed in the path of the fluid flow, creates a high degree of turbulence resulting in an increase in the heat transfer rate. By placing inserts, it is expected that the benefits due to the increased heat transfer coefficient overcome the higher cost involved because of the increased frictional losses. The work mainly focuses on increasing the heat transfer of tube-in-tube heat exchangers by using twisted aluminium inserts. The results obtained from the tube with twisted aluminium insert are compared with those without twisted insert using standard properties of heat transfer (LMTD & Effectiveness). The relations based on the data gathered during this work for predicting the heat transfer coefficient of the horizontal pipe with twisted taped insert are proposed. According to the results, in order to obtain maximum heat transfer, the twist ratio must be at the lowest level.
Improvement of signal coverage using wcdma signal repeater for 3 g systemsIjrdt Journal
Wireless communication has become an indispensable technology for the society. In broadband wireless data transmission technique, 3G cellular systems are expected to provide high data rate and less probability of error. This repeater finds application in areas of poor signal coverage and connectivity. The repeater consists of a patch panel antenna for receiving WCDMA signals from the base station and amplifying the signals using a wideband RF amplifier. The signals are then retransmitted to the weak coverage area using a directional Yagi-Uda antenna. The antenna characteristics such as return loss and VSWR are measured using a Network analyzer. The component of the repeater are mounted in a stand and the performance of the entire unit was evaluated using a WCDMA generator, act as a base station, transmitting at 869 MHz and 5dBm output power. A spectrum analyzer with WCDMA analyzer is used as a receiver, the RF signal level and constellation plots with error vector magnitude are determined
Applications of artificial Intelligence in Mechanical Engineering.pdfAtif Razi
Historically, mechanical engineering has relied heavily on human expertise and empirical methods to solve complex problems. With the introduction of computer-aided design (CAD) and finite element analysis (FEA), the field took its first steps towards digitization. These tools allowed engineers to simulate and analyze mechanical systems with greater accuracy and efficiency. However, the sheer volume of data generated by modern engineering systems and the increasing complexity of these systems have necessitated more advanced analytical tools, paving the way for AI.
AI offers the capability to process vast amounts of data, identify patterns, and make predictions with a level of speed and accuracy unattainable by traditional methods. This has profound implications for mechanical engineering, enabling more efficient design processes, predictive maintenance strategies, and optimized manufacturing operations. AI-driven tools can learn from historical data, adapt to new information, and continuously improve their performance, making them invaluable in tackling the multifaceted challenges of modern mechanical engineering.
Tools & Techniques for Commissioning and Maintaining PV Systems W-Animations ...Transcat
Join us for this solutions-based webinar on the tools and techniques for commissioning and maintaining PV Systems. In this session, we'll review the process of building and maintaining a solar array, starting with installation and commissioning, then reviewing operations and maintenance of the system. This course will review insulation resistance testing, I-V curve testing, earth-bond continuity, ground resistance testing, performance tests, visual inspections, ground and arc fault testing procedures, and power quality analysis.
Fluke Solar Application Specialist Will White is presenting on this engaging topic:
Will has worked in the renewable energy industry since 2005, first as an installer for a small east coast solar integrator before adding sales, design, and project management to his skillset. In 2022, Will joined Fluke as a solar application specialist, where he supports their renewable energy testing equipment like IV-curve tracers, electrical meters, and thermal imaging cameras. Experienced in wind power, solar thermal, energy storage, and all scales of PV, Will has primarily focused on residential and small commercial systems. He is passionate about implementing high-quality, code-compliant installation techniques.
Supermarket Management System Project Report.pdfKamal Acharya
Supermarket management is a stand-alone J2EE using Eclipse Juno program.
This project contains all the necessary required information about maintaining
the supermarket billing system.
The core idea of this project to minimize the paper work and centralize the
data. Here all the communication is taken in secure manner. That is, in this
application the information will be stored in client itself. For further security the
data base is stored in the back-end oracle and so no intruders can access it.
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024Sinan KOZAK
Sinan from the Delivery Hero mobile infrastructure engineering team shares a deep dive into performance acceleration with Gradle build cache optimizations. Sinan shares their journey into solving complex build-cache problems that affect Gradle builds. By understanding the challenges and solutions found in our journey, we aim to demonstrate the possibilities for faster builds. The case study reveals how overlapping outputs and cache misconfigurations led to significant increases in build times, especially as the project scaled up with numerous modules using Paparazzi tests. The journey from diagnosing to defeating cache issues offers invaluable lessons on maintaining cache integrity without sacrificing functionality.
Discover the latest insights on Data Driven Maintenance with our comprehensive webinar presentation. Learn about traditional maintenance challenges, the right approach to utilizing data, and the benefits of adopting a Data Driven Maintenance strategy. Explore real-world examples, industry best practices, and innovative solutions like FMECA and the D3M model. This presentation, led by expert Jules Oudmans, is essential for asset owners looking to optimize their maintenance processes and leverage digital technologies for improved efficiency and performance. Download now to stay ahead in the evolving maintenance landscape.
Improved performance through carbon aware green cloud policy
1. International Journal For Research & Development in Technology
Volume: 2, Issue: 2, AUGUST-2014 ISSN (Online):- 2349-3585
10 Copyright 2014- IJRDT www.ijrdt.org
Improved Performance through Carbon
Aware Green Cloud Policy
Indra Priyadharshini.S1
,Niranjan.V2
,Dr.Sankar Ram.N3
1
Assistant Professor, 2
Student, 3
Professor
123
RMK College of Engineering and Technology, Chennai
ABSTRACT: Cloud computing and Green computing are
two most emergent areas in information communication
technology (ICT) with immense applications in the entire
globe. Due to tremendous improvements in computer
networks, the people prefer the Network-based computing
instead of doing something in an in-house based computing.
In any business sectors, daily business and individual
computing are now migrating from individual hard drives to
internet servers. Therefore more and more companies are
investing in building large datacenters to host Cloud
services. These datacenters not only consume huge amount
of energy but are also very complex in the infrastructure
itself. Certain studies propose to make these datacenters
energy efficient by using technologies such as virtualization
and consolidation. These solutions are mostly cost driven
and thus, do not directly address the critical impact on the
environmental sustainability in terms of CO2 emissions.
Hence, in this work, we propose a user-oriented Cloud
architectural framework, i.e. Carbon Aware Green Cloud
Architecture, which addresses this environmental problem
from the overall usage of Cloud Computing resources.
Keywords: Cloud Computing, Green IT, Datacenter
I.INTRODUCTION
Cloud Computing provides a highly scalable and cost-
effective computing infrastructure for running IT applications
such as High Performance Computing (HPC), Web and
enterprise applications which require ever-increasing
computational resources. The emergence of Cloud Computing
has rapidly changed the paradigm of ownership-based
computing approach to subscription-oriented computing by
providing access to scalable infrastructure and services on-
demand. The cloud users can store, access, and share any
amount of information online. Similarly, small and medium
enterprises/organizations do not have to worry about
purchasing, conjuring, administering, and maintaining their
own computing infrastructure. They can instead focus on
improving their core competencies by exploiting a number of
Cloud Computing benefits such as low cost, datacenter
experiences, on-demand computing resources, faster and
cheaper software development capabilities. Cloud computing
has just begun to catch on and will soon spread like a wildfire.
However, Clouds are essentially datacenters hosting
application services offeredon a subscription basis. They
require high energy usage to maintain their operations. Today,
a typical datacenter with 1000 racks needs 10 Megawatt of
power to operate.Thus, for a datacenter, the energy cost is a
significant component of its operating and upfront costs. In
addition, in April 2007, it was estimated that the Information
and Communication Technologies (ICT) industry generates
about 2% of the total global CO2 emissions. In other words,
the internet releases around 300 million tons of CO2, which is
equivalent to the total consumption of coal, oil and gas by
some of the smaller countries in one year. According to a
report published by the European Union, a decrease in
emission volume of 15-30% is required before the year 2020
to keep the global temperature increase below 2°C. Thus, the
rapidly growing energy consumption and CO2 emission of
Cloud infrastructure has become a key environmental concern.
Hence, energy efficient solutions are required to ensure the
environmental sustainability of this new computing paradigm.
Up to now, as datacenters are the major elements of Cloud
Computing resources, most solutions primarily focus on
minimizing the energy consumption of datacenters which
indirectly minimizes the CO2 emission. However, although
such solutions can decrease the energy consumption to a great
degree, they do not ensure the minimization of CO2 emissions
as a whole. For example, consider a Cloud datacenter which
uses cheap energy generated by coal. The usage of such a
datacenter will only increase CO2 emissions.
Therefore, we propose a user-oriented Carbon Aware Green
Cloud Architecture for reducing the carbon footprint of Cloud
Computing in a wholesome manner without sacrificing the
Quality of Service (QoS) (such as performance,
responsiveness and availability) offered by multiple Cloud
providers. Our architecture is designed such that it provides
incentives to both users and providers to utilize and deliver the
most “Green" services respectively. Our evaluation results in
the context of IaaS Clouds show that a large amount of CO2
savings can be gained using our proposed architecture.
II.RELATED WORK:
Most works improve the energy efficiency of Clouds by
addressing the issue within a particular datacenter and not
2. International Journal For Research & Development in Technology
Paper Title:- Improved Performance through Carbon aware
Green Cloud Policy (Vol.2, Issue 2) ISSN (O):- 2349-3585
11 Copyright 2014- IJRDT www.ijrdt.org
from the usage of Clouds as a whole. They focus on
scheduling and resource management within a single
datacenter to reduce the amount of active resources executing
the workload. The consolidation of Virtual Machines (VMs),
VM migration, scheduling, demand projection, heat
management, temperature aware allocation, and load
balancing are used as basic techniques for minimizing energy
consumption. Virtualization plays an important role in these
techniques due to its several benefits such as consolidation,
live migration and performance isolation.
Some works also propose frameworks to enable the energy
efficiency of Clouds from user and provider perspectives.
From the provider perspective, Green Cloud architecture aims
to reduce virtualized datacenter energy consumption by
supporting optimized VM migration and VM placement.
Similar work is presented by at Green Open Cloud (GOC).
GOC is designed for next generation Cloud datacenter that
supports facilities like advance reservation. GOC aggregates
the workload by negotiating with users so that idle servers can
be switched-off longer.
Although these works maximize the energy efficiency of
Cloud datacenters, they do not considerCO2 emission which
measures the environmental sustainability of Cloud
Computing. Even if a cloud provider has used most energy
efficientsolutions for building his datacenter, it is still
notassured that Cloud Computing will be carbon efficient.
Greenpeace indicates that current datacenters arereally not
environmentally friendly as Cloud providers are more
concerned aboutreducing energy cost rather than CO2
emission. For instance, Google Datacenterin Lenoir, NC,
USA, uses 50.5% of dirty energy generated by coal. Thus, our
previous work proposes policies to simultaneously maximize
the Cloud provider's profit and minimize the CO2 emission of
its non-virtualized datacenters. Let us consider a similar multi-
datacenter scenario, but with a different perspective of
leveraging green energy by capping the brown energy. In
contrast, here we propose an architectural framework which
focuses on reducing the carbon footprint of Cloud Computing
as a whole. Specifically, we consider all theelements of Cloud
computing including Software, Platform, and Infrastructureas
a Service. We also present a carbon aware policy for IaaS
providers.
III. THE GREEN CLOUD ARCHITECTURE:
We propose Carbon Aware Green Cloud Architecture, which
considers the goals of both users and providers while reducing
the CO2 emission of Clouds. Its elements include:
1. Third Party: Green Offer Directory and Carbon
Emission Directory listing available green Cloud services and
their energy efficiency respectively.
2.User: Green Broker accepting Cloud service
requests (i.e. software, platform, or infrastructure) and
selecting the best green Cloud provider.
3. Provider: Green Middleware enabling the most
carbon efficient operation of Clouds. The components of this
middleware vary depending on the Cloud offerings (i.e. SaaS,
PaaS, or IaaS).
Fig. 1: Carbon aware Green Cloud Architecture
3.1 Third Party: Green Offer Directory and Carbon
Emission Directory
We propose two new elements, i.e. Green Offer Directory and
Carbon Emission Directory, which are essential to enforce the
green usage of Cloud Computing. Governments have already
introduced energy ratings for datacenters and various laws to
cap the energy usage of these datacenters. There is also
increasing awareness on the impact of greenhouse gases on
climate change. Therefore, users will likely prefer using Cloud
services of providers which ensure the minimum carbon
footprint. Cloud providers can also use these directories as an
advertising tool to attract more users. Hence, the introduction
of such directories is practical in the current context of Cloud
Computing. Cloud providers register their services in the form
of `Green Offers' to a Green Offer Directory which is accessed
by Green Broker. These offers consist of the type of service
provided, pricing, and time when it can be accessed for the
least CO2 emission.
The Carbon Emission Directory maintains data related to the
energy efficiency of Cloud services, which include the Power
Usage Effectiveness (PUE) and cooling efficiency of Cloud
datacenters which are providing the service, network cost, and
CO2 emission rate of electricity. Hence, Green Broker can get
the current status of energy parameters for using various
Cloud services from Carbon Emission Directory.
3. International Journal For Research & Development in Technology
Paper Title:- Improved Performance through Carbon aware
Green Cloud Policy (Vol.2, Issue 2) ISSN (O):- 2349-3585
12 Copyright 2014- IJRDT www.ijrdt.org
1.2 User: Green Broker
Fig 2(a): Green Broker
Fig 2(b): Green Middleware components for each cloud
service
Green Broker (Figure 2) has similar responsibility as a typical
Cloud broker, i.e. to lease Cloud services on behalf of users
and schedule their applications. Its first layer comprises Cloud
request services that analyze the requests and their QoS
requirements. Its second layer calculates the cost and carbon
footprint of leasing particular Cloud services based on
information about various Cloud offerings and current CO2
emission factors obtained from Green Offer Directory and
Carbon Emission Directory respectively. With these
calculations, Green Policies make the decisions of leasing
Cloud services. If no exact match is found for a request,
alternate `Green Offers' are suggested to users by Cloud
Request Services. The carbon footprint of a user request
depends on the type of Cloud service it requires, i.e. SaaS,
PaaS and IaaS, and is computed as the sum of CO2 emission
due to data transfer and service execution at datacenter. SaaS
and PaaS requests use CO2 emission per second (CO2PS) to
reflect long term usage, while IaaS request uses CO2 emission
as data transfer is mostly once.
SaaS and PaaS Request (CO2 emission per second):
𝑪𝑶𝟐𝑷𝑺 𝑺𝒂𝒂𝑺/𝑷𝒂𝒂𝑺 = 𝒓 𝒅𝑻
𝑪𝑶𝟐
𝑬 𝒅𝑻 𝑿𝒂 𝒅𝑻 + (𝒓 𝑪𝑶𝟐
𝑿
𝟏
𝑫𝑪𝒊𝑬
𝑿𝑬 𝒔𝒆𝒓𝒗)
(1)
Where rCO2dT is the CO2 emission rate per joule of energy
spent from the user's machine to the datacenter, EdT is the
per-bit energy consumption of data transfer, adT is the data
bits transferred per second, rCO2 is the CO2 emission rate
where the datacenter is located, DCiE is the power efficiency
of the datacenter defined as the fraction of total power
dissipated that is used for IT resources, and Eserv is the
energy spent per second by the server for executing the user's
request. The total power dissipated by a Cloud provider is
used not only for computers, but also for other purposes,
including power conditioning, HVAC (Heating, Ventilating,
and Air Conditioning), lighting, and wiring. Therefore, DCiE
is the most appropriate parameter for selecting Cloud
providers.
IaaS Request (CO2 emission):
𝑪𝑶𝟐 𝑰𝒂𝒂𝑺 =
𝒓 𝒅𝑻
𝑪𝑶𝟐
𝑬 𝒅𝑻 𝑿𝑰𝑶𝒅𝒂𝒕𝒂 + (𝒓 𝑪𝑶𝟐
𝑿
𝟏
𝑫𝑪𝒊𝑬
𝑿𝑬 𝒔𝒆𝒓𝒗 𝑿𝑽𝒕𝒊𝒎𝒆) (2)
Where IOdata is the data transferred to run application on VM
leased fromClouds and Vtime is the time for which VM is
active.
3.3 Provider: Green Middleware
To support carbon aware Cloud Computing, a Cloud provider
must implement “Green" conscious middleware at various
layers depending on the type of Cloud service offered (SaaS,
PaaS, or IaaS) (Figure 2) as follows:
SaaS Level: SaaS providers mainly offer software installed in
their own datacenters or resources leased from IaaS providers.
Therefore, they require Power Capping component to limit the
usage of software services by each user. This is especially
important for social networking and game applications where
users become completely unaware of their actions on
environmental sustainability. SaaS providers can also offer
Green Software Services deployed on carbon efficient
datacenters with less replications.
PaaS Level: PaaS providers in general offer platform services
for application development and their deployment. Thus, to
ensure energy efficient development of applications, relevant
components such as Green Compiler to compile applications
with the minimum carbon footprint and carbon measuring
tools for users to monitor the carbon footprint of their
applications.
IaaS level: IaaS providers play the most crucial role in the
success of Green Cloud Architecture since IaaS not only
offers independent infrastructure services, but also support
other services (SaaS and PaaS) offered by Clouds. They use
the latest technologies for IT and cooling systems to have the
most energy efficient infrastructure. By using virtualization
and consolidation, the energy consumption is further reduced
by switching off unutilized servers.Energy and Temperature
Sensors are installed to calculate the current energy efficiency
of each IaaS provider and their datacenters. This informationis
4. International Journal For Research & Development in Technology
Paper Title:- Improved Performance through Carbon aware
Green Cloud Policy (Vol.2, Issue 2) ISSN (O):- 2349-3585
13 Copyright 2014- IJRDT www.ijrdt.org
advertised regularly by Cloud providers in the Carbon
Emission Directory. Various green scheduling and resource
provisioning policies will ensureminimum energy usage. In
addition, IaaS providers can design attractive ‘Green Offers'
and pricing schemes providing incentives for users to use
theirservices during off -peak or maximum energy efficiency
hours.
Cloud datacenters have different CO2 emission rates and
energy costs basedon their locations. Each datacenter updates
this data to Carbon Emission Directory for facilitating carbon
efficientscheduling. For this study, we consider threeCO2
emission related parameters: CO2 emission rate (kg/kWh)
(rCO2i ), averageDCiE (Ieffi), and VM power efficiency
(VMeffi). The VM power efficiency isthe amount of power
dissipated by fully active VM running at maximum utilization
level. In Green OfferDirectory, IaaS providers specify the
maximumnumber of VMs that can be initiated at a particular
time for achieving the highest energy efficiency due to the
variation in datacenter efficiency with time andloadand power
capping technologies used within the datacenter.
IV.CARBON EFFICIENT GREEN POLICY (CEGP)
We develop Carbon Efficient Green Policy (CEGP) for Green
Broker to periodically select the Cloud provider with the
minimum carbon footprint and initiateVMs to run the jobs
(Algorithm 1). Based on user requests at each scheduling
interval, Green Broker obtains information from Carbon
Emission Directoryabout the current CO2 emission related
parameters of providers as describedin Section 4. The QoS
requirements of a job j is defined in a tuple(dj ; nj ; ej; 𝑓𝑗
𝑚
),
where dj is the deadline to complete job j, nj is the number
ofCPUs required for job execution, and ej is the job execution
time when operatingat the CPU frequency 𝑓𝑗
𝑚
.CEGP then
sorts the incoming jobs based on Earliest Deadline First
(EDF), before sorting the Cloud datacenters based on their
carbon footprint. CEGP schedule jobs to IaaS Clouds in a
greedy manner to reducethe overall CO2 emission. For IaaS
providers, CEGP uses three main factors tocalculate the CO2
emission: CO2 emission rate, DCiE, and CPU power
efficiency.The carbon footprint of an IaaS Cloud i is given
by:𝑟𝑖
𝐶𝑂2
𝑋
1
𝑙𝑒𝑓𝑓 𝑖
𝑋
1
𝑉𝑀𝑒𝑓𝑓 𝑖
where, VMeffi can be calculated by
Cloud providers based on the proportionof resources on a
server utilized by the VM using tools such as PowerMeter.If a
VM consumes the power equivalent to a processor running at
fi frequency level, then we can use the following power model
to calculate its power efficiency: βi + αi(𝒇𝒊) 𝟑
, where βi is the
static power dissipated by the CPU and αi is the
proportionality constant. Therefore, the approximate energy
efficiency of VM is: VMeffi =
𝑓 𝑖
β𝐢 + α𝐢(𝒇 𝒊) 𝟑. If job j executes at
CPU frequency f, then its CO2 emission will be the minimum
when it is allocated to the datacenter with the minimum CO2
emission rate rCO2i, maximum DCiE value Ieffi, and
maximum CPU power efficiency VMeffi. CEGP then assigns
jobs to VMs initiated on each Cloud datacenter according to
this ordering.
V. PERFORMANCE EVALUATION AND RESULTS
Experimental Scenarios: We compare the carbon efficiency
of CEGP with a performance-based scheduling algorithm
(Earliest Start Time (EST)) using two metrics: average energy
consumption and CO2 emissions. EST schedules jobs to the
datacenter where jobs can start as earliest as possible with the
least waiting time. The average energy consumption shows the
amount of energy saved by our green framework using CEGP
compared to an existing approach using EST which just
focuses on performance, whereas the average CO2 emission
shows its corresponding environmental impact. We examine
two experimental scenarios: 1) comparison of CEGP with EST
and 2) effect of relationship betweenCO2 emission rate and
datacenter power efficiency DCiE. The first scenario
demonstrates how our proposed architecture can achieve
higher carbon efficiency. The second scenario reveals how the
relationship between CO2 emission rate and DCiE can affect
the achievement of carbon efficiency. Hence, we consider two
types of relationship between CO2 emission rate and DCiE: 1)
datacenter withthe highest CO2 emission rate has the highest
DCiE (HH) and 2) datacenterwith the highest CO2 emission
rate has the lowest DCiE (HL).
5.1 Comparison of CEGP with Performance-based
Algorithm (EST)
We compare CEGP with EST for datacenters with HH
configuration. The effect of job urgency on energy
consumption and CO2 emission is prominent. As the
percentage of HU (High Urgency) jobs with more urgent
(shorter) deadlines increases, the energy consumption (Figure
3(a)) and CO2 emission (Figure 3(b)) also increase due to
more urgent jobs running on datacenters with lower DCiE
value and at the highest CPU frequency to avoid deadline
violations. It is clear that our proposed architecture using
5. International Journal For Research & Development in Technology
Paper Title:- Improved Performance through Carbon aware
Green Cloud Policy (Vol.2, Issue 2) ISSN (O):- 2349-3585
14 Copyright 2014- IJRDT www.ijrdt.org
CEGP (EDF-CEGP) can reduce up to 23% of the energy
consumption (Figure 3(a)) and 25% of the CO2 emission
(Figure 3(b)) compared to an existing approach using EST
(EDF- EST) across all datacenters. CEGP is also able to
complete very similar amount of workloads as EST (Figure
3(c)), but with much less energy consumption and CO2
emission. This highlights the importance of considering the
DCiE and CO2 emission related factors in achieving the
carbon efficient usage of Cloud Computing. In particular,
CEGP can reduce energy consumption (Figure 3(a)) and CO2
emission (Figure 3(b)) even more when there are more LU
jobs with less urgent (longer) deadline.
Fig.3. Comparison of CEGP with performance-based EST
5.2 Effect of Relationship between CO2 Emission Rate and
Datacenter Power Efficiency DCiE
This experiment analyzes the impact of different
configurations (HH and HL) of datacenters with respect to
CO2 emission rate and datacenter power efficiency DCiE
based on 40% of high urgency jobs. In both HH and HL
configurations, CEGP reduces CO2 emission and energy
consumption between 23% and 25% (Figure 4(a) and 4(b)).
Therefore, we infer that for other configurations, we will also
achieve similar carbon efficiency in Cloud Computing by
using CEGP. Moreover, in Figure 4(a), there is a decrease in
energy consumption of all the Cloud datacenters from HH to
HL configuration by using EST, while there is almost no
corresponding decrease by using CEGP. This shows that how
important is the consideration of global factors such as DCiE
and CO2 emission rate in order to improve the carbon
footprint of Cloud Computing.
Fig.4. Effect of relationship between CO2 emission rate and
DCiE
VI.CONCLUSION
In this paper, we present a Carbon Aware Green Cloud
Architecture to improve the carbon footprint of Cloud
Computing taking into account its global view. Our
architecture is designed such that it provides incentives to both
users and providers to utilize and deliver the most “Green"
services respectively. Therefore, it embeds components such
6. International Journal For Research & Development in Technology
Paper Title:- Improved Performance through Carbon aware
Green Cloud Policy (Vol.2, Issue 2) ISSN (O):- 2349-3585
15 Copyright 2014- IJRDT www.ijrdt.org
as Green broker from user side to ensure the execution of their
applications with the minimum carbon footprint. Similarly,
from provider side, we propose features for next generation
Cloud providers who will publish the carbon footprint of their
services in public directories and provide `Green Offers' to
minimize their overall energy consumption. We also propose a
Carbon Efficient Green Policy (CEGP) for Green broker
which schedules user application workload with urgent
deadline on Cloud datacenters with more energy efficiency
and low carbon footprint. Further, the simulation-based
evaluation of our architecture is done in multiple IaaS Cloud
provider scenarios. We compare two scheduling approaches to
prove how our proposed architecture helps in improving
carbon and energy footprint of Cloud Computing.
Performance evaluation results show how our proposed
architecture using a Green Policy CEGP can save up to 23%
energy while improving the carbon footprint by about 25%.
Therefore, these promising results show that by using our
architectural framework carbon footprint and energy
consumption of Cloud Computing can be improved. In the
future, we will investigate different `Green Policies' for Green
broker and also how Cloud providers can design various
`Green Offers' based on their internal power efficiency
techniques such as VM consolidation and migration. We will
also conduct experiments for our architecture using real
Clouds.
REFERNCES
[1]. Saurabh Kumar Garg, Chee Shin Yeo and
RajkumarBuyya, Green Cloud Framework For Improving
Carbon
Efficiency of Clouds, Institute of High Performance
Computing, Singapore
[2]. Beloglazov, A., Buyya, R., Lee, Y., Zomaya, A.: A
Taxonomy and Survey ofEnergy-Efficient Data Centers and
cloud Computing Systems. Advances in Computers, M.
Zelkowitz (editor). Elsevier, San Francisco, USA (2011)
[3]. Bohra, A., Chaudhary, V.: Vmeter: Power modelling for
virtualized clouds. In:Proc. of 24th IEEE IPDPS Workshops.
Atlanta, USA (2010)
[4]. Cameron, K.: Trading in Green IT. Computer 43(3), 83-85
(2010)
[5]. Chen, Y., et al.: Managing server energy and operational
costs in hosting centers.ACM SIGMETRICSPerformance
Evaluation Review 33(1), 303-314 (2005)
[6]. Feitelson, D.: Parallel workloads archive.
http://www.cs.huji.ac.il/labs/parallel/workload (2011)
[7]. Garg, S., Yeo, C., Anandasivam, A., Buyya, R.:
Environment-conscious schedulingof HPC applications on
distributed cloud-oriented data centers. Journal of Paralleland
Distributed Computing 71(6), 732-749 (2011)
[8]. Gartner: Gartner Estimates ICT Industry Accounts for 2
Percent of Global CO2Emissions.
http://www.gartner.com/it/page.jsp?id=503867 (Apr 2007)
[9]. Greenberg, S., et al.: Best practices for data centers:
Results from benchmarking 22data centers. In: ACEEE
Summer Study on Energy E_ciency in Buildings (2006)
[10]. Greenpeace International: Make IT green: Cloud
computing and its contributionto climate change (2010)
[11]. Irwin, D., Grit, L., Chase, J.: Balancing risk and reward
in a market-based taskservice. In: Proc. of 13th IEEE HPDC.
Honolulu, USA (2004)
[12]. Kurp, P.: Green computing. Commun. ACM 51, 11-13
(2008)
[13]. Le, K., et al.: Managing the cost, energy consumption,
and carbon footprint ofinternet services. ACMSIGMETRICS
Perf.Eval. Review 38(1), 357-358 (2010)