This document discusses the emergence of cloud libraries and cloud computing applications for science and technology (S&T) libraries. It begins by describing the evolution from traditional paper-based libraries to digital libraries without physical walls. It then defines cloud computing and explains how many popular web services already utilize cloud computing. The document outlines different types of cloud services including SaaS, PaaS, IaaS and Daas and provides examples of how libraries currently use cloud computing applications. It raises questions about data ownership, costs, and technical requirements for libraries adopting cloud-based systems and services.
The Reality of the Cloud: Implications of Cloud Computing for Mobile Library ...University of Missouri
The document discusses how cloud computing enables mobile library technologies and services. It defines cloud computing and explains how software, products, and data can reside in the cloud. Examples are given of library services, products, and data that are being accessed remotely through cloud-based platforms. Challenges of cloud computing like loss of local control and data security are addressed, but benefits like increased access and lower costs are also outlined. The cloud allows the library to become a platform for sharing content and data between librarians and patrons on their mobile devices.
This document discusses big data and cloud computing. It introduces cloud storage and computing models. It then discusses how big data requires distributed systems that can scale out across many commodity machines to handle large volumes and varieties of data with high velocity. The document outlines some famous cloud products and their technologies. Finally, it provides an overview of the company's focus on enterprise big data management leveraging cloud technologies, and lists some of its cloud products and services including data storage, object storage, MapReduce and compute cloud services.
The document discusses Digital Asset Management (DAM) and defines it as the management of digital assets and their associated metadata. It describes how DAM involves ingesting, cataloging, storing, retrieving and distributing digital assets like photos, videos and music. It also discusses how DAM systems provide features for classifying, indexing, versioning and securing digital assets to transform them into managed assets.
We are in the midst of a computing revolution. As the cost of provisioning hardware and software stacks grows, and the cost of securing and administering these complex systems grows even faster, we're seeing a shift towards computing clouds. For cloud service providers, there is efficiency from amortizing costs and averaging usage peaks. Internet portals like Yahoo! have long offered application services, such as email for individuals and organizations. Companies are now offering services such as storage and compute cycles, enabling higher-level services to be built on top. In this talk, I will discuss Yahoo!'s vision of cloud computing, and describe some of the key initiatives, highlighting the technical challenges involved in designing hosted, multi-tenanted data management systems.
Cloud computing provides on-demand access to shared computing resources like networks, servers, storage, applications and services over the internet. It delivers these resources as a service and allows users to access applications from anywhere in the world. Cloud computing benefits libraries by increasing visibility and accessibility of collections, reducing duplication of effort, and making resources available remotely. It also saves libraries money by charging incrementally based on usage. Libraries can utilize various cloud services and providers to host websites, backup media collections, and build digital library systems accessible to users.
Richard McDougall discusses trends in big data and frameworks for building big data applications. He outlines the growth of data, how big data is driving real-world benefits, and early adopter industries. McDougall also summarizes batch processing frameworks like Hadoop and Spark, graph processing frameworks like Pregel, and real-time processing frameworks like Storm. Finally, he discusses interactive processing frameworks such as Hive, Impala, and Shark and how to unify the big data platform using virtualization.
The document is a presentation about scaling clouds for startups. It discusses the speaker's experience using various cloud providers and platforms. It provides an overview of cloud computing models and components. It also covers best practices like automating operations, architecting for failure and elasticity, and controlling cloud costs through metrics like utilization and reserved instances. The presentation emphasizes that choosing the right cloud stack and provider depends on the application needs and that planning for failures is essential given the cloud's dynamic nature.
This document discusses cloud computing concepts, technologies, and business implications. It provides an introduction to cloud models including Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). It also demonstrates cloud capabilities through examples of cloud models, data and computing models using MapReduce, and graph processing using Amazon Elastic MapReduce. The document discusses enabling cloud technologies including virtualization, multi-core architectures, and web services interfaces.
The Reality of the Cloud: Implications of Cloud Computing for Mobile Library ...University of Missouri
The document discusses how cloud computing enables mobile library technologies and services. It defines cloud computing and explains how software, products, and data can reside in the cloud. Examples are given of library services, products, and data that are being accessed remotely through cloud-based platforms. Challenges of cloud computing like loss of local control and data security are addressed, but benefits like increased access and lower costs are also outlined. The cloud allows the library to become a platform for sharing content and data between librarians and patrons on their mobile devices.
This document discusses big data and cloud computing. It introduces cloud storage and computing models. It then discusses how big data requires distributed systems that can scale out across many commodity machines to handle large volumes and varieties of data with high velocity. The document outlines some famous cloud products and their technologies. Finally, it provides an overview of the company's focus on enterprise big data management leveraging cloud technologies, and lists some of its cloud products and services including data storage, object storage, MapReduce and compute cloud services.
The document discusses Digital Asset Management (DAM) and defines it as the management of digital assets and their associated metadata. It describes how DAM involves ingesting, cataloging, storing, retrieving and distributing digital assets like photos, videos and music. It also discusses how DAM systems provide features for classifying, indexing, versioning and securing digital assets to transform them into managed assets.
We are in the midst of a computing revolution. As the cost of provisioning hardware and software stacks grows, and the cost of securing and administering these complex systems grows even faster, we're seeing a shift towards computing clouds. For cloud service providers, there is efficiency from amortizing costs and averaging usage peaks. Internet portals like Yahoo! have long offered application services, such as email for individuals and organizations. Companies are now offering services such as storage and compute cycles, enabling higher-level services to be built on top. In this talk, I will discuss Yahoo!'s vision of cloud computing, and describe some of the key initiatives, highlighting the technical challenges involved in designing hosted, multi-tenanted data management systems.
Cloud computing provides on-demand access to shared computing resources like networks, servers, storage, applications and services over the internet. It delivers these resources as a service and allows users to access applications from anywhere in the world. Cloud computing benefits libraries by increasing visibility and accessibility of collections, reducing duplication of effort, and making resources available remotely. It also saves libraries money by charging incrementally based on usage. Libraries can utilize various cloud services and providers to host websites, backup media collections, and build digital library systems accessible to users.
Richard McDougall discusses trends in big data and frameworks for building big data applications. He outlines the growth of data, how big data is driving real-world benefits, and early adopter industries. McDougall also summarizes batch processing frameworks like Hadoop and Spark, graph processing frameworks like Pregel, and real-time processing frameworks like Storm. Finally, he discusses interactive processing frameworks such as Hive, Impala, and Shark and how to unify the big data platform using virtualization.
The document is a presentation about scaling clouds for startups. It discusses the speaker's experience using various cloud providers and platforms. It provides an overview of cloud computing models and components. It also covers best practices like automating operations, architecting for failure and elasticity, and controlling cloud costs through metrics like utilization and reserved instances. The presentation emphasizes that choosing the right cloud stack and provider depends on the application needs and that planning for failures is essential given the cloud's dynamic nature.
This document discusses cloud computing concepts, technologies, and business implications. It provides an introduction to cloud models including Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). It also demonstrates cloud capabilities through examples of cloud models, data and computing models using MapReduce, and graph processing using Amazon Elastic MapReduce. The document discusses enabling cloud technologies including virtualization, multi-core architectures, and web services interfaces.
This document discusses a presentation on cloud computing concepts, technologies, and business implications. It provides an outline of the talk including an introduction to cloud models like IaaS, PaaS, and SaaS. It also discusses demonstrating cloud capabilities through examples and a case study of applying cloud computing to a real business application. The speakers' backgrounds in cloud computing are introduced and the document concludes with an introduction stating that cloud computing represents a new golden era in computing.
A novel solution of distributed memory no sql database for cloud computingJoão Gabriel Lima
This document proposes a new distributed memory NoSQL database architecture called CDSA for cloud computing. CDSA aims to improve the performance of querying and storing large amounts of distributed data in the cloud. The architecture stores redundant copies of data across nodes to ensure high availability and reliability. It also allows nodes to be dynamically added or removed without interrupting service.
The document summarizes Terapot, a commercial email archiving system that uses Hadoop. It discusses how Terapot addresses the challenges of archiving massive amounts of email data at low cost and high scalability. Terapot leverages Hadoop's distributed architecture for crawling, indexing, and searching emails across thousands of servers. Key components include batch processing for archiving, real-time indexing, distributed search, and analysis tools that mine the archived email data.
AWS Partner Presentation - PetaByte Scale Computing on Amazon EC2 with BigDat...Amazon Web Services
The document discusses challenges with processing and storing large amounts of data at petabyte scales. Traditional relational databases do not scale well for this amount of data. The solution proposed uses Hadoop for distributed processing of large datasets and Amazon S3 and NoSQL databases for scalable storage. A software-based storage engine called iMoveS is suggested to intelligently migrate data between different storage options based on policies and access patterns to make management easier and improve performance, availability, and lower costs.
What do Azure, AWS, IBM, and Dell EMC ECS have in common? All are leveraging Nasuni UniFS® for scalable file storage and cross-site synchronisation. Nasuni offers the first global file system designed for private and public cloud object storage, so it scales without the limits of legacy controller-based file systems. Learn how Nasuni consolidates distributed file servers and NAS, enables high-speed file collaboration across any number of locations, improves file recovery points and times, simplifies DR, and accelerates business growth. All while reducing costs up to 60% compared to traditional file infrastructure.
The document summarizes insights from a 2008 workshop between researchers and leaders in the commercial cloud computing community. It discusses how the researchers had to revise their definition of cloud computing based on the perspectives shared by speakers from IBM, Microsoft, and eBay. While some current research topics seemed less important to the companies, the speakers also brought up new questions for researchers regarding challenges like managing infrastructure at massive scales.
Open Source Paving the Future of Cloud and Big Data StrategiesSKALI Group
This document discusses how open source technologies are paving the future of cloud and big data strategies. It defines key open source technologies like hypervisors Xen and KVM that power major cloud platforms. The document explains how open source lowers barriers to cloud adoption through cost savings, customization opportunities, and collaboration. Open source is poised to significantly impact cloud computing by offering free and flexible alternatives to proprietary software license models. Finally, the document briefly touches on how cloud and big data are related through cloud's ability to handle the volume, velocity and variety of big data through massive resource pooling and rapid elasticity.
Big data application using hadoop in cloud [Smart Refrigerator] Pushkar Bhandari
This document proposes a smart refrigerator concept that uses cloud computing and big data techniques. Sensors in the refrigerator would generate and store data in the cloud. This data could then be used to detect malfunctions and provide notifications to users. It also allows third-party vendors regulated access to analyze the data for purposes like sending discount offers or analyzing refrigerator use patterns.
removing nodes. NuoDB automatically redistributes data and transactions
across available resources to maintain high performance and availability.
The document introduces NuoDB, a new cloud data management system that takes a completely new
approach to databases by rewriting the rules for relational databases to work in the cloud. NuoDB uses a
distributed, shared-nothing architecture that elastically scales to handle large transaction volumes and
global users. It provides the reliability of ACID transactions while scaling out simply on commodity cloud
resources to meet the demands of modern web-scale applications.
Yahoo uses Apache Hadoop extensively to power many of its products and services. Hadoop allows Yahoo to gain insights from massive amounts of data, including user data from services like Flickr and Yahoo Mail. Yahoo has contributed over 70% of the code to the Apache Hadoop project to date. Hadoop is critical to Yahoo's business by enabling personalization, spam filtering, content optimization, and other data-driven features. Yahoo runs Hadoop on tens of thousands of servers storing over 100 petabytes of data. The company continues working to enhance Hadoop's scalability, flexibility, and performance to make it more suitable for enterprise use.
IBM Storage for Analytics, Cognitive and CloudTony Pearson
Presentation on Software-Defined Storage, Spectrum Scale for Analytics with Hadoop and Hortonworks, and IBM Cloud Object Storage, presented March 15 in San Juan, Puerto Rico
Lovett introducing cloud computing nov 2009Hilde Lovett
Cloud computing involves providing dynamically scalable computing resources as a service over the Internet. Major players like Google, Amazon, and Microsoft have large server parks and are expanding into computing services. Telecom operators are also pursuing opportunities in cloud computing by leveraging their network infrastructure and data centers. The presentation identifies several potential roles for Telenor in cloud services such as a network service provider, cloud customer, cloud reseller, PaaS or IaaS provider, and SaaS enabler.
The document provides an overview and introduction to NoSQL databases. It discusses what triggered the NoSQL movement, common characteristics of NoSQL systems, and business benefits. The agenda covers topics such as what NoSQL is, differences from big data and cloud computing, core concepts, example implementations, and selecting the right NoSQL system for a project.
This document provides an overview of cloud computing, including its basic functioning, characteristics, service models (IaaS, PaaS, SaaS), types of clouds (private, public, hybrid, multi-cloud, community), and advantages and disadvantages. Cloud computing allows on-demand access to shared configurable computing resources via the internet. It provides various capabilities for users to store and process data in third-party data centers. The main service models are infrastructure as a service, platform as a service, and software as a service.
The document discusses emerging trends in virtual library networks and services, including increased mobility, wireless access, and cloud computing. It examines how libraries' traditional roles of selection, organization, access, and preservation are adapting to new technologies. The rise of the internet and new trends like recommender systems, predictive text, and collaborative tools are changing how library networks can empower members and provide value through resource sharing.
This document discusses a presentation on cloud computing concepts, technologies, and business implications. It provides an outline of the talk including an introduction to cloud models like IaaS, PaaS, and SaaS. It also discusses demonstrating cloud capabilities through examples and a case study of applying cloud computing to a real business application. The speakers' backgrounds in cloud computing are introduced and the document concludes with an introduction stating that cloud computing represents a new golden era in computing.
A novel solution of distributed memory no sql database for cloud computingJoão Gabriel Lima
This document proposes a new distributed memory NoSQL database architecture called CDSA for cloud computing. CDSA aims to improve the performance of querying and storing large amounts of distributed data in the cloud. The architecture stores redundant copies of data across nodes to ensure high availability and reliability. It also allows nodes to be dynamically added or removed without interrupting service.
The document summarizes Terapot, a commercial email archiving system that uses Hadoop. It discusses how Terapot addresses the challenges of archiving massive amounts of email data at low cost and high scalability. Terapot leverages Hadoop's distributed architecture for crawling, indexing, and searching emails across thousands of servers. Key components include batch processing for archiving, real-time indexing, distributed search, and analysis tools that mine the archived email data.
AWS Partner Presentation - PetaByte Scale Computing on Amazon EC2 with BigDat...Amazon Web Services
The document discusses challenges with processing and storing large amounts of data at petabyte scales. Traditional relational databases do not scale well for this amount of data. The solution proposed uses Hadoop for distributed processing of large datasets and Amazon S3 and NoSQL databases for scalable storage. A software-based storage engine called iMoveS is suggested to intelligently migrate data between different storage options based on policies and access patterns to make management easier and improve performance, availability, and lower costs.
What do Azure, AWS, IBM, and Dell EMC ECS have in common? All are leveraging Nasuni UniFS® for scalable file storage and cross-site synchronisation. Nasuni offers the first global file system designed for private and public cloud object storage, so it scales without the limits of legacy controller-based file systems. Learn how Nasuni consolidates distributed file servers and NAS, enables high-speed file collaboration across any number of locations, improves file recovery points and times, simplifies DR, and accelerates business growth. All while reducing costs up to 60% compared to traditional file infrastructure.
The document summarizes insights from a 2008 workshop between researchers and leaders in the commercial cloud computing community. It discusses how the researchers had to revise their definition of cloud computing based on the perspectives shared by speakers from IBM, Microsoft, and eBay. While some current research topics seemed less important to the companies, the speakers also brought up new questions for researchers regarding challenges like managing infrastructure at massive scales.
Open Source Paving the Future of Cloud and Big Data StrategiesSKALI Group
This document discusses how open source technologies are paving the future of cloud and big data strategies. It defines key open source technologies like hypervisors Xen and KVM that power major cloud platforms. The document explains how open source lowers barriers to cloud adoption through cost savings, customization opportunities, and collaboration. Open source is poised to significantly impact cloud computing by offering free and flexible alternatives to proprietary software license models. Finally, the document briefly touches on how cloud and big data are related through cloud's ability to handle the volume, velocity and variety of big data through massive resource pooling and rapid elasticity.
Big data application using hadoop in cloud [Smart Refrigerator] Pushkar Bhandari
This document proposes a smart refrigerator concept that uses cloud computing and big data techniques. Sensors in the refrigerator would generate and store data in the cloud. This data could then be used to detect malfunctions and provide notifications to users. It also allows third-party vendors regulated access to analyze the data for purposes like sending discount offers or analyzing refrigerator use patterns.
removing nodes. NuoDB automatically redistributes data and transactions
across available resources to maintain high performance and availability.
The document introduces NuoDB, a new cloud data management system that takes a completely new
approach to databases by rewriting the rules for relational databases to work in the cloud. NuoDB uses a
distributed, shared-nothing architecture that elastically scales to handle large transaction volumes and
global users. It provides the reliability of ACID transactions while scaling out simply on commodity cloud
resources to meet the demands of modern web-scale applications.
Yahoo uses Apache Hadoop extensively to power many of its products and services. Hadoop allows Yahoo to gain insights from massive amounts of data, including user data from services like Flickr and Yahoo Mail. Yahoo has contributed over 70% of the code to the Apache Hadoop project to date. Hadoop is critical to Yahoo's business by enabling personalization, spam filtering, content optimization, and other data-driven features. Yahoo runs Hadoop on tens of thousands of servers storing over 100 petabytes of data. The company continues working to enhance Hadoop's scalability, flexibility, and performance to make it more suitable for enterprise use.
IBM Storage for Analytics, Cognitive and CloudTony Pearson
Presentation on Software-Defined Storage, Spectrum Scale for Analytics with Hadoop and Hortonworks, and IBM Cloud Object Storage, presented March 15 in San Juan, Puerto Rico
Lovett introducing cloud computing nov 2009Hilde Lovett
Cloud computing involves providing dynamically scalable computing resources as a service over the Internet. Major players like Google, Amazon, and Microsoft have large server parks and are expanding into computing services. Telecom operators are also pursuing opportunities in cloud computing by leveraging their network infrastructure and data centers. The presentation identifies several potential roles for Telenor in cloud services such as a network service provider, cloud customer, cloud reseller, PaaS or IaaS provider, and SaaS enabler.
The document provides an overview and introduction to NoSQL databases. It discusses what triggered the NoSQL movement, common characteristics of NoSQL systems, and business benefits. The agenda covers topics such as what NoSQL is, differences from big data and cloud computing, core concepts, example implementations, and selecting the right NoSQL system for a project.
This document provides an overview of cloud computing, including its basic functioning, characteristics, service models (IaaS, PaaS, SaaS), types of clouds (private, public, hybrid, multi-cloud, community), and advantages and disadvantages. Cloud computing allows on-demand access to shared configurable computing resources via the internet. It provides various capabilities for users to store and process data in third-party data centers. The main service models are infrastructure as a service, platform as a service, and software as a service.
The document discusses emerging trends in virtual library networks and services, including increased mobility, wireless access, and cloud computing. It examines how libraries' traditional roles of selection, organization, access, and preservation are adapting to new technologies. The rise of the internet and new trends like recommender systems, predictive text, and collaborative tools are changing how library networks can empower members and provide value through resource sharing.
Resource sharing network protocol in library Science (presentation)Muhammad Kashif
This document discusses resource sharing between libraries. It outlines two protocols for resource sharing: conventional and advanced. The conventional protocol involves sharing printed materials through interlibrary loan based on union catalogs and lists. The advanced automated protocol utilizes technologies like the World Wide Web, online public access catalogs, electronic formats, email, MARC standards, Z39.50 for database searching, and digital libraries to share resources electronically. Resource sharing networks allow libraries to provide extensive access to information with limited budgets by collaborating and pooling resources.
This document summarizes an analysis of course syllabi from Chemistry and History departments at a university to evaluate the library instruction program. In Chemistry, the analysis found inconsistent understanding of library resources and few library instruction sessions. In History, it was assumed instruction was successful but most mentions of the library were in first year courses, questioning that assumption. The analysis identified gaps like a lack of required independent library research, critical thinking outcomes, or mentions of the library in many courses. It provides questions used to examine syllabi and comparisons of findings between departments.
Library consortia allow libraries to share resources and achieve objectives through cooperation. They are a method for sharing electronic resources among libraries with common goals. Consortia provide advantages like access to a large number of resources at lower costs. They also facilitate functions like cataloguing, consulting, collection development, purchasing, digitization and resource sharing. Consortia are easily formed without requiring capital and allow flexible membership.
The document discusses library consortia, which are cooperative arrangements that allow groups like academic institutions to share resources. It provides background on what consortia are, outlines their key features and benefits. These include reducing costs, expanding access to publications, and addressing issues like rising journal prices and shrinking budgets. Various Indian library consortia initiatives are also described, such as UGC-INFONET, INDEST, and CSIR-DST. Different types of consortia models are covered.
Library networking involves cooperation between libraries to share resources and provide maximum access to users. It requires creating tools like union catalogs to make each library's collections accessible. Rational acquisition and fast interlibrary loan are important. Participating libraries must be willing to contribute records, train staff, and adopt standards. Networks aim to expand access and services while reducing costs through collaborative collection development and resource sharing. They allow libraries to offer more than they could individually.
Library networking in india for resources sharingTiqueRebecca
The document discusses library networking in India for resource sharing. It outlines reasons for the need of resource sharing, such as the deluge of information and declining information buying power of libraries. It describes how networking connects computers to share information and resources. Networks were built to tackle increasing demands for better services given financial pressures. The ultimate goal of library networks is to interlink information resources across location, format, medium, language and script. Several library networks established in major Indian cities are described, including Delnet, Calibnet, Malibnet, MyLibnet, Bonet, Punenet and Adinet. The national network, Inflibnet, aims to provide end-users a mechanism for sharing and using information resources through modern information
1) Library consortia allow libraries to share resources and reduce costs through cooperation instead of competing. They have formed in countries like the UK, South Africa, and Nigeria.
2) Key elements of successful consortia include having mutual objectives, joint decision making, and continuous improvement. Critical success factors include a shared vision, cost effectiveness, accessible resources, and staff commitment.
3) Advantages include comprehensive collections, avoiding duplication, reduced costs, enhanced services, and staff development opportunities. Challenges include developing teamwork, trust, openness, and adopting a win-win approach.
This is a power-point about Networking and Resource Sharing in Library and Information Services: the case study of Consortium Building
Prepared By: May Joyce M. Dulnuan
Cloud computing provides on-demand access to shared computing resources like servers, storage, databases, networking, software and analytics over the internet. It allows libraries to access applications from anywhere in the world. Cloud computing offers computing, storage and software as a service. It provides libraries benefits like reduced costs, increased storage, automation and accessibility of collections. Libraries can use cloud services to host websites, digital libraries and integrated library systems. Issues around security, standards and regulations still need to be fully resolved for cloud computing in libraries.
Is cloud and NDT a good mix? NDT has its own specificity. Clouds can truly simplify the file management, but is any cloud solution adapted for the NDT? For example, Dropbox may not work right out of the box for our market. This presentation highlights different avenues about clouds (IaaS, PaaS, and SaaS); and highlights NDT critical requirements (constraints and needs). A list of different levels of cloud services (component, option, security, ...) will be defined. It is important to remember that private and public servers are 2 possible avenues. NDT was an early user of private servers even before it was called a cloud. Overall the main idea is to optimize the operation process to reduce OPEX and to increase availability and accuracy of data.
See: www.amotus-solutions.com or www.nubitus.com
This document discusses cloud computing, including what it is, the services it provides, and its advantages and disadvantages. Cloud computing relies on sharing computing resources over the internet rather than local hardware. The main types of cloud services are Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS). Cloud computing offers benefits like lower costs, improved performance and access, but also risks like security and reliability depending on internet connectivity.
Clould Computing and its application in LibrariesAmit Shaw
Cloud computing offers several potential benefits for libraries, including lower costs, increased storage capacity, improved mobility and access, and more flexible workflows. Key aspects of cloud computing include deployment models like private, public and hybrid clouds. Issues include security, data ownership, and lack of control. Recent trends include the use of cloud-based library services and products, as well as research into cloud computing architectures and management. Overall, cloud computing can help libraries modernize services in a cost-effective manner.
Cloud computing platforms like Amazon Web Services, Microsoft Azure, and Rackspace offer libraries services like automated management, digital content hosting, and data storage. OCLC promotes its Webscale platform for library management. Ex-Libris and OSS Labs provide cloud-based library systems. Duraspace offers digital preservation through DuraCloud. Libraries use cloud computing for applications like automation, digital libraries, office software, storage, search tools, and website hosting to save costs compared to maintaining own servers and software.
The document provides an overview of cloud computing including its popularity, definitions, benefits, key technology drivers like virtualization and SOA, top cloud providers like Amazon and Google, different cloud services and types, challenges, and real-world case studies demonstrating benefits like cost savings and faster deployment times.
This document discusses how cloud computing is changing library services. It begins with an overview of how core library services, data, content and user experience are evolving due to new technologies and user needs/expectations. Case studies are presented that showcase how libraries are using cloud computing for low-barrier solutions, innovative projects, and large-scale IT issues. Examples of cloud computing applications, platforms and infrastructure are also described.
This document provides an outline for a talk on cloud computing. It begins with an introduction to cloud concepts and technologies like virtualization and parallel computing models. It then discusses different cloud models including IaaS, PaaS and SaaS. The outline includes demonstrations of cloud capabilities with Amazon AWS and Microsoft Azure, as well as data and computing models using MapReduce. It concludes with a case study of a real business application of the cloud and a question and answer section.
Cloud computing provides on-demand access to computing resources and applications via the internet. There are different types of cloud services and deployment models. Key cloud characteristics include on-demand self-service, broad network access, resource pooling, and rapid elasticity. Amazon Web Services (AWS) is a major public cloud provider that operates across multiple regions and availability zones to provide scalable infrastructure to customers. AWS Elastic Compute Cloud (EC2) allows customers to launch virtual server instances from machine images to run applications.
The document provides an overview of cloud computing including:
- Definitions of distributed computing, cluster computing, utility computing, and cloud computing as trends in computing.
- A brief history of cloud computing including early concepts in the 1960s and milestones like Salesforce.com in 1999 and Amazon Web Services in 2002.
- Descriptions of the types of cloud including public, private, hybrid, and community clouds.
- Explanations of cloud service models including Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS).
- Discussions of cloud storage and advantages and disadvantages of cloud computing.
- Real-life examples of
O'Reilly Webcast: Architecting Applications For The CloudO'Reilly Media
This presentation analyzes aspects of the Amazon EC2 IaaS cloud environment that differ from a traditional data center and introduces general best practices for ensuring data privacy, storage persistence, and reliable DBMS backup. Presented by Jorge Noa, CTO of Hyperstratus
This document provides an overview of cloud computing including definitions, types of clouds, cloud models, benefits, challenges, and applications in libraries. It defines cloud computing as enabling on-demand access to configurable shared computing resources via the internet. The main types are public, private, and hybrid clouds. Cloud models include Software as a Service, Platform as a Service, and Infrastructure as a Service. Benefits include reduced costs, increased storage, and flexibility, while challenges include data protection, security, and dependency issues. The document discusses how libraries can use clouds to facilitate research, collaboration, and remote access, while reducing costs and redundancy.
Storage side computing for data management in private and public cloudsNoqaiya Ali
This document summarizes a presentation about storage-side computing for data management in private and public clouds. It discusses cloud computing and various online storage services like Amazon S3, Dropbox, and SkyDrive. It then explains the differences between public and private clouds. The presentation introduces the concept of storage-side computing and evaluates its effectiveness using a digital library system example. It concludes that storage-side computing is an effective way to store and access data from the cloud.
DCPL uses cloud computing services from Amazon EC2 and Google to host several digital initiatives. This includes Dclibrary.org, their digital image archive built on Alfresco, and Google Docs for staff collaboration. They chose the cloud for its speed, lower costs compared to alternatives, and ability to maximize performance and technology flexibility. Benefits include enhanced staff collaboration, quick expansion of sites and services, and more time spent on content rather than infrastructure support. Future plans include migrating more systems to the cloud like the OPAC and providing cloud-based services directly to customers.
The origin of the term cloud computing is unclear but it refers to computing resources that are dynamically provisioned over the internet. Early concepts of cloud computing involved time-sharing mainframe computers in the 1950s and virtual machines in the 1970s. Telecommunications companies started offering virtual private networks in the 1990s. Grid computing, utility computing, SaaS, and cloud computing evolved the concept further, providing on-demand access to computing resources and applications delivered as a service.
Microsoft Azure is a cloud computing service that provides infrastructure, platform and software services through global data centers. It supports virtual machines, web apps, storage, databases, analytics and more. Azure uses a specialized operating system called Microsoft Azure to manage computing resources across its global fabric layer.
Facebook's data center fabric provides scalable networking infrastructure to support increasing traffic and new products. It uses ECMP routing and multi-speed links for load balancing. The fabric is designed as a non-oversubscribed environment and uses automation tools to manage topology changes.
Google's first data centers used donated hardware from Sun, Intel and IBM. It has numerous centers worldwide with large facilities in the US, Europe and Asia. Google developed software for
This presentation covers the work done by Erik Mitchell, Kevin Gilbertson, Jean Paul Bessou, Barry Davis and Tim Mitchell in moving the ZSR library to Amazon servers
Cloud computing is the on-demand delivery of IT resources and applications via the Internet with pay-as-you-go pricing. It evolved from earlier technologies like grid computing and utility computing by providing greater ease of use and on-demand scaling. A cloud broker acts as an intermediary between cloud service providers and customers, providing a unified interface and moving workloads between public and private clouds for improved performance and redundancy.
5 Comparing Microsoft Big Data Technologies for Analytics
Science & technology (s&t) cloud2
1. A Science & Technology (S&T)
Cloud Library : New Trends
By
J.A.Amaraweera
Member – NSF National Committee on LIS
(SLSTINET Seminar – 06-03-2013)
2. Libraries – Then, Now & Future
Libraries within walls
i.e. Traditional Paper-based Libraries (Book Libraries)
Libraries without walls & Paperless Libraries
i.e. Hybrid Libraries
Electronic Libraries
Digital Libraries
Invisible or Virtual Libraries
In near Future - Cloud Libraries
3. Cloud computing: “a style of
computing in which massively
scalable and elastic IT-enabled
capabilities are delivered as a
service to external customers
G
using Internet technologies.”
Gartner
4. “„Cloud‟ is a metaphor for the
internet. „Cloud computing‟ is a
phrase that is being used today
to describe the act of
storing, accessing, and sharing
data, applications, and
computing power in cyberspace”
(Anderson & Rainie, 2010).
5. If we are using any of the popular Web 2.0
services for our day-to-day information and
communication activities (e.g. Gmail,
Wikipedia, Flicker, YouTube or Twitter),
we already have some experience with
cloud computing, since most of these
applications are hosted in the large online
data centers that are the hallmark of cloud
computing.
7. Introduction to Cloud computing
Public Cloud Vs. Private Cloud Vs. Hybrid Cloud
Public cloud
Applications delivered over the Internet in the software-as-a-service model.
Computing resources such as storage or compute cycles delivered in the
infrastructure-as-a-service model
Application development platform provided in a platform-as-a-service model
Private cloud, Also known as a Corporate Cloud
uses cloud-like infrastructure and technology, such as virtualized servers in
a scalable architecture, to run applications behind the corporate firewall
Hybrid Cloud
A hybrid model takes advantages of both public and private structures.
Organization may choose, for example, to run its e-mail system in the public
cloud while keeping highly sensitive, customer-oriented applications behind
the firewall
8. . Like water and electricity, a computing
cloud is a communally-shared resource
that you lease on a metered basis, paying
for as little or as much as you need, when
you need it.
effects of cloud computing will probably impact libraries
Cost savings
Flexibility and innovation
Broad, general IT skills vs. deep, specialized skills
Cloud OPAC and Cloud ILS WorldCat and FirstSearch
. *
11. State of Cloud Computing in Libraries
Cloud computing can be divided into four
(4) categories:
Software-as-a-Service (SaaS),
Platform-as-a-Service (PaaS),
Infrastructure-as-a-Service (IaaS). and
Desktop as a Service (Daas)
12. Desktop as a Service (DaaS)
is a cloud computing solution in which virtual desktop
infrastructure is outsourced to a third-party provider.
DaaS functionality relies on the virtual desktop, which
is a user-controlled session or dedicated machine that
transforms on-demand cloud services for users and
organizations around the world. This is an efficient
model in which the service provider manages all the
back-end responsibilities that would normally be
provided by application software.
Desktop as a service is also known as a virtual desktop
or hosted desktop services.
[ Techopedia.com ]
17. 3M Cloud Library
3M launched its cloud library in June 2011.
It currently has a stockpile of 100,000 ebook titles (other formats
will be forthcoming) from 40 publishers.
Small public libraries in a consortial group can be easily
accommodated by the 3M service.
It also has pricing terms for small libraries that wish to remain
independent.
3M allows libraries to transfer content to another platform once a
contract has expired if they wish to do so. It also features cloud
delivery of content.
The company is engaged in discussions with Amazon and hopes to
offer downloads to Kindle devices in the future
18.
19. Amazon CloudFront
Amazon CloudFront is a web service for content delivery.
It integrates with other Amazon Web Services to give developers and
businesses an easy way to distribute content to end users with low
latency, high data transfer speeds, and no commitments.
Amazon CloudFront can be used to deliver your entire
website, including dynamic, static and streaming content using a
global network of edge locations.
Requests for your content are automatically routed to the nearest
edge location, so content is delivered with the best possible
performance.
Amazon CloudFront is optimized to work with other Amazon Web
Services, like Amazon Simple Storage Service (Amazon S3), Amazon
Elastic Compute Cloud (Amazon EC2), Amazon Elastic Load
Balancing, and Amazon Route 53.
20. Contd…/
Global Amazon CloudFront uses a global network of edge
locations, located near your end users in the United
States, Europe, Asia, and South America
aAmzon CloudFront has a simple, web services interface that lets
you get started in minutes.
In Amazon CloudFront, your content is organized into
distributions. A distribution specifies the location or locations of
the original version of your files. A distribution has a unique
CloudFront.net domain name (e.g. abc123.cloudfront.net) that you
can use to reference your objects through the global network of
edge locations
25. cloud computing for libraries
Requirements for Libraries
• separation between front end and back end
• separation of services
– account management
– financial (licensing, fees, fines)
– Bibliographic Services (OPAC, Indexes,SDI)
• standardise (MARC21, RDA, DDC or UDC…)
• know your functional requirements (ILL, DDS)
• collaborate closely with IT
25
26. cloud computing for libraries
Journals / articles Google (Scholar, Books)
Databases
Worldcat local - global Wikipedia
Amazon, Bol
Global content
digitization
$
KB/depot/special collections
harvesting harvesting
licensing info
digitizing on demand print
Local content
26
27. Cloud computing examples
IaaS:
•Amazon Elastic Compute Cloud (Amazon EC2)
•Amazon Simple Storage Service (S3)
PaaS:
•Heroku (Ruby on Rails, PostgresSQL)
•Google App Engine
DaaS*:
•Serials Solutions Summon
•Ex Libris Primo Central
SaaS:
•Microsoft Office 365
•Google Docs
28. Cloud services for libraries
• Document sharing in libraries
– DropBox, Google Docs, Evernote, Sugar Sync
• Web conferencing software
– Skype, Adobe Connect
• Web publishing
– WordPress, Google Sites
• Marketing, branding, communication
– Facebook, Twitter, YouTube, mobile social apps
28
29. Library systems in the cloud
• Integrated library systems
– Koha in the Cloud
– OCLC WMS
• Repository software
– Archives hosted in the cloud
– Institutional repositories (bepress
DigitalCommons)
• Discovery systems
– Ebsco Discovery Service
– Ex Libris Primo Central
29
30. In Libraires
Remote access to data for librarians
-Saves on
electricity, servers, workstations, an
d maintenance
–Means dumb terminals that aren‟t
replaced frequently can be referred
over high-performance individual
workstations
31. Questions about cloud computing for
libraries
Do libraries need the same type of scalability as a
retailer that specializes in Indian costumes?
Do vendors from outside the library world
understand library needs?
Can we effectively negotiate a contract if we
don‟t have the sufficient budget?
Are we ok with many (most?) cases of a contract:
take it or leave it?
Are there bottom line cost savings?
Should we evaluate cloud services in the same
way as other services?
32. Data ownership questions
• What rights do the library and vendor
have to data in the cloud?
• How do you access data?
• How and what format do you get your
data back when a contract ends?
• What happens to the data should the
vendor go out of business?
33. Advantages of cloud computing for
Libraries
• Greater efficiency
• Increased flexibility
• Scalability
• A way to deal with lack of technical expertise
• A way to do something a single library simply
could not do alone (i.e. CARI)
– Aggregation of library-land data
• Lower computing costs:
– Often free or low cost solutions are available (i.e.
Google Apps for EDU)
• Uptime vs. downtime (cloud may or may not
be better than local IT)
33
34. SLSTINET Internet
Cloud
NSF - S & T Cloud
Technonet
S&T
AGRINET S & T
ENVINET S & T NatLib S & T