This document discusses setting up a private cloud using a Raspberry Pi. It begins by defining cloud computing according to NIST standards, distinguishing between software, platform and infrastructure as a service models. It then discusses setting up a Raspberry Pi with the operating system Raspbian, and installing OwnCloud for file hosting and Nginx as the web server. Step-by-step instructions are provided for configuring the Raspberry Pi, OwnCloud, dynamic DNS service No-IP, and AFP file sharing to enable external access to the private cloud. The setup is then compared to commercial cloud services and the NIST cloud computing characteristics.
Linux is an open-source operating system modeled after UNIX. It is widely used on servers and systems like mainframes and supercomputers. Linux code can be freely used, modified, and distributed under the GNU public license. Linux is expanding to new devices like laptop BIOS, bringing up a streamlined environment quickly. Popular Linux distributions offer application management with a few clicks. Code contributions must be compatible with GPLv2. Developers worldwide can improve and influence Linux. Revenue comes from support services, convenience of pre-built distributions, and expertise in integrating free Linux code.
This document provides an overview of managing user accounts in a Microsoft Windows Server 2003 environment. It discusses the purpose of user accounts and the authentication process. It also describes how to create and manage local, roaming, and mandatory user profiles. Various methods for creating and modifying user accounts using tools like Active Directory Users and Computers and command line utilities are presented.
The document discusses the history and development of Linux. It begins with Richard Stallman starting the GNU project in 1983 with the goal of creating a free and open source Unix-like operating system. Linus Torvalds then began developing the Linux kernel in 1991. Linux grew out of earlier Unix systems like BSD and MINIX and incorporated key developments like the Linux mascot Tux, established in 1996. Today, Linux powers over 90% of the world's fastest supercomputers and has a large community of open source software and distributions supported by organizations like the Free Software Foundation.
This document provides an overview of cloud computing. It begins with learning objectives and defines cloud computing according to NIST as a model for enabling network access to a shared pool of configurable computing resources that can be rapidly provisioned with minimal management effort. It describes the five essential cloud characteristics, three service models (SaaS, PaaS, IaaS), and four deployment models (private, public, hybrid, community). Examples are given for each along with issues and benefits of cloud computing. The document provides a comprehensive introduction to cloud computing concepts.
Support formation vidéo : Réussir la certification Linux LPIC-1 (1)SmartnSkilled
Préparez-vous à la certification linux LPIC-1 avec votre expert Anis Hachani.
Cette formation vous permet de :
- Construire des bases solides avec Linux.
- Devenir opérationnel avec n’importe quelle distribution Linux.
- Préparer à la certification LPIC-1.
- Comprendre et préparer les différents thèmes requis pour l'examen LPIC-1 101 & 102 fixés par LPI.
Suivez la formation vidéo par ici :
https://www.smartnskilled.com/tutoriel/formation-en-ligne-reussir-la-certification-linux-lpic-11
Linux was created in 1991 by Linus Torvalds as a hobby project and free operating system. It gained popularity through distributions like Slackware and Red Hat Linux. Key developments included the Linux kernel version 1.0 release in 1994, establishment of major desktop environments like KDE and GNOME in the 1990s, and Ubuntu's first release in 2004 which helped popularize Linux for desktop users.
Debian is an open source operating system and community started in 1993. It has over 1000 volunteer developers creating over 23,000 binary packages. Debian uses a volunteer-driven structure and consensus-based decision making process. It emphasizes free software and stability through processes like package testing and policy compliance. Many other distributions are based on Debian, benefiting from its large package collection and stable base.
Linux is an open-source operating system modeled after UNIX. It is widely used on servers and systems like mainframes and supercomputers. Linux code can be freely used, modified, and distributed under the GNU public license. Linux is expanding to new devices like laptop BIOS, bringing up a streamlined environment quickly. Popular Linux distributions offer application management with a few clicks. Code contributions must be compatible with GPLv2. Developers worldwide can improve and influence Linux. Revenue comes from support services, convenience of pre-built distributions, and expertise in integrating free Linux code.
This document provides an overview of managing user accounts in a Microsoft Windows Server 2003 environment. It discusses the purpose of user accounts and the authentication process. It also describes how to create and manage local, roaming, and mandatory user profiles. Various methods for creating and modifying user accounts using tools like Active Directory Users and Computers and command line utilities are presented.
The document discusses the history and development of Linux. It begins with Richard Stallman starting the GNU project in 1983 with the goal of creating a free and open source Unix-like operating system. Linus Torvalds then began developing the Linux kernel in 1991. Linux grew out of earlier Unix systems like BSD and MINIX and incorporated key developments like the Linux mascot Tux, established in 1996. Today, Linux powers over 90% of the world's fastest supercomputers and has a large community of open source software and distributions supported by organizations like the Free Software Foundation.
This document provides an overview of cloud computing. It begins with learning objectives and defines cloud computing according to NIST as a model for enabling network access to a shared pool of configurable computing resources that can be rapidly provisioned with minimal management effort. It describes the five essential cloud characteristics, three service models (SaaS, PaaS, IaaS), and four deployment models (private, public, hybrid, community). Examples are given for each along with issues and benefits of cloud computing. The document provides a comprehensive introduction to cloud computing concepts.
Support formation vidéo : Réussir la certification Linux LPIC-1 (1)SmartnSkilled
Préparez-vous à la certification linux LPIC-1 avec votre expert Anis Hachani.
Cette formation vous permet de :
- Construire des bases solides avec Linux.
- Devenir opérationnel avec n’importe quelle distribution Linux.
- Préparer à la certification LPIC-1.
- Comprendre et préparer les différents thèmes requis pour l'examen LPIC-1 101 & 102 fixés par LPI.
Suivez la formation vidéo par ici :
https://www.smartnskilled.com/tutoriel/formation-en-ligne-reussir-la-certification-linux-lpic-11
Linux was created in 1991 by Linus Torvalds as a hobby project and free operating system. It gained popularity through distributions like Slackware and Red Hat Linux. Key developments included the Linux kernel version 1.0 release in 1994, establishment of major desktop environments like KDE and GNOME in the 1990s, and Ubuntu's first release in 2004 which helped popularize Linux for desktop users.
Debian is an open source operating system and community started in 1993. It has over 1000 volunteer developers creating over 23,000 binary packages. Debian uses a volunteer-driven structure and consensus-based decision making process. It emphasizes free software and stability through processes like package testing and policy compliance. Many other distributions are based on Debian, benefiting from its large package collection and stable base.
Debian is a Linux operating system initiated in 1993 by Ian Murdock. There are three main versions: Old Stable, Stable, and Experimental. A new Stable version is released every two years with three years of official support. Debian has influenced over 136 Linux distributions. It includes desktop environments like GNOME, KDE, XFCE, LXDE, MATE, and Cinnamon. Debian can be used for purposes like notebooks, desktops, workstations, servers, and more. It offers both text-based and graphical installers.
What is Linux?
Command-line Interface, Shell & BASH
Popular commands
File Permissions and Owners
Installing programs
Piping and Scripting
Variables
Common applications in bioinformatics
Conclusion
NCM is a network configuration management tool that allows users to discover devices, backup configurations, detect configuration changes, compare configurations, manage compliance policies, and generate detailed reports. It offers benefits such as being user friendly, easy to install and use, affordable pricing, and 24/7 technical support. NCM provides a complete solution for network configuration management.
This document discusses user and file permissions in Linux. It covers how every file is owned by a user and group, and how file access is defined using file mode bits. These bits determine read, write and execute permissions for the file owner, group and others. An example of a file with permissions -rw-rw-r-- is provided to demonstrate this. User accounts are configured in /etc/passwd, while passwords are securely stored in /etc/shadow. Common commands for managing users, groups, permissions and default file access (umask) are also outlined.
The document traces the origins and evolution of UNIX and Linux operating systems, culminating in the development of the Kali Linux operating system. It discusses how UNIX was developed at Bell Labs in the 1960s, and how Linux was later created by Linus Torvalds in the 1990s. It then focuses on the development of Kali Linux, which originated from the BackTrack Linux security and penetration testing distribution, and has become the premier operating system for penetration testing and security auditing.
This document provides an overview of the history and architecture of the Linux operating system. It discusses how Linux originated as a free alternative to proprietary operating systems like DOS, Mac OS, and UNIX. Key individuals in the development of Linux include Linus Torvalds and Richard Stallman. The document also outlines the core components of Linux like the kernel, shell, and system utilities, as well as common uses of Linux as a desktop, server, and firewall platform.
The document provides information about server training, including the basics of servers and server hardware components. It defines what a server is, describes different types of servers based on size and use (e.g. rack mount, tower, blade). It also outlines the client-server model. The hardware components of a server are explained, such as the motherboard, hard drives, fans, power supplies, memory, RAID controllers. Different types of RAID configurations are defined. Finally, it discusses server processor diagnostics using tools like light path diagnostics and baseboard management controllers.
Arch Linux is defined by simplicity, modernity, pragmatism, and user centrality. It provides only a command line interface upon installation, allowing users to build a custom system by choosing from over 58,000 packages. As a rolling release distribution, it maintains the latest stable versions of software. Arch Linux requires proficient users who are willing to read documentation and solve their own problems. It is inspired by the CRUX distribution and focuses on simplicity rather than ideology or popularity.
Bevezetés az Amazon Web Services (AWS) világábaShiwaForce
A ShiwaForce magyar nyelvű képzési anyagából megismerheted a felhő alapfogalmait és röviden bemutatjuk a legfontosabb alapszolgáltatásokat, majd végigkísérünk az AWS Web Console-ba való regisztráció során, ahol létrehozunk egy S3 bucketet és kipróbáljuk a parancssori hozzáférést (AWS CLI) is.
A képzés teljes, ingyenes, magyar nyelvű anyagát megtalálod a Udemy oldalain, ahol máris több százan végezték el a Mádi Gábor és Kora András által összeállított AWS bevezető kurzust: https://www.udemy.com/bevezetes-az-amazon-web-services-vilagaba-shiwaforce-aws-magyarul/
This lecture covers an introduction to cloud computing. It discusses key topics like cloud types, architecture, services, platforms, security, and applications. Specifically, it defines cloud computing, compares delivery models like Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). It also discusses using major cloud platforms from Amazon Web Services, Google, and Microsoft and exploring concepts like virtualization, capacity planning, and establishing identity/security in the cloud. The lecture concludes by discussing mobile cloud integration and streaming media/video applications in cloud computing.
This document discusses cloud computing. It begins with an introduction defining cloud computing as internet-based computing where shared resources are provided on demand. It then covers the history, components, architecture, types (public, private, hybrid), advantages of flexibility, low cost and easy management, and disadvantages of dependency and security issues of cloud computing.
This document provides an introduction to Ubuntu, an open-source Linux operating system. It discusses what Ubuntu is, why users would want to use it, its default applications, and recent Ubuntu releases. It then provides overviews of the Ubuntu desktop, panels, menus, icons, virtual desktops, and the Nautilus file browser. It discusses how files are handled in Ubuntu and basic day-to-day file management tasks. The document concludes with exercises for the reader to complete.
Linux is an open-source operating system that originated as a personal project by Linus Torvalds in 1991. It can run on a variety of devices from servers and desktop computers to smartphones. Some key advantages of Linux include low cost, high performance, strong security, and versatility in being able to run on many system types. Popular Linux distributions include Red Hat Enterprise Linux, Debian, Ubuntu, and Mint. The document provides an overview of the history and development of Linux as well as common myths and facts about the operating system.
Kali Linux is an operating system based on Debian Linux designed for digital forensics and penetration testing. It contains over 600 security and forensics tools, runs on both 32-bit and 64-bit architectures, and is free and open source. Kali Linux is commonly used by ethical hackers, penetration testers, and digital forensics investigators. It contains more security tools than other Linux distributions and is optimized for tasks such as vulnerability assessment, security auditing, and penetration testing.
Presentation on Debian Operating System. In this presentation auther describes the Debian OS and Its Facility and Fetures point-by-point and How Debian Invented. It is also usefull for CSE Student for the presentation of their OS Subject.
Linux is the best-known and most-used open source operating system. As an operating system, Linux is software that sits underneath all of the other software on a computer, receiving requests from those programs and relaying these requests to the computer's hardware.
Approbation de Domaine D'active Directory AD sur 2008 serveurr medfaye
l'objectif d'un domaine d'Approbation et d'active Directory et de permettent de prés comme de loin aux utilisateurs du domaine précédemment crées d'avoir la possibilité et la facilité à accéder aux domaine approuvé d'effectuer ou de prendre tout ce qui lui ai attribué a n'importe quel domaine sans se rendre compte de son changement de domaine de partout ou il se trouve dans le réseaux
User objects can represent employees, customers, or students. Groups are collections of users that permissions or rights can be applied to collectively rather than individually. There are two types of user accounts: local accounts stored on individual computers and domain accounts stored centrally in Active Directory. Domain accounts are replicated across domain controllers for shared management.
This short document contains photo credits from five different photographers and encourages the reader to create their own Haiku Deck presentation on SlideShare. It lists photo credits to emmettanderson, G.Garcia, hyeenus, Rzom_, and deep_schismic and suggests that the reader can get started making their own Haiku Deck presentation on SlideShare.
Debian is a Linux operating system initiated in 1993 by Ian Murdock. There are three main versions: Old Stable, Stable, and Experimental. A new Stable version is released every two years with three years of official support. Debian has influenced over 136 Linux distributions. It includes desktop environments like GNOME, KDE, XFCE, LXDE, MATE, and Cinnamon. Debian can be used for purposes like notebooks, desktops, workstations, servers, and more. It offers both text-based and graphical installers.
What is Linux?
Command-line Interface, Shell & BASH
Popular commands
File Permissions and Owners
Installing programs
Piping and Scripting
Variables
Common applications in bioinformatics
Conclusion
NCM is a network configuration management tool that allows users to discover devices, backup configurations, detect configuration changes, compare configurations, manage compliance policies, and generate detailed reports. It offers benefits such as being user friendly, easy to install and use, affordable pricing, and 24/7 technical support. NCM provides a complete solution for network configuration management.
This document discusses user and file permissions in Linux. It covers how every file is owned by a user and group, and how file access is defined using file mode bits. These bits determine read, write and execute permissions for the file owner, group and others. An example of a file with permissions -rw-rw-r-- is provided to demonstrate this. User accounts are configured in /etc/passwd, while passwords are securely stored in /etc/shadow. Common commands for managing users, groups, permissions and default file access (umask) are also outlined.
The document traces the origins and evolution of UNIX and Linux operating systems, culminating in the development of the Kali Linux operating system. It discusses how UNIX was developed at Bell Labs in the 1960s, and how Linux was later created by Linus Torvalds in the 1990s. It then focuses on the development of Kali Linux, which originated from the BackTrack Linux security and penetration testing distribution, and has become the premier operating system for penetration testing and security auditing.
This document provides an overview of the history and architecture of the Linux operating system. It discusses how Linux originated as a free alternative to proprietary operating systems like DOS, Mac OS, and UNIX. Key individuals in the development of Linux include Linus Torvalds and Richard Stallman. The document also outlines the core components of Linux like the kernel, shell, and system utilities, as well as common uses of Linux as a desktop, server, and firewall platform.
The document provides information about server training, including the basics of servers and server hardware components. It defines what a server is, describes different types of servers based on size and use (e.g. rack mount, tower, blade). It also outlines the client-server model. The hardware components of a server are explained, such as the motherboard, hard drives, fans, power supplies, memory, RAID controllers. Different types of RAID configurations are defined. Finally, it discusses server processor diagnostics using tools like light path diagnostics and baseboard management controllers.
Arch Linux is defined by simplicity, modernity, pragmatism, and user centrality. It provides only a command line interface upon installation, allowing users to build a custom system by choosing from over 58,000 packages. As a rolling release distribution, it maintains the latest stable versions of software. Arch Linux requires proficient users who are willing to read documentation and solve their own problems. It is inspired by the CRUX distribution and focuses on simplicity rather than ideology or popularity.
Bevezetés az Amazon Web Services (AWS) világábaShiwaForce
A ShiwaForce magyar nyelvű képzési anyagából megismerheted a felhő alapfogalmait és röviden bemutatjuk a legfontosabb alapszolgáltatásokat, majd végigkísérünk az AWS Web Console-ba való regisztráció során, ahol létrehozunk egy S3 bucketet és kipróbáljuk a parancssori hozzáférést (AWS CLI) is.
A képzés teljes, ingyenes, magyar nyelvű anyagát megtalálod a Udemy oldalain, ahol máris több százan végezték el a Mádi Gábor és Kora András által összeállított AWS bevezető kurzust: https://www.udemy.com/bevezetes-az-amazon-web-services-vilagaba-shiwaforce-aws-magyarul/
This lecture covers an introduction to cloud computing. It discusses key topics like cloud types, architecture, services, platforms, security, and applications. Specifically, it defines cloud computing, compares delivery models like Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). It also discusses using major cloud platforms from Amazon Web Services, Google, and Microsoft and exploring concepts like virtualization, capacity planning, and establishing identity/security in the cloud. The lecture concludes by discussing mobile cloud integration and streaming media/video applications in cloud computing.
This document discusses cloud computing. It begins with an introduction defining cloud computing as internet-based computing where shared resources are provided on demand. It then covers the history, components, architecture, types (public, private, hybrid), advantages of flexibility, low cost and easy management, and disadvantages of dependency and security issues of cloud computing.
This document provides an introduction to Ubuntu, an open-source Linux operating system. It discusses what Ubuntu is, why users would want to use it, its default applications, and recent Ubuntu releases. It then provides overviews of the Ubuntu desktop, panels, menus, icons, virtual desktops, and the Nautilus file browser. It discusses how files are handled in Ubuntu and basic day-to-day file management tasks. The document concludes with exercises for the reader to complete.
Linux is an open-source operating system that originated as a personal project by Linus Torvalds in 1991. It can run on a variety of devices from servers and desktop computers to smartphones. Some key advantages of Linux include low cost, high performance, strong security, and versatility in being able to run on many system types. Popular Linux distributions include Red Hat Enterprise Linux, Debian, Ubuntu, and Mint. The document provides an overview of the history and development of Linux as well as common myths and facts about the operating system.
Kali Linux is an operating system based on Debian Linux designed for digital forensics and penetration testing. It contains over 600 security and forensics tools, runs on both 32-bit and 64-bit architectures, and is free and open source. Kali Linux is commonly used by ethical hackers, penetration testers, and digital forensics investigators. It contains more security tools than other Linux distributions and is optimized for tasks such as vulnerability assessment, security auditing, and penetration testing.
Presentation on Debian Operating System. In this presentation auther describes the Debian OS and Its Facility and Fetures point-by-point and How Debian Invented. It is also usefull for CSE Student for the presentation of their OS Subject.
Linux is the best-known and most-used open source operating system. As an operating system, Linux is software that sits underneath all of the other software on a computer, receiving requests from those programs and relaying these requests to the computer's hardware.
Approbation de Domaine D'active Directory AD sur 2008 serveurr medfaye
l'objectif d'un domaine d'Approbation et d'active Directory et de permettent de prés comme de loin aux utilisateurs du domaine précédemment crées d'avoir la possibilité et la facilité à accéder aux domaine approuvé d'effectuer ou de prendre tout ce qui lui ai attribué a n'importe quel domaine sans se rendre compte de son changement de domaine de partout ou il se trouve dans le réseaux
User objects can represent employees, customers, or students. Groups are collections of users that permissions or rights can be applied to collectively rather than individually. There are two types of user accounts: local accounts stored on individual computers and domain accounts stored centrally in Active Directory. Domain accounts are replicated across domain controllers for shared management.
This short document contains photo credits from five different photographers and encourages the reader to create their own Haiku Deck presentation on SlideShare. It lists photo credits to emmettanderson, G.Garcia, hyeenus, Rzom_, and deep_schismic and suggests that the reader can get started making their own Haiku Deck presentation on SlideShare.
Deon Puzey is an experienced licensed aircraft mechanic with over 13 years of experience in aviation maintenance. He has extensive experience in aircraft structural repair, composite repair, component overhaul, and aircraft painting. Puzey values honesty, punctuality, dependability, and efficiency in his work. He served six years of active duty in the Navy as an aviation structural mechanic, completing three wartime deployments. Puzey aims to gain management experience and leadership skills to advance his career in commercial aviation.
Participatory museum experiences and performative practices in museum educationMarco Peri
Generally in museum hall the “sight” is the predominant sense: How can I offer to the visitors a more dynamic sensorial experience?
How can I transform the visitors into active protagonists of the experience?
The key I choose to bring people closer to art is the body.
Putting the body at the centre is the key to involve feelings.
This form of ‘embodiment’ is the premise to design participatory museum experiences, educational projects and workshops.
more informations: www.marcoperi.it/arteducation
[This topics were presented during the Conference: Le musée par la scène. Le spectacle vivant au musée : pratiques, publics, médiations. Paris 18/19/20 Novembre 2015]
01 mid term assignment - christmas land project khawar-v3Reza Khawar
Christmas Land project is a very huge opportunity for the Salford City Council to promote the city and attract the attention of stakeholders not only for the project but also for the future of the city. Christmas is an event where people across the country tries their best to spend their vacations in best recreational facilities with their family, thus it is best opportunity for businesses and bankers to invest on such projects. Notwithstanding its benefits and future impacts on the city, the project does entail its own complexities and risks that needs careful study when planning. High number of stakeholders with different objectives each is a challenge for the project whereas the project objective should be narrowed down to mitigate the risk of losing focus while it should encompass the expectations and objectives of stakeholders at the same time. Hence the project’s success and failure highly depends on the identification and analysis of stakeholders and selection of objective(s) for the project. Initial planning of the project needs careful and concentration in that the scope of the project should be carefully defined in which the boundaries of the project, activities associated in the project including the timeframe and any expected uncertainties of the project should be considered.
When planning the project a detailed work breakdown structure should be defined so that activities with same resources should not overlap each other. Dependencies of activities should be considered as well as resource allocation and time boundaries for each activity should be defined. A professional quality control mechanism should be defined for the project so that each activities performance are evaluated at several stages to mitigate risk of low quality delivery of service. The project has its associated risks which should be clearly defined and approaches to mitigate such risks should be considered beforehand.
After implementation of the project, the closure of the project is a very important phase where lessons learned should be documented from the project for future success of projects. The city council can further develop the project into a permanent recreational facility for the city residents where any festivals and events can be conducted in the project environment. After reviewing the project performance of activities can be identified and best practices can be listed that brought the project to success.
Damien Daly is a tech lead on the analytics team at Newsweaver, which provides a service for organizations to inform and engage employees. The document discusses Newsweaver's service-oriented architecture with bounded contexts and microservices. It also outlines their development approach, including using Docker containers, Kubernetes for deployments, and running automated tests during their bugfix process before deploying updates.
The document discusses the benefits of exercise for mental health. It states that regular physical activity can help reduce anxiety and depression and improve mood and cognitive functioning. Exercise has also been shown to enhance self-esteem and quality of life.
Microsoft Project es una herramienta para administrar proyectos que permite definir tareas, recursos, costos, cronogramas y rutas críticas. Ofrece información sobre el alcance del proyecto incluyendo etapas, tareas e hitos, así como los plazos y fechas límite. Es útil cuando se necesita administrar y comunicar efectivamente los detalles de un proyecto.
Family abuse comes in many forms, including physical, emotional, financial, and sexual abuse. Abuse can happen to people of any age, from children to seniors, and most often occurs at the hands of someone close like a parent, spouse, or other family member. Abuse has significant costs on individuals and society, and changing social attitudes is important for addressing the root causes and reducing its occurrence. Support for victims, education, and social programs can help prevent family abuse.
An electromagnetic pulse (EMP) is a burst of electromagnetic radiation caused by large explosions, especially nuclear explosions, that can damage electrical systems by inducing current and voltage surges. EMPs are categorized based on the altitude of the explosion: subsurface, surface, air, and high altitude EMP which occurs above 30km.
Este documento presenta un cuestionario de 7 preguntas sobre el uso seguro de herramientas manuales y eléctricas. Las preguntas cubren temas como los tipos de herramientas más involucradas en accidentes, el uso de guantes con herramientas eléctricas, la peligrosidad relativa de cuchillos afilados versus sin filo, y formas comunes de electrocutarse con herramientas eléctricas. El cuestionario parece diseñado para evaluar el conocimiento sobre seguridad al usar diferentes tipos de herramientas.
https://www.intex-pool-shop.de
Wir bieten eine große Auswahl an Pools, Intex Poolfolie, Intex Spa, Intex Booten, Luftbetten und Zubehör. Bei Swimmingpools gehort Intex zu den Marktfuhrern in Deutschland. Mit den Pools von Intex kann der Sommer kommen.
Khizar Hayat has over 10 years of experience in construction, fabrication, erection, inspection, testing, pre-commissioning and commissioning of oil, gas, chemical and power plants. He has worked as a mechanical engineer on various projects in Pakistan and the UAE, where his responsibilities included fabrication, erection, inspection, supervision of testing, preparation of as-built drawings, and maintenance of HVAC, plumbing and other mechanical systems. Currently he works as an HVAC mechanical engineer in Sheikh Khalifa Medical Hospital in the UAE, maintaining and performing repairs on refrigeration equipment and HVAC systems.
Cloud Computing Without The Hype An Executive Guide (1.00 Slideshare)Lustratus REPAMA
Author: Steve Craggs - Lustratus Research Limited.
Defining Cloud Computing and identifying the current players
This document offers a high-level summary of Cloud Computing, targeted at Executives who find themselves bombarded with Cloud Computing and need to cut through the hype to get a clear understanding of what cloud is all about.
Cloud is defined in simple terms and the main categories of cloud are identified. A high level segmentation of the cloud marketplace is also offered, and includes a reasonably comprehensive index of suppliers in the Cloud Computing marketplace and the Cloud segments in which they operate.
This document is a thesis on lock-in issues with Platform as a Service (PaaS) cloud computing. Chapter 1 provides background on cloud computing, defining it as on-demand access to shared computing resources over a network. It describes the benefits of cloud computing like reduced costs and ability to quickly scale resources. It outlines the three service models of Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). Chapter 2 will discuss the problem of vendor lock-in that exists in cloud computing, particularly in PaaS, and the issues it causes for users.
This document is an introduction to cloud computing architecture that discusses:
- Cloud computing builds on established trends like virtual machines, on-demand services delivered over the network, and open source software.
- There are three cloud infrastructure models: public clouds, private clouds, and hybrid clouds. Architectural layers include SaaS, PaaS, and IaaS.
- Cloud computing benefits include reducing costs and response times while increasing agility and the pace of innovation.
This document is an introduction to cloud computing architecture that discusses:
- Cloud computing builds on established trends like virtual machines, on-demand services delivered over the network, and open source software.
- There are three cloud infrastructure models: public clouds, private clouds, and hybrid clouds. Architectural layers include SaaS, PaaS, and IaaS.
- Cloud computing benefits include reducing costs and response times while increasing agility and the pace of innovation.
Cloud computing builds upon established trends like virtualization, on-demand resources, self-service models, and internet-delivered services. It transforms how applications are designed, developed, deployed and managed. Key aspects include using virtual machines as standard deployment objects, consuming computing resources as a pay-per-use service, and programming the infrastructure through APIs to dynamically configure and scale applications. This allows applications to be rapidly and automatically deployed across many servers in public or private clouds.
The document discusses some security considerations related to cloud computing. It notes that while cloud computing offers benefits, it also poses some unique security challenges including availability issues from outages or data loss. Other security risks addressed include the trustworthiness and transparency of cloud providers, the ability to monitor systems in the cloud, and challenges around incident response. The document emphasizes the importance of assessing these risks for any application or data hosted in the cloud.
The document discusses several concepts related to cloud computing including:
- The evolving definition of cloud computing with 3 service models and 4 deployment models and 5 essential characteristics
- Diffusion of innovation theory and how new technologies are adopted over time through innovators, early adopters, early majority, late majority, and laggards
- The concept of ubiquity and how innovations progress from niche to mainstream as they become more established, standardized, and commoditized over time
The document discusses the evolving concept of cloud computing, comparing it to historical notions of computer utilities. It describes cloud computing as having 3 service models (SaaS, PaaS, IaaS) and 4 deployment models (private, public, hybrid, community clouds) with 5 essential characteristics including on-demand access and elastic scaling. However, the exact definition of cloud computing remains unclear as the technology continues to change.
This article discusses how cloud computing is impacting different types of IT jobs in both positive and negative ways. It finds that cloud computing will change how most IT jobs are performed, with some jobs being eliminated and others growing in importance. Specifically, jobs focused on operations and infrastructure will decline as those tasks are increasingly automated by the cloud. Meanwhile, jobs requiring skills in areas like security, integration, and strategy/governance will become more important for managing applications and services in hybrid cloud environments. The cloud is transforming IT work from hands-on technical roles to more project-based roles focused on collaboration and management. Overall, while some jobs will be lost, cloud computing also creates new opportunities for IT professionals to adapt their skills.
Cloud computing allows companies to access scalable IT resources located in data centers around the world in a flexible and cost-effective manner, helping improve the creation and delivery of IT solutions. Security issues are a main concern in cloud computing, so companies must ensure their data and systems comply with applicable security, privacy and compliance requirements when using cloud services. Different types of cloud computing include private, public and hybrid clouds that offer varying levels of access and control over computing resources and data.
The document provides an introduction to cloud computing, defining it as "pay-as-you-go computing on the Internet" where users can rent infrastructure, platforms, and software instead of purchasing them. It discusses how cloud computing encompasses hardware, operating systems, and software delivered as services over the Internet. The cloud allows users to access computing resources without large upfront costs and manage resources remotely. While the terminology can be confusing, cloud computing represents a significant shift in how organizations access technology.
Cloud computing builds on established trends like virtualization, on-demand deployment models, and delivering services over the network. It transforms application development by allowing applications to be rapidly deployed on virtual machines in a self-service manner and scaled up automatically as needed. Developers now take on more of an architect role in programming how their applications dynamically compose and scale across infrastructure resources. This approach can significantly reduce costs and speed up innovation cycles compared to traditional enterprise computing.
This document provides an overview of cloud computing, including its benefits, risks, and considerations for implementation. It discusses how cloud computing can reduce costs while accelerating innovation by providing on-demand access to resources and applications. However, it also notes security and compliance risks that must be addressed. The document provides guidance on evaluating cloud computing options and applications, asking the right questions of providers, and testing solutions before full deployment. The goal is to help organizations strategically decide when and how to adopt cloud computing services.
Cloud computing provides on-demand access to computing resources and IT services on a pay-per-use basis. The document discusses the basics of cloud computing including its architecture, types of cloud services, and its advantages over traditional computing. It also explores potential applications of cloud computing in e-governance and rural development in India by providing cost-effective and scalable IT services.
The document discusses the evolution of cloud computing from its early conceptualization to its current form. It explores how cloud computing has progressed from an undefined concept to widespread ubiquity due to increasing demand and continuous improvements by suppliers. Key factors that have driven this transition include the commoditization of infrastructure, the delivery of software and platforms as standardized services, and a shift towards viewing these resources as utilities rather than custom-built products.
The document is a thesis proposal for a cloud-based microservice architecture for the Skolrutiner system. It begins with an introduction that outlines the motivation and contributions of the thesis. It then provides background on cloud computing and microservices. The current system design is discussed along with its drawbacks. A new proposed cloud-based microservice architecture is then presented and evaluated, with the goal of addressing issues with the current centralized system. The thesis concludes by summarizing the contributions and discussing potential future work.
The document discusses cloud computing, including its history, types, models, advantages, and disadvantages. It also discusses cloud intrusion detection and proposes a model. Specifically, it provides information on public, private, and hybrid cloud models and infrastructure as a service (IaaS), platform as a service (PaaS), and software as a service (SaaS) cloud service models. The document aims to introduce cloud computing concepts and cloud intrusion detection.
Security & Compliance in the Cloud - Hadoop Magazine Column Version 1.2 by Ja...Jarrett Neil Ridlinghafer
This document discusses risks associated with using public cloud services and provides recommendations for mitigating those risks. It addresses issues around data transmission and storage, encryption, key management, email security, application access controls, identity management, employee termination procedures, disaster recovery, and compliance with cloud service provider agreements. The goal is to help companies enter the cloud in a secure and compliant manner based on proven security practices and principles.
Cloud Computing Adoption and the Impact of Information SecurityBelinda Edwards
This document provides an analysis of cloud computing adoption and the impact of information security. It discusses the competitive industry structure of cloud computing and analyzes internal strengths and weaknesses as well as external opportunities and threats. Specifically, it notes that while cloud computing provides economic benefits and centralized infrastructure, weaknesses include a lack of uniform security measurements and regulations. Opportunities include collaborating on cloud computing standards to address security concerns, while threats include economic crises and issues with centralization. The document recommends developing specific security policies and governance approaches to consistently apply information security standards within cloud computing.
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
20 Comprehensive Checklist of Designing and Developing a WebsitePixlogix Infotech
Dive into the world of Website Designing and Developing with Pixlogix! Looking to create a stunning online presence? Look no further! Our comprehensive checklist covers everything you need to know to craft a website that stands out. From user-friendly design to seamless functionality, we've got you covered. Don't miss out on this invaluable resource! Check out our checklist now at Pixlogix and start your journey towards a captivating online presence today.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
7. 1
1. Introduction
1.1. Cloud Computing defined
The Cloud has become a greatly overloaded term over the last few years. It covers many
services and technologies that have been around for some time and that are not necessarily
offered together (Winans & Brown, 2009). Large cloud vendors offer data center hosting
sometimes combined with web applications, interoperability with other applications and
much more. Sometimes even the Internet as a whole is referred to as ‘The Cloud’. To provide
clarity, the cloud concept, shown in the picture below, is defined by NIST (National Institute
of Standards and Technology) to create a framework that can be used to classify specific
services (Mell & Grance, 2011). It splits the cloud into three service models, Software as a
Service (SaaS), Platform as a Service (PaaS) and Infrastructure as a Service (IaaS).
In SaaS, access to end user applications, such as Gmail, is provided over the Internet or
intranet through a thin-client interface. The applications themselves actually run in a
datacenter and not on the local terminal (Apprenda, 2016). The PaaS model is designed for
software developers. It offers development tools for web and mobile applications over the
Internet or intranet. Examples are Red Hat OpenShift and Google App Engine. Finally, IaaS,
can be compared to home utilities such as electricity. Storage and processing power is
delivered over the Internet or intranet. This allows for on-demand sharing of data, online
backups and much more. However, it is the customer’s responsibility to manage everything
that is running on the server (Kabir, Islam, & Hossain, 2015).
Figure 1 : Visual Model of NIST Definition of Cloud Computing
Another distinction must be made regarding the deployment models between public, private
or hybrid clouds. In public clouds, the providers offer their physical resources to multiple
8. 2
organizations and/or consumers simultaneously. It can host both a collection of services and
individual services. In private clouds, these resources are not shared. In this case, the service
providers dedicate specific computing power, storage and other resources to a single client.
We also talk about private clouds when a company/consumer has created its own cloud by
investing in resources such as a datacenter. It can then be referred to as an on-premises cloud.
Public clouds can be offered more cheaply since the providers can make use of economies of
scale. However, compared to private clouds, the clients have less control over things such as
resource allocation and other configurations. Solutions exist that combine the cost-
effectiveness of public clouds with the security and configurability of private clouds. These
are called hybrid clouds.
Characteristics that apply to all cloud service models and deployment models are on-demand
self-service, resource pooling, broad network access rapid elasticity and measured service.
According to the NIST definition of cloud computing, a service must have these essential
characteristics to be labeled as a cloud solution. ‘On-demand self-service’ is the ability to hide
and adapt the limitations of the system so that it appears to be infinite to the users. ‘Resource
pooling’ means that, for multiple users, the resources are combined and assigned dynamically
to improve efficiency. With ‘broad network access’, the resources are available over the
Internet through multiple platforms. ‘Rapid elasticity’ is achieved when the resources can be
scaled easily to cope with changing demand. Next to that, the characteristic of ‘measured
service’ is one that can be seen on any cloud service. The user receives certain limited
resources based on the subscription model. The usage of these resources is shown to the
user.
1.2. Industrial Context
In the current fast-paced markets, it is of the essence that companies are flexible to adapt to
new opportunities and threats. Therefore, focusing on the core business is increasingly
important. Services that merely exist to support the delivery of the core products and core
services become subject to outsourcing. In many cases a specialized company can provide the
same service more cost-effectively, qualitatively and efficiently. When this is the case,
outsourcing is done when it doesn’t pose a threat to the company’s intellectual property. For
almost all companies, IT infrastructure makes up the backbone of their entire operation.
Sometimes it includes an in-house built cloud. Factories and other facilities across borders
rely on secure connections for day-to-day communication and operation. Establishing and
maintaining such infrastructure requires significant investments and in-house knowledge.
Although important, in most cases it is not the core business of a company. Cloud vendors
today are able to offer the same, and better, infrastructure in a much more cost-effective
way.
9. 3
1.2.1. Current issues for corporate IT
Data Centers require large investments and are thus fairly fixed assets. With a fixed capacity,
it cannot cope with the unbalanced workload that arises from the complexity of current global
markets. Also, for startups and SME’s, it is difficult, if not impossible, to finance such an
investment. As business applications are costly to create/buy and integrate within the
company, many of them are outdated and lack the ability to communicate with other
application platforms outside the company. This, in turn, can put a strain on overall business
performance.
1.2.2. Cloud Solutions
The cloud gives answers to these issues (Kabir et al., 2015). Cloud vendors, thanks to their
large scale, are able to efficiently handle spikes in demand. They offer scalability and agility
and can afford to keep updating their systems to state-of-the-art technology. Next to that,
geographical distribution offers many opportunities in network efficiency but also energy
efficiency and cost-savings. Demand can easily be shifted to parts of the world where it is
night or to a location where it can currently draw solar power or any other cost-effective
source of energy (green computing).
Application portfolio management is also improved thanks to virtualisation, by which the
partners can be presented with a standardized platform that, in the short-term draws upon
all their current applications, in order for the transition to newer technologies to go as smooth
as possible.
Overall, cloud computing offers resource efficiency and a modernized and up-to-date IT
application portfolio in the short term. Moreover, it offers long-term business advantages
such as agility. An example would be an acquisition of another company. Now, it is a
painstaking process for the IT department to integrate different systems. These problems are
easily overcome by the use of a distributed platform. Financially, the business is also more
agile because high capital investments are exchanged with much lower operational costs. This
leaves financial room for other business needs.
1.3. Security concerns
Many of the basic features that make cloud services so appealing, also make up the
foundation of some major security concerns (Mazhar Ali, Khan, & Vasilakos, 2015). Only some
concerns are described here. Cloud solutions offer scalability by pooling resources. Many
users are tied to the same resource set. This leads to the risk of data exposure to other users.
Next to that, in this shared environment, the security of the entire service is only as good as
its weakest link. An intruder might compromise the whole service from this weak link. These
shared resources together with scalability inevitably lead to one client using resources that
10. 4
were once allocated to another client. Using data recovery techniques, sensitive data from
previous users can easily be exposed (Balduzzi, Zaddach, Balzarotti, & Loureiro, 2012).
Next to that, the different cloud service models are interdependent. IaaS can be seen as the
basis upon which the Paas is built. Similarly, the applications of the Saas are developed using
the PaaS tools. Intruders can take advantage of these dependencies by attacking one model
and compromising the other service models that are left exposed. A more obvious risk source
are the service providers themselves. Employees with bad intentions might be granted
sufficient access to pose a potential threat.
From this, it is clear that there are still many security, privacy and regulatory threats that need
to be addressed by the cloud vendors. For large corporations, many of these concerns can be
tackled in service level agreements (SLA) with the vendor (M Ali et al., 2015). This way they
can already make use of many cloud technologies before planning a full transition to cloud
applications and services. Still, legal issues become somewhat harder to tackle when the data
is migrated to a datacenter located in a country with different laws.
1.4. Consumer Context
Large corporations have the power to secure many potential risks of cloud computing by
means of tailored contracts negotiated with cloud vendors. This involves ownership of data,
IP protection and law enforcement in case of security breaches and much more. In contrast,
most individual users don’t have this negotiation power and are left with many concerns.
These include the reliability of the service, data security, privacy, increased dependency upon
a third party, fairness and transparency of the unnegotiable terms and conditions and so forth
(Cunningham, 2013).
Most cloud service providers operate under a freemium model. Limited services are offered
for free and the user receives the option to pay for subscription fees depending on the
services needed, such as extra storage. The user however, becomes increasingly dependent
upon the services that are linked to one another, often including cloud storage, web mail and
calendar applications. Therefore, consumers are increasingly led into paying more for cloud
service.
Another problem with free cloud services comes from the need for a revenue stream. The
service provider can achieve this by collecting and selling user data. This in itself is not
necessarily a problem. The problem arises when the transparency of these transactions are
questionable. Mostly, users are not aware which data is shared with whom and might find
themselves more exposed than anticipated.
16. 10
Dynamic web pages provide a solution for this. They create a live user experience. They can
be provided in two ways, either by client-side scripting or server-side scripting.
A server-side dynamic web page is built on-demand by the web server using a scripting
language such as PHP, ASP, Perl, etc, to retrieve the necessary information and combine it
into an HTML document. Most web servers support scripting languages to achieve this goal.
PHP5 will be used to create our cloud environment.
With a client-side dynamic web page, the browser itself creates the interaction within the
page. Scripting languages such as JavaScript are used to process the web page as it loads and
to then reload to change the variable content.
2.6. Dynamic DNS service: no-ip
Every Internet-connected household receives an IP address from the Internet service provider
(ISP). These IP addresses are mostly dynamic, meaning that the service provider redistributes
the IP addresses after a period of time, using the Dynamic Host Configuration Protocol
(DHCP). This is done partly to minimize the amount of used IP addresses, given that not all
clients will be online at the same time. However, this poses a problem when trying to access
your private cloud from outside the local network. All Internet domain names are linked to
an IP address, using a table called the Domain Name System (DNS). However, when the IP
address of the server, running that domain name changes, the table is not adjusted
automatically. After such a change has occurred, you won’t be able to connect anymore.
A Dynamic DNS (DDNS or DynDNS) service links and updates the most current IP address to
connect to a hostname in the DNS. This works by installing a client program on the device that
is running the server and setting it up using the same login-information as used on the website
of the DDNS provider. The the DDNS client sends the correct IP address to the service, which
adapts its binding to the domain name, ready to provide it to anyone that enters the domain
name.
2.7. File sharing protocol: AFP (Netatalk)
SMB (“Server Message Block”), has been the main Windows File Sharing protocol since the
90s. The protocol, although developed by IBM, is proprietary to Microsoft. However, it was
revers engineered to create the Samba protocol. Samba allows unix-based systems to share
files in a Windows environment and was implemented by Apple in OS X 10.2. Apple itself
however, already used another protocol for file-sharing among macs (unix), called AFP
(“Apple[Talk] Filing Protocol”). AFP outperforms SMB in numerous ways including read/write
performance and sleep suspend support (JPY, 2015).
23. 17
sudo nano /etc/php5/fpm/pool.d/www.conf
Change listen = /var/run/php-fpm.sock to listen = 127.0.0.1:9000
Save and Exit.
We will also edit the dphys-swapfile.
sudo nano /etc/dphys-swapfile
Change CONF_SWAPSIZE=100 to CONF_SWAPSIZE=512
Save and Exit
Restart the Raspberry Pi with the following command:
sudo reboot
3.2.4. Setting up ownCloud
Install ownCloud. To do this we will create a directory for it, download and unpack the
software and move it to the right directory. We need to change the permissions of the www-
data user to be able to access the ownCloud folder. After that, we perform some cleanup,
that is, deleting the downloaded compressed file.
sudo mkdir -p /var/www/owncloud
sudo wget https://download.owncloud.org/community/owncloud-9.0.2.tar.bz2
sudo tar xvf owncloud-9.0.2.tar.bz2
sudo mv owncloud/ /var/www/
sudo chown -R www-data:www-data /var/www
rm -rf owncloud owncloud-9.0.2.tar.bz2
Adjust the .htaccess file to support the same file size that we set earlier, namely 2 GB.
cd /var/www/owncloud
sudo nano .htaccess
Update the following three parameters like this: php_value_upload_max_filesize 2000M
php_value_post_max_size 2000M
php_value_memory_limit 2000M
Save and exit.
Do the same with the .user.ini file:
24. 18
sudo nano .user.ini
Update the following three parameters like this: upload_max_filesize=2000M
post_max_size=2000M
memory_limit=2000M
Save and exit.
3.2.5. Setting up an external drive
We want ownCloud to use the external hard drive or USB drive automatically to store all the
data. Therefore, it is very useful to automatically mount the drive. Also, the path of that drive
needs to be linked to ownCloud combined with the necessary permissions. This is done in the
following instructions:
If the drive is NTFS formatted, the NTFS package needs to be installed first:
sudo apt-get install ntfs-3g
Make a directory that you can mount to, we call ours ‘ownclouddrive’.
sudo mkdir /media/ownclouddrive
We will need information about the server user and the drive to update the fstab file. The
fstab (or file systems table) file is a configuration file that lists all available disk partitions and
the specifications on how to initialize each of them in order to fit in the general file system
structure.
Retrieve gid (group identifier) of the www-data user
id -g www-data
Retrieve UUID of the hard drive. This is unique for each drive, so the system will recognize it
even if it is plugged into a different USB port.
ls -l /dev/disk/by-uuid
We will need the blue letters and numbers of the last part. It should have /sda1 at the end.
Now, adjust the fstab file based on the information we just retrieved.
sudo nano /etc/fstab
Insert and adjust the following line at the bottom of the file, using the found guid and UUID
27. 21
NOIP client installation
In many routers, a DDNS client is already implemented. This can be checked by logging in to
the router’s settings page. In many cases, this can be reached under 192.168.1.1. or
192.168.0.1. Follow the guidelines on the DDNS settings page to synchronize it with your
DDNS service provider. In case the router does not provide this service, a client needs to be
installed so that the raspberry pi can provide this service itself.
First you need to create a directory for NOIP, go to that directory and download the NOIP
software into that directory. After extracting, the downloaded file can be removed. Next, the
client can be installed and started. After installing, it prompts you to fill in the login credentials
used for the NOIP service. It then finds the domain name notdropbox.hopto.org. The other
parameters will be left at default. With this, the client will send the IP address every 30
minutes. The following commands are required:
mkdir /home/pi/noip
cd /home/pi/noip
wget http://www.no-ip.com/client/linu/noip-duc-linux.tar.gz
tar vzxf noip-duc-linux.tar.gz
rm noip-duc-linux.tar.gz
cd noip-2.1.9-1
sudo make
sudo make install
Figure 7 : no-ip DDNS configuration
28. 22
Start on reboot
To start it automatically, the rc.local file needs to be adapted:
sudo nano /etc/rc.local
add the following: /usr/local/bin/noip2
between the lines: fi
exit 0
so that it looks like this:
NGINX reconfiguration
The NGINX configuration file should be changed as follows:
sudo nano /etc/nginx/sites-available/default
The server_name values now point to the local IP address of the Raspberry Pi. This needs to
be changed into your external IP address, or your domain name:
Previously: 192.168.1.18
Now: notdropbox.hopto.org
Also, some ISPs might block the use of port 80 (http) or port 443 (https) for personal use. For
me, that was the case. It can save you a lot of trouble changing the listening port for https to
something else like port 444, in the same configuration file. Check the availability of ports first
to avoid conflicts. We will only use the secure https protocol. That is why port 80 can be left
unchanged.
Figure 8 : noip client configuration
Figure 9 : rc.local modification
36. 30
4. Results and Conclusion
4.1. Raspberry Pi setup vs to commercial cloud
Having followed all the steps provided above, has led to a product that can serve as a true
alternative to commercial cloud services. The Raspberry offers a convenient way to access
and share your files at home and on the go. With the ownCloud platform, the administrator
has a wide set of tools to create user groups and appropriate permissions, link different
calendars and contacts, offer live-collaboration, include chat, and many more. This way,
ownCloud outperforms competitors such as Google Drive, that don’t provide these
administrator tools to adapt the service to your needs. As explained, the disadvantage of
having the files reside on your network, is that the speed at which you can access them
externally is limited to your upload speed. Externally, commercial cloud services clearly
outperform the Raspberry Pi.
With the AFP file-sharing protocol, the Raspberry Pi offers the possibility to access your files
anywhere right within your file explorer as if they were located on your hard drive. At home,
this means that you can work at LAN speeds. These are, under normal circumstances, higher
than download or upload speeds. Also, unlike Dropbox, the files are not stored on your local
drive, leaving storage capacity for other data. Although less safe as Dropbox, the daily backup
provides a simple way to avoid losing data in case one of the drives crashes. Added to this is
the security argument of controlling your data in a private environment. No privacy issues can
come up and as owner of your data, you know exactly where your data is and who might have
access to it.
The use of the low-cost, low-powered Raspberry Pi means that interaction with the system
can take quite some time. The latest version of the Raspberry Pi, the Pi 3, provides a much
faster 1.2GHz 64-bit quad-core ARMv8 CPU, doubles the RAM memory to 1GB, adds
connectivity features such as BLE and doubles the USB ports (4). This model is much more
suited for the computationally expensive cloud tasks such as video streaming and photo
editing.
4.2. Comparison to NIST definition for cloud computing
In order to be identified as a cloud solution, the Rasperry Pi configuration should answer
certain criteria as stated in section 1.1. The comply with the NIST definition, the following
characteristics need to be fulfilled: on-demand self-service, broad network access, resource
pooling, rapid elasticity and measured service. The Raspberry fulfils limited on-demand self-
service as it requires only adaptations from the administrator to add extra storage capacity
when the current configuration seems to become insufficient. This means that, this
37. 31
requirement is only fulfilled in case of multiple users. Next, the resource pooling
characteristic is not fulfilled. The system is not able to dynamically reassign its resources in a
multi-tenant model. When multiple users access the system simultaneously, the system will
treat these request similarly. However, measured service is a feature that ownCloud
provides. The administrator can assign different storage capacity limits to different users and
groups. The user is able to see how much has been used so far. The Raspberry Pi set-up
provides limited rapid elasticity in the sense that it is relatively easy to change the storage
capacity but the processing power is more fixed. When an expandable computer would have
been used to set up the service, this would have been more easy to do. Finally, the system is
available over the Internet via two different interfaces. This fulfills the broad network access
requirement.
In practice, the Raspberry Pi with ownCloud and AFP file-sharing can be seen as a true Cloud
computing service that is worth the effort of setting up.
39. 33
6. References
Ali, M., Dhamotharan, R., Khan, E., Khan, S. U., Vasilakos, A. V, Li, K., & Zomaya, A. Y. (2015).
SeDaSC: Secure Data Sharing in Clouds. IEEE Systems Journal.
http://doi.org/10.1109/JSYST.2014.2379646
Ali, M., Khan, S. U., & Vasilakos, A. V. (2015). Security in cloud computing: Opportunities and
challenges. Information Sciences, 305, 357–383.
http://doi.org/10.1016/j.ins.2015.01.025
Apprenda. (2016). IaaS, PaaS, SaaS (Explained and Compared). Retrieved February 1, 2016,
from https://apprenda.com/library/paas/iaas-paas-saas-explained-compared/
Auti, D. (2014). How to Setup NOIP Client on Raspberry Pi. Retrieved April 1, 2016, from
http://www.awesomeweirdness.com/projects-diy/raspberrypi/setup-noip-client-
raspberry-pi/
Balduzzi, M., Zaddach, J., Balzarotti, D., & Loureiro, S. (2012). A Security Analysis of Amazon ’
s Elastic Compute Cloud Service – Long Version. Security, 1427–1434.
http://doi.org/10.1145/2245276.2232005
Cunningham, A. (2013). Caveat Consumer? – Consumer Protection and Cloud Computing –
Part 2. SSRN Electronic Journal, 1–41. http://doi.org/10.2139/ssrn.2212051
Damontimm. (2010). How to set up AFP filesharing on Ubuntu. Retrieved October 2, 2016,
from https://missingreadme.wordpress.com/2010/05/08/how-to-set-up-afp-
filesharing-on-ubuntu/
Dilger, D. E. (2013). Apple shifts from AFP file sharing to SMB2 in OS X 10.9 Mavericks.
Retrieved April 10, 2016, from http://appleinsider.com/articles/13/06/11/apple-shifts-
from-afp-file-sharing-to-smb2-in-os-x-109-mavericks
Ellingwood, J. (2015). Apache vs Nginx: Practical Considerations.
Fizpatrick, J. (2013). How to Turn a Raspberry Pi into a Low-Power Network Storage Device.
Retrieved June 8, 2016, from http://www.howtogeek.com/139433/how-to-turn-a-
raspberry-pi-into-a-low-power-network-storage-device/
Gus. (2015a). Raspberry Pi OwnCloud: Your Own Personal Cloud Storage. Retrieved March 7,
2016, from https://pimylifeup.com/raspberry-pi-owncloud/
Gus. (2015b). Raspberry Pi Port Forwarding & Dynamic DNS. Retrieved March 16, 2016, from
https://pimylifeup.com/raspberry-pi-port-forwarding/
JPY. (2015). AFP versus SMB/SAMBA. Retrieved May 20, 2016, from
http://news.jpy.com/kbase/support-articles/2015/9/8/afp-versus-smbsamba
Kabir, M. H., Islam, S., & Hossain, S. (2015). A Detail Overview of Cloud Computing with its
Opportunities and Obstacles in Developing Countries, 4(4), 52–63.
Koff1979. (2012). Raspberry Pi Owncloud (dropbox clone). Retrieved March 10, 2016, from
http://www.instructables.com/id/Raspberry-Pi-Owncloud-dropbox-clone/?ALLSTEPS
Makershed. (2015). Raspberry Pi Comparison Chart. Retrieved March 1, 2016, from
http://www.makershed.com/pages/raspberry-pi-comparison-chart
Mayank, S. (2016). How to set up a Raspberry Pi-powered cloud service. Retrieved April 1,
2016, from http://www.techradar.com/how-to/computing/how-to-set-up-a-raspberry-
pi-powered-cloud-service-1316017
Mell, P., & Grance, T. (2011). The NIST Definition of Cloud Computing Recommendations of
the National Institute of Standards and Technology. National Institute of Standards and
Technology, Information Technology Laboratory, 145, 7.
40. 34
http://doi.org/10.1136/emj.2010.096966
Rafacz, R. (2012). Adding Linux Users and Groups. Retrieved May 13, 2016, from
https://www.pluralsight.com/blog/tutorials/linux-add-user-command
Ryan, H. (2016). OS X Yosemite (Server 4.03) and SMB3. Retrieved April 10, 2016, from
https://www.murage.ca/os-x-yosemite-server-4-03-smb3/
Shrivastava, T. (2013). Rsync (Remote Sync): 10 Practical Examples of Rsync Command in
Linux. Retrieved June 9, 2016, from http://www.tecmint.com/rsync-local-remote-file-
synchronization-commands/
Siddiqui, F., Zeadally, S., Kacem, T., & Fowler, S. (2012). Zero configuration networking:
Implementation, performance, and security. Computers and Electrical Engineering,
38(5), 1129–1145. http://doi.org/10.1016/j.compeleceng.2012.02.011
Steiner, P. (2010). Why FTP is no longer up to the job. Network Security, 2010(5), 7–9.
http://doi.org/10.1016/S1353-4858(10)70055-6
Thijxx. (2012). File sharing with OSX. Retrieved June 10, 2016, from
https://www.raspberrypi.org/forums/viewtopic.php?f=36&t=26826
Whitson, G. (2014). What Operating System Should I Use for My DIY Home Server? Retrieved
March 1, 2016, from http://lifehacker.com/what-operating-system-should-i-use-for-my-
diy-home-serv-1671385076
Winans, T. B., & Brown, J. S. (2009). Cloud Computing: A Collection of Working Papers.
Http://Www.Johnseelybrown.Com, 1–39.
http://doi.org/10.1017/CBO9781107415324.004
Young, D. (2015). Create your own cloud server on the Raspberry Pi. Retrieved March 7, 2016,
from https://www.element14.com/community/community/raspberry-
pi/raspberrypi_projects/blog/2015/05/05/owncloud-v8-server-on-raspberry-pi-2-
create-your-own-dropbox