This document outlines a research project examining digital forensic analysis of popular cloud storage services like Dropbox, Microsoft SkyDrive, and Google Drive. The research aims to identify what data remnants remain on devices after using cloud storage, develop a forensic analysis framework, and determine how to preserve cloud data for investigations. Virtual machines running Windows 7 were used to simulate cloud storage access and analyze forensic artifacts left behind. Results found various software files, operating system remnants, and browser history items that could identify cloud accounts and files. Downloading files did not alter internal data but changed metadata like timestamps.
Client Forensics: An Assessment of Existing Research And Future DirectionsCSCJournals
This paper provides an assessment of processes for identifying artifacts, left on client devices after a cloud storage interaction. It focuses on those artifacts that may be used to prove usage of a cloud service, as proposed by the current research. Besides providing the current state of knowledge in client forensics, this paper (1) provides a summary of current research in the area of client forensics, (2) presents similarities and differences among proposed processes and identified artifacts, and (3) presents some possible future work. Investigators need to understand how devices and cloud storage services interact, the types of evidential artifacts that are likely to remain on the devices after cloud storage interaction, and how they may be used to prove usage. Not knowing if a cloud service was accessed, or which cloud service or the location of digital evidence can potentially impede an investigation.
Hybrid Cloud Approach for Secure Authorized DeduplicationPrem Rao
This document proposes a hybrid cloud approach for secure authorized data deduplication. It discusses existing systems that use data deduplication to reduce storage usage but lack security features. The proposed system uses convergent encryption for data confidentiality while allowing deduplication. It also aims to support authorized duplicate checks by encrypting files with differential privilege keys. The system design involves data owner, encryption/decryption, private cloud, public cloud, and cloud server modules. Cryptographic techniques like hashing and encryption are used along with communication via HTTP. The development follows a waterfall model with phases for requirements analysis, design, implementation, testing, and maintenance.
Secure Auditing and Deduplicating Data on CloudIJMTST Journal
Cloud computing is a technology that used for storing and accessing. The data on remote location. It is totally internet-based.it is self-service and on Demand technology. That’s why now days it’s mostly used and popular term. Enterprises and organizations used cloud storage for access data to third-party. As like, the single user also use the confidential data anywhere, anytime on earth It is now becoming business standard. Its simplify users accessibility. It is cost saving and flexible for better performance on internet. But is also occur drawbacks like security and integrity on data. Like many times the data is already available on storage but it contain slightly difference. So overcome this problems we introduce two secure system, namely seccloud and seccloud+. Seccloud is used for generating tags on data before uploading and seccloud+ is maintain the integrity auditing and secure de-duplication on data because every customer wants to encrypt their data before uploading. Data integrity and storage efficiency are tw o important aspect of cloud storage. Proof of Retrievability (POR) and Proof of Data Possession (PDP) techniques Assure data integrity for cloud storage. Proof of Ownership (POW) improves Storage efficiency by securely removing unnecessarily duplicated data on the Storage server. Cloud computing is one of the most talked about IT trends today.in cloud more application availability on the cloud. Also cloud increased growth in the market.
The document proposes a secure client-side deduplication scheme called KeyD that uses identity-based broadcast encryption instead of independent key management to effectively manage convergent keys for deduplication. KeyD ensures data confidentiality and convergent key security while providing ownership privacy. Experimental results show that KeyD achieves better tradeoffs between storage costs, communication overhead, and computation overhead compared to traditional deduplication schemes.
Storage Made Easy - File Fabric Use CasesHybrid Cloud
The File Fabric provides a multi-cloud solution for on-site and on-cloud data and can be used for solutions as diverse as data governance and compliance through to Big Data / Object Storage use cases.
Trying to bottle the cloud forensic challenges with cloud computingBrent Muir
This document discusses the forensic challenges associated with cloud computing. It covers the different types of cloud technologies including cloud processing and cloud storage. It also discusses the challenges of accessing cloud data which can be stored across various backend infrastructures and datacenters around the world. The document outlines the different types of data that may be stored in the cloud including VM files systems, loose files, emails and more. It discusses issues around jurisdiction and legislation when cloud data is involved in an investigation. Recommendations are provided for forensically sound procedures when acquiring cloud evidence. Real-world examples involving an Australian cloud storage provider and Microsoft SkyDrive are also summarized.
Client Forensics: An Assessment of Existing Research And Future DirectionsCSCJournals
This paper provides an assessment of processes for identifying artifacts, left on client devices after a cloud storage interaction. It focuses on those artifacts that may be used to prove usage of a cloud service, as proposed by the current research. Besides providing the current state of knowledge in client forensics, this paper (1) provides a summary of current research in the area of client forensics, (2) presents similarities and differences among proposed processes and identified artifacts, and (3) presents some possible future work. Investigators need to understand how devices and cloud storage services interact, the types of evidential artifacts that are likely to remain on the devices after cloud storage interaction, and how they may be used to prove usage. Not knowing if a cloud service was accessed, or which cloud service or the location of digital evidence can potentially impede an investigation.
Hybrid Cloud Approach for Secure Authorized DeduplicationPrem Rao
This document proposes a hybrid cloud approach for secure authorized data deduplication. It discusses existing systems that use data deduplication to reduce storage usage but lack security features. The proposed system uses convergent encryption for data confidentiality while allowing deduplication. It also aims to support authorized duplicate checks by encrypting files with differential privilege keys. The system design involves data owner, encryption/decryption, private cloud, public cloud, and cloud server modules. Cryptographic techniques like hashing and encryption are used along with communication via HTTP. The development follows a waterfall model with phases for requirements analysis, design, implementation, testing, and maintenance.
Secure Auditing and Deduplicating Data on CloudIJMTST Journal
Cloud computing is a technology that used for storing and accessing. The data on remote location. It is totally internet-based.it is self-service and on Demand technology. That’s why now days it’s mostly used and popular term. Enterprises and organizations used cloud storage for access data to third-party. As like, the single user also use the confidential data anywhere, anytime on earth It is now becoming business standard. Its simplify users accessibility. It is cost saving and flexible for better performance on internet. But is also occur drawbacks like security and integrity on data. Like many times the data is already available on storage but it contain slightly difference. So overcome this problems we introduce two secure system, namely seccloud and seccloud+. Seccloud is used for generating tags on data before uploading and seccloud+ is maintain the integrity auditing and secure de-duplication on data because every customer wants to encrypt their data before uploading. Data integrity and storage efficiency are tw o important aspect of cloud storage. Proof of Retrievability (POR) and Proof of Data Possession (PDP) techniques Assure data integrity for cloud storage. Proof of Ownership (POW) improves Storage efficiency by securely removing unnecessarily duplicated data on the Storage server. Cloud computing is one of the most talked about IT trends today.in cloud more application availability on the cloud. Also cloud increased growth in the market.
The document proposes a secure client-side deduplication scheme called KeyD that uses identity-based broadcast encryption instead of independent key management to effectively manage convergent keys for deduplication. KeyD ensures data confidentiality and convergent key security while providing ownership privacy. Experimental results show that KeyD achieves better tradeoffs between storage costs, communication overhead, and computation overhead compared to traditional deduplication schemes.
Storage Made Easy - File Fabric Use CasesHybrid Cloud
The File Fabric provides a multi-cloud solution for on-site and on-cloud data and can be used for solutions as diverse as data governance and compliance through to Big Data / Object Storage use cases.
Trying to bottle the cloud forensic challenges with cloud computingBrent Muir
This document discusses the forensic challenges associated with cloud computing. It covers the different types of cloud technologies including cloud processing and cloud storage. It also discusses the challenges of accessing cloud data which can be stored across various backend infrastructures and datacenters around the world. The document outlines the different types of data that may be stored in the cloud including VM files systems, loose files, emails and more. It discusses issues around jurisdiction and legislation when cloud data is involved in an investigation. Recommendations are provided for forensically sound procedures when acquiring cloud evidence. Real-world examples involving an Australian cloud storage provider and Microsoft SkyDrive are also summarized.
This a talk that I gave at BioIT World West on March 12, 2019. The talk was called: A Gen3 Perspective of Disparate Data:From Pipelines in Data Commons to AI in Data Ecosystems.
The research proposed in this paper focuses on gathering evidence from devices with UNIX/Linux systems (in particular on Ubuntu 14.04 and Android OS), and Windows 8.1, in order to find artifacts left by cloud storage applications that suggests their use even after the deletion of the applications. The work performed aims to expand upon the prior work done by other researches in the field of cloud forensics and to show an example of analysis. We show where and what type of data remnants can be found using our analysis and how this information can be used as evidence in a digital forensic investigation.
Cloud computing involves delivering computing resources as a service over the internet. It allows users to access software, storage, and computing power from any device with an internet connection. Major advantages include lower costs compared to owning software/hardware, access from any device, automatic updates, and not needing large internal storage on devices. However, security, privacy, and reliance on consistent internet access are potential disadvantages.
Automating Research Data Management at Scale with GlobusGlobus
Research computing facilities, such as the national supercomputing centers, and shared instruments, such as cryo electron microscopes and advanced light sources, are generating large volumes of data daily. These growing data volumes make it challenging for researchers to perform what should be mundane tasks: move data reliably, describe data for subsequent discovery, and make data accessible to geographically distributed collaborators. Most employ some set of ad hoc methods, which are not scalable, and it is clear that some level of automation is required for these tasks.
Globus is an established service from the University of Chicago that is widely used for managing research data in national laboratories, campus computing centers, and HPC facilities. While its intuitive web app addresses simple file transfer and sharing scenarios, automation at scale requires integrating Globus data management platform services into custom science gateways, data portals and other web applications in service of research. Such applications should enable automated ingest of data from diverse sources, launching of analysis runs on diverse computing resources, extraction and addition of metadata for creating search indexes, assignment of persistent identifiers faceted search for rapid data discovery, and point-and-click downloading of datasets by authorized users — all protected by an authentication and authorization substrate that allows the implementation of flexible data access policies for both metadata and data alike.
We describe current and emerging Globus services that facilitate these automated data flows while ensuring a streamlined user experience. We also demonstrate Petreldata.net, a data management portal and gateway to multiple computing resources, that supports large-scale research at the Advanced Photon Source.
The document discusses various digital preservation activities the author undertook as part of an assignment, including archiving, harvesting, mirroring files, extracting metadata, and verifying checksums. The author learned how to use tools like PeaZip, Xena, emulators, and metadata extraction software. They created disk images and analyzed them using bulk extractor to identify sensitive data. The author automated a workflow to generate checksums and write them to an Excel file. Overall, the assignment helped the author gain hands-on experience with digital preservation concepts and tools.
This document summarizes a study that benchmarked the performance of personal cloud storage services like Dropbox, Google Drive, SkyDrive, Wuala, and Amazon Cloud Drive. The study developed a methodology to test each service's system architecture, data center locations, file synchronization capabilities, and performance. Key findings include: Dropbox implemented the most capabilities but had high overhead, while Google Drive and Wuala had the fastest completion times due to data centers near the testbed. The study provided insights into how each service's design impacts performance.
The document provides an overview of Hadoop including:
- A brief history of Hadoop and its origins from Nutch.
- An overview of the Hadoop architecture including HDFS and MapReduce.
- Examples of how companies like Yahoo, Facebook and Amazon use Hadoop at large scales to process petabytes of data.
IRJET - A Secure Access Policies based on Data Deduplication SystemIRJET Journal
This document summarizes a research paper on a secure access policies based data deduplication system. The system uses attribute-based encryption and a hybrid cloud model with a private cloud for deduplication and a public cloud for storage. It allows defining access policies for encrypted data files. When a user uploads a duplicate file, the system checks for a matching file and replaces it with a reference to the existing copy to save storage. The system provides file and block-level deduplication for efficient storage and uses cryptographic techniques like MD5, 3DES and RSA for encryption, tagging and access control of encrypted duplicate data across clouds.
GlobusWorld 2021 Tutorial: Introduction to GlobusGlobus
An introduction to the core features of the Globus data management service. This tutorial was presented at the GlobusWorld 2021 conference in Chicago, IL by Greg Nawrocki.
The document describes a proposed system called FogDrive Disaster Backup as a Service that uses fog computing to provide backup for cloud server data. It aims to provide an easy-to-use and secure backup system. The system would use cloud and encryption techniques to backup and store data on a FogDrive server located within the local network. It would include OpenPGP encryption of backups and allow users to manage their encrypted backups through a dashboard. The system architecture was presented, including modules for the cloud provider, FogDrive server, and backup software service.
Introduction to Globus (GlobusWorld Tour West)Globus
This document introduces Globus, which provides fast and reliable data transfer, sharing, and platform services across different storage systems and resources. It does this through software-as-a-service that uses existing user identities, with the goal of unifying access to data across different tiers like HPC, storage, cloud, and personal resources. Key features include secure data transfers without moving files, access control and sharing capabilities, and tools for building automations and integrating with science gateways. It also discusses options for handling protected data like health information with additional security controls and business agreements.
Implementing a Data Mesh with Apache Kafka with Adam Bellemare | Kafka Summit...HostedbyConfluent
Have you heard about Data Mesh but never really understood how you actually build one? Data mesh is a relatively recent term that describes a set of principles that good modern data systems uphold. Although the data mesh is not a technology specific pattern, it requires that organizations make choices and investments into specific technologies and operational policies when implementing the mesh. Establishing ""paved roads"" for creating, publishing, evolving, deprecating, and discovering data products is essential for bringing the benefits of the mesh to those who would use it.
In this talk, Adam covers implementing a self-service data mesh with events streams in Apache Kafka®. Event streams as a data product are an essential part of a real-world data mesh, as they enable both operational and analytical workloads from a common source of truth. Event streams provide full historical data along with realtime updates, letting each individual data product consumer decide what to consume, how to remodel it, and where to store it to best suit their needs.
Adam structures this talk by seeking to answer a hypothetical SaaS business question of ""what is the relationship between feature usage and user retention?"" This example explores each team's role in the data mesh, including the data products they would (and wouldn't) publish, how other teams could use the products, and the organizational dynamics and principles underpinning it all.
PDP is a technique designed for CSP to show that complete file of the client is in secured state without downloading the whole file. PDP scheme of multi cloud includes multiple csp’s to mutually store and maintain the client’s data. Later CPDP was proposed which also has the Homorphic verifiable response and Hash index hierarchy properties, but there is a security flaws. The problem is, when a malicious csp or organizer generates a valid response which also clears the verification process in case of deletion of the stored data. This means, simply, an attacker gets the information without storing the client’s data. In this paper, we discuss about the security flaws in CPDP (Cooperative Provable Data Possession) scheme.
Please i need this paper in 6 hours. if you can make it happen, kindtaminklsperaw
Please i need this paper in 6 hours. if you can make it happen, kindly lets talk.thank you in advance
.
Final Project Milestone One: Draft of Report
To complete this assignment, review the prompt and grading rubric in the
Milestone One Guidelines and Rubric
document. When you have finished your work, submit the assignment here for grading and instructor feedback.
ISE 640 Final Project Forensic Notes
Use the information in this document to help you complete your final project.
Drew Patrick, a director-level employee, is stealing intellectual property from a manufacturing company. The company is heavily involved in high-end development of widgets. Drew has access to corporate secrets and files. He is planning on leaving the company, taking the intellectual property with him, and going to work for a competitor. There is suspicion of him doing this, so human resources (HR) notified the information technology (IT) department to monitor Drew’s past history. An internal investigation is launched due to Drew’s abnormal behavior. The IT department confirms that they have found large files and emails. Forensics identified unauthorized access, transmission, and storage of intellectual property by Drew. Evidence found will be used to support legal civil and criminal proceedings.
Scenario ACME Construction Company designs, manufactures, and sells large construction vehicles that can cost upwards of a million dollars. They spent hundreds of thousands of hours redesigning their premier excavator. Every piece that goes into the excavator is individually designed to maximize the longevity of the equipment. Known for attention to detail, high-quality work, and industry innovation, this painstaking work is what sets ACME Construction company apart and is attributed for the excellent reputation they enjoy. This, in turn, allows them to charge a premium on their exceptionally well-built products.
Drew Patrick is a senior manager directly involved with the overall development of ACME’s excavators. His role provides him with access to design documentation, schematics, support documents, and any other technical references maintained in the company’s research and development (R&D) database. The R&D database is maintained by ACME’s information technology (IT) department, which is supported by a security operations center (SOC). The SOC uses Snort as a core component of their security information and event management (SIEM) system to keep tabs on network traffic, authentication requests, file access, and log file analysis.
The SIEM alerted SOC personnel of potential peer-to-peer (P2P) traffic originating from the internet protocol (IP) address associated with Drew’s computer. However, analysis of Active Directory logs indicated that Drew was not logged into his account at the time the files were transferred via the P2P application. ACME enforces two-factor authentication and does not allow for computer sharing. The SOC person ...
This document summarizes a research paper on developing a cloud-based storage system similar to Dropbox called "Unbox". It describes the proposed system's architecture which includes users uploading files via a web browser that are stored on servers using the OwnCloud storage system. The storage is implemented using a Raspberry Pi server with Samba file sharing enabled to allow access from multiple devices via CIFS/SMB protocol. The system aims to provide a cost-efficient cloud storage option for users with features like file encryption, backup plans and access from multiple devices.
Throughout this course, you will be keeping an investigative joumarilynnhoare
Throughout this course, you will be keeping an investigative journal. The purpose of this journal is to archive any artifacts and information that may support your final projects. You will submit it as part of Milestone One and receive points within the milestone rubric for this. Additionally, it will assist you by allowing you to organize information in a chronological order that you can easily retrieve when completing the final projects in the later modules. This journal can be kept as a Word document. You can compile journal entries within the same document and submit this document as one file submission at the end of the course with your Milestone One submission.
In your investigative journal, develop a chain of custody form to be used within a business based on forensic notes for the final project.
In your investigative journal, record how data is acquired and the tools used in the final project scenario.
In your investigative journal, record how data is acquired and the tools used in the final project scenario.
In your investigative journal, record network analysis for the final project scenario.
ISE 640 Final Project Forensic Notes
Use the information in this document to help you complete your final project.
Drew Patrick, a director-level employee, is stealing intellectual property from a manufacturing company. The company is heavily involved in high-end development of widgets. Drew has access to corporate secrets and files. He is planning on leaving the company, taking the intellectual property with him, and going to work for a competitor. There is suspicion of him doing this, so human resources (HR) notified the information technology (IT) department to monitor Drew’s past history. An internal investigation is launched due to Drew’s abnormal behavior. The IT department confirms that they have found large files and emails. Forensics identified unauthorized access, transmission, and storage of intellectual property by Drew. Evidence found will be used to support legal civil and criminal proceedings.
Scenario ACME Construction Company designs, manufactures, and sells large construction vehicles that can cost upwards of a million dollars. They spent hundreds of thousands of hours redesigning their premier excavator. Every piece that goes into the excavator is individually designed to maximize the longevity of the equipment. Known for attention to detail, high-quality work, and industry innovation, this painstaking work is what sets ACME Construction company apart and is attributed for the excellent reputation they enjoy. This, in turn, allows them to charge a premium on their exceptionally well-built products.
Drew Patrick is a senior manager directly involved with the overall development of ACME’s excavators. His role provides him with access to design documentation, schematics, support documents, and any other technical references maintained in the company’s research and development (R&D) database. The R&D database is maint ...
What Are You Looking ForThe variety of operating systems, appli.docxalanfhall8953
What Are You Looking For?
The variety of operating systems, application programs, and storage methods available today means that when it comes to looking for evidence there are a multitude of places to look. Digital evidence can be found in numerous sources, including stored data, applications used to create data, and the computer system that produced the activity. Systems can be huge and complex, and they can change rapidly. Data can be hidden in many different locations and formats. After you find such data, you may have to process it to make it readable by people.
Discovering Evidence Using Connectors
In recent years, manufacturers have developed branded forensic workstations that provide external native connectors for a variety of media, such as Serial ATA (SATA), SCSI (Small Computer System Interface), flash media, and the older IDE (Integrated Drive Electronics) drives. SATA hard drives are more commonly used by individuals, while SCSI hard drives are more likely to be found in a corporate environment.) As a forensic investigator, you will encounter and work with many different types of media. You may also encounter connectors that hook up FireWire to SATA, SCSI, or IDE, and that hook up USB to SATA, SCSI, or IDE. A forensic investigator will determine what media the suspect has been using to store data and will have a variety of connectors on hand to aid the investigation.
connector
The part of a cable that plugs into a port or interface to connect devices. Male connectors are identified by exposed pins. Female connectors are identified by holes into which the male connector can be inserted.
The general discovery process is the same whether you are working with a SATA, SCSI, or IDE drive. You should adapt your techniques to suit the hardware you encounter.
To begin the discovery process for a drive, copy the image file onto your forensic workstation and then process it using one or several different forensic tools such as FTK, Encase, or ProDiscover.
Network Activity Files
Let's use an example case that involves the Internet and pictures. During your career as a forensic investigator, you may be called upon to investigate situations where an employee has illegally accessed and downloaded pictures of proprietary designs from a competitor's internal Web site and then used these designs in his or her own work.
After the forensic image has been added to your forensic computer, open your forensic software and start a case. Figure 6.1 shows the New Case Wizard from the AccessData Forensic Toolkit (FTK).
Figure 6.1: AccessData's Forensic Toolkit New Case Wizard
When a user logs on to a Windows XP, Vista, or Windows 7 system for the first time, a directory structure is created to hold that individual user's files and settings. This structure is called the profile. The profile creates a directory that has the same name as the user, along with various other folders and files.
Because this case involves searching for images that were downloaded .
This document provides an overview of a workshop on Google Cloud Platform presented by Javed Habib, GDSC Lead at IIT Bhilai. The workshop covers introduction to cloud computing and Google Cloud architecture, hands-on labs for Google Cloud storage options, APIs, Pub/Sub, security, big data analysis using Dataflow and BigQuery, machine learning with Vertex AI and AutoML, and networking and security on Google Cloud including VPCs, load balancing, and firewalls.
Integrating On-premises Enterprise Storage Workloads with AWS (ENT301) | AWS ...Amazon Web Services
AWS gives designers of enterprise storage systems a completely new set of options. Aimed at enterprise storage specialists and managers of cloud-integration teams, this session gives you the tools and perspective to confidently integrate your storage workloads with AWS. We show working use cases, a thorough TCO model, and detailed customer blueprints. Throughout we analyze how data-tiering options measure up to the design criteria that matter most: performance, efficiency, cost, security, and integration.
Embedded machine learning-based road conditions and driving behavior monitoringIJECEIAES
Car accident rates have increased in recent years, resulting in losses in human lives, properties, and other financial costs. An embedded machine learning-based system is developed to address this critical issue. The system can monitor road conditions, detect driving patterns, and identify aggressive driving behaviors. The system is based on neural networks trained on a comprehensive dataset of driving events, driving styles, and road conditions. The system effectively detects potential risks and helps mitigate the frequency and impact of accidents. The primary goal is to ensure the safety of drivers and vehicles. Collecting data involved gathering information on three key road events: normal street and normal drive, speed bumps, circular yellow speed bumps, and three aggressive driving actions: sudden start, sudden stop, and sudden entry. The gathered data is processed and analyzed using a machine learning system designed for limited power and memory devices. The developed system resulted in 91.9% accuracy, 93.6% precision, and 92% recall. The achieved inference time on an Arduino Nano 33 BLE Sense with a 32-bit CPU running at 64 MHz is 34 ms and requires 2.6 kB peak RAM and 139.9 kB program flash memory, making it suitable for resource-constrained embedded systems.
This a talk that I gave at BioIT World West on March 12, 2019. The talk was called: A Gen3 Perspective of Disparate Data:From Pipelines in Data Commons to AI in Data Ecosystems.
The research proposed in this paper focuses on gathering evidence from devices with UNIX/Linux systems (in particular on Ubuntu 14.04 and Android OS), and Windows 8.1, in order to find artifacts left by cloud storage applications that suggests their use even after the deletion of the applications. The work performed aims to expand upon the prior work done by other researches in the field of cloud forensics and to show an example of analysis. We show where and what type of data remnants can be found using our analysis and how this information can be used as evidence in a digital forensic investigation.
Cloud computing involves delivering computing resources as a service over the internet. It allows users to access software, storage, and computing power from any device with an internet connection. Major advantages include lower costs compared to owning software/hardware, access from any device, automatic updates, and not needing large internal storage on devices. However, security, privacy, and reliance on consistent internet access are potential disadvantages.
Automating Research Data Management at Scale with GlobusGlobus
Research computing facilities, such as the national supercomputing centers, and shared instruments, such as cryo electron microscopes and advanced light sources, are generating large volumes of data daily. These growing data volumes make it challenging for researchers to perform what should be mundane tasks: move data reliably, describe data for subsequent discovery, and make data accessible to geographically distributed collaborators. Most employ some set of ad hoc methods, which are not scalable, and it is clear that some level of automation is required for these tasks.
Globus is an established service from the University of Chicago that is widely used for managing research data in national laboratories, campus computing centers, and HPC facilities. While its intuitive web app addresses simple file transfer and sharing scenarios, automation at scale requires integrating Globus data management platform services into custom science gateways, data portals and other web applications in service of research. Such applications should enable automated ingest of data from diverse sources, launching of analysis runs on diverse computing resources, extraction and addition of metadata for creating search indexes, assignment of persistent identifiers faceted search for rapid data discovery, and point-and-click downloading of datasets by authorized users — all protected by an authentication and authorization substrate that allows the implementation of flexible data access policies for both metadata and data alike.
We describe current and emerging Globus services that facilitate these automated data flows while ensuring a streamlined user experience. We also demonstrate Petreldata.net, a data management portal and gateway to multiple computing resources, that supports large-scale research at the Advanced Photon Source.
The document discusses various digital preservation activities the author undertook as part of an assignment, including archiving, harvesting, mirroring files, extracting metadata, and verifying checksums. The author learned how to use tools like PeaZip, Xena, emulators, and metadata extraction software. They created disk images and analyzed them using bulk extractor to identify sensitive data. The author automated a workflow to generate checksums and write them to an Excel file. Overall, the assignment helped the author gain hands-on experience with digital preservation concepts and tools.
This document summarizes a study that benchmarked the performance of personal cloud storage services like Dropbox, Google Drive, SkyDrive, Wuala, and Amazon Cloud Drive. The study developed a methodology to test each service's system architecture, data center locations, file synchronization capabilities, and performance. Key findings include: Dropbox implemented the most capabilities but had high overhead, while Google Drive and Wuala had the fastest completion times due to data centers near the testbed. The study provided insights into how each service's design impacts performance.
The document provides an overview of Hadoop including:
- A brief history of Hadoop and its origins from Nutch.
- An overview of the Hadoop architecture including HDFS and MapReduce.
- Examples of how companies like Yahoo, Facebook and Amazon use Hadoop at large scales to process petabytes of data.
IRJET - A Secure Access Policies based on Data Deduplication SystemIRJET Journal
This document summarizes a research paper on a secure access policies based data deduplication system. The system uses attribute-based encryption and a hybrid cloud model with a private cloud for deduplication and a public cloud for storage. It allows defining access policies for encrypted data files. When a user uploads a duplicate file, the system checks for a matching file and replaces it with a reference to the existing copy to save storage. The system provides file and block-level deduplication for efficient storage and uses cryptographic techniques like MD5, 3DES and RSA for encryption, tagging and access control of encrypted duplicate data across clouds.
GlobusWorld 2021 Tutorial: Introduction to GlobusGlobus
An introduction to the core features of the Globus data management service. This tutorial was presented at the GlobusWorld 2021 conference in Chicago, IL by Greg Nawrocki.
The document describes a proposed system called FogDrive Disaster Backup as a Service that uses fog computing to provide backup for cloud server data. It aims to provide an easy-to-use and secure backup system. The system would use cloud and encryption techniques to backup and store data on a FogDrive server located within the local network. It would include OpenPGP encryption of backups and allow users to manage their encrypted backups through a dashboard. The system architecture was presented, including modules for the cloud provider, FogDrive server, and backup software service.
Introduction to Globus (GlobusWorld Tour West)Globus
This document introduces Globus, which provides fast and reliable data transfer, sharing, and platform services across different storage systems and resources. It does this through software-as-a-service that uses existing user identities, with the goal of unifying access to data across different tiers like HPC, storage, cloud, and personal resources. Key features include secure data transfers without moving files, access control and sharing capabilities, and tools for building automations and integrating with science gateways. It also discusses options for handling protected data like health information with additional security controls and business agreements.
Implementing a Data Mesh with Apache Kafka with Adam Bellemare | Kafka Summit...HostedbyConfluent
Have you heard about Data Mesh but never really understood how you actually build one? Data mesh is a relatively recent term that describes a set of principles that good modern data systems uphold. Although the data mesh is not a technology specific pattern, it requires that organizations make choices and investments into specific technologies and operational policies when implementing the mesh. Establishing ""paved roads"" for creating, publishing, evolving, deprecating, and discovering data products is essential for bringing the benefits of the mesh to those who would use it.
In this talk, Adam covers implementing a self-service data mesh with events streams in Apache Kafka®. Event streams as a data product are an essential part of a real-world data mesh, as they enable both operational and analytical workloads from a common source of truth. Event streams provide full historical data along with realtime updates, letting each individual data product consumer decide what to consume, how to remodel it, and where to store it to best suit their needs.
Adam structures this talk by seeking to answer a hypothetical SaaS business question of ""what is the relationship between feature usage and user retention?"" This example explores each team's role in the data mesh, including the data products they would (and wouldn't) publish, how other teams could use the products, and the organizational dynamics and principles underpinning it all.
PDP is a technique designed for CSP to show that complete file of the client is in secured state without downloading the whole file. PDP scheme of multi cloud includes multiple csp’s to mutually store and maintain the client’s data. Later CPDP was proposed which also has the Homorphic verifiable response and Hash index hierarchy properties, but there is a security flaws. The problem is, when a malicious csp or organizer generates a valid response which also clears the verification process in case of deletion of the stored data. This means, simply, an attacker gets the information without storing the client’s data. In this paper, we discuss about the security flaws in CPDP (Cooperative Provable Data Possession) scheme.
Please i need this paper in 6 hours. if you can make it happen, kindtaminklsperaw
Please i need this paper in 6 hours. if you can make it happen, kindly lets talk.thank you in advance
.
Final Project Milestone One: Draft of Report
To complete this assignment, review the prompt and grading rubric in the
Milestone One Guidelines and Rubric
document. When you have finished your work, submit the assignment here for grading and instructor feedback.
ISE 640 Final Project Forensic Notes
Use the information in this document to help you complete your final project.
Drew Patrick, a director-level employee, is stealing intellectual property from a manufacturing company. The company is heavily involved in high-end development of widgets. Drew has access to corporate secrets and files. He is planning on leaving the company, taking the intellectual property with him, and going to work for a competitor. There is suspicion of him doing this, so human resources (HR) notified the information technology (IT) department to monitor Drew’s past history. An internal investigation is launched due to Drew’s abnormal behavior. The IT department confirms that they have found large files and emails. Forensics identified unauthorized access, transmission, and storage of intellectual property by Drew. Evidence found will be used to support legal civil and criminal proceedings.
Scenario ACME Construction Company designs, manufactures, and sells large construction vehicles that can cost upwards of a million dollars. They spent hundreds of thousands of hours redesigning their premier excavator. Every piece that goes into the excavator is individually designed to maximize the longevity of the equipment. Known for attention to detail, high-quality work, and industry innovation, this painstaking work is what sets ACME Construction company apart and is attributed for the excellent reputation they enjoy. This, in turn, allows them to charge a premium on their exceptionally well-built products.
Drew Patrick is a senior manager directly involved with the overall development of ACME’s excavators. His role provides him with access to design documentation, schematics, support documents, and any other technical references maintained in the company’s research and development (R&D) database. The R&D database is maintained by ACME’s information technology (IT) department, which is supported by a security operations center (SOC). The SOC uses Snort as a core component of their security information and event management (SIEM) system to keep tabs on network traffic, authentication requests, file access, and log file analysis.
The SIEM alerted SOC personnel of potential peer-to-peer (P2P) traffic originating from the internet protocol (IP) address associated with Drew’s computer. However, analysis of Active Directory logs indicated that Drew was not logged into his account at the time the files were transferred via the P2P application. ACME enforces two-factor authentication and does not allow for computer sharing. The SOC person ...
This document summarizes a research paper on developing a cloud-based storage system similar to Dropbox called "Unbox". It describes the proposed system's architecture which includes users uploading files via a web browser that are stored on servers using the OwnCloud storage system. The storage is implemented using a Raspberry Pi server with Samba file sharing enabled to allow access from multiple devices via CIFS/SMB protocol. The system aims to provide a cost-efficient cloud storage option for users with features like file encryption, backup plans and access from multiple devices.
Throughout this course, you will be keeping an investigative joumarilynnhoare
Throughout this course, you will be keeping an investigative journal. The purpose of this journal is to archive any artifacts and information that may support your final projects. You will submit it as part of Milestone One and receive points within the milestone rubric for this. Additionally, it will assist you by allowing you to organize information in a chronological order that you can easily retrieve when completing the final projects in the later modules. This journal can be kept as a Word document. You can compile journal entries within the same document and submit this document as one file submission at the end of the course with your Milestone One submission.
In your investigative journal, develop a chain of custody form to be used within a business based on forensic notes for the final project.
In your investigative journal, record how data is acquired and the tools used in the final project scenario.
In your investigative journal, record how data is acquired and the tools used in the final project scenario.
In your investigative journal, record network analysis for the final project scenario.
ISE 640 Final Project Forensic Notes
Use the information in this document to help you complete your final project.
Drew Patrick, a director-level employee, is stealing intellectual property from a manufacturing company. The company is heavily involved in high-end development of widgets. Drew has access to corporate secrets and files. He is planning on leaving the company, taking the intellectual property with him, and going to work for a competitor. There is suspicion of him doing this, so human resources (HR) notified the information technology (IT) department to monitor Drew’s past history. An internal investigation is launched due to Drew’s abnormal behavior. The IT department confirms that they have found large files and emails. Forensics identified unauthorized access, transmission, and storage of intellectual property by Drew. Evidence found will be used to support legal civil and criminal proceedings.
Scenario ACME Construction Company designs, manufactures, and sells large construction vehicles that can cost upwards of a million dollars. They spent hundreds of thousands of hours redesigning their premier excavator. Every piece that goes into the excavator is individually designed to maximize the longevity of the equipment. Known for attention to detail, high-quality work, and industry innovation, this painstaking work is what sets ACME Construction company apart and is attributed for the excellent reputation they enjoy. This, in turn, allows them to charge a premium on their exceptionally well-built products.
Drew Patrick is a senior manager directly involved with the overall development of ACME’s excavators. His role provides him with access to design documentation, schematics, support documents, and any other technical references maintained in the company’s research and development (R&D) database. The R&D database is maint ...
What Are You Looking ForThe variety of operating systems, appli.docxalanfhall8953
What Are You Looking For?
The variety of operating systems, application programs, and storage methods available today means that when it comes to looking for evidence there are a multitude of places to look. Digital evidence can be found in numerous sources, including stored data, applications used to create data, and the computer system that produced the activity. Systems can be huge and complex, and they can change rapidly. Data can be hidden in many different locations and formats. After you find such data, you may have to process it to make it readable by people.
Discovering Evidence Using Connectors
In recent years, manufacturers have developed branded forensic workstations that provide external native connectors for a variety of media, such as Serial ATA (SATA), SCSI (Small Computer System Interface), flash media, and the older IDE (Integrated Drive Electronics) drives. SATA hard drives are more commonly used by individuals, while SCSI hard drives are more likely to be found in a corporate environment.) As a forensic investigator, you will encounter and work with many different types of media. You may also encounter connectors that hook up FireWire to SATA, SCSI, or IDE, and that hook up USB to SATA, SCSI, or IDE. A forensic investigator will determine what media the suspect has been using to store data and will have a variety of connectors on hand to aid the investigation.
connector
The part of a cable that plugs into a port or interface to connect devices. Male connectors are identified by exposed pins. Female connectors are identified by holes into which the male connector can be inserted.
The general discovery process is the same whether you are working with a SATA, SCSI, or IDE drive. You should adapt your techniques to suit the hardware you encounter.
To begin the discovery process for a drive, copy the image file onto your forensic workstation and then process it using one or several different forensic tools such as FTK, Encase, or ProDiscover.
Network Activity Files
Let's use an example case that involves the Internet and pictures. During your career as a forensic investigator, you may be called upon to investigate situations where an employee has illegally accessed and downloaded pictures of proprietary designs from a competitor's internal Web site and then used these designs in his or her own work.
After the forensic image has been added to your forensic computer, open your forensic software and start a case. Figure 6.1 shows the New Case Wizard from the AccessData Forensic Toolkit (FTK).
Figure 6.1: AccessData's Forensic Toolkit New Case Wizard
When a user logs on to a Windows XP, Vista, or Windows 7 system for the first time, a directory structure is created to hold that individual user's files and settings. This structure is called the profile. The profile creates a directory that has the same name as the user, along with various other folders and files.
Because this case involves searching for images that were downloaded .
This document provides an overview of a workshop on Google Cloud Platform presented by Javed Habib, GDSC Lead at IIT Bhilai. The workshop covers introduction to cloud computing and Google Cloud architecture, hands-on labs for Google Cloud storage options, APIs, Pub/Sub, security, big data analysis using Dataflow and BigQuery, machine learning with Vertex AI and AutoML, and networking and security on Google Cloud including VPCs, load balancing, and firewalls.
Integrating On-premises Enterprise Storage Workloads with AWS (ENT301) | AWS ...Amazon Web Services
AWS gives designers of enterprise storage systems a completely new set of options. Aimed at enterprise storage specialists and managers of cloud-integration teams, this session gives you the tools and perspective to confidently integrate your storage workloads with AWS. We show working use cases, a thorough TCO model, and detailed customer blueprints. Throughout we analyze how data-tiering options measure up to the design criteria that matter most: performance, efficiency, cost, security, and integration.
Similar to 219568662-QUICK-Cloud-Storage-Forensic-Analysis-Presentation.pptx (20)
Embedded machine learning-based road conditions and driving behavior monitoringIJECEIAES
Car accident rates have increased in recent years, resulting in losses in human lives, properties, and other financial costs. An embedded machine learning-based system is developed to address this critical issue. The system can monitor road conditions, detect driving patterns, and identify aggressive driving behaviors. The system is based on neural networks trained on a comprehensive dataset of driving events, driving styles, and road conditions. The system effectively detects potential risks and helps mitigate the frequency and impact of accidents. The primary goal is to ensure the safety of drivers and vehicles. Collecting data involved gathering information on three key road events: normal street and normal drive, speed bumps, circular yellow speed bumps, and three aggressive driving actions: sudden start, sudden stop, and sudden entry. The gathered data is processed and analyzed using a machine learning system designed for limited power and memory devices. The developed system resulted in 91.9% accuracy, 93.6% precision, and 92% recall. The achieved inference time on an Arduino Nano 33 BLE Sense with a 32-bit CPU running at 64 MHz is 34 ms and requires 2.6 kB peak RAM and 139.9 kB program flash memory, making it suitable for resource-constrained embedded systems.
UNLOCKING HEALTHCARE 4.0: NAVIGATING CRITICAL SUCCESS FACTORS FOR EFFECTIVE I...amsjournal
The Fourth Industrial Revolution is transforming industries, including healthcare, by integrating digital,
physical, and biological technologies. This study examines the integration of 4.0 technologies into
healthcare, identifying success factors and challenges through interviews with 70 stakeholders from 33
countries. Healthcare is evolving significantly, with varied objectives across nations aiming to improve
population health. The study explores stakeholders' perceptions on critical success factors, identifying
challenges such as insufficiently trained personnel, organizational silos, and structural barriers to data
exchange. Facilitators for integration include cost reduction initiatives and interoperability policies.
Technologies like IoT, Big Data, AI, Machine Learning, and robotics enhance diagnostics, treatment
precision, and real-time monitoring, reducing errors and optimizing resource utilization. Automation
improves employee satisfaction and patient care, while Blockchain and telemedicine drive cost reductions.
Successful integration requires skilled professionals and supportive policies, promising efficient resource
use, lower error rates, and accelerated processes, leading to optimized global healthcare outcomes.
Optimizing Gradle Builds - Gradle DPE Tour Berlin 2024Sinan KOZAK
Sinan from the Delivery Hero mobile infrastructure engineering team shares a deep dive into performance acceleration with Gradle build cache optimizations. Sinan shares their journey into solving complex build-cache problems that affect Gradle builds. By understanding the challenges and solutions found in our journey, we aim to demonstrate the possibilities for faster builds. The case study reveals how overlapping outputs and cache misconfigurations led to significant increases in build times, especially as the project scaled up with numerous modules using Paparazzi tests. The journey from diagnosing to defeating cache issues offers invaluable lessons on maintaining cache integrity without sacrificing functionality.
International Conference on NLP, Artificial Intelligence, Machine Learning an...gerogepatton
International Conference on NLP, Artificial Intelligence, Machine Learning and Applications (NLAIM 2024) offers a premier global platform for exchanging insights and findings in the theory, methodology, and applications of NLP, Artificial Intelligence, Machine Learning, and their applications. The conference seeks substantial contributions across all key domains of NLP, Artificial Intelligence, Machine Learning, and their practical applications, aiming to foster both theoretical advancements and real-world implementations. With a focus on facilitating collaboration between researchers and practitioners from academia and industry, the conference serves as a nexus for sharing the latest developments in the field.
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...IJECEIAES
Climate change's impact on the planet forced the United Nations and governments to promote green energies and electric transportation. The deployments of photovoltaic (PV) and electric vehicle (EV) systems gained stronger momentum due to their numerous advantages over fossil fuel types. The advantages go beyond sustainability to reach financial support and stability. The work in this paper introduces the hybrid system between PV and EV to support industrial and commercial plants. This paper covers the theoretical framework of the proposed hybrid system including the required equation to complete the cost analysis when PV and EV are present. In addition, the proposed design diagram which sets the priorities and requirements of the system is presented. The proposed approach allows setup to advance their power stability, especially during power outages. The presented information supports researchers and plant owners to complete the necessary analysis while promoting the deployment of clean energy. The result of a case study that represents a dairy milk farmer supports the theoretical works and highlights its advanced benefits to existing plants. The short return on investment of the proposed approach supports the paper's novelty approach for the sustainable electrical system. In addition, the proposed system allows for an isolated power setup without the need for a transmission line which enhances the safety of the electrical network
Advanced control scheme of doubly fed induction generator for wind turbine us...IJECEIAES
This paper describes a speed control device for generating electrical energy on an electricity network based on the doubly fed induction generator (DFIG) used for wind power conversion systems. At first, a double-fed induction generator model was constructed. A control law is formulated to govern the flow of energy between the stator of a DFIG and the energy network using three types of controllers: proportional integral (PI), sliding mode controller (SMC) and second order sliding mode controller (SOSMC). Their different results in terms of power reference tracking, reaction to unexpected speed fluctuations, sensitivity to perturbations, and resilience against machine parameter alterations are compared. MATLAB/Simulink was used to conduct the simulations for the preceding study. Multiple simulations have shown very satisfying results, and the investigations demonstrate the efficacy and power-enhancing capabilities of the suggested control system.
ACEP Magazine edition 4th launched on 05.06.2024Rahul
This document provides information about the third edition of the magazine "Sthapatya" published by the Association of Civil Engineers (Practicing) Aurangabad. It includes messages from current and past presidents of ACEP, memories and photos from past ACEP events, information on life time achievement awards given by ACEP, and a technical article on concrete maintenance, repairs and strengthening. The document highlights activities of ACEP and provides a technical educational article for members.
Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...IJECEIAES
Medical image analysis has witnessed significant advancements with deep learning techniques. In the domain of brain tumor segmentation, the ability to
precisely delineate tumor boundaries from magnetic resonance imaging (MRI)
scans holds profound implications for diagnosis. This study presents an ensemble convolutional neural network (CNN) with transfer learning, integrating
the state-of-the-art Deeplabv3+ architecture with the ResNet18 backbone. The
model is rigorously trained and evaluated, exhibiting remarkable performance
metrics, including an impressive global accuracy of 99.286%, a high-class accuracy of 82.191%, a mean intersection over union (IoU) of 79.900%, a weighted
IoU of 98.620%, and a Boundary F1 (BF) score of 83.303%. Notably, a detailed comparative analysis with existing methods showcases the superiority of
our proposed model. These findings underscore the model’s competence in precise brain tumor localization, underscoring its potential to revolutionize medical
image analysis and enhance healthcare outcomes. This research paves the way
for future exploration and optimization of advanced CNN models in medical
imaging, emphasizing addressing false positives and resource efficiency.
Use PyCharm for remote debugging of WSL on a Windo cf5c162d672e4e58b4dde5d797...shadow0702a
This document serves as a comprehensive step-by-step guide on how to effectively use PyCharm for remote debugging of the Windows Subsystem for Linux (WSL) on a local Windows machine. It meticulously outlines several critical steps in the process, starting with the crucial task of enabling permissions, followed by the installation and configuration of WSL.
The guide then proceeds to explain how to set up the SSH service within the WSL environment, an integral part of the process. Alongside this, it also provides detailed instructions on how to modify the inbound rules of the Windows firewall to facilitate the process, ensuring that there are no connectivity issues that could potentially hinder the debugging process.
The document further emphasizes on the importance of checking the connection between the Windows and WSL environments, providing instructions on how to ensure that the connection is optimal and ready for remote debugging.
It also offers an in-depth guide on how to configure the WSL interpreter and files within the PyCharm environment. This is essential for ensuring that the debugging process is set up correctly and that the program can be run effectively within the WSL terminal.
Additionally, the document provides guidance on how to set up breakpoints for debugging, a fundamental aspect of the debugging process which allows the developer to stop the execution of their code at certain points and inspect their program at those stages.
Finally, the document concludes by providing a link to a reference blog. This blog offers additional information and guidance on configuring the remote Python interpreter in PyCharm, providing the reader with a well-rounded understanding of the process.
3. Cloud computing
Cloud storage
Gartner Report (Kleynhans 2012)
Personal cloud will replace PC’s as the main
storage by 2014
Dropbox, Microsoft SkyDrive, and
Google Drive
PC; client software or browser
Portable devices; browser or apps
4. Criminals and victims data of interest
Virtualised, geographically disbursed and
transient
Technical and legal issues for investigators;
◦ Identification of data; i.e. service provider
◦ Username,
◦ Data in the account
◦ Difficult to prove ownership
◦ Data may be moved or erased before it can be
preserved
5. Objective 1: To examine current research published in
literature relating to cloud storage and identified cloud
storage analysis methodologies.
Objective 2: To develop a digital forensic analysis framework
that will assist practitioners, examiners, and researchers
follow a standard process when undertaking forensic analysis
of cloud storage services.
Objective 3: To conduct research using popular cloud storage
services; Dropbox, Microsoft SkyDrive, and Google Drive, and
determine whether there are any data remnants which assist
digital forensic analysis and investigations.
Objective 4: To examine the forensic implications of
accessing and downloading cloud stored data from popular
cloud storage services; Dropbox, Microsoft SkyDrive, and
Google Drive.
6. NIST (2011) definition of cloud computing
IaaS – Infrastructure as a Service – user
control
PaaS – Platform as a Service – OS provided
SaaS – Software as a Service – User has
limited control
Criminal use
Security of cloud services is well addressed
Mobile devices
7. Digital forensic analysis process
Common procedures for investigation
McClain (2011) Dropbox analysis
Chung et al. (2012) Dropbox, Google Docs,
Amazon S3 and Evernote
Zhu (2011) examines Skype, Viber, Mail,
Dropbox
Reese (2010) examines Amazon EBS
Clark (2011) examines Exif metadata in
pictures
8. Objectives not answered in literature
Need to conduct primary research
Q1 What data remnants result from the use of
cloud storage to identify its use?
H0 - There are no data remnants from cloud
storage use
H1 – There are remnants from cloud storage use
9. a) What data remains on a Windows 7 computer hard drive
after cloud storage client software is installed and used
to upload and store data with each hosting provider.
b) What data remains on a Windows 7 computer hard drive
after cloud storage services are accessed via a web
browser with each hosting provider?
c) What data is observed in network traffic when client
software or browser access is undertaken?
d) What data remains in memory when client software or
browser access is undertaken?
e) What data remains on an Apple iPhone 3G after cloud
storage services are accessed via a web browser with
each hosting provider?
f) What data remains on an Apple iPhone 3G after cloud
storage services are accessed via an installed application
from each hosting provider?
10. Q2 What forensically sound methods are
available to preserve data stored in a cloud
storage account?
◦ H0 the process of downloading files from cloud storage
does not alter the internal data or the associated file
metadata.
◦ H1 the process of downloading files from cloud storage
alters the internal file data and the associated file metadata.
◦ H2 the process of downloading files from cloud storage
does not alter the internal data, but does alter the file
metadata.
◦ H3 the process of downloading files from cloud storage
alters the internal data, but not the
associated file metadata.
11. Q2a) What data can be acquired and preserved
from a cloud storage account using existing
forensic tools, methodologies, and procedures
when applied to cloud storage investigations?
13. Prepare Virtual PC’s with Windows 7
Base (control) clean installation
Install Browser (Internet Explorer, Mozilla
Firefox, Google Chrome, Apple Safari)
Install Client Software and upload test files
Use browser to access account and view files
Use browser to access and download files
Use Eraser to erase files
Use CCleaner to remove browsing history
Use DBAN to erase virtual hard drive
15. Using the Framework to guide the process
Analysis of the VM images
In the Control VM’s; ‘Dropbox’ references
Client Software 1.2.52; encrypted, sample files
System Tray link to ‘launch Dropbox website’
Browser remnants
OS remnants; Prefetch information, Link Files,
$MFT, Registry, Thumbcache, Event logs
Network traffic; IP’s, URL client/web
RAM; password in cleartext
Eraser/CCleaner; left remnants
DBAN; all erased
16. iPhone 3G iOS 4.2.1 (using the framework)
◦ Base (control); nil located
◦ Browser; filenames in History.plist + URL
◦ Dropbox App; username in keychain.plist
Case study (used to illustrate findings)
◦ ‘Botnet’ hypothetical example describing finding
information on PC and iPhone re Dropbox
use
17. Conclusion;
◦ dbx files are now encrypted, earlier versions;
Filecache.db and config.db
◦ Password in cleartext in memory
◦ Process of booting a forensic image in a virtual
PC will synchronise and provide access to the
account without requiring a username or
password
Current Police investigation; located illicit
data being stored in a Dropbox account
(real world application of the research)
18. Using the Framework to guide the process
Analysis of the VM images
In the Control VM’s; ‘skydrive’ references
Client Software; SyncDiagnostics.log,
OwnerID.dat
OS remnants; Prefetch information, Link Files,
$MFT, Registry, Thumbcache, Event logs
Network traffic; IP’s, filenames
RAM; password in cleartext
Eraser/CCleaner; left remnants
DBAN; all erased
19. iPhone 3G iOS 4.2.1 (using the framework)
◦ Base (control); nil located
◦ Browser; OwnerID in URL, filenames in History.plist
◦ SkyDrive App; username in keychain.plist
Case study (used to illustrate findings)
◦ ‘IP Theft’ hypothetical example describing finding
information on PC and iPhone re SkyDrive
use
20. Conclusion;
◦ SyncDiagnostics.log and OwnerID.dat files
◦ Password in cleartext in memory
◦ Process of booting a forensic image in a virtual
PC may synchronise the files in an account.
Access to the account requires a password.
21. Using the Framework to guide the process
Analysis of the VM images
In the Control VM’s; ‘drive google’ references
Client Software; Sync_config.db and snapshot.db
Password in cleartext stored on Hard Drive
System Tray link to ‘visit Google Drive on the web’
OS remnants; Prefetch information, Link Files,
$MFT, Registry, Thumbcache, Event logs
Network traffic; IP’s, username
Eraser/CCleaner; left remnants
DBAN; all erased
22. iPhone 3G iOS 4.2.1 (using the framework)
◦ Base (control); nil located
◦ Browser; username in cookies, filenames in
History.plist
◦ Google Drive App; unable to install, need iOS 5
Case study (used to illustrate findings)
◦ ‘Steroid importation’ hypothetical example
describing finding information on PC and
iPhone re Google Drive use
23. Conclusion;
◦ sync_config.db and snapshot.db files files
◦ Password in cleartext in RAM and on Hard Drive
◦ System Tray link to ‘visit Google Drive on the
web’
◦ Process of booting a forensic image in a virtual
PC will give full access to an account without
requiring a username or password
24. No documented process to collect data once
identified
Some jurisdictions have legal power to
secure data accessible at the time of serving
a warrant, such as 3LA Crimes Act 1914
Tested in VM with Dropbox, Microsoft
SkyDrive, and Google Drive
Access via Browser and Client Software
No change to files (Hash values same after
downloading when compared with original)
25. Times and Dates change;
Last Accessed File Created Last Written Entry Modified
Dropbox browser Last Written (UTC) Last Written (UTC) unZIP time unZIP time
client download time download time same download time
Google browser 1/01/1980 1/01/1980 unZIP time unZIP time
Drive client last written download time same download time
SkyDrive browser upload date/time upload date/time unZIP time unZIP time
client download time download time same download time
26. Q1 = H1
There are remnants from cloud storage use
which enable the identification of the service,
a username, or file details.
Q2 = H2
The process of downloading files from cloud
storage does not alter the internal data, but
does alter the file metadata.
27. Identified software files for each service, e.g.
◦ SyncDiagnostics.log – SkyDrive
◦ Snapshot.db – Google Drive
◦ Filecache.db – Dropbox
Identified OS remnants;
◦ Prefetch
◦ Link files
◦ Registry
Identified Browser History remnants
No change to access and download files
Difference in timestamps for downloaded files
Process to boot PC in a VM
28. Other cloud storage services;
◦ Amazon S3, iCloud, and UbuntuOne
Physical iPhone extract compared to logical
extract
Android, Windows Mobile devices
Apple iOS 5 devices
Further test the framework
29. Quick, D & Choo, K-K R 2012. ‘Dropbox Analysis: Data
Remnants on User Machines’. Submitted to Digital
Investigation
Quick, D & Choo, K-K R 2012. ‘Digital Droplets: Microsoft
SkyDrive forensic data remnants’. Submitted to Future
Generation Computer Systems
Quick, D & Choo, K-K R 2012. ‘Forensic Collection of Cloud
Storage Data from a Law Enforcement Perspective’. Submitted
to Computers & Security
Quick, D & Choo, K-K R 2012. ‘Google Drive: Forensic
Analysis of data remnants’. Submitted to Journal of Network
and Computer Applications
30. Chung, H, Park, J, Lee, S & Kang, C (2012), Digital Forensic Investigation of
Cloud Storage Services, Digital Investigation
Clark, P (2011), 'Digital Forensics Tool Testing–Image Metadata in the Cloud',
Department of Computer Science and Media Technology, Gjøvik University
College.
Kleynhans, S (2012), The New Pc Era- the Personal Cloud, Gartner Inc,
McClain, F (2011), Dropbox Forensics, updated 31 May 2011, Forensic Focus
McKemmish, R (1999), 'What Is Forensic Computing?', Trends and Issues in
Crime and Criminal Justice, Australian Institute of Criminology, vol. 118, pp.
1-6.
NIST (2011), Challenging Security Requirements for Us Government Cloud
Computing Adoption (Draft), U.S. Department of Commerce.
Ratcliffe, J (2003), 'Intelligence-Led Policing', Trends and Issues in Crime and
Criminal Justice vol. 248, pp. 1-6
Reese, G (2010), Cloud Forensics Using Ebs Boot Volumes, Oreilly.com
Zhu, M (2011), 'Mobile Cloud Computing: Implications to Smartphone
Forensic Procedures and Methodologies', AUT University.
Editor's Notes
This presentation provides an overview of the thesis ‘Cloud Storage Forensic Analysis’ by Darren Quick - 28 October 2012. Supervised by Dr Kim-Kwang Raymond Choo.
This presentation follows the same structure as the thesis;
The first section introduces the topic; cloud storage forensic analysis.
Section two explains the literature review.
Section three details the research method, questions and hypotheses.
Section four outlines the proposed Digital Forensic Analysis Cycle
Sections 5, 6 and 7 explain the findings in relation to the experiments involving Dropbox, Microsoft SkyDrive, and Google Drive
Section 8 details the preservation experiment and results
Section 9 summarises the findings and the results of the experiments
Cloud computing describes computer resources available as a service over a network.
Cloud storage is a popular option for users to store electronic data and be able to access it via a range of Internet connected devices.
Gartner highlight the trend is shifting from a focus on PC’s to portable devices, and that a personal cloud will replace PC’s as the main storage by 2014 (Kleynhans 2012).
Dropbox, Microsoft SkyDrive, and Google Drive are all popular services that offer free storage.
These can be accessed via PC; client software or browser, and portable devices browser or apps.
Criminals and victims data may be stored in the cloud.
Data of interest may be virtualised, geographically disbursed and transient.
This presents technical and legal issues for law enforcement and security agencies.
Issues in relation to identification of data; including the associated service provider, username, and data held in the account.
In addition, it becomes difficult to prove ownership and who has accessed data.
If not identified in a timely manner, data may be moved or erased before it can be preserved.
The objectives of the research are outlined in the thesis introduction and consist of the following;
Objective 1: To examine current research relating to cloud storage and identified cloud storage analysis methodologies.
Objective 2: To develop a digital forensic analysis framework that will assist practitioners, follow a standard process when undertaking forensic analysis of cloud storage services.
Objective 3: To conduct research using popular cloud storage services and determine whether there are any data remnants which assist digital forensic analysis and investigations.
Objective 4: To examine the forensic implications of accessing and downloading cloud stored data from popular cloud storage services; Dropbox, Microsoft SkyDrive, and Google Drive.
The literature review examines current literature focusing on cloud storage and digital investigations. The first section in the thesis examines cloud computing and storage. The next sections provide an overview of digital investigations and implications with cloud storage.
The definition from NIST (2011) is used, which is; convenient, on demand network access to shared resources that can be rapidly provisioned with minimal management.
These are divided into IaaS, PaaS, and SaaS. With IaaS, the user has a lot of control such as choosing and managing the OS and software. With PaaS the OS is provided and the user installs and manages software, and with SaaS the software is provided, and the user has minimal control.
Criminals use cloud storage to store illicit data, and also target the services and data of victims.
The security of cloud services is well addressed, but forensic response and analysis remains an issue.
The growth in the use of mobile devices and the ability to access cloud storage is also an issue for investigators.
The digital forensic analysis process, as defined by McKemmish (1999) is a process of; identification, preservation, analysis, and presentation.
It has been identified there is a need for common processes and procedures for cloud storage investigation.
Literature of note include;
McClain (2011) who examines Dropbox analysis, but the focus is on a previous version of the client software, and since October 2011 the database files are encrypted.
Chung et al. (2012) examine Dropbox, Google Docs, Amazon S3 and Evernote. Their research is of a wide scope, but doesn’t include SkyDrive, Google Drive, or other browsers, and is also an earlier version of Dropbox which is not encrypted.
Zhu (2011) examines Skype, Viber, Mail, Dropbox; but the focus is on mobile devices only.
Reese (2010) examines Amazon EBS, but this is not applicable to cloud storage.
Clark (2011) examines Exif metadata in pictures, so is quite narrow in it’s focus.
It is concluded that the four objectives were not answered in literature.
Hence there is a need to conduct primary research.
From the objectives, two research questions were outlined;
Question 1 - What data remnants result from the use of cloud storage to identify its use?
This leads to the two hypotheses;
H0 - There are no data remnants from cloud storage use to identify the service provider, username, or file details.
H1 – There are remnants from cloud storage use which enable the identification of the service, a username, or file details.
The following sub questions from Q1 were also outlined;
Q1a) What data remains on a Windows 7 computer hard drive after cloud storage client software is installed and used to upload and store data with each hosting provider.
Q1b) What data remains on a Windows 7 computer hard drive after cloud storage services are accessed via a web browser with each hosting provider?
Q1c) What data is observed in network traffic when client software or browser access is undertaken?
Q1d) What data remains in memory when client software or browser access is undertaken?
Q1e) What data remains on an Apple iPhone 3G after cloud storage services are accessed via a web browser with each hosting provider?
Q1f) What data remains on an Apple iPhone 3G after cloud storage services are accessed via an installed application from each hosting provider?
Research Question Two is;
What forensically sound methods are available to preserve data stored in a cloud storage account?
This leads to the following four alternative hypotheses;
H0 the process of downloading files from cloud storage does not alter the internal data or the associated file metadata.
H1 the process of downloading files from cloud storage alters the internal file data and the associated file metadata.
H2 the process of downloading files from cloud storage does not alter the internal data, but does alter the file metadata.
H3 the process of downloading files from cloud storage alters the internal data, but not the associated file metadata.
A sub question from Q2 is “What data can be acquired and preserved from a cloud storage account using existing forensic tools, methodologies, and procedures when applied to cloud storage investigations?”
The research experiment was undertaken using Virtual PC’s to create various circumstances of accessing cloud storage services. The use of Virtual systems allowed for a wider range of circumstances to be created and analysed than would be possible with physical hardware.
In the experiment, the VM’s are forensically preserved and analysed for data remnants.
The block diagram summaries the scope, from a control installation, each popular service is chosen, and VM’s created with control data using client software and four popular browsers. The Memory, Network data, and hard drives are preserved for analysis.
An Apple iPhone is also used to conduct analysis of the client applications and browser access to the three services
The experiment encompasses a range of circumstances;
Using Virtual PC’s with Windows 7 home basic;
Start with a Base (control) clean installation
Install the selected browser (Internet Explorer, Mozilla Firefox, Google Chrome, and Apple Safari)
Install Client Software and upload test files
Use the browser to access the research account and view the files
Use the browser to access the research account and download files
Use Eraser to erase the downloaded files
Use CCleaner to remove file and browsing history
Use DBAN to erase the virtual hard drive
This was done for each service with each browser, resulting in 96 VM’s, memory captures, and Network capture files
As identified in the literature review, there is a need to define a process for analysis of cloud environments. The proposed framework builds upon the process outlined by McKemmish (1999) and includes processes from intelligence analysis (Ratcliffe 2003).
The process is cyclic, and can break off from the main cycle to return to previous steps for newly identified data, as indicated with the internal arrows.
The scope is outlined to focus the investigation.
Preparation of equipment and practitioners, and response if necessary.
Data is identified and collected.
Data is preserved in forensically sound methods, such as write blocking and hash comparisons.
Analysis is conducted, which may identify additional data, hence the process breaks for the new data to prepare, identify, collect and preserve.
Meanwhile, the analysis continues.
Presentation is a standard step, and usually completes the process.
However, Feedback and review is important to ensure the investigation is complete.
A final decision should also be made to finalise the investigation, and decide if further enquiries are necessary, otherwise the files and data are archived for retrieval if needed.
The proposed framework was applied to the analysis of Dropbox, using the methodology outlined earlier.
Dropbox references were found in the control media, hence undertaking a keyword search for ‘dropbox’ will not be conclusive.
The client software database files appear to be encrypted in version 1.2.52, unlike previous versions of the software.
There is an icon in the system tray which when selected launches a browser with full access to the account, without needing a password or username.
Sample files were installed in the process which can be used to identify the presence of the software.
There were a range of remnants when a browser was used to access an account, in addition there were a lot of remnants found in OS files, such as prefetch, $MFT, Link files and registry.
Data was observed in the network traffic, but was mainly encrypted.
The password was observed in cleartext in memory captures.
Anti forensic software did not remove the data remnants.
A full erase of the hard drive did remove the remnants.
Next an iPhone was used to identify remnants, again using the proposed analysis cycle.
There was no information in the control image.
Filenames were located when the browser was used.
The username and filenames were located when the client software was used.
In the thesis, a case study was used to illustrate the findings in relation to Dropbox.
In relation to Dropbox, the conclusion reached was that there are data remnants of interest, and outlined in the theses. For the earlier versions of the client software, the two database files are important, but in version 1.2.52 the files are encrypted.
The password was observed in cleartext.
A process of booting a forensic image in a virtual system allows for access to a user account without knowing the username or password.
A real world application of the research was in a current Police investigation, illicit data being hosted in a Dropbox account was identified using the information from this research. The investigation is ongoing, hence details of the investigation cannot be discussed.
Again, using the framework, this time with SkyDrive, using the methodology outlined earlier.
SkyDrive references were found in the control media, hence undertaking a keyword search for ‘skydrive’ will not be conclusive.
SyncDiagnostics.log lists the files uploaded and downloaded, Owner information and dates and times. The OwnerID file lists the storage locations on the hard drive.
There were a range of remnants when a browser was used to access an account, in addition there were a lot of remnants found in OS files, such as prefetch, $MFT, Link files and registry.
Data was observed in the network traffic, but was mainly encrypted.
The password was observed in cleartext in memory captures.
Anti forensic software did not remove the data remnants.
A full erase of the hard drive did remove the remnants.
Next an iPhone was used to identify remnants, again using the proposed analysis cycle.
There was no information in the control image.
The OwnerID and filenames were located when the browser was used.
The username and filenames were located when the client software was used.
In the thesis, a case study was used to illustrate the findings in relation to SkyDrive.
In relation to SkyDrive, the conclusion reached was that there are data remnants of interest, and outlined in the theses.
The two files identified have data which may be important to an investigation.
The password was observed in cleartext in memory.
Booting a forensic image in a VM may synchronise the files in an account, however, access to the account requires a password – which is good for security.
Again, using the framework, this time with Google Drive, using the methodology outlined earlier.
“drive google” references were found in the control media, hence undertaking a keyword search for this will not be conclusive.
Sync_config.db and snapshot.db list the files uploaded and downloaded, owner information and dates and times.
There were a range of remnants when a browser was used to access an account, in addition there were a lot of remnants found in OS files, such as prefetch, $MFT, Link files and registry.
The password was observed in cleartext on the hard drive and in memory captures.
When running a forensic image as a VM, selecting a link in the system tray allowed full access to an account without requiring a username or password.
Data was observed in the network traffic, but was mainly encrypted.
Anti forensic software did not remove the data remnants.
A full erase of the hard drive did remove the remnants.
Next an iPhone was used to identify remnants, again using the proposed analysis cycle.
There was no information in the control image.
Filenames were located when the browser was used.
The client software was unable to be installed to the iPhone used, hence an opportunity for future research.
In the thesis, a case study was used to illustrate the findings in relation to Google Drive.
In relation to Google Drive, the conclusion reached was that there are data remnants of interest, and outlined in the theses.
The two files identified have data which may be important to an investigation.
The password was observed in cleartext on the hard drive and in memory.
It is possible to run a forensic image in a VM and get full access to an account without knowing the username or password from the client software.
As identified in the literature review, these is a need for a process to collect identified data.
Australia has legislation to collect data, such as section 3LA of the Crimes Act 1914.
Experiments were conducted with control VM systems to preserve data from research accounts with the three providers.
Access was undertaken using a browser and using client software, which was then compared with the original files.
There were no changes to the original files hash values, hence no change to the internal data.
There were changes to the associated dates and times, as per the table. For example, if downloading a file from a Google Drive account using a browser, the created date on the file will be 1/1/1980, and not the created date from the original file. The only date/time value the same as the original was when using the client software, the last written times were the same as the original file.
These changes must be understood by an examiner, otherwise the information may be misinterpreted, and incorrect conclusions made.
In the thesis, the final chapter lists each question and sub question and how each has been addressed.
To summarise;
In relation to research question 1, the correct hypotheses is H1, there are remnants from cloud storage use.
In relation to question 2, preserving data by accessing an account does not change the internal data, but there are changes to the associated timestamps of the files when they are downloaded, therefore H2 is correct.
The main contributions of the thesis are;
The identification of files which store information which may be relevant to an investigation for each service provider, for example the SyncDiagnostics.log file for SkyDrive.
Identified that there are a range of data remnants on a Windows 7 PC hard drive, such as in Prefetch files, link files, $MFT, Registry, etc.
Identified there are a range of data remnants in the browser histories for the popular browsers.
Identified that accessing and downloading files from an account does not alter the contents of the files, verified with the hash value.
However, the timestamps of the downloaded files are different to the original files, and must be considered when forming conclusions. As per the table on slide 24.
A process to access an account in a forensically sound manner was also outlined, and if client software has been pre-installed it will provide access to the files in an account for Google Drive and Dropbox; or if the username and password were located during analysis, this can be used. Legal considerations must be met to ensure accessing the account is permitted, ie. 3LA of the Crimes Act (Cth).
Research opportunities identified include;
Other (less) popular cloud storage providers, such as Amazon S3, Apple iCloud, and UbuntuOne.
Compare a physical iPhone extract to the logical extracts undertaken in this research.
Examine other portable device operating systems, such as Android and Windows Mobile.
Examine the latest Apple iOS
These could all serve to further assess the proposed framework.
The listed four papers were based on chapters in the thesis, and have been submitted for peer review. All four are currently under consideration.