The document describes a new model called the investigative lab for conducting digital investigations more efficiently through collaboration. The model involves assembling all available digital evidence into a central location, using automated workflows to process the most relevant evidence, and then dividing the processed evidence into review sets for different investigators and experts to analyze based on their specialties. This allows an investigative team to dramatically increase the volume of digital evidence analyzed within set timeframes compared to traditional methods relying on solo forensic experts.
The EDRM Enron data set is an industry-standard collection of email data that the
legal profession has used for many years for electronic discovery training and testing.
Since this data set was published, it has been an open secret that it contained many
instances of private, health and financial data.
In this paper, we will discuss a model for setting up an investigative lab that allows digital forensic specialists, non-technical investigators and subject matter experts to collaborate on digital evidence. The end result is a dramatic increase in the volume and quality of digital
evidence an investigative team can analyze within a fixed time.
Bridging the gap between mobile and computer forensicsNina Ananiasvili
Mobile devices are becoming an increasingly integral part of criminal, legal, and regulatory investigations and disclosures.
However, computers and mobile devices are often examined separately by different people, often due to technical and procedural reasons. That can make it almost impossible to identify and review evidence and intelligence across multiple data sources, devices, and crime scenes. Only when we look at all of the devices at the same time will we start to see the complete picture.
In this webinar, we will look at some of the trends and challenges in acquiring and analysis mobile devices and will discuss:
- What we can expect to recover from mobile devices today
- What this data looks like when reviewed using Nuix
- Techniques and workflows for optimising investigations that include mobile devices, computers, and cloud-based evidence.
Interplay of Digital Forensics in eDiscoveryCSCJournals
Digital forensics is often confused with eDiscovery (electronic discovery). However, both the fields are highly independent of the other but slightly overlap to assist each other in a symbiotic relationship. With decreasing costs of cloud storage, growing Internet speeds, and growing capacity of portable storage media, their chances of being used in a crime have grown. Sifting through large volumes of evidential data during eDiscovery or forensically investigating them requires teams from both these fields to work together on a case. In this paper, the authors discuss the relationship between these disciplines and highlight the digital forensic skills required, sub-disciplines of digital forensics, the possible electronic artifacts that can be encountered in a case, and the forensic opportunities relative to the eDiscovery industry. Lastly, the authors touch upon the best practices in digital evidence management during the eDiscovery process.
Surviving Technology 2009 & The ParalegalAubrey Owens
Talking technology with Paralegal Studies Students at J. Sargent Reynolds Community College on February 25, 2009. Using Software as a Solution for information mangament through trial presentation.
Electronic Document Management And DiscoveryRonald Coleman
Presentation given as part of Delaware Bar Association Computer Law Section CLE program, "E-Commerce law: Critical Legal and Business Issues."
Many of the particulars of this presentation are relatively obsolete now.
The EDRM Enron data set is an industry-standard collection of email data that the
legal profession has used for many years for electronic discovery training and testing.
Since this data set was published, it has been an open secret that it contained many
instances of private, health and financial data.
In this paper, we will discuss a model for setting up an investigative lab that allows digital forensic specialists, non-technical investigators and subject matter experts to collaborate on digital evidence. The end result is a dramatic increase in the volume and quality of digital
evidence an investigative team can analyze within a fixed time.
Bridging the gap between mobile and computer forensicsNina Ananiasvili
Mobile devices are becoming an increasingly integral part of criminal, legal, and regulatory investigations and disclosures.
However, computers and mobile devices are often examined separately by different people, often due to technical and procedural reasons. That can make it almost impossible to identify and review evidence and intelligence across multiple data sources, devices, and crime scenes. Only when we look at all of the devices at the same time will we start to see the complete picture.
In this webinar, we will look at some of the trends and challenges in acquiring and analysis mobile devices and will discuss:
- What we can expect to recover from mobile devices today
- What this data looks like when reviewed using Nuix
- Techniques and workflows for optimising investigations that include mobile devices, computers, and cloud-based evidence.
Interplay of Digital Forensics in eDiscoveryCSCJournals
Digital forensics is often confused with eDiscovery (electronic discovery). However, both the fields are highly independent of the other but slightly overlap to assist each other in a symbiotic relationship. With decreasing costs of cloud storage, growing Internet speeds, and growing capacity of portable storage media, their chances of being used in a crime have grown. Sifting through large volumes of evidential data during eDiscovery or forensically investigating them requires teams from both these fields to work together on a case. In this paper, the authors discuss the relationship between these disciplines and highlight the digital forensic skills required, sub-disciplines of digital forensics, the possible electronic artifacts that can be encountered in a case, and the forensic opportunities relative to the eDiscovery industry. Lastly, the authors touch upon the best practices in digital evidence management during the eDiscovery process.
Surviving Technology 2009 & The ParalegalAubrey Owens
Talking technology with Paralegal Studies Students at J. Sargent Reynolds Community College on February 25, 2009. Using Software as a Solution for information mangament through trial presentation.
Electronic Document Management And DiscoveryRonald Coleman
Presentation given as part of Delaware Bar Association Computer Law Section CLE program, "E-Commerce law: Critical Legal and Business Issues."
Many of the particulars of this presentation are relatively obsolete now.
The goal of this white paper is to provide an introduction to the key areas involved in developing an e-discovery capability and to help organizations plan to become better prepared for the rigors of the e-discovery process. Note that the goal of this report is not to offer legal advice or legal opinions on specific legal issues related to e-discovery, and it should not be used in this manner.
Design for A Network Centric Enterprise Forensic SystemCSCJournals
Increased profitability and exposure of enterprise’s information incite more attackers to attempt exploitation on enterprise network, while striving not to leave any evidences. Although the area of digital forensic analysis is evolving to become more mature in the modern criminology, the scope of network and computer forensics in the large-scale commercial environment is still vague. The conventional forensic techniques, consisting of large proportion of manual operations and isolated processes, are not adequately compatible in modern enterprise context. Data volume of enterprise is usually overwhelming and the interference to business operation during the investigation is unwelcomed. To evidence and monitor these increasing and evolving cyber offences and criminals, forensic investigators are calling for more comprehensive forensic methodology. For comprehension of current insufficiencies, this paper starts from the probes for the weaknesses of various preliminary forensic techniques. Then it proposes an approach to design an enhanced forensic system that integrates the network distributed system concept and information fusion theory as a remedy to the drawbacks of existing forensic techniques.
BoyarMiller presentation to the Houston Young Lawyers Association (HYLA) about best practices for electronic forensic examinations. http://www.boyarmiller.com/news-and-publications/presentations/trade-secret-theft-digital-age-electronic-forensic-examination/
Lima - Digital Forensic Case Management SystemIntaForensics
The most effective Digital Forensic & E-Discovery Case Management System available commercially. "Best Buy" Award Winner in SC Magazine's Digital Forensic Tools Group Test 2013. Widely used in Law Enforcement, Commercial and Government Organisations in the USA, UK, EU, Canada & India
The goal of this white paper is to provide an introduction to the key areas involved in developing an e-discovery capability and to help organizations plan to become better prepared for the rigors of the e-discovery process. Note that the goal of this report is not to offer legal advice or legal opinions on specific legal issues related to e-discovery, and it should not be used in this manner.
Design for A Network Centric Enterprise Forensic SystemCSCJournals
Increased profitability and exposure of enterprise’s information incite more attackers to attempt exploitation on enterprise network, while striving not to leave any evidences. Although the area of digital forensic analysis is evolving to become more mature in the modern criminology, the scope of network and computer forensics in the large-scale commercial environment is still vague. The conventional forensic techniques, consisting of large proportion of manual operations and isolated processes, are not adequately compatible in modern enterprise context. Data volume of enterprise is usually overwhelming and the interference to business operation during the investigation is unwelcomed. To evidence and monitor these increasing and evolving cyber offences and criminals, forensic investigators are calling for more comprehensive forensic methodology. For comprehension of current insufficiencies, this paper starts from the probes for the weaknesses of various preliminary forensic techniques. Then it proposes an approach to design an enhanced forensic system that integrates the network distributed system concept and information fusion theory as a remedy to the drawbacks of existing forensic techniques.
BoyarMiller presentation to the Houston Young Lawyers Association (HYLA) about best practices for electronic forensic examinations. http://www.boyarmiller.com/news-and-publications/presentations/trade-secret-theft-digital-age-electronic-forensic-examination/
Lima - Digital Forensic Case Management SystemIntaForensics
The most effective Digital Forensic & E-Discovery Case Management System available commercially. "Best Buy" Award Winner in SC Magazine's Digital Forensic Tools Group Test 2013. Widely used in Law Enforcement, Commercial and Government Organisations in the USA, UK, EU, Canada & India
Technical specialist Tom Miseur conducted a webinar discussing the basics of getting started with performance and load testing. Learn how to create a PTP (performance test plan), define requirements and objectives, define test scope and approach, and then finally how to create, execute, and analyze test results.
Outlines how the scope can be widened through the use of Preceptive Software for optimisation of processes and reduced costs in conjunction with our proffesional services
Learn more about electronics recycling in the Washington D.C. Metro Area which is home to a booming technology sector and two states with current electronics recycling laws. Electronics recycling protects the environment and is good for the economy.
Presence Agent y Presence Scripting para personas con limitaciones visualesPresence Technology
Presence es el primer proveedor de Tecnología para Contact Centers en Colombia en integrarse con JAWS, lector de pantalla para personas con limitaciones Visuales.
Josh Moulin describes his experience building a mobile digital forensic lab on a small budget. This article discusses the effectiveness and efficiencies gained by having a mobile digital lab as well as some of the considerations when building one.
A Formal Two Stage Triage Process Model (FTSTPM) for Digital Forensic PracticeCSCJournals
Due to the rapid increase of digital based evidence, the requirement for the timely identification, examination and interpretation of digital evidence is becoming more essential. In certain investigations such as child abductions, pedophiles, missing or exploited persons, time becomes extremely important as in some cases, it is the difference between life and death for the victim. Moreover, the growing number of computer systems being submitted to digital forensic laboratories is creating a backlog of cases that can delay investigations and negatively affect public safety and the criminal justice system. To deal with these problems, there is a need for more effective ‘onsite’ triage methods to enable the investigators to acquire information in a timely manner, and to reduce the number of computer systems that are submitted to DFLs for analysis. This paper presents a Formal Two-Stage Triage Process Model fulfilling the needs of an onsite triage examination process.
Predict Conference: Data Analytics for Digital Forensics and CybersecurityMark Scanlon
Information overload is one of the biggest problems facing professionals working in the fields of Digital Forensics and Cybersecurity. The sheer volume of cases requiring digital forensic analysis in law enforcement agencies throughout the world is outstripping the capacities of digital forensic laboratories. This has resulted in huge digital evidence backlogs becoming commonplace and cases being ruled upon in court without the inclusion of potentially pertinent information, which is sitting idle in some evidence store. As is commonly relayed in the media, the frequency of cyberattacks being faced by governments, law enforcement agencies, and industry is increasing, alongside the sophistication of the techniques used. Current rules-based network intrusion detection systems are predominantly based on historic, known threat vectors and result in a very high amount of false positive alerts being generated. Intelligent, real-time, automated data processing and event categorisation is one solution that shows great promise to combat this information overload.
The final section of the Digital Forensics journal article by Ga.pdfjyothimuppasani1
The final section of the \"Digital Forensics\" journal article by Garfinkel lists many of the
technical and personnel challenges to modern digital forensics. Using that section (Forging
Ahead, page 4) and your text, write a discussion board post describing the challenges to digital
forensics that you think are the most difficult to solve. Can you think of any other challenges to
digital forensics that are not mentioned? Could you see yourself working in the digital forensics
space as a future career? Forging Ahead For all its power, digital forensics faces stark challenges
that are likely to grow in the coming years. Today\'s computers have on average 1,000 times
more storage but are only 100 times faster than the high-end workstations of the early 1990s, so
there is less computing power available to process each byte of memory. The number of cases in
which digital evidence is collected is rising far faster than the number of forensic investigators a
to do the examinations. And police no realize that digital evidence can be used to solve crimes-
that is, as part of the investigation process- whereas in the past it was main ly a tool for assisting
in convictions. Cell phones may be equipped with self-destruct\" applications that wipe their data
if they receive a particular text, so it is now standard practice to store phones in a shielded metal
which blocks radio waves. But many cell phones wi forget their stored memory if left off for too
long, so the Faraday cages must be equipped with box, called a Faraday cage power strips and
cell phone chargers. Because many low-end cell phones have proprietary plugs, police must
seize chargers as well. However, some phones wi wipe their data if they can\'t call home,
whereas others will encrypt their data with algorithms too powerful for law enforcement to
decipher Further complicating the investigator\'s job is the emergence of cloud computing and
other technologies for storing data on the Internet. As a result of the doud, there is no way ensure
that a seized cell phone actually holds the suspect\'s data-the phone might simply be a tool for
accessing a remote server. A law enforcement professional who is authorized to search a device
may not have legal authority to use information on that device to access remotely stored data.
Worse sti the data might be deleted in the meantime by one of the suspect\'s collaborators.
Despite its technical sophistication and reliance on the minutiae of digital systems, the single
biggest challenge facing digital forensics practice today has a decidedly human dimension: the
lack o lified people to serve as researchers and practitioners. Not merely the result of the general
tech shortage, the very nature of digital forensics makes staffing significantly harder than in
other disciplines. Because the field\'s mission is to understand any data that might be stored we
need individuals who have knowledge of both current and past computer systems, applications,
and data format.
Digital forensics research: The next 10 yearsMehedi Hasan
Today’s Golden Age of computer forensics is quickly coming to an end. Without a clear strategy for enabling research efforts that build upon one another, forensic research will fall behind the market, tools will become increasingly obsolete, and law enforcement, military and other users of computer forensics products will be unable to rely on the results of forensic analysis. This article summarizes current forensic research directions and argues that to move forward the community needs to adopt standardized, modular approaches for data representation and forensic processing.
@2010 Digital Forensic Research Workshop. Published by Elsevier Ltd. All rights reserved
This is not my original work. Copyright belongs to the original author. If there is any infringement, please contact us immediately, we will deal with it promptly.
Download DOC word file from below Links:
Link 1 :http://gestyy.com/eiT4WO
Link 2: http://fumacrom.com/RQUm
Disclaimer: Above doc file is only for education purpose only
Process of Digital forensics
Identification
Preservation
Analysis
4. Presentation and Reporting:
5. Disseminating the case:
What is acquisition in digital forensics?
How to handle data acquisition in digital forensics
Types of Digital Forensics
Disk Forensics
Network Forensics
Wireless Forensics
Database Forensics
What is digital evidence? , sources of digital evidence, types of digital evidence, the procedure for collecting digital evidence, records, digital vs physical evidence, controlling contamination.
The Emergence of Digital Forensic Bangaloreehackacademy
Now digital forensic Bangalore has emerged as a crucial field in the investigation and analysis of digital evidence for the purpose of solving crimes and providing evidence in legal proceedings.
Similar to The Investigative Lab - White Paper (20)
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
The Investigative Lab - White Paper
1. NUIX WHITE PAPER
THE INVESTIGATIVE LAB:
A MODEL FOR EFFICIENT COLLABORATIVE
DIGITAL INVESTIGATIONS
2. WHITE PAPER THE INVESTIGATIVE LAB: A MODEL FOR EFFICIENT COLLABORATIVE DIGITAL INVESTIGATIONS
PAGE 2
CONTENTS
Executive summary...............................................2
The digital forensic investigation impasse............3
Lessons from eDiscovery.......................................4
Case study: Serious Fraud Office...........................4
The investigative lab workflow model
for digital investigations.......................................5
Share the workload
—a production-line methodology.........................5
Ensure the right data gets reviewed
by the right people................................................6
Use advanced investigative techniques................6
Perform deep forensics only when necessary........6
EXECUTIVE SUMMARY
In this paper, we will discuss a model for setting up an investigative
lab that allows digital forensic specialists, non-technical investigators
and subject matter experts to collaborate on digital evidence. The
end result is a dramatic increase in the volume and quality of digital
evidence an investigative team can analyze within a fixed time.
For many years, digital forensic investigators have used specialist
forensic tools to “dig deep” into a handful of computers and other
evidence sources. Typically, investigations rely on a single digital
forensic specialist to examine these evidence sources one by one.
This approach relies on the digital forensic investigator, who is usually
not familiar with all the details or context of the case, to extract the
information he or she thinks is relevant from each device. As a result,
non-technical investigators and subject matter experts must rely on an
incomplete and subjective slice of the evidence.
By contrast, the model discussed in this paper enables investigative
teams to divide up digital evidence and spread the review workload
between many people. They can distribute different types of evidence
to the people most qualified to understand it and its context.
This methodology also provides opportunities to apply advanced
investigative techniques such as data visualization and near-duplicate
analysis, helping investigators look at the evidence from different
angles. While in-depth forensic analysis remains an essential tool, this
workflow limits its use to specific circumstances when it can deliver the
most value.
By adopting a collaborative investigation model like the one in this
paper, the United Kingdom’s Serious Fraud Office (SFO) increased the
volume of data it could process each year 20-fold. This methodology
also enabled the SFO’s investigation teams to respond much faster to
information requests from courts.
The end result is a dramatic increase in the volume
and quality of digital evidence an investigation team
can analyze within a fixed time
3. WHITE PAPER THE INVESTIGATIVE LAB: A MODEL FOR EFFICIENT COLLABORATIVE DIGITAL INVESTIGATIONS
PAGE 3
Digital investigators face a constant battle to find the truth
in ever larger, more varied and increasingly complex stores
of electronic evidence. At the same time they must balance
business demands such as reduced budgets and resources,
spiraling case backlogs and ever decreasing timescales.
These demands are made all the more challenging because
many digital investigations rely on workflows, processes and
tools that were designed before this mass explosion of data and
devices—in fact, they often hark back to a pre-digital age.
As the growing volume of data has stretched traditional forensic
tools to capacity, it becomes impossible to examine them all.
Digital forensic investigators may take arbitrary decisions as to
which evidence sources they analyze first—or at all.
Many investigative organizations do not follow standard
processes for each stage of this investigative workflow. For
example, digital forensic investigators may not handle all data
sources consistently during processing and analysis. They
might deliver the relevant information in different ways, such as
printing it out, copying it to a CD or flash drive, or placing it on a
computer where other stakeholders can search it.
A traditional investigation relies on two very different groups
of people:
• Case investigators—such as police detectives and corporate
fraud analysts—understand the wider context of the
investigation and examine crimes or investigative matters
from all angles. If an investigation has a digital component,
such as a computer recovered at a crime scene, an
investigator would call on the support of one or more digital
forensic investigators who would examine the computer and
they report their findings back to them
• Digital forensic investigators collect, image, process and
examine the collected data. They typically identify and
evidence digital material that they consider potentially
relevant to the case.
This division of roles (see Figure 1) is a major source of
inefficiency. More worryingly, it highlights a disconnection in
the investigation process. Very often case investigators view
digital forensic investigators as providing a supporting service,
rather than working with them collaboratively. For example, it
is common for a single digital forensic investigator to handle all
devices involved in an investigation. This solo digital forensic
investigator will typically examine each storage device in turn,
extracting the information he or she thinks is relevant then
preparing a report for the less technical investigative team.
This piecemeal approach is very inefficient, especially in
larger cases. Digital forensic investigators must make critical
decisions about the relevance of particular documents, email
messages or images—or even entire evidence sources—
often without knowing the broader details of the case.
This is something like the ancient parable about six
blindfolded men trying to describe an elephant. One is
touching the elephant’s trunk and thinks the elephant is like
a snake. Another is holding the elephant’s leg and thinks it
is like a tree, and so on. While they are all correct about the
individual parts, none of them can see the bigger picture
(see Figure 2).
The inability to understand the evidence in its broader
context can have knock-on effects when digital forensic
investigators must provide their findings to larger teams or
other stakeholders such as prosecutors or human resources,
legal or internal audit departments. These stakeholders
must often rely on the pieces of the puzzle that the forensic
investigator decided were most relevant.
THE DIGITAL FORENSIC INVESTIGATION IMPASSE
Focusing on one part at a time makes it hard
to see the elephant in the room
Figure 1: Typical division of roles in investigations with very little overlap
between those performing technical and non-technical tasks.
Figure 2: Focusing on one part at a time makes it hard to see the
elephant in the room.
Digital forensic
investigators
often working
behind closed
doors
Case investigators
managing the
entire investigation
and relying on
digital evidence to
help join the dots
4. WHITE PAPER THE INVESTIGATIVE LAB: A MODEL FOR EFFICIENT COLLABORATIVE DIGITAL INVESTIGATIONS
PAGE 4
Some investigative organizations have streamlined their processes
for handling digital evidence (see for example, the United Kingdom
Serious Fraud Office case study). In this area, investigators have
a great deal to learn from the way legal teams handle electronic
discovery, which typically involves even larger volumes of digital
evidence than investigations.
In many jurisdictions, courts require litigants to collaborate and
consider eDiscovery at the earliest opportunity. Doing so can help
keep the costs of the case proportionate to the amount at dispute. To
achieve this, discovery practitioners must quickly identify potentially
relevant material among massive volumes of digital evidence. They
must then place this potentially relevant material into the hands of
the experts such as senior lawyers, financial investigators or subject
matter experts.
Legal teams often use a tiered review system, assigning junior staff to
perform a “first cut” review of the material to eliminate the material
that is clearly not relevant. Rather than allowing these reviewers to
make arbitrary decisions, someone who has in-depth knowledge of the
case would create a pre-defined set of guidelines for them to follow.
This person may also review, validate or amend these decisions.
In this way, smaller and smaller volumes of more and more relevant
material are passed up the chain. The highly knowledgeable—and
usually highly paid—experts need only see the “hot” documents,
safe in the knowledge that someone has reviewed and classified all
other material.
This process is a very efficient way of classifying huge volumes of
material into relevant or not relevant bundles. For it to work, legal
teams must be able to:
• Divide up the available evidence into parcels for multiple people
to review
• Ensure each reviewer understands the ground rules for deciding
what is relevant
• Make the most relevant documents available for experts to analyze
and examine.
This approach is not new, even in investigative circles. For example, in
many complex criminal matters, rank-and-file detectives do the ground
work, such as identifying witnesses and evidence, before passing on
their findings to senior officers and subject matter experts for review.
However, it is rare for digital forensic investigators to follow this
process when dealing with electronic evidence, often because
traditional tools make it difficult to combine information from
multiple sources and make it available to non-technical investigators
or subject matter experts for review.
Investigators have a great deal to learn from the way legal teams handle electronic discovery,
which typically involves even larger volumes of digital evidence than investigations
LESSONS FROM eDISCOVERY
CASE STUDY:
SERIOUS FRAUD OFFICE
The collaborative investigation model and lab
workflow discussed in this paper draws on
the experience of Nuix’s Director of Forensic
Solutions Paul Slater at the United Kingdom
Serious Fraud Office (SFO). By adopting this
model, the SFO increased the volume of data
it could process each year 20-fold and made
it possible to deliver timely responses to
information requests from courts.
As interim head of the Digital Forensics Unit
from 2009 to the end of 2010, Slater helped to
standardize and streamline the SFO’s digital
investigative processes. The SFO reduced
its focus on in-depth forensics; created and
automated investigative workflows; and
developed a more collaborative approach
to investigations. This change in approach
helped to transform the SFO’s capabilities.
“While traditional computer forensics
techniques dig deep into a handful of
computers, [the SFO can now] quickly distil
the huge volumes of data captured in our
search operations and to focus on material
likely to have greatest evidential yield,” wrote
the SFO’s Chief Executive in its 2010-11 Annual
Report and Accounts.i
“We can now handle
up to 100GB of new information each day—a
2,000% increase year on year.
“It is also allowing us to respond quickly to
court requirements—in one case we were
able to identify and produce over 47,000
emails overnight to satisfy a judge’s order.
Such speed of response would have been
impossible before.”
5. WHITE PAPER THE INVESTIGATIVE LAB: A MODEL FOR EFFICIENT COLLABORATIVE DIGITAL INVESTIGATIONS
PAGE 5
TIER 1 High volume of material, relevance unknown, for initial filtering and review
TIER 2 Potentially relevant material for expert review
TIER 3 Highly relevant material
IMAGES AND VIDEOS
EMAILS
FINANCIAL RECORDS
DOCUMENTS AND
SPEADSHEETS
POTENTIALLY
RELEVANT IMAGES
AND VIDEOS
POTENTIALLY
RELEVANT
DOCUMENTS
POTENTIALLY
RELEVANT EMAILS
RELEVANT
IMAGES AND
VIDEOS
RELEVANT
EMAILS AND
DOCUMENTS
The investigative lab model is a way for investigators
to combine the efficiencies of the eDiscovery process
with the forensic rigor and provenance of traditional
digital investigation methodologies. It offers a more
efficient way of utilizing available resources by
making it possible to spread work between digital
and non-digital investigators and subject matter
experts, rather than being the sole responsibility
of often a single person. It also ensures that digital
forensic investigators handle each piece of evidence
using an agreed set of repeatable processes.
Although this workflow model is technology
agnostic, it requires a number of capabilities that are
not available in traditional forensic tools, such as:
• Conducting a light metadata scan of data sources
to rapidly assess their content and value to the
investigation
• Combining multiple evidence sources into a single
data set for analysis
• Processing multiple terabytes of evidence within
a reasonable time
• Supporting a wide range of file formats including
forensic image formats from multiple competing
technologies
• Dividing a large data set into sub-cases based
on criteria such as date, custodian, file format,
location or language, then recombining the
results into a single case.
Share the workload—a production-line methodology
A key shortcoming of the traditional approach is that digital forensic
investigators examine each evidence source individually. However,
cases often hinge on a particular type of evidence—such as
documents, emails or text messages—and the connections between
them across multiple sources. In addition, certain evidence types must
be reviewed by people who have particular expertise.
As a result, a necessary first step of this methodology is to assemble
all available evidence—forensic images, email and mobile phone
communications, loose files, documents and the rest—into a single
location.
Conducting a light metadata scan on all these evidence sources can
quickly help investigators choose the items they want to process in
greater depth. This forms part of the content-based forensic triage
process Nuix has discussed in a previous white paper, Content-Based
Forensic Triage: Managing Digital Investigations in the Age of Big Data.ii
Once the investigative team has chosen the evidence sources most likely
to be relevant, they can then process these devices following a set of
previously agreed standards and settings. Investigative organizations
can build a series of best practices or case-specific workflows which
automate many of the time-consuming and error-prone tasks that are
performed on each case. These include date range filtering, keyword
searching and tagging, identification and optical character recognition
(OCR) for non-text documents. The workflow can also include activities
to filter out irrelevant information such as duplicate items or system files.
This approach significantly reduces operator-level decisions and
inconsistencies around which files are processed and how, leading
to a consistent and repeatable outcome. By using this methodology,
investigative teams can rapidly trim down large evidence sets into
small numbers of highly relevant items for expert review (see Figure 3).
Using the collaborative approach and tiered review methodology, investigators can
quickly distil huge volumes of data into smaller, more manageable chunks
Figure 3: A tiered approach to
reviewing evidence.
THE INVESTIGATIVE LAB WORKFLOW MODEL FOR DIGITAL INVESTIGATIONS
6. WHITE PAPER THE INVESTIGATIVE LAB: A MODEL FOR EFFICIENT COLLABORATIVE DIGITAL INVESTIGATIONS
PAGE 6
Key evidence is more often found “hidden in plain sight”.
As a result, in-depth forensic analysis must become the exception rather than the rule.
Ensure the right data gets reviewed by the right people
The next phase of this investigative workflow involves dividing the
processed evidence into review sets. At its most basic, it can be a way
of sharing the work between multiple investigators to complete the
task faster. They may choose to divide up the evidence by date ranges,
custodians, location, language or content. However, there are many
other options.
In a fraud case, for example, investigators could pass on financial
records to forensic accountants and internet activity to technical
specialists. In an inappropriate images investigation, detectives
could package potentially relevant pictures and videos for specialist
child protection teams, while leaving other file types for their
digital forensic investigators. In multi-jurisdictional investigations,
investigative teams can produce evidence or intelligence packages for
third parties to review, comment on and return.
Use advanced investigative techniques
As we have discussed, this workflow borrows a number of ideas
from eDiscovery. Investigators can also use a number of technology-
assisted analysis techniques from the legal world to look at the
evidence from different angles.
For example, some investigative tools offer ways to visualize the data
and metadata. Common visualizations include timelines of document
or user histories; network maps of communications between
people; and mapping locations of photos or phone calls. These allow
investigators to quickly understand the who, what, where, when and
why of the evidence.
Another emerging technique is the ability to identify similar or near-
duplicate documents within a data set. By extracting and comparing
lists of overlapping phrases called “shingles,” investigative tools can
identify documents with similar content, and gauge how similar they
are. This can help investigators identify who created, received or sent
key emails, documents or attachments; analyzes how documents have
changed over time; or find related documents that use similar language.
Near-duplicate technology can also speed up the process of
identifying relevant evidence by connecting file fragments and
recovered deleted data to similar content found within live files to
build up the picture of events over time.
The list of shingles used to make near-duplicate comparisons has
another powerful use: increasing the relevance of keyword searches.
For example, fraud analysts search for words and phrases that indicate
an employee may have motivation, opportunity and rationalization
to commit fraud. However, these keyword searches may also bring
up many results that are not relevant. By instead searching the list
of shingles that contain words such as “cover up,” “write off,” “grey
area,” “not ethical” or “off the books,” investigators can quickly locate
longer phrases that will deliver highly relevant search results.
Perform deep forensics only when necessary
Many digital forensic investigators have been
reluctant to change their processes even as they
struggle with masses of digital material. One reason
is they believe the old-fashioned technique is the only
way to achieve the forensic rigor and deep technical
analysis that courts and other authorities require.
In our experience, nowadays the key evidence is
more often found “hidden in plain sight”—typically
in email, documents, spreadsheets or images. The
major impediment for digital forensic investigators
is that they can’t get across the volume of evidence
to find the facts that will prove or disprove the
case, especially if they conducting time-consuming
deep forensic analysis on every data source. As a
result, in-depth forensic analysis must become the
exception rather than the rule.
Using the collaborative approach and tiered review
methodology in the paper, investigators can
quickly distil huge volumes of data into smaller,
more manageable chunks. Once this technique
identifies the smoking gun or the elephant in the
room, investigators can pass this piece of data
back to specialist digital forensic investigators for
them to examine and provide provenance, validate
authenticity and produce to the court or authorities.
Alternatively, the review workflow may not locate the
crucial evidence but may provide strong clues as to
where digital forensic investigators should deep-dive
to try and find it.
i http://www.sfo.gov.uk/media/175084/resource-accounts-2010-11.pdf
ii http://nuix.com/PDFDownload.aspx?f=images/resources/White_Paper_Nuix_
Content-Based_Triage_US_WEB.pdf