This document summarizes the examiner's analysis of various SQLite files using SQLitebrowser and SQLiteman tools to identify internet artifacts. Key findings include:
1) Both tools allowed filtering of data through SQL queries to view time stamps in a human readable format and identify visited URLs and search terms.
2) SQLiteman provided additional visualization features like an "Item View" tab and BLOB previewer not available in SQLitebrowser.
3) While both tools produced identical results, SQLiteman presented output in a more organized format with extra analysis capabilities.
Hi semua, terima kasih sudah berkunjung kesini 😆 Semua file yang diupload adalah materi perkuliahan. Nah... materi ini dari dosen yang dikhususkan untuk teman-teman kelas #manabeve 💚
Biar gampang diakses, yah masukin sini aja kan😆 Sekalian membantu kalian yang mungkin butuh beberapa konten dalam file-file ini.
Jangan lupa di like yah 💙 Kalau mau dishare atau didownload PLEASE MINTA IZIN dulu oke??
Biar ngga salah paham cuy😆
ASK FOR PERMISSION ▶ itsmeroses@mail.ru
Kalau kesulitan untuk mendownload FEEL FREE untuk email ke aku🔝🔝🔝🔝
[DISCLAIMER] Mohon banget kalau udah didownload. Kemuadian ingin dijadikan materi atau referensi. Jangan lupa cantumkan sumbernya. Terima kasih atas pengertiannya💖
------------------------------------------------------------
Materi details :
Coming soon ")
------------------------------------------------------------
MEET CLASS FELLAS💚
Instagram ▶ https://www.instagram.com/manabeve
Blog ▶ https://manabeve.blogspot.com
Email ▶ manabeve@gmail.com
------------------------------------------------------------
LET'S BECOME FRIENDS WITH ME💜
Instagram ▶ https://www.instagram.com/ameldiana3
Twitter ▶ https://www.twitter.com/amlediana3
Description of a cartoon produced by a 2nd ESO student. They have worked about TV programmes and have learnt vocabulary on the topic. Practice of the past simple, present simple and some simple passive voice.
SplunkLive! Tampa: Splunk Ninjas: New Features, Pivot, and Search Dojo Splunk
Besides seeing the newest features in Splunk Enterprise and learning the best practices for data models and pivot, we will show you how to use a handful of search commands that will solve most search needs. Learn these well and become a ninja.
Besides seeing the newest features in Splunk Enterprise and learning the best practices for data models and pivot, we will show you how to use a handful of search commands that will solve most search needs. Learn these well and become a ninja.
Research: Developing an Interactive Web Information Retrieval and Visualizati...Roman Atachiants
Finding the needed information (images, articles or other) is not always as simple as going to a search en-
gine. This paper aim at developing an interactive presentation system, able to cope with live presentation
challenges.
This presentation illustrates DocIndex, InternetMiner and VisioDecompositer - my 3 proprietary test tools - and walks the user through how they are used effectively.
The tools are presented in the context of a Test Strategy and the emphasis is on HOW the tools are used and the rationale behind the esign of the tools.
View this presentation with SPEAKERS NOTES ON.
IEEE 2014 DOTNET CLOUD COMPUTING PROJECTS A scientometric analysis of cloud c...IEEEMEMTECHSTUDENTPROJECTS
To Get any Project for CSE, IT ECE, EEE Contact Me @ 09666155510, 09849539085 or mail us - ieeefinalsemprojects@gmail.com-Visit Our Website: www.finalyearprojects.org
Splunk Ninjas: New Features, Pivot, and Search DojoSplunk
Besides seeing the newest features in Splunk Enterprise and learning the best practices for data models and pivot, we will show you how to use a handful of search commands that will solve most search needs. Learn these well and become a ninja.
PHP UK 2020 Tutorial: MySQL Indexes, Histograms And other ways To Speed Up Yo...Dave Stokes
Slow query? Add an index or two! But things are suddenly even slower! Indexes are great tools to speed data lookup but have overhead issues. Histograms don’t have that overhead but may not be suited. And how you lock rows also effects performance. So what do you do to speed up queries smartly?
Hi semua, terima kasih sudah berkunjung kesini 😆 Semua file yang diupload adalah materi perkuliahan. Nah... materi ini dari dosen yang dikhususkan untuk teman-teman kelas #manabeve 💚
Biar gampang diakses, yah masukin sini aja kan😆 Sekalian membantu kalian yang mungkin butuh beberapa konten dalam file-file ini.
Jangan lupa di like yah 💙 Kalau mau dishare atau didownload PLEASE MINTA IZIN dulu oke??
Biar ngga salah paham cuy😆
ASK FOR PERMISSION ▶ itsmeroses@mail.ru
Kalau kesulitan untuk mendownload FEEL FREE untuk email ke aku🔝🔝🔝🔝
[DISCLAIMER] Mohon banget kalau udah didownload. Kemuadian ingin dijadikan materi atau referensi. Jangan lupa cantumkan sumbernya. Terima kasih atas pengertiannya💖
------------------------------------------------------------
Materi details :
Coming soon ")
------------------------------------------------------------
MEET CLASS FELLAS💚
Instagram ▶ https://www.instagram.com/manabeve
Blog ▶ https://manabeve.blogspot.com
Email ▶ manabeve@gmail.com
------------------------------------------------------------
LET'S BECOME FRIENDS WITH ME💜
Instagram ▶ https://www.instagram.com/ameldiana3
Twitter ▶ https://www.twitter.com/amlediana3
Description of a cartoon produced by a 2nd ESO student. They have worked about TV programmes and have learnt vocabulary on the topic. Practice of the past simple, present simple and some simple passive voice.
SplunkLive! Tampa: Splunk Ninjas: New Features, Pivot, and Search Dojo Splunk
Besides seeing the newest features in Splunk Enterprise and learning the best practices for data models and pivot, we will show you how to use a handful of search commands that will solve most search needs. Learn these well and become a ninja.
Besides seeing the newest features in Splunk Enterprise and learning the best practices for data models and pivot, we will show you how to use a handful of search commands that will solve most search needs. Learn these well and become a ninja.
Research: Developing an Interactive Web Information Retrieval and Visualizati...Roman Atachiants
Finding the needed information (images, articles or other) is not always as simple as going to a search en-
gine. This paper aim at developing an interactive presentation system, able to cope with live presentation
challenges.
This presentation illustrates DocIndex, InternetMiner and VisioDecompositer - my 3 proprietary test tools - and walks the user through how they are used effectively.
The tools are presented in the context of a Test Strategy and the emphasis is on HOW the tools are used and the rationale behind the esign of the tools.
View this presentation with SPEAKERS NOTES ON.
IEEE 2014 DOTNET CLOUD COMPUTING PROJECTS A scientometric analysis of cloud c...IEEEMEMTECHSTUDENTPROJECTS
To Get any Project for CSE, IT ECE, EEE Contact Me @ 09666155510, 09849539085 or mail us - ieeefinalsemprojects@gmail.com-Visit Our Website: www.finalyearprojects.org
Splunk Ninjas: New Features, Pivot, and Search DojoSplunk
Besides seeing the newest features in Splunk Enterprise and learning the best practices for data models and pivot, we will show you how to use a handful of search commands that will solve most search needs. Learn these well and become a ninja.
PHP UK 2020 Tutorial: MySQL Indexes, Histograms And other ways To Speed Up Yo...Dave Stokes
Slow query? Add an index or two! But things are suddenly even slower! Indexes are great tools to speed data lookup but have overhead issues. Histograms don’t have that overhead but may not be suited. And how you lock rows also effects performance. So what do you do to speed up queries smartly?
CMGT410 v19Project Charter TemplateCMGT410 v19Page 2 of 3P.docxmccormicknadine86
CMGT/410 v19
Project Charter Template
CMGT/410 v19
Page 2 of 3Project Charter Template
Instructions: With your team members, create a project charter using the template below. Some example information has been provided to assist you. Please replace all example information in the template with information specific to the project charter you and your fellow team members create.1.0 Project Identification
Name
Company Web interface for big data analytics
Description
Design, develop, and implement a data analytics application leveraging the millions of data records in company databases.
Sponsor
Matthew Sullivan, David Dosaima
Project Manager
Daniel Kathii
Project Team Resources
2.0 Business Reasons for Project
· Too produce company analytic assessments based on database records.
· To process and understand millions of data records to make more informed business decisions
· To allow for more efficient data record analysis
· To allow for more efficient trend analysis3.0 Project Objectives (Purpose)
· To develop a web interface and application leveraging existing data storage repositories
· To develop a web interface for complex data queries leveraging Boolean and elastic search capabilities
· To develop a nodal analysis tool to visualize data records and databases objects.4.0 Project Scope
· In scope:
· Initiate web server/infrastructure for online dealings
· Provide development and user customer services options
· Market place
· Develop enterprise object model ontology
· Develop object driven search and discovery based on simple, complex and temporal database queries
· Out of scope:
· 3rd party recruitment options 5.0 Requirements – User Stories
User Story
Description
Search and Discovery
“As <user role – data analyst > I want to <action – discover data records through complex search criteria> so that <goal reduce time of relevant data discovery>”
Data Visualization
“As <user role – data analyst > I want to <action – view data records and links in a nodal data chart> so that <goal – data relationships can be viewed>”
Account lookup
“As <user role – UI account > I want to <action – view account info> so that <goal – management can query user adoption>”
Temporal Search and Discovery
“As <user role – data analyst > I want to <action – query data based on temporal filters> so that <goal – data can be filtered by time and space>”
Data Organization
“As <user role – data analyst > I want to <action – save, organize and save persistent searches> so that <goal – users can be notified of new data that meets historical/saved searches>”
6.0 Milestone Dates
Item
Milestone
Date
I.
Stand up cloud test/development server
1st Dec 2019
II.
Develop UI prototype
15th Mar 2020
III.
Networking/Information event for all departments
15th Apr 2020
IV.
Develop department toolkit, templates, resources
15th May 2020
V.
Implementation and communication to stakeholder groups
15th Jul 2020
VI.
Website launch
12th Sep 2020
VII.
Evaluation, consultations, lessons learned
1st ...
Learn how to design a web solution that exploits the ASP.NET stack: in this talk we’ll find out how to set up an effective, idiomatic design that take advantage of both “out of the box” tools (e.g. MVC, Entity Framework) and bleeding edge, third party ones. Needing a SPA? We’ll understand how to take advantage of existing toolkits. Responsive design? Let’s talk Bootstrap looking at how it provides a useful and highly customizable taxonomy for UI elements. Having troubles implementing an efficient data access layer due to a lot of business rules? We’ll learn how to use LINQ as a mean to decompose those rules in simpler ones that can be composed in a flexible and efficient way. Are you concerned about performance and scalability issues? We’ll see how to implement CQRS in order to take advantage of ad hoc data models and introduce a service bus so to decouple front-end systems from back-end ones.
2. Page 2 of 8
Table of Contents
List of Illustrative Materials............................................................................................................ 3
Figures......................................................................................................................................... 3
Graded Lab Assessment.................................................................................................................. 4
Observations of Results and Findings......................................................................................... 4
SQLitebrowser ........................................................................................................................ 4
SQLitebrowser vs SQLiteman ................................................................................................ 5
Discussion of Results.................................................................................................................. 6
SQLitebrowser and SQLiteman.............................................................................................. 6
SQLitebrowser vs SQLiteman ................................................................................................ 7
Conclusion ...................................................................................................................................... 8
3. Page 3 of 8
List of Illustrative Materials
Figures
Figure 1: Filtered SQLitebrowser output data for the cookies.sqlite file ....................................... 4
Figure 2: Output for the keyword “pumpkin” on the cookies.sqlite file ........................................ 5
Figure 3: Output for visit count to websites and words used for Google search ............................ 5
Figure 4: Output for values entered found on the formhistory.sqlite file ....................................... 5
Figure 5: SQLite browser output for formhistory.sqlite ................................................................. 5
Figure 6: SQLiteman output for formhistory.sqlite ........................................................................ 6
Figure 7: SQLite browser time data in epoch time ......................................................................... 6
Figure 8: Converted Epoch output date and time ........................................................................... 7
Figure 9: Converted Epoch date and time and URL visit count ..................................................... 7
Figure 10: Item View tab within the SQLiteman tool .................................................................... 7
Figure 11: BLOB Preview Window within the sqliteman tool....................................................... 8
4. Page 4 of 8
Graded Lab Assessment
Observations of Results and Findings
In order to identify a user's activities an examiner will analyze the cookies, formhistory,
extensions, places, webappsstore, and search .sqlite files for internet artifacts. Artifacts
discovered throughout this investigation shall provide the examiner with information and insight
that will provide the examiner insight about the activities performed by the user.
Throughout the lab the examiner performed various exercises with the SQLiteman and
SQLbrowser tools in order to gain familiarity with the examining capabilities of each of the
tools. Using each of the tools on various sqlite files the examiner was able to examine and
identify various internet artifacts that were created by the user’s previous activities. In the lab
the examiner was provided with “test.zip” and was tasked with performing an analysis on the
provided .zip file. After the test.zip file was unzipped by the examiner, the folder "Firefox
Portable" was created within the examiner's WK6 directory. Prior to performing any analysis on
the provided data the examiner calculated and recorded the MD5 sum
(d0e7388cf7a5b68e9b65dfdd0ba0695a) of the extracted folder. By knowing the MD5 sum prior
to the analysis an examiner can then compare the calculated MD5 sum upon completion and
verify that none of the data was altered throughout the investigation.
SQLitebrowser
Using the SQLite browser tool the examiner examined the sqlite files named, “cookies, places,
formhistory” for specific internet artifacts within the sqlite data. The tool processed the sqlite
data files and displayed the information in a human readable format; allowing the examiner to
view and examine the data. In order to filter through all of the output data displayed by the tool
the examiner would select the “Execute SQL” tab and input SQL string logic that would filter
and alter the outputs of the tool. An example of this was performed when “SELECT host, value,
datetime(lastAccessed/1000000,"unixepoch") FROM moz_cookies” was entered into SQL
window of the tool, in order to modify the dateTime output into human readable format as seen
in figure 1.
Figure 1: Filtered SQLitebrowser output data for the cookies.sqlite file
Using these strings to filter through the .sqlite data allows the examiner to quickly search and
identify specific internet artifacts such as: website visits, keyword searches, various URL
information, and inputted values within forms. Various strings were entered into the SQL
window in order to filter specific artifacts. Examples of these modified outputs from the SQLite
browser tool can be seen in figures 2-4.
5. Page 5 of 8
Figure 2: Output for the keyword “pumpkin” on the cookies.sqlite file
Figure 3: Output for visit count to websites and words used for Google search
Figure 4: Output for values entered found on the formhistory.sqlite file
SQLitebrowservs SQLiteman
Within the lab the examiner was instructed to use the sqliteman tool to process and analyze the
formhistory.sqlite file that had been previously analyzed using the SQLitebrowser tool. The goal
of using both tools on the same sqlite data was to demonstrate to the examiner the differences
between both of the tools. From initial observations and comparisons of the tools the examiner
saw no variances or differences between the outputs of either of the tools. The biggest
differences seen by the examiner between the tools was how each of the tools presented their
output data for the formhistory.sqlite file. The SQLiteman appeared to be more organized in
how the output data was presented and had additional tabs and features that the SQLitebrowser
did not. As seen in figures 5 and 6 the SQLiteman tool possesses additional features that
SQLitebrowser does not possess for performing additional analysis on sqlite data.
Figure 5: SQLite browser output for formhistory.sqlite
6. Page 6 of 8
Figure 6: SQLiteman output for formhistory.sqlite
Discussion of Results
SQLitebrowserand SQLiteman
During the lab the examiner used the tools SQLitebrowser and SQLiteman to examine the
provided sqlite files for specific internet artifacts. Using the capability to input SQL strings into
both of the tools the examiner was able to examine and identify all of the specific internet
artifacts within the lab. The examiner was tasked with identifying: words and phrases inputted
into Google searches, specific keywords, URLs visited, the visit count of the URLs, and form
data inputted within the cookies, places, and formhistory sqlite files provided. This was done by
inputting specific SQL strings into both of the tools in order to filter out any unnecessary data
within the sqlite files. Being able to control and alter the output display for either of the tools
helps the examiner analyze and process the data in a timely manner. An example of how the
SQL strings can assist the examiner with their analysis process can be seen in figure 7.
Figure 7: SQLite browser time data in epoch time
Prior to any SQL strings being inputted into the tools to convert the time, the output times for the
tools will be displayed in epoch time. Reading the times in epoch is not an efficient way to view
and analyze the time information for internet artifacts during an investigation. By using the SQL
7. Page 7 of 8
string, “--- datetime(xxxx/1000000,”unixepoch”) on any field of time (lastUsed, firstUsed, etc)
the epoch time output will be converted into a human readable format as seen in figure 8.
Figure 8: Converted Epoch output date and time
Throughout, the analysis of the provided sqlite files there were several occasions when multiple
SQL strings were used in order to help display the output data in a desired format. When
performing analysis on the “places.sqlite” file the examiner was tasked with discovering what the
amount of times specific URLs were visited by the user. By inputting the SQL string, “SELECT
url, datetime(last_visit_date/1000000,”unixepoch”), visit_count FROM moz_places” into the
SQL string window, the output window displayed the times in a human readable format and
displayed the amount of visits for each URL as seen in figure 9.
Figure 9: Converted Epoch date and time and URL visit count
Viewing the data in this organized manner allows the examiner to sort through all of the data in
an organized way which can help the examiner identify other various artifacts that may be
relevant to the investigation.
SQLitebrowservs SQLiteman
When analyzing and comparing the tools SQLitebrowser and SQLiteman the examiner identified
several differences between the tools. Outside of visual differences the examiner identified that
the SQLiteman tool had additional capabilities to easily isolate and view the outputted data via
the “Item View” tab as seen in figure 10.
Figure 10: Item View tab within the SQLiteman tool
8. Page 8 of 8
An additional feature that the SQLiteman tool has that the SQLitebrowser tool does not is the
BLOB viewer feature as seen in figure 11.
Figure 11: BLOB Preview Window within the sqliteman tool
BLOB is short for Binary Large OBject. A BLOB is a large amount of binary data that is stored
and collected as a single object in a database management system. (Common BLOB artifacts
can be images and multi-media objects) The BLOB viewer allows the examiner to view a visual
preview of the BLOB data that can be identified by the examiner.
Conclusion
After completing the analysis of the sqlite files with the various tools, the examiner was able to
gain insight and understanding about the full capabilities of each of the tools. Throughout, the
lab the examiner exercised the capabilities of each of the tools on the sqlite files in order to
discover various internet artifacts within each of the sqlite data files. Using the capability to
enter various strings within each of the tools the examiner was able to examine and identify:
inputted words and phrases for Google searches, identify URLs associated with keywords,
identify which URL had more than one visit, suspicious browsing behavior, data inputted into
forms, and convert Unix epoch time into human readable time. Being able to use the strings
when performing analysis on sqlite files allows the examiner the ability to accurately expedite
their analysis on specific internet artifacts of interest during an investigation. By comparing and
understanding the capabilities of each of the SQLiteman and SQLbrowser tools the examiner can
decide on the best way to implement each of the tools when performing analysis on various sqlite
data files. With all of the findings discovered from internet artifact analysis, the examiner can
create a detailed timeline report about the events and actions that were performed by a specific
user.