Digital Forensics is a technique used to search for evidence of events that have occurred. This quest aims to reveal the hidden truth. The existence of digital forensic activities due to the occurrence of crimes both in the field of computers or other. Legal treatment in digital forensic field makes this area of science a compulsory device to dismantle crimes involving the computer world. In general, the cyber crime leaves a digital footprint, so it is necessary for a computer forensics expert to secure digital evidence. Computer forensics necessarily requires a standard operational procedure in taking digital evidence so as not to be contaminated or modified when the data is analyzed. The application of digital forensic is beneficial to the legal process going well and correctly.
Live forensics of tools on android devices for email forensicsTELKOMNIKA JOURNAL
Email is one communication technology that can be used to exchange information, data, and etc. The development of email technology not only can be opened using a computer but can be opened using an smartphone. The most widely used smartphone in Indonesian society is Android. Within a row, the development technology of higher cybercrime such as email fraud catching cybercrime offenders need evidence to be submitted to a court, for obtain evidence can use tools like Wireshark and Networkminer to analyzing network traffic on live networks. Opportunity, we will do a comparison of the forensic tools it to acquire digital evidence. The subject of this research focused on Android-based email service to get as much digital evidence as possible on both tools. This process uses National Institute of Standards and Technology method. The results of this research that networkminer managed to get the receiving port, while in Wireshark not found.
TOOLS AND TECHNIQUES FOR NETWORK FORENSICSIJNSA Journal
Network forensics deals with the capture, recording and analysis of network events in order to discover evidential information about the source of security attacks in a court of law. This paper discusses the different tools and techniques available to conduct network forensics. Some of the tools discussed include: eMailTrackerPro – to identify the physical location of an email sender; Web Historian – to find the duration of each visit and the files ploaded and downloaded from the visited website; packet sniffers like Ethereal – to capture and analyze the data exchanged among the different computers in the network. The second half of the paper presents a survey of ifferent IP traceback techniques like packet marking that help a forensic investigator to identify the true sources of the attacking IP packets. We also discuss the use of Honeypots and Honeynets that gather intelligence about the enemy and the tools and tactics of network intruders.
Experimental Analysis of Web Browser Sessions Using Live Forensics Method IJECEIAES
In today's digital era almost every aspect of life requires the internet, one way to access the internet is through a web browser. For security reasons, one developed is private mode. Unfortunately, some users using this feature do it for cybercrime. The use of this feature is to minimize the discovery of digital evidence. The standard investigative techniques of NIST need to be developed to uncover an ever-varied cybercrime. Live Forensics is an investigative development model for obtaining evidence of computer usage. This research provides a solution in forensic investigation effectively and efficiently by using live forensics. This paper proposes a framework for web browser analysis. Live Forensics allows investigators to obtain data from RAM that contains computer usage sessions.
Computer Forensic: A Reactive Strategy for Fighting Computer CrimeCSCJournals
Computer Forensics is the science of obtaining, preserving, documenting and presenting digital evidence, stored in the form of encoded information, from digital electronic storage devices, such as computers, Personal Digital Assistance (PDA), digital cameras, mobile phones and various memory storage devices. All must be done in a manner designed to preserve the probative value of the evidence and to assure its admissibility in legal proceeding. The word forensics means “to bring to the court”. Forensics deals primarily with the recovery and analysis of latent evidence. Latent evidence can take many forms, from fingerprints left on a window to deoxyribonucleic acid (DNA) evidence recovered from blood stains to the files on a hard drive. This paper provides a high-level overview on computer forensics investigation phases for both technical and nontechnical audience. Although the term “computer” is used, the concept applies to any device capable of storing digital information.
The development of computer technology now have an impact on the increasing cases of cybercrime crime that occurred either directly or indirectly. Cases of cybercrime now are able to steal digital information is sensitive and confidential. Such information may include email, user_id, and password. In addition to browser cookies stored on your computer or laptop hard drive, user_id, email, and password are also stored in random access memory (RAM). Random access memory (RAM) is volatile so that in doing the analysis required an appropriate and effective method. Digital data acquisition method in random access memory can be done live forensics or when the system is running. This is done because if the device or laptop computer is dead or shutdown then the information stored in random access memory will be lost. In this study has successfully carried out its acquisition of random access memory (RAM) for information access rights and password login form user_id on your websites such as Facebook, PayPal, internet banking, and bitcoin. Tools used to perform data acquisition, namely Linux Memory Extractor (LiME) and FTK Imager.
Live forensics of tools on android devices for email forensicsTELKOMNIKA JOURNAL
Email is one communication technology that can be used to exchange information, data, and etc. The development of email technology not only can be opened using a computer but can be opened using an smartphone. The most widely used smartphone in Indonesian society is Android. Within a row, the development technology of higher cybercrime such as email fraud catching cybercrime offenders need evidence to be submitted to a court, for obtain evidence can use tools like Wireshark and Networkminer to analyzing network traffic on live networks. Opportunity, we will do a comparison of the forensic tools it to acquire digital evidence. The subject of this research focused on Android-based email service to get as much digital evidence as possible on both tools. This process uses National Institute of Standards and Technology method. The results of this research that networkminer managed to get the receiving port, while in Wireshark not found.
TOOLS AND TECHNIQUES FOR NETWORK FORENSICSIJNSA Journal
Network forensics deals with the capture, recording and analysis of network events in order to discover evidential information about the source of security attacks in a court of law. This paper discusses the different tools and techniques available to conduct network forensics. Some of the tools discussed include: eMailTrackerPro – to identify the physical location of an email sender; Web Historian – to find the duration of each visit and the files ploaded and downloaded from the visited website; packet sniffers like Ethereal – to capture and analyze the data exchanged among the different computers in the network. The second half of the paper presents a survey of ifferent IP traceback techniques like packet marking that help a forensic investigator to identify the true sources of the attacking IP packets. We also discuss the use of Honeypots and Honeynets that gather intelligence about the enemy and the tools and tactics of network intruders.
Experimental Analysis of Web Browser Sessions Using Live Forensics Method IJECEIAES
In today's digital era almost every aspect of life requires the internet, one way to access the internet is through a web browser. For security reasons, one developed is private mode. Unfortunately, some users using this feature do it for cybercrime. The use of this feature is to minimize the discovery of digital evidence. The standard investigative techniques of NIST need to be developed to uncover an ever-varied cybercrime. Live Forensics is an investigative development model for obtaining evidence of computer usage. This research provides a solution in forensic investigation effectively and efficiently by using live forensics. This paper proposes a framework for web browser analysis. Live Forensics allows investigators to obtain data from RAM that contains computer usage sessions.
Computer Forensic: A Reactive Strategy for Fighting Computer CrimeCSCJournals
Computer Forensics is the science of obtaining, preserving, documenting and presenting digital evidence, stored in the form of encoded information, from digital electronic storage devices, such as computers, Personal Digital Assistance (PDA), digital cameras, mobile phones and various memory storage devices. All must be done in a manner designed to preserve the probative value of the evidence and to assure its admissibility in legal proceeding. The word forensics means “to bring to the court”. Forensics deals primarily with the recovery and analysis of latent evidence. Latent evidence can take many forms, from fingerprints left on a window to deoxyribonucleic acid (DNA) evidence recovered from blood stains to the files on a hard drive. This paper provides a high-level overview on computer forensics investigation phases for both technical and nontechnical audience. Although the term “computer” is used, the concept applies to any device capable of storing digital information.
The development of computer technology now have an impact on the increasing cases of cybercrime crime that occurred either directly or indirectly. Cases of cybercrime now are able to steal digital information is sensitive and confidential. Such information may include email, user_id, and password. In addition to browser cookies stored on your computer or laptop hard drive, user_id, email, and password are also stored in random access memory (RAM). Random access memory (RAM) is volatile so that in doing the analysis required an appropriate and effective method. Digital data acquisition method in random access memory can be done live forensics or when the system is running. This is done because if the device or laptop computer is dead or shutdown then the information stored in random access memory will be lost. In this study has successfully carried out its acquisition of random access memory (RAM) for information access rights and password login form user_id on your websites such as Facebook, PayPal, internet banking, and bitcoin. Tools used to perform data acquisition, namely Linux Memory Extractor (LiME) and FTK Imager.
AN EMPIRICAL ANALYSIS OF EMAIL FORENSICS TOOLSIJNSA Journal
Emails are the most common service on the Internet for communication and sending documents. Email is used not only from computers but also from many other electronic devices such as tablets; smartphones, etc. Emails can also be used for criminal activities. Email forensic refers to the study of email detail and content as evidence to identify the actual sender and recipient of a message, date/time of transmission, detailed record of email transaction, intent of the sender, etc. Email forensics involves investigation of metadata, keyword, searching, port scanning and generating report based on investigators need. Many tools are available for any investigation that involves email forensics. Investigators should be very careful of not violating user’s privacy. To this end, investigators should run keyword searches to reveal only the relevant emails. Therefore, knowledge of the features of the tool and the search features is necessary for the tool selection. In this research, we experimentally compare the performance of several email forensics tools. Our aim is to help the investigators with the tool selection task. We evaluate the tools in terms of their keyword search, report generation, and other features such as, email format, size of the file accepted, whether they work online or offline, format of the reports, etc. We use Enron email dataset for our experiment.
A novel cloud storage system with support of sensitive data applicationijmnct
Most users are willing to store their data in the c
loud storage system and use many facilities of clou
d. But
their sensitive data applications faces with potent
ial serious security threats. In this paper, securi
ty
requirements of sensitive data application in the c
loud are analyzed and improved structure for the ty
pical
cloud storage system architecture is proposed. The
hardware USB-Key is used in the proposed architectu
re
for purpose of enhancing security of user identity
and interaction security between the users and the
cloud
storage system. Moreover, drawn from the idea of da
ta active protection, a data security container is
introduced in the system to enhance the security of
the data transmission process; by encapsulating th
e
encrypted data, increasing appropriate access contr
ol and data management functions. The static data
blocks are replaced with a dynamic executable data
security container. Then, an enhanced security
architecture for software of cloud storage terminal
is proposed for more adaptation with the user's sp
ecific
requirements, and its functions and components can
be customizable. Moreover, the proposed architectur
e
have capability of detecting whether the execution
environment is according with the pre-defined
environment requirements.
Forensic Tools Performance Analysis on Android-based Blackberry Messenger usi...IJECEIAES
Blackberry Messenger is one of the popularly used instant messaging applications on Android with user’s amount that increase significantly each year. The increase off Blackberry Messenger users might lead to application misuse, such as for commiting digital crimes. To conduct investigation involving smartphone devices, the investigators need to use forensic tools. Therefore, a research on current forensic tool’s performance in order to handle digital crime cases involving Android smartphones and Blackberry Messenger in particular need to be done. This research focuses on evaluating and comparing three forensic tools to obtain digital evidence from Blackberry Messenger on Android smartphones using parameter from National Institute of Standard Technology and Blackberry Messenger’s acquired digital evidences. The result shows that from comparative analysis conducted, Andriller gives 25% performance value, Oxygen Forensic Suite gives 100% performance value, and Autopsy 4.1.1 gives 0% performance value. Related to National Institute of Standard Technology parameter criterias, Andriller has performance value of 47.61%. Oxygen Forensic Suite has performance value of 61.90%. Autopsy 4.1.1 has performance value of 9.52%.
Metadata is the information that is embedded
in a file whose contents are the explanation of the file. In the
handling of the main evidence with a metadata-based approach
is still a lot of manually in search for correlation related files to
uncover various cases of computer crime. However, when
correlated files are in separate locations (folders) and the
number of files will certainly be a formidable challenge for
forensic investigators in analyzing the evidence. In this study,
we will build a prototype analysis using a metadata-based
approach to analyze the correlation of the main proof file with
the associated file or deemed relevant in the context of the
investigation automatically based on the metadata parameters
of Author, Size, File Type and Date. In this research, the
related analysis read the characteristics of metadata file that is
file type Jpg, Docx, Pdf, Mp3 and Mp4 and analysis of digital
evidence correlation by using specified parameters, so it can
multiply the findings of evidence and facilitate analysis of
digital evidence. In this research, the result of correlation
analysis of digital evidence found that using parameter of
Author, Size, File Type and Date found less correlated file
while using parameter without Size and File Type found more
correlated file because of various extension and file size.
Semantic annotation, which is considered one of the semantic web applicative aspects, has been adopted by researchers from different communities as a paramount solution that improves searching and retrieval of information by promoting the richness of the content. However, researchers are facing challenges concerning both the quality and the relevance of the semantic annotations attached to the annotated document against its content as well as its semantics, without ignoring those regarding automation process which is supposed to ensure an optimal system for information indexing and retrieval. In this article, we will introduce the semantic annotation concept by presenting a state of the art including definitions, features and a classification of annotation systems. Systems and proposed approaches in the field will be cited, as well as a study of some existing annotation tools. This study will also pinpoint various problems and limitations related to the annotation in order to offer solutions for our future work.
The Internet is a massive network of networks, a networking infrastructure. It connects millions of computers together globally, forming a network in which any computer can communicate with any other computer as long as they are both connected to the Internet. Information that travels over the Internet does so via a variety of languages known as protocols.
Classifying confidential data using SVM for efficient cloud query processingTELKOMNIKA JOURNAL
Nowadays, organizations are widely using a cloud database engine from the cloud service
providers. Privacy still is the main concern for these organizations where every organization is strictly
looking forward more secure environment for their own data. Several studies have proposed different types
of encryption methods to protect the data over the cloud. However, the daily transactions represented by
queries for such databases makes encryption is inefficient solution. Therefore, recent studies presented
a mechanism for classifying the data prior to migrate into the cloud. This would reduce the need of
encryption which enhances the efficiency. Yet, most of the classification methods used in the literature
were based on string-based matching approach. Such approach suffers of the exact match of terms where
the partial matching would not be considered. This paper aims to take the advantage of N-gram
representation along with Support Vector Machine classification. A real-time data will used in
the experiment. After conducting the classification, the Advanced Encryption Standard algorithm will be
used to encrypt the confidential data. Results showed that the proposed method outperformed the baseline
encryption method. This emphasizes the usefulness of using the machine learning techniques for
the process of classifying the data based on confidentiality.
Virtual machine has been the most one of virtualization technology used today for working and saving
hardware resources, besides as a tool conduct research on
malware, network installations etc. The wide use of
virtualization technology is becoming a new challenge for
digital forensics experts to carry out further research on the
recovery of evidence of deleted virtual machine image. This
research tries to find out whether there is evidence of
generated activity in the destroyed virtual vachine and how to
find the potential of digital evidence by using the Virtual
Machine Forensic Analysis and Recovery method. The result
showed, the virtual machine which was removed from the
VirtualBox library could be recovered and analyzed by using
autopsy tools and FTK with analytical method, 4 deleted files
in the VMDK file could be recovered and analyzed against the
digital evidence after checking the hash and metadata in
accordance with the original. However, Virtual machine image
with Windows-based and Linux-based operating systems which
was deleted using the destroy method on VirtualBox could not
be recovered by using autopsy and FTK, even though
VirtualBox log analysis, deleted filesystem analysis, and
registry analysis to recover backbox.vmdk and windows
7.vmdk does not work, due to the deletion was done using a
high-level removal method, almost similar to the method of
wipe removal of data on the hard drive.
Decision Support for E-Governance: A Text Mining ApproachIJMIT JOURNAL
Information and communication technology has the capability to improve the process by which governments involve citizens in formulating public policy and public projects. Even though much of government regulations may now be in digital form (and often available online), due to their complexity and diversity, identifying the ones relevant to a particular context is a non-trivial task. Similarly, with the advent of a number of electronic online forums, social networking sites and blogs, the opportunity of gathering citizens’ petitions and stakeholders’ views on government policy and proposals has increased greatly, but the volume and the complexity of analyzing unstructured data makes this difficult. On the other hand, text mining has come a long way from simple keyword search, and matured into a discipline capable of dealing with much more complex tasks. In this paper we discuss how text-mining techniques can help in retrieval of information and relationships from textual data sources, thereby assisting policy makers in discovering associations between policies and citizens’ opinions expressed in electronic public forums and blogs etc. We also present here, an integrated text mining based architecture for e-governance decision support along with a discussion on the Indian scenario.
Forensic the word which indicate the detective work, which searches for and attempting to discover information. Mainly search is carried out for collecting evidence for investigation which is useful in criminal, civil or corporate investigations. Investigation is applicable in presence of some legal rules.
As criminals are getting smarter to perform crime that is, using data hiding techniques such as encryption and steganography, so forensic department has become alert has introduced a new concept called as Digital Forensic, which handles sensitive data which is responsible and confidential.
AN EMPIRICAL ANALYSIS OF EMAIL FORENSICS TOOLSIJNSA Journal
Emails are the most common service on the Internet for communication and sending documents. Email is used not only from computers but also from many other electronic devices such as tablets; smartphones, etc. Emails can also be used for criminal activities. Email forensic refers to the study of email detail and content as evidence to identify the actual sender and recipient of a message, date/time of transmission, detailed record of email transaction, intent of the sender, etc. Email forensics involves investigation of metadata, keyword, searching, port scanning and generating report based on investigators need. Many tools are available for any investigation that involves email forensics. Investigators should be very careful of not violating user’s privacy. To this end, investigators should run keyword searches to reveal only the relevant emails. Therefore, knowledge of the features of the tool and the search features is necessary for the tool selection. In this research, we experimentally compare the performance of several email forensics tools. Our aim is to help the investigators with the tool selection task. We evaluate the tools in terms of their keyword search, report generation, and other features such as, email format, size of the file accepted, whether they work online or offline, format of the reports, etc. We use Enron email dataset for our experiment.
A novel cloud storage system with support of sensitive data applicationijmnct
Most users are willing to store their data in the c
loud storage system and use many facilities of clou
d. But
their sensitive data applications faces with potent
ial serious security threats. In this paper, securi
ty
requirements of sensitive data application in the c
loud are analyzed and improved structure for the ty
pical
cloud storage system architecture is proposed. The
hardware USB-Key is used in the proposed architectu
re
for purpose of enhancing security of user identity
and interaction security between the users and the
cloud
storage system. Moreover, drawn from the idea of da
ta active protection, a data security container is
introduced in the system to enhance the security of
the data transmission process; by encapsulating th
e
encrypted data, increasing appropriate access contr
ol and data management functions. The static data
blocks are replaced with a dynamic executable data
security container. Then, an enhanced security
architecture for software of cloud storage terminal
is proposed for more adaptation with the user's sp
ecific
requirements, and its functions and components can
be customizable. Moreover, the proposed architectur
e
have capability of detecting whether the execution
environment is according with the pre-defined
environment requirements.
Forensic Tools Performance Analysis on Android-based Blackberry Messenger usi...IJECEIAES
Blackberry Messenger is one of the popularly used instant messaging applications on Android with user’s amount that increase significantly each year. The increase off Blackberry Messenger users might lead to application misuse, such as for commiting digital crimes. To conduct investigation involving smartphone devices, the investigators need to use forensic tools. Therefore, a research on current forensic tool’s performance in order to handle digital crime cases involving Android smartphones and Blackberry Messenger in particular need to be done. This research focuses on evaluating and comparing three forensic tools to obtain digital evidence from Blackberry Messenger on Android smartphones using parameter from National Institute of Standard Technology and Blackberry Messenger’s acquired digital evidences. The result shows that from comparative analysis conducted, Andriller gives 25% performance value, Oxygen Forensic Suite gives 100% performance value, and Autopsy 4.1.1 gives 0% performance value. Related to National Institute of Standard Technology parameter criterias, Andriller has performance value of 47.61%. Oxygen Forensic Suite has performance value of 61.90%. Autopsy 4.1.1 has performance value of 9.52%.
Metadata is the information that is embedded
in a file whose contents are the explanation of the file. In the
handling of the main evidence with a metadata-based approach
is still a lot of manually in search for correlation related files to
uncover various cases of computer crime. However, when
correlated files are in separate locations (folders) and the
number of files will certainly be a formidable challenge for
forensic investigators in analyzing the evidence. In this study,
we will build a prototype analysis using a metadata-based
approach to analyze the correlation of the main proof file with
the associated file or deemed relevant in the context of the
investigation automatically based on the metadata parameters
of Author, Size, File Type and Date. In this research, the
related analysis read the characteristics of metadata file that is
file type Jpg, Docx, Pdf, Mp3 and Mp4 and analysis of digital
evidence correlation by using specified parameters, so it can
multiply the findings of evidence and facilitate analysis of
digital evidence. In this research, the result of correlation
analysis of digital evidence found that using parameter of
Author, Size, File Type and Date found less correlated file
while using parameter without Size and File Type found more
correlated file because of various extension and file size.
Semantic annotation, which is considered one of the semantic web applicative aspects, has been adopted by researchers from different communities as a paramount solution that improves searching and retrieval of information by promoting the richness of the content. However, researchers are facing challenges concerning both the quality and the relevance of the semantic annotations attached to the annotated document against its content as well as its semantics, without ignoring those regarding automation process which is supposed to ensure an optimal system for information indexing and retrieval. In this article, we will introduce the semantic annotation concept by presenting a state of the art including definitions, features and a classification of annotation systems. Systems and proposed approaches in the field will be cited, as well as a study of some existing annotation tools. This study will also pinpoint various problems and limitations related to the annotation in order to offer solutions for our future work.
The Internet is a massive network of networks, a networking infrastructure. It connects millions of computers together globally, forming a network in which any computer can communicate with any other computer as long as they are both connected to the Internet. Information that travels over the Internet does so via a variety of languages known as protocols.
Classifying confidential data using SVM for efficient cloud query processingTELKOMNIKA JOURNAL
Nowadays, organizations are widely using a cloud database engine from the cloud service
providers. Privacy still is the main concern for these organizations where every organization is strictly
looking forward more secure environment for their own data. Several studies have proposed different types
of encryption methods to protect the data over the cloud. However, the daily transactions represented by
queries for such databases makes encryption is inefficient solution. Therefore, recent studies presented
a mechanism for classifying the data prior to migrate into the cloud. This would reduce the need of
encryption which enhances the efficiency. Yet, most of the classification methods used in the literature
were based on string-based matching approach. Such approach suffers of the exact match of terms where
the partial matching would not be considered. This paper aims to take the advantage of N-gram
representation along with Support Vector Machine classification. A real-time data will used in
the experiment. After conducting the classification, the Advanced Encryption Standard algorithm will be
used to encrypt the confidential data. Results showed that the proposed method outperformed the baseline
encryption method. This emphasizes the usefulness of using the machine learning techniques for
the process of classifying the data based on confidentiality.
Virtual machine has been the most one of virtualization technology used today for working and saving
hardware resources, besides as a tool conduct research on
malware, network installations etc. The wide use of
virtualization technology is becoming a new challenge for
digital forensics experts to carry out further research on the
recovery of evidence of deleted virtual machine image. This
research tries to find out whether there is evidence of
generated activity in the destroyed virtual vachine and how to
find the potential of digital evidence by using the Virtual
Machine Forensic Analysis and Recovery method. The result
showed, the virtual machine which was removed from the
VirtualBox library could be recovered and analyzed by using
autopsy tools and FTK with analytical method, 4 deleted files
in the VMDK file could be recovered and analyzed against the
digital evidence after checking the hash and metadata in
accordance with the original. However, Virtual machine image
with Windows-based and Linux-based operating systems which
was deleted using the destroy method on VirtualBox could not
be recovered by using autopsy and FTK, even though
VirtualBox log analysis, deleted filesystem analysis, and
registry analysis to recover backbox.vmdk and windows
7.vmdk does not work, due to the deletion was done using a
high-level removal method, almost similar to the method of
wipe removal of data on the hard drive.
Decision Support for E-Governance: A Text Mining ApproachIJMIT JOURNAL
Information and communication technology has the capability to improve the process by which governments involve citizens in formulating public policy and public projects. Even though much of government regulations may now be in digital form (and often available online), due to their complexity and diversity, identifying the ones relevant to a particular context is a non-trivial task. Similarly, with the advent of a number of electronic online forums, social networking sites and blogs, the opportunity of gathering citizens’ petitions and stakeholders’ views on government policy and proposals has increased greatly, but the volume and the complexity of analyzing unstructured data makes this difficult. On the other hand, text mining has come a long way from simple keyword search, and matured into a discipline capable of dealing with much more complex tasks. In this paper we discuss how text-mining techniques can help in retrieval of information and relationships from textual data sources, thereby assisting policy makers in discovering associations between policies and citizens’ opinions expressed in electronic public forums and blogs etc. We also present here, an integrated text mining based architecture for e-governance decision support along with a discussion on the Indian scenario.
Forensic the word which indicate the detective work, which searches for and attempting to discover information. Mainly search is carried out for collecting evidence for investigation which is useful in criminal, civil or corporate investigations. Investigation is applicable in presence of some legal rules.
As criminals are getting smarter to perform crime that is, using data hiding techniques such as encryption and steganography, so forensic department has become alert has introduced a new concept called as Digital Forensic, which handles sensitive data which is responsible and confidential.
Design for A Network Centric Enterprise Forensic SystemCSCJournals
Increased profitability and exposure of enterprise’s information incite more attackers to attempt exploitation on enterprise network, while striving not to leave any evidences. Although the area of digital forensic analysis is evolving to become more mature in the modern criminology, the scope of network and computer forensics in the large-scale commercial environment is still vague. The conventional forensic techniques, consisting of large proportion of manual operations and isolated processes, are not adequately compatible in modern enterprise context. Data volume of enterprise is usually overwhelming and the interference to business operation during the investigation is unwelcomed. To evidence and monitor these increasing and evolving cyber offences and criminals, forensic investigators are calling for more comprehensive forensic methodology. For comprehension of current insufficiencies, this paper starts from the probes for the weaknesses of various preliminary forensic techniques. Then it proposes an approach to design an enhanced forensic system that integrates the network distributed system concept and information fusion theory as a remedy to the drawbacks of existing forensic techniques.
Proposed Effective Solution for Cybercrime Investigation in Myanmartheijes
The rapid increase of ICT creates new attack surfaces for cybercrime forensics. In society, information is the new challenge for security, privacy, and cybercrime. In this paper, an applicable framework has been proposed for cybercrime forensics investigation in Myanmar, known as CCFIM. By using standard cyber laws and policy for cybercrime forensics investigation can provide an ethical, secure and monitored computing environment. This framework provides a secure analysis on both logical and physical data extractions. Acceptable Evidences can be obtained by examining sensible clues from any digital devices such as computer, mobile smart phones, tablets, GPS and IoT devices via traditional or cloud. The most important part of forensic investigation is to gather the “relevant” and “acceptable” information for cyber evidence on court. Therefore, forensic investigators need to emphasize how file system timestamps work. This paper emphasizes on the comparative timestamps of the various file and window operating systems.
Recently, modern technologies have made a positive contribution to most institutions of the society. However, they have generated serious issues with regard to the data security. For example, in the educational sector, most educational institutions have huge amounts of various types of data, which require rigorous protection ways to ensure their security and integrity. This study aims at reviewing the most recent data security methods, particularly the existing common encryption algorithms as well as designing an electronic system based on efficient and trusted ones to protect academic and nonacademic digital data in educational institutions. The study applies the satisfaction questionnaire as instrument to evaluate the proposed system efficiency. The questionnaire has been presented to the evaluation team whose members are divided into two groups: experts and end users. The results of the evaluation process have indicated that, the satisfaction ratio of the proposed system is encouraging. The overall satisfaction percentage is 96.25%, which demonstrates that the proposed system is an acceptable and suitable choice for various security applications.
The security and speed of data transmission is very important in data communications, the steps that can be done is to use the appropriate cryptographic and compression algorithms in this case is the Data Encryption Standard and Lempel-Ziv-Welch algorithms combined to get the data safe and also the results good compression so that the transmission process can run properly, safely and quickly.
The problem of electric power quality is a matter of changing the form of voltage, current or frequency that can cause failure of equipment, either utility equipment or consumer property. Components of household equipment there are many nonlinear loads, one of which Mixer. Even a load nonlinear current waveform and voltage is not sinusoidal. Due to the use of household appliances such as mixers, it will cause harmonics problems that can damage the electrical system equipment. This study analyzes the percentage value of harmonics in Mixer and reduces harmonics according to standard. Measurements made before the use of LC passive filter yield total current harmonic distortion value (THDi) is 61.48%, while after passive filter use LC the THDi percentage becomes 23.75%. The order of harmonic current in the 3rd order mixer (IHDi) is 0.4185 A not according to standard, after the use of LC passive filter to 0.088 A and it is in accordance with the desired standard, and with the use of passive filter LC, the power factor value becomes better than 0.75 to 0.98.
This paper examines the long-term simultaneous response between dividend policy and corporate value. The main problem studied is that the dividend policy is responded very slowly to the final goal of corporate value. Analysis of Data was using Vector Autoregression (VAR). The result of the discussion concludes the effect of different simultaneous response every period between dividend policy with corporate value, short-term, medium-term, and long-term. The strongest response to dividend changes comes from free cash flow whereas the highest response to corporate value comes from market book value.
Whatsapp is a social media application that is currently widely used from various circles due to ease of use and security is good enough, the security at the time of communicating at this time is very important as well with Whatsapp. Whatsapp from the network is very secure but on the local storage that contains the message was not safe enough because the message on local storage is not secured properly using a special algorithm even using the software Whatsapp Database Viewer whatsapp message can be known, to improve the security of messages on local storage whatsapp submitted security enhancements using the Modular Multiplication Block Cipher algorithm so that the message on whatsapp would be better in terms of security and not easy to read by unauthorized ones.
Consumers are increasingly easy to access to information resources. Consumers quickly interact with whatever they will spend. Ease of use of technology an impact on consumer an attitude are increasingly intelligent and has encouraged the rise of digital transactions. Technology makes it easy for them to transact on an e-commerce shopping channel. Future e-commerce trends will lead to User Generated Content related to user behavior in Indonesia that tends to compare between shopping channels. The purpose of this study was to examine the direct and indirect effects of Perceived Ease of Use on Behavioral Intention to transact in which Perceived Usefulness is used as an intervening variable. The present study used the descriptive exploratory method with causal-predictive analysis. Determination method of research sample used purposive sampling. The enumerator team assists in the distribution of questionnaires. The results of the study found that the direct effect of perceived ease of use on behavioral intention to transact is smaller than that indirectly mediated by perceived usefulness variables.
Performance is a process of assessment of the algorithm. Speed and security is the performance to be achieved in determining which algorithm is better to use. In determining the optimum route, there are two algorithms that can be used for comparison. The Genetic and Primary algorithms are two very popular algorithms for determining the optimum route on the graph. Prim can minimize circuit to avoid connected loop. Prim will determine the best route based on active vertex. This algorithm is especially useful when applied in a minimum spanning tree case. Genetics works with probability properties. Genetics cannot determine which route has the maximum value. However, genetics can determine the overall optimum route based on appropriate parameters. Each algorithm can be used for the case of the shortest path, minimum spanning tree or traveling salesman problem. The Prim algorithm is superior to the speed of Genetics. The strength of the Genetic algorithm lies in the number of generations and population generated as well as the selection, crossover and mutation processes as the resultant support. The disadvantage of the Genetic algorithm is spending to much time to get the desired result. Overall, the Prim algorithm has better performance than Genetic especially for a large number of vertices.
Implementation of Decision Support System for various purposes now can facilitate policy makers to get the best alternative from a variety of predefined criteria, one of the methods used in the implementation of Decision Support System is VIKOR (Vise Kriterijumska Optimizacija I Kompromisno Resenje), VIKOR method in this research got the best results with an efficient and easily understood process computationally, it is expected that the results of this study facilitate various parties to develop a model any solutions.
Edge detection is one of the most frequent processes in digital image processing for various purposes, one of which is detecting road damage based on crack paths that can be checked using a Canny algorithm. This paper proposed a mobile application to detect cracks in the road and with customized threshold function in the requests to produce useful and accurate edge detection. The experimental results show that the use of threshold function in a canny algorithm can detect better damage in the road
The security and confidentiality of information becomes an important factor in communication, the use of cryptography can be a powerful way of securing the information, IDEA (International Data Encryption Algorithm) and WAKE (Word Auto Key Encryption) are some modern symmetric cryptography algorithms with encryption and decryption function are much faster than the asymmetric cryptographic algorithm, with the combination experiment IDEA and WAKE it probable to produce highly secret ciphertext and it hopes to take a very long time for cryptanalyst to decrypt the information without knowing the key of the encryption process.
Employees are the backbone of corporate activities and the giving of bonuses, job titles and allowances to employees to motivate the work of employees is very necessary, salesman on the company very much and to find the best salesman cannot be done manually and for that required the implementation of a system in this decision support system by applying the TOPSIS method, it is expected with the implementation of TOPSIS method the expected results of top management can be fulfilled.
English is a language that must be known all-digital era at this time where almost all information is in English, ranging from kindergarten to college learn English. elementary school is now also there are learning and to help introduce English is prototype application recogni-tion of common words in English and can be updated dynamically so that updates occur information to new words and sentences in Eng-lish to be introduced to students.
The selection of the best employees is one of the process of evaluating how well the performance of the employees is adjusted to the standards set by the company and usually done by top management such as General Manager or Director. In general, the selection of the best employees is still perform manually with many criteria and alternatives, and this usually make it difficult top managerial making decisions as well as the selection of the best employees periodically into a long and complicated process. Therefore, it is necessary to build a decision support system that can help facilitate the decision maker in determining the best choice based on standard criteria, faster, and more objective. In this research, the computational method of decision-making system used is Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS). The criteria used in the selection of the best employees are: job responsibilities, work discipline, work quality, and behaviour. The final result of the global priority value of the best employee candidates is used as the best employee selection decision making tool by top management.
Rabin Karp algorithm is a search algorithm that searches for a substring pattern in a text using hashing. It is beneficial for matching words with many patterns. One of the practical applications of Rabin Karp's algorithm is in the detection of plagiarism. Michael O. Rabin and Richard M. Karp invented the algorithm. This algorithm performs string search by using a hash function. A hash function is the values that are compared between two documents to determine the level of similarity of the document. Rabin-Karp algorithm is not very good for single pattern text search. This algorithm is perfect for multiple pattern search. The Levenshtein algorithm can be used to replace the hash calculation on the Rabin-Karp algorithm. The hash calculation on Rabin-Karp only counts the number of hashes that have the same value in both documents. Using the Levenshtein algorithm, the calculation of the hash distance in both documents will result in better accuracy.
Cybercrime is a digital crime committed to reaping profits through the Internet as a medium. Any criminal activity that occurs in the digital world or through the internet network is referred to as internet crime. Cybercrime also refers to criminal activity on computers and computer networks. This activity can be done in a certain location or even done between countries. These crimes include credit card forgery, confidence fraud, the dissemination of personal information, pornography, and so on. In ancient times there was no strong law to combat cybercrime. Since there are electronic information laws and transactions, legal jurisdiction of computer crime has been applied. Computer networks are not only installed in one particular local area but can be applied to a worldwide network. It is what makes cybercrime can occur between countries freely. This issue requires universal jurisdiction. A country has the authority to combat crimes that threaten the international community. This jurisdiction is applied without determining where the crime was committed and the citizen who committed the cybercrime. This jurisdiction is created in the absence of an international judicial body specifically to try individual crimes. Cybercrime cannot be totally eradicated. Implementing international jurisdiction at least reduces the number of cybercrimes in the world.
Competitive market competition so the company must be smart in managing finance. In promoting the selling point, marketing is the most important step to be considered. Promotional routine activity is one of the marketing techniques to increase consumer appeal to marketed products. One of the important agendas of promotion is the selection of the most appropriate promotional media. The problem that often occurs in the process of selecting a promotional media is the subjectivity of decision making. Marketing activities have a taxation fund that must be issued. Limited funds are one of the constraints of improving market strategy. So far, the selection of promotional media is performed by the company manually using standardized determination that already applies. It has many shortcomings, among others, regarding effectiveness and efficiency of time and limited funds. Markov Chain is very helpful to the company in analyzing the development of the company over a period. This method can predict the market share in the future so that company can optimize promotion cost at the certain time. Implementation of this algorithm produces a percentage of market share so that businesses can determine and choose which way is more appropriate to improve the company's market strategy. Assessment is done by looking at consumer criteria of a particular product. These criteria can determine consumer interest in a product so that it can be analyzed consumer behavior.
The transition of copper cable technology to fiber optic is very triggering the development of technology where data can be transmitted quickly and accurately. This cable change can be seen everywhere. This cable is an expensive cable. If it is not installed optimally, it will cost enormously. This excess cost can be used to other things to support performance rather than for excess cable that should be minimized. Determining how much cable use at the time of installation is difficult if done manually. Prim's algorithm can optimize by calculating the minimum spanning tree on branches used for fiber optic cable installation. This algorithm can be used to shorten the time to a destination by making all the points interconnected according to the points listed. Use of this method helps save the cost of fiber optic construction.
An image is a medium for conveying information. The information contained therein may be a particular event, experience or moment. Not infrequently many images that have similarities. However, this level of similarity is not easily detected by the human eye. Eigenface is one technique to calculate the resemblance of an object. This technique calculates based on the intensity of the colors that exist in the two images compared. The stages used are normalization, eigenface, training, and testing. Eigenface is used to calculate pixel proximity between images. This calculation yields the feature value used for comparison. The smallest value of the feature value is an image very close to the original image. Application of this method is very helpful for analysts to predict the likeness of digital images. Also, it can be used in the field of steganography, digital forensic, face recognition and so forth.
Compression is an activity performed to reduce its size into smaller than earlier. Compression is created since lack of adequate storage capacity. Data compression is also needed to speed up data transmission activity between computer networks. Compression has the different rule between speed and density. Compressed compression will take longer than compression that relies on speed. Elias Delta is one of the lossless compression techniques that can compress the characters. This compression is created based on the frequency of the character of a character on a document to be compressed. It works based on bit deductions on seven or eight bits. The most common characters will have the least number of bits, while the fewest characters will have the longest number of bits. The formation of character sets serves to eliminate double characters in the calculation of the number of each character as well as for the compression table storage. It has a good level of comparison between before and after compression. The speed of compression and decompression process possessed by this method is outstanding and fast.
Technological developments in computer networks increasingly demand security on systems built. Security also requires flexibility, efficiency, and effectiveness. The exchange of information through the internet connection is a common thing to do now. However, this way can be able to trigger data theft or cyber crime which resulted in losses for both parties. Data theft rate is getting higher by using a wireless network. The wireless system does not have any signal restrictions that can be intercepted Filtering is used to restrict incoming access through the internet. It aims to avoid intruders or people who want to steal data. This is fatal if not anticipated. IP and MAC filtering is a way to protect wireless networks from being used and misused by just anyone. This technique is very useful for securing data on the computer if it joins the public network. By registering IP and MAC on a router, this will keep the information unused and stolen. This system is only a few computers that can be connected to a wireless hotspot by IP and MAC Address listed.
Catfish is one type of freshwater fish. This fish has a good taste. In the cultivation of these fish, many obstacles need to be faced. Because living in dirty water, this type of fish is susceptible to disease. Many symptoms arise during the fish cultivation process; From skin disease to physical. Catfish farmers do not know how to diagnose diseases that exist in their livestock. This diagnosis serves to separate places between good and sick catfish. The goal is that the sale value of the fish is high. Catfish that have diseases will be sold cheaper to be used as other animal feed while healthy fish will be sold to the market or exported to other countries. Diagnosis can be done by expert system method. The algorithm of certainty factor is one of the good algorithms to determine the percentage of possible fish disease. This algorithm is very helpful for farmers to improve catfish farming.
The Roman Empire A Historical Colossus.pdfkaushalkr1407
The Roman Empire, a vast and enduring power, stands as one of history's most remarkable civilizations, leaving an indelible imprint on the world. It emerged from the Roman Republic, transitioning into an imperial powerhouse under the leadership of Augustus Caesar in 27 BCE. This transformation marked the beginning of an era defined by unprecedented territorial expansion, architectural marvels, and profound cultural influence.
The empire's roots lie in the city of Rome, founded, according to legend, by Romulus in 753 BCE. Over centuries, Rome evolved from a small settlement to a formidable republic, characterized by a complex political system with elected officials and checks on power. However, internal strife, class conflicts, and military ambitions paved the way for the end of the Republic. Julius Caesar’s dictatorship and subsequent assassination in 44 BCE created a power vacuum, leading to a civil war. Octavian, later Augustus, emerged victorious, heralding the Roman Empire’s birth.
Under Augustus, the empire experienced the Pax Romana, a 200-year period of relative peace and stability. Augustus reformed the military, established efficient administrative systems, and initiated grand construction projects. The empire's borders expanded, encompassing territories from Britain to Egypt and from Spain to the Euphrates. Roman legions, renowned for their discipline and engineering prowess, secured and maintained these vast territories, building roads, fortifications, and cities that facilitated control and integration.
The Roman Empire’s society was hierarchical, with a rigid class system. At the top were the patricians, wealthy elites who held significant political power. Below them were the plebeians, free citizens with limited political influence, and the vast numbers of slaves who formed the backbone of the economy. The family unit was central, governed by the paterfamilias, the male head who held absolute authority.
Culturally, the Romans were eclectic, absorbing and adapting elements from the civilizations they encountered, particularly the Greeks. Roman art, literature, and philosophy reflected this synthesis, creating a rich cultural tapestry. Latin, the Roman language, became the lingua franca of the Western world, influencing numerous modern languages.
Roman architecture and engineering achievements were monumental. They perfected the arch, vault, and dome, constructing enduring structures like the Colosseum, Pantheon, and aqueducts. These engineering marvels not only showcased Roman ingenuity but also served practical purposes, from public entertainment to water supply.
Francesca Gottschalk - How can education support child empowerment.pptxEduSkills OECD
Francesca Gottschalk from the OECD’s Centre for Educational Research and Innovation presents at the Ask an Expert Webinar: How can education support child empowerment?
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
Embracing GenAI - A Strategic ImperativePeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
A Strategic Approach: GenAI in EducationPeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
2. International Journal of Scientific Research in Science and Technology (www.ijsrst.com) 163
The more complex the crime in the computer field, the
higher the computer science of forensics extends the
study of science from various aspects [4][5]. Therefore,
it is necessary to divide the concentration of science in
the field of computer forensics; it is intended that in
investigating to uncover the crime and even restore the
system after the damage can easily be done because it
has been divided into several concentrations, such as:
- Disk Forensics
- System Forensics
- Network Forensics
- Internet Forensics
Disk forensics is the concentration of science this one
has begun to develop, where disk forensics involves a
variety of storage media. It has been well documented in
the various literature, even IT professionals can handle
the problem of the disk forensics. Suppose, get the files
that have been deleted, change the hard drive partition,
look for traces bad sectors, restoring windows registry
modified or hidden by a virus and so forth. However,
there are still many IT professionals who do not know
that the above behavior is one of the computer forensic
sactions.
System forensic sis the method that closes to the
operating system. It is still difficult to be studied more
deeply; it is due to the many operating systems that are
developing today, where the operating system has
different characteristics and behaviors, such as various
file systems, Therefore the existing forensic methods are
still difficult to be averaged. The obstacle is the current
support software where as a tool to dissect the operating
system is still flatform windows. It is what causes the
need for the development of science.
Network forensics is a method of capturing, storing and
analyzing user network data to find the source of a
system security breach or information system security
problem. If we are talking about this one part, it
certainly involves the OSI layer, which explains how
computers can communicate. It does not only involve a
LAN network system but can include into a larger
network system.
Internet forensics is more complicated than others
because there are many computers connected to each
other and its usage can be concurrent without taking into
account the distance so in this section requires complex
techniques. Through internet forensics, the analyzer can
track who sends e-mails, when to send and where the
sender is, this can be done given the increasing number
of fake e-mails that are on behalf of a particular
company with lucky draw mode that will harm the
recipient or even a lot of threatening e-mail. Therefore
internet forensics becomes a science that is very
promising in revealing the facts and gather evidence.
2.2 Digital Evidence
The evidence is referred to information or data. The
point of view is same, but in the case of computer
forensics, the subject is the digital evidence [6]. The
more complex the context of digital evidence due to
media factors that embed the data, the harder it is to
reveal the facts behind it. Formatting will also affect the
way to digital evidence, such as digital evidence in the
form of documents, which are categorized into three
parts, such as:
1. Archieval Files
2. Active Files
3. Residual Data
Archived files are required for the file in the archiving
function, including handling documents to be stored in
the prescribed format, retrieving and distributing process
for other needs, such as some documents that are
digitized to be stored in TIFF format to maintain
document quality.
Active files are files that are used for various purposes
that are closely related to the activities that are being
done, such as image files, text documents, and others.
While the files belonging to the residual include the files
that are produced along with computer processes and
user activities, e.g., record usage in using the internet,
database logs, various temporary files, and so forth.
Digital evidence is scattered in different media and
contexts, so it takes more foresight than simply
classifying data for forensic purposes [7][8]. Keep in
mind also, the more peripherals or devices integrated
into computer systems, it will be more complex and
involves many considerations to lift digital evidence. It
3. International Journal of Scientific Research in Science and Technology (www.ijsrst.com) 164
is one of the obstacles in accessing the files that will be
used as evidence.
The obstacles that may occur in the field at the time of
investigation to retrieve data, such as:
1. Compressed file
2. One deliberately named the file or not
3. Incorrect in providing file format, intentionally
or not
4. Password-protected files
5. Hidden Files
6. The file is encrypted
7. Watermarking
III. RESULT AND DISCUSSION
3.1 Forensic Model
The model in forensic science is applicable in many
fields, and this model involves three components that are
assembled, empowered and managed in such a way to be
the ultimate goal with all feasibility and quality. Three
components include:
- Human
- Equipment
- Protocol
Human is required in computer forensics is the
perpetrator who certainly required certain qualifications
to achieve the desired quality. It is easy to learn
computer forensics, but to become an expert another
story; it takes more than just knowledge but the
experience that makes it said to be an expert. There are
three groups as a computer forensics performer,
Collection Specialist, Examiner, and Investigator.
Collection Specialist duty to collect evidence in the form
of digital evidence. For the examiner level only has the
ability as a test of media and extract data while the
investigator is already at the expert level or as an
investigator. The equipment must be used in such a way
as to obtain quality evidence. There is much required
involving specific software and various hardware as well
as various storage media in handling the data later.
Protocol is the most critical component in computer
forensics modeling, the rules of digging, obtaining,
analyzing and finally presenting in the reports. The rules
in computer forensics that an expert must run in four
phases include:
- Collection
- Testing
- Analysis
- Reports
Collection is the first step in the forensics process to
identify potential sources and how the data is collected.
This collection involves increasingly complex processes
and methods due to rapid technological developments,
multiple computers, a wide variety of storage media and
many computer networks with all the technologies
attached to them. Surely this complexity requires
different handling. After conducting the data collection
process, a further step is to conduct testing, including in
assessing and extracting relevant information from the
data collected. Once the information is extracted, the
examiner performs an analysis to formulate the
conclusions in describing the data. The analysis in
question certainly takes a methodical approach in
generating quality conclusions based on the availability
of data. Documentation and reporting are the final stages
of computer forensics. In this stage, the information is
the result of the analysis process.
3.2 Computer Surgery
In performing computer surgery, we need to know what
part we should have surgical in collecting information,
in the previous chapter has been discussed about the four
concentrations in computer forensics and in this section,
which in surgery is Disk Forensics.
3.2.1 Windows Registry
When accessing the windows registry, this process is
also called computer surgery because the registry is a
substantial system configuration and is a single logical
and store. The registry is divided into three separate
databases and allocated to handle users, systems, and
network settings, where these sections hold precious
information. To dismantle the registry should be known
in advance structure than the windows registry. The
registry consists of seven root keys, such as:
- HKEY_CLASSES_ROOT
- HKEY_CURRENT_USER
4. International Journal of Scientific Research in Science and Technology (www.ijsrst.com) 165
- HKEY_LOCAL_MACHINE
- HKEY_USERS
- HKEY_CURRENT_CONFIG
- HKEY_DYN_DATA
- HKEY_PERFORMANCE_DATA
In this registry will be seen anything just the information
stored in it, for example, Figure 1 illustrates the internet
activity can access registry key as follows:
HKEY_CURRENT_USER
SoftwareMicrosoftInternetExplorerTypedUrl.
Figure 1. Internet Activity Registry
3.2.2 Post-Damage Handling
In repairing a damaged system, it needs software to
recover a damaged system. Damage to the system that
occurs can be caused by factors of the carelessness of
humans and viruses that infect the computer and need to
know also the parts of the computer that must be
restored. In this section, it will be given a simple
example of damage to the registry that modifies the
recycle bin name.
If this is the case, an analyzer usually has been able to
analyze the causes and ways of recovery. There are two
causal factors, which can be due to viruses and can be
deliberate by the user. Moreover, that needs to know is if
this is because the virus is usually the main purpose of a
virus maker is the windows registry because with this
section the virus can disable the computer system by
destroying, modify or to hide the registry.
3.3 The legal role of digital forensics
Electronic information can be distinguished from
electronic documents, but can not be separated from
each other. Electronic information is information
contained in a medium. This information is news, sound
recordings, pictures, videos or things that refer to an
event. Meanwhile, electronic documents are how the
information is stored or wrapped. Some keep the
recorded conversation in MP3 or WAV format, or there
is a secret information stored in an encrypted image file.
The expansion of evidence is set in the laws of each
country. It includes corporate document law, terrorism,
corruption eradication, money laundering crime. The
electronic law confirms that in all applicable procedural
laws of each country electronic information, document
and prints may be used as legal evidence in court. Thus,
email, chatting files over chats, and various other
electronic documents can be used as valid evidence. In
some court decisions, there are decisions concerning the
position and acknowledgment of electronic evidence
presented in the tribunal.
Presentation of digital evidence is a trial process in
which digital evidence will be verified and linkage
between one and the other with the current case. It is the
appointment of digital evidence related to the ongoing
crime. The process of investigation takes a long time to
reveal the truth and find the cause of a case. It takes a
long time to go through the trial process. Digital
evidence is expected to remain original and unmodified
when identified by the investigator at the time the
evidence is found.
The important thing investigators need to know to
protect evidence is the chain of custody. It is to keep the
evidence obtained at the time of the crime or case by
minimizing the damage caused by investigation and
analysis. Evidence must be genuine. When the
investigator examines the evidence, it must not be
defective or physically or non-physically altered so that
the messages arising from such evidence are not lost or
modified.
The goals of the chain of custody are:
- The evidence is original
- At the time of the trial, the evidence is still as it
was found.
5. International Journal of Scientific Research in Science and Technology (www.ijsrst.com) 166
IV. CONCLUSION
There are much more areas of computer forensics that
must be explored deeper. This field has become a
significant part in exposing computer crimes. It is not
part of it because the science is increasingly advanced
rapidly coupled with human morals are becoming more
degenerate and far from the values of religion. It is
necessary for the monitoring of any human activities that
concern the interests of the community, especially with
the easy internet access, even from mobile phones,
people can access whatever is in this world. Another
important thing is that the existence of the law does not
necessarily make everyone become themselves as
perpetrators in computer forensics. There are already
rules set to become authors of computer forensics. Only
the authorized officers as police, an attorney is entitled
to investigate to obtain evidence of digital evidence from
other persons unless the competent authority has
designated it. It is therefore advisable not to take action
in previous chapters above for personal benefit except
for learning for science and technology development.
V. REFERENCES
[1]. R. Kaur and A. Kaur, "Digital Forensics,"
International Journal of Computer Applications,
vol. 50, no. 5, pp. 5-9, 2012.
[2]. F. Jafari and R. S. Satti, "Comparative Analysis of
Digital Forensic Models," Journal of Advances in
Computer Networks, vol. 3, no. 1, pp. 82-86,
2015.
[3]. R. Rowlingson, "A Ten Step Process for Forensic
Readiness," International Journal of Digital
Evidence, vol. 2, no. 3, pp. 1-28, 2004.
[4]. S. Perumal, "Digital Forensic Model Based On
Malaysian Investigation Process," International
Journal of Computer Science and Network
Security, vol. 9, no. 8, pp. 38-44, 2009.
[5]. M. R. Chourasiya and A. P. Wadhe,
"Implementation of Video Forensics Frame Work
for Video Source Identification," in International
Conference on Science and Technology for
Sustainable Development, Kuala Lumpur, 2016.
[6]. H. K. Siburian, "A Study of Internet and Cyber
Crime," International Journal of Scientific
Research in Science and Technology, vol. 2, no. 6,
pp. 223-226, 2016.
[7]. Y. M. Saragih and A. P. U. Siahaan, "Cyber
Crime Prevention Strategy in Indonesia,"
International Journal of Humanities and Social
Science, vol. 3, no. 6, pp. 22-26, 2016.
[8]. H. K. Siburian, "Emerging Issue in Cyber Crime:
Case Study Cyber Crime in Indonesia,"
International Journal of Science and Research,
vol. 5, no. 11, pp. 511-514, 2016.