- This is my first article, its for my Final Year Project for Bachelor's of Computer Science (Systems and Networking)
- It also will be uploaded into CyberSecurity Malaysia E-Bulletin for 2017
Analysis of IT Monitoring Using Open Source Software Techniques: A ReviewIJERD Editor
The Network administrators usually rely on generic and built-in monitoring tools for network
security. Ideally, the network infrastructure is supposed to have carefully designed strategies to scale up
monitoring tools and techniques as the network grows, over time. Without this, there can be network
performance challenges, downtimes due to failures, and most importantly, penetration attacks. These can lead to
monetary losses as well as loss of reputation. Thus, there is a need for best practices to monitor network
infrastructure in an agile manner. Network security monitoring involves collecting network packet data,
segregating it among all the 7 OSI layers, and applying intelligent algorithms to get answers to security-related
questions. The purpose is to know in real-time what is happening on the network at a detailed level, and
strengthen security by hardening the processes, devices, appliances, software policies, etc. The Multi Router
Traffic Grapher, or just simply MRTG, is free software for monitoring and measuring the traffic load
on network links. It allows the user to see traffic load on a network over time in graphical form.
SECURING BGP BY HANDLING DYNAMIC NETWORK BEHAVIOR AND UNBALANCED DATASETSIJCNCJournal
The Border Gateway Protocol (BGP) provides crucial routing information for the Internet infrastructure. A problem with abnormal routing behavior affects the stability and connectivity of the global Internet. The biggest hurdles in detecting BGP attacks are extremely unbalanced data set category distribution and the dynamic nature of the network. This unbalanced class distribution and dynamic nature of the network results in the classifier's inferior performance. In this paper we proposed an efficient approach to properly managing these problems, the proposed approach tackles the unbalanced classification of datasets by turning the problem of binary classification into a problem of multiclass classification. This is achieved by splitting the majority-class samples evenly into multiple segments using Affinity Propagation, where the number of segments is chosen so that the number of samples in any segment closely matches the minority-class samples. Such sections of the dataset together with the minor class are then viewed as different classes and used to train the Extreme Learning Machine (ELM). The RIPE and BCNET datasets are used to evaluate the performance of the proposed technique. When no feature selection is used, the proposed technique improves the F1 score by 1.9% compared to state-of-the-art techniques. With the Fischer feature selection algorithm, the proposed algorithm achieved the highest F1 score of 76.3%, which was a 1.7% improvement over the compared ones. Additionally, the MIQ feature selection technique improves the accuracy by 3.5%. For the BCNET dataset, the proposed technique improves the F1 score by 1.8% for the Fisher feature selection technique. The experimental findings support the substantial improvement in performance from previous approaches by the new technique.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Analysis of IT Monitoring Using Open Source Software Techniques: A ReviewIJERD Editor
The Network administrators usually rely on generic and built-in monitoring tools for network
security. Ideally, the network infrastructure is supposed to have carefully designed strategies to scale up
monitoring tools and techniques as the network grows, over time. Without this, there can be network
performance challenges, downtimes due to failures, and most importantly, penetration attacks. These can lead to
monetary losses as well as loss of reputation. Thus, there is a need for best practices to monitor network
infrastructure in an agile manner. Network security monitoring involves collecting network packet data,
segregating it among all the 7 OSI layers, and applying intelligent algorithms to get answers to security-related
questions. The purpose is to know in real-time what is happening on the network at a detailed level, and
strengthen security by hardening the processes, devices, appliances, software policies, etc. The Multi Router
Traffic Grapher, or just simply MRTG, is free software for monitoring and measuring the traffic load
on network links. It allows the user to see traffic load on a network over time in graphical form.
SECURING BGP BY HANDLING DYNAMIC NETWORK BEHAVIOR AND UNBALANCED DATASETSIJCNCJournal
The Border Gateway Protocol (BGP) provides crucial routing information for the Internet infrastructure. A problem with abnormal routing behavior affects the stability and connectivity of the global Internet. The biggest hurdles in detecting BGP attacks are extremely unbalanced data set category distribution and the dynamic nature of the network. This unbalanced class distribution and dynamic nature of the network results in the classifier's inferior performance. In this paper we proposed an efficient approach to properly managing these problems, the proposed approach tackles the unbalanced classification of datasets by turning the problem of binary classification into a problem of multiclass classification. This is achieved by splitting the majority-class samples evenly into multiple segments using Affinity Propagation, where the number of segments is chosen so that the number of samples in any segment closely matches the minority-class samples. Such sections of the dataset together with the minor class are then viewed as different classes and used to train the Extreme Learning Machine (ELM). The RIPE and BCNET datasets are used to evaluate the performance of the proposed technique. When no feature selection is used, the proposed technique improves the F1 score by 1.9% compared to state-of-the-art techniques. With the Fischer feature selection algorithm, the proposed algorithm achieved the highest F1 score of 76.3%, which was a 1.7% improvement over the compared ones. Additionally, the MIQ feature selection technique improves the accuracy by 3.5%. For the BCNET dataset, the proposed technique improves the F1 score by 1.8% for the Fisher feature selection technique. The experimental findings support the substantial improvement in performance from previous approaches by the new technique.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
MEKDA: Multi-Level ECC based Key Distribution and Authentication in Internet ...IJCNCJournal
The Internet of Things (IoT) is an extensive system of networks and connected devices with minimal human interaction and swift growth. The constraints of the System and limitations of Devices pose several challenges, including security; hence billions of devices must protect from attacks and compromises. The resource-constrained nature of IoT devices amplifies security challenges. Thus standard data communication and security measures are inefficient in the IoT environment. The ubiquity of IoT devices and their deployment in sensitive applications increase the vulnerability of any security breaches to risk lives. Hence, IoT-related security challenges are of great concern. Authentication is the solution to the vulnerability of a malicious device in the IoT environment. The proposed Multi-level Elliptic Curve Cryptography based Key Distribution and Authentication in IoT enhances the security by Multi-level Authentication when the devices enter or exit the Cluster in an IoT system. The decreased Computation Time and Energy Consumption by generating and distributing Keys using Elliptic Curve Cryptography extends the availability of the IoT devices. The Performance analysis shows the improvement over the Fast Authentication and Data Transfer method.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Distributed reflection denial of service attack: A critical review IJECEIAES
As the world becomes increasingly connected and the number of users grows exponentially and “things” go online, the prospect of cyberspace becoming a significant target for cybercriminals is a reality. Any host or device that is exposed on the internet is a prime target for cyberattacks. A denial-of-service (DoS) attack is accountable for the majority of these cyberattacks. Although various solutions have been proposed by researchers to mitigate this issue, cybercriminals always adapt their attack approach to circumvent countermeasures. One of the modified DoS attacks is known as distributed reflection denial-of-service attack (DRDoS). This type of attack is considered to be a more severe variant of the DoS attack and can be conducted in transmission control protocol (TCP) and user datagram protocol (UDP). However, this attack is not effective in the TCP protocol due to the three-way handshake approach that prevents this type of attack from passing through the network layer to the upper layers in the network stack. On the other hand, UDP is a connectionless protocol, so most of these DRDoS attacks pass through UDP. This study aims to examine and identify the differences between TCP-based and UDP-based DRDoS attacks.
ieee projects 2012 for cse in networking trichy, ieee projects 2012 for networking chennai, ieee projects 2012 for cse bangalore, ieee projects 2012 for it hyderabad, ieee projects 2012 for me pune, ieee projects 2012 for mca nagpur, ieee projects 2012 for me cse tirupati, ieee projects for cse 2012 titles cochin, ieee projects for cse 2012 free download mysore, ieee projects for cse 2012 hubli, ieee mini projects for cse 2012 vijayawada
A COOPERATIVE LOCALIZATION METHOD BASED ON V2I COMMUNICATION AND DISTANCE INF...IJCNCJournal
Relative positions are recent solutions to overcome the limited accuracy of GPS in urban environment.
Vehicle positions obtained using V2I communication are more accurate because the known roadside unit
(RSU) locations help predict errors in measurements over time. The accuracy of vehicle positions depends
more on the number of RSUs; however, the high installation cost limits the use of this approach. It also
depends on nonlinear localization nature. They were neglected in several research papers. In these studies,
the accumulated errors increased with time due to the linearity localization problem. In the present study,
a cooperative localization method based on V2I communication and distance information in vehicular
networks is proposed for improving the estimates of vehicles’ initial positions. This method assumes that
the virtual RSUs based on mobility measurements help reduce installation costs and facilitate in handling
fault environments. The extended Kalman filter algorithm is a well-known estimator in nonlinear problem,
but it requires well initial vehicle position vector and adaptive noise in measurements. Using the proposed
method, vehicles’ initial positions can be estimated accurately. The experimental results confirm that the
proposed method has superior accuracy than existing methods, giving a root mean square error of
approximately 1 m. In addition, it is shown that virtual RSUs can assist in estimating initial positions in
fault environments.
Privacy Preserving Public Auditing and Data Integrity for Secure Cloud Storag...INFOGAIN PUBLICATION
Using cloud services, anyone can remotely store their data and can have the on-demand high quality applications and services from a shared pool of computing resources, without the burden of local data storage and maintenance. Cloud is a commonplace for storing data as well as sharing of that data. However, preserving the privacy and maintaining integrity of data during public auditing remains to be an open challenge. In this paper, we introducing a third party auditor (TPA), which will keep track of all the files along with their integrity. The task of TPA is to verify the data, so that the user will be worry-free. Verification of data is done on the aggregate authenticators sent by the user and Cloud Service Provider (CSP). For this, we propose a secure cloud storage system which supports privacy-preserving public auditing and blockless data verification over the cloud
ieee projects is the most important projects for engineering students like BE Projects and ME Projects, MCA students Projects, BCA students Projects, MPhile Projects
Post-AlphaGo Deep Learning Innovation Status!
Patents are a good information resource for obtaining the state of the art of deep reinforcement learning (Deep RL) technology innovation insights.
I. Deep RL Technology Innovation Status
Patents that specifically describe the major Deep RL technologies are a good indicator of the Deep RL innovations in a specific innovation entity. To find the deep learning technology innovation status of Deep RL, patent applications in the USPTO as of May 31, 2020 that specifically describe the major Deep RL technologies are searched and reviewed. 260 published patent applications that are related to the key Deep RL technology innovation are selected for detail analysis.
II. Deep RL Technology Innovation Details
Patent information can provide many valuable insights that can be exploited for developing and implementing new technologies. Patents can also be exploited to identify new product/service development opportunities.
Autonomous Driving Vehicle (ADV)/Actor-Critic Algorithm
Telecommunications/Deep Q-Network (DQN) Learning
FinTech/Deep Q-Network (DQN) Learning
Google Deepmind’s Innovation for Challenging Deep RL Issues
DESIGN AND IMPLEMENTATION OF THE ADVANCED CLOUD PRIVACY THREAT MODELING IJNSA Journal
Privacy-preservation for sensitive data has become a challenging issue in cloud computing. Threat
modeling as a part of requirements engineering in secure software development provides a structured
approach for identifying attacks and proposing countermeasures against the exploitation of vulnerabilities
in a system. This paper describes an extension of Cloud Privacy Threat Modeling (CPTM) methodology for
privacy threat modeling in relation to processing sensitive data in cloud computing environments. It
describes the modeling methodology that involved applying Method Engineering to specify characteristics
of a cloud privacy threat modeling methodology, different steps in the proposed methodology and
corresponding products. In addition, a case study has been implemented as a proof of concept to
demonstrate the usability of the proposed methodology. We believe that the extended methodology
facilitates the application of a privacy-preserving cloud software development approach from requirements
engineering to design.
Abstract: Cloud computing is a latest trend and a hot topic in today global world. In which sources are provided to concern as local user on an on demand basically as usual it provides the path or means of internet. Mobile cloud computing is simply cloud computing throughout that at all smallest variety of devices could be involved as wireless equipment this paper concern multiple procedure and procedure for the mobile cloud computing . It developed every General mobile cloud computing solution and application specific solution. It also concern about the cloud computing in which mobile phones are used to browse the web, write e-mails, videos etc. Mobile phones are become the universal interface online services and cloud computing application general run local on mobile phones.
The Impact on Security due to the Vulnerabilities Existing in the network a S...IJAEMSJORNAL
Software Defined Networking, the emerging technology is taking the network sector to a new variant. Networking sector completely focused on hardware infrastructure is now moving towards software programming. Due to an exponential growth in the number of user and the amount of information over wires, there arises a great risk with the existing IP Network architecture. Software Defined Networking paves a platform identifying a feasible solution to the problem by virtualization. Software Defined Networking provides a viable path in virtualization and managing the network resources in an “On Demand Manner”. This study is focused on the drawbacks of the existing technology and a fine grained introduction to Software Defined Networking. Further adding to the above topic, this study also passes over the current steps taken in the industrial sector in implementing Software Defined Networking. This study makes a walkthrough about the security features of Software Defined Networking, its advantages, limitations and further scope in identifying the loopholes in the security.
Jednolity Plik Kontrolny – nowe obowiązki dla podatników prowadzących księgi ...Rekord SI sp. z o.o.
Obowiązek przekazywania ksiąg podatkowych na żądanie organu podatkowego za pomocą środków komunikacji elektronicznej wprowadziła ustawa o zmianie ustawy Ordynacja podatkowa (Dz.U. z 2015 poz.1649). Co do zasady obowiązek ten dotyczy wszystkich podatników, ale równocześnie ustawodawca użył w przepisach przejściowych pojęcia „przedsiębiorców” i ich kategorii (odwołanie wprost do ustawy o działalności gospodarczej) dla zróżnicowania terminów obowiązywania zapisów ustawy.
MEKDA: Multi-Level ECC based Key Distribution and Authentication in Internet ...IJCNCJournal
The Internet of Things (IoT) is an extensive system of networks and connected devices with minimal human interaction and swift growth. The constraints of the System and limitations of Devices pose several challenges, including security; hence billions of devices must protect from attacks and compromises. The resource-constrained nature of IoT devices amplifies security challenges. Thus standard data communication and security measures are inefficient in the IoT environment. The ubiquity of IoT devices and their deployment in sensitive applications increase the vulnerability of any security breaches to risk lives. Hence, IoT-related security challenges are of great concern. Authentication is the solution to the vulnerability of a malicious device in the IoT environment. The proposed Multi-level Elliptic Curve Cryptography based Key Distribution and Authentication in IoT enhances the security by Multi-level Authentication when the devices enter or exit the Cluster in an IoT system. The decreased Computation Time and Energy Consumption by generating and distributing Keys using Elliptic Curve Cryptography extends the availability of the IoT devices. The Performance analysis shows the improvement over the Fast Authentication and Data Transfer method.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Distributed reflection denial of service attack: A critical review IJECEIAES
As the world becomes increasingly connected and the number of users grows exponentially and “things” go online, the prospect of cyberspace becoming a significant target for cybercriminals is a reality. Any host or device that is exposed on the internet is a prime target for cyberattacks. A denial-of-service (DoS) attack is accountable for the majority of these cyberattacks. Although various solutions have been proposed by researchers to mitigate this issue, cybercriminals always adapt their attack approach to circumvent countermeasures. One of the modified DoS attacks is known as distributed reflection denial-of-service attack (DRDoS). This type of attack is considered to be a more severe variant of the DoS attack and can be conducted in transmission control protocol (TCP) and user datagram protocol (UDP). However, this attack is not effective in the TCP protocol due to the three-way handshake approach that prevents this type of attack from passing through the network layer to the upper layers in the network stack. On the other hand, UDP is a connectionless protocol, so most of these DRDoS attacks pass through UDP. This study aims to examine and identify the differences between TCP-based and UDP-based DRDoS attacks.
ieee projects 2012 for cse in networking trichy, ieee projects 2012 for networking chennai, ieee projects 2012 for cse bangalore, ieee projects 2012 for it hyderabad, ieee projects 2012 for me pune, ieee projects 2012 for mca nagpur, ieee projects 2012 for me cse tirupati, ieee projects for cse 2012 titles cochin, ieee projects for cse 2012 free download mysore, ieee projects for cse 2012 hubli, ieee mini projects for cse 2012 vijayawada
A COOPERATIVE LOCALIZATION METHOD BASED ON V2I COMMUNICATION AND DISTANCE INF...IJCNCJournal
Relative positions are recent solutions to overcome the limited accuracy of GPS in urban environment.
Vehicle positions obtained using V2I communication are more accurate because the known roadside unit
(RSU) locations help predict errors in measurements over time. The accuracy of vehicle positions depends
more on the number of RSUs; however, the high installation cost limits the use of this approach. It also
depends on nonlinear localization nature. They were neglected in several research papers. In these studies,
the accumulated errors increased with time due to the linearity localization problem. In the present study,
a cooperative localization method based on V2I communication and distance information in vehicular
networks is proposed for improving the estimates of vehicles’ initial positions. This method assumes that
the virtual RSUs based on mobility measurements help reduce installation costs and facilitate in handling
fault environments. The extended Kalman filter algorithm is a well-known estimator in nonlinear problem,
but it requires well initial vehicle position vector and adaptive noise in measurements. Using the proposed
method, vehicles’ initial positions can be estimated accurately. The experimental results confirm that the
proposed method has superior accuracy than existing methods, giving a root mean square error of
approximately 1 m. In addition, it is shown that virtual RSUs can assist in estimating initial positions in
fault environments.
Privacy Preserving Public Auditing and Data Integrity for Secure Cloud Storag...INFOGAIN PUBLICATION
Using cloud services, anyone can remotely store their data and can have the on-demand high quality applications and services from a shared pool of computing resources, without the burden of local data storage and maintenance. Cloud is a commonplace for storing data as well as sharing of that data. However, preserving the privacy and maintaining integrity of data during public auditing remains to be an open challenge. In this paper, we introducing a third party auditor (TPA), which will keep track of all the files along with their integrity. The task of TPA is to verify the data, so that the user will be worry-free. Verification of data is done on the aggregate authenticators sent by the user and Cloud Service Provider (CSP). For this, we propose a secure cloud storage system which supports privacy-preserving public auditing and blockless data verification over the cloud
ieee projects is the most important projects for engineering students like BE Projects and ME Projects, MCA students Projects, BCA students Projects, MPhile Projects
Post-AlphaGo Deep Learning Innovation Status!
Patents are a good information resource for obtaining the state of the art of deep reinforcement learning (Deep RL) technology innovation insights.
I. Deep RL Technology Innovation Status
Patents that specifically describe the major Deep RL technologies are a good indicator of the Deep RL innovations in a specific innovation entity. To find the deep learning technology innovation status of Deep RL, patent applications in the USPTO as of May 31, 2020 that specifically describe the major Deep RL technologies are searched and reviewed. 260 published patent applications that are related to the key Deep RL technology innovation are selected for detail analysis.
II. Deep RL Technology Innovation Details
Patent information can provide many valuable insights that can be exploited for developing and implementing new technologies. Patents can also be exploited to identify new product/service development opportunities.
Autonomous Driving Vehicle (ADV)/Actor-Critic Algorithm
Telecommunications/Deep Q-Network (DQN) Learning
FinTech/Deep Q-Network (DQN) Learning
Google Deepmind’s Innovation for Challenging Deep RL Issues
DESIGN AND IMPLEMENTATION OF THE ADVANCED CLOUD PRIVACY THREAT MODELING IJNSA Journal
Privacy-preservation for sensitive data has become a challenging issue in cloud computing. Threat
modeling as a part of requirements engineering in secure software development provides a structured
approach for identifying attacks and proposing countermeasures against the exploitation of vulnerabilities
in a system. This paper describes an extension of Cloud Privacy Threat Modeling (CPTM) methodology for
privacy threat modeling in relation to processing sensitive data in cloud computing environments. It
describes the modeling methodology that involved applying Method Engineering to specify characteristics
of a cloud privacy threat modeling methodology, different steps in the proposed methodology and
corresponding products. In addition, a case study has been implemented as a proof of concept to
demonstrate the usability of the proposed methodology. We believe that the extended methodology
facilitates the application of a privacy-preserving cloud software development approach from requirements
engineering to design.
Abstract: Cloud computing is a latest trend and a hot topic in today global world. In which sources are provided to concern as local user on an on demand basically as usual it provides the path or means of internet. Mobile cloud computing is simply cloud computing throughout that at all smallest variety of devices could be involved as wireless equipment this paper concern multiple procedure and procedure for the mobile cloud computing . It developed every General mobile cloud computing solution and application specific solution. It also concern about the cloud computing in which mobile phones are used to browse the web, write e-mails, videos etc. Mobile phones are become the universal interface online services and cloud computing application general run local on mobile phones.
The Impact on Security due to the Vulnerabilities Existing in the network a S...IJAEMSJORNAL
Software Defined Networking, the emerging technology is taking the network sector to a new variant. Networking sector completely focused on hardware infrastructure is now moving towards software programming. Due to an exponential growth in the number of user and the amount of information over wires, there arises a great risk with the existing IP Network architecture. Software Defined Networking paves a platform identifying a feasible solution to the problem by virtualization. Software Defined Networking provides a viable path in virtualization and managing the network resources in an “On Demand Manner”. This study is focused on the drawbacks of the existing technology and a fine grained introduction to Software Defined Networking. Further adding to the above topic, this study also passes over the current steps taken in the industrial sector in implementing Software Defined Networking. This study makes a walkthrough about the security features of Software Defined Networking, its advantages, limitations and further scope in identifying the loopholes in the security.
Jednolity Plik Kontrolny – nowe obowiązki dla podatników prowadzących księgi ...Rekord SI sp. z o.o.
Obowiązek przekazywania ksiąg podatkowych na żądanie organu podatkowego za pomocą środków komunikacji elektronicznej wprowadziła ustawa o zmianie ustawy Ordynacja podatkowa (Dz.U. z 2015 poz.1649). Co do zasady obowiązek ten dotyczy wszystkich podatników, ale równocześnie ustawodawca użył w przepisach przejściowych pojęcia „przedsiębiorców” i ich kategorii (odwołanie wprost do ustawy o działalności gospodarczej) dla zróżnicowania terminów obowiązywania zapisów ustawy.
REKORD.ERP, w połączeniu z filozofią KAIZEN, umożliwia ciągłe doskonalenie i usprawnianie działań przedsiębiorstwa. Funkcjonalności systemu są na bieżąco poszerzane, a jego modułowa budowa pozwala dostosować je do potrzeb i możliwości inwestycyjnych organizacji. Wpływa to na poprawę jakości oraz procesów, a co za tym idzie - osiągnięcie przewagi nad konkurencją.
For more course tutorials visit
www.newtonhelp.com
CST 610 Project 1 Information Systems and Identity Management
CST 610 Project 2 Operating Systems Vulnerabilities (Windows and Linux)
For more course tutorials visit
www.newtonhelp.com
CST 610 Project 1 Information Systems and Identity Management
CST 610 Project 2 Operating Systems Vulnerabilities (Windows and Linux)
CST 610 Project 3 Assessing Information System Vulnerabilities and Risk
Cst 610 Education is Power/newtonhelp.comamaranthbeg73
For more course tutorials visit
www.newtonhelp.com
CST 610 Project 1 Information Systems and Identity Management
CST 610 Project 2 Operating Systems Vulnerabilities (Windows and Linux)
SEAMLESS AUTOMATION AND INTEGRATION OF MACHINE LEARNING CAPABILITIES FOR BIG ...ijdpsjournal
The paper aims at proposing a solution for designing and developing a seamless automation and
integration of machine learning capabilities for Big Data with the following requirements: 1) the ability to
seamlessly handle and scale very large amount of unstructured and structured data from diversified and
heterogeneous sources; 2) the ability to systematically determine the steps and procedures needed for
analyzing Big Data datasets based on data characteristics, domain expert inputs, and data pre-processing
component; 3) the ability to automatically select the most appropriate libraries and tools to compute and
accelerate the machine learning computations; and 4) the ability to perform Big Data analytics with high
learning performance, but with minimal human intervention and supervision. The whole focus is to provide
a seamless automated and integrated solution which can be effectively used to analyze Big Data with highfrequency
and high-dimensional features from different types of data characteristics and different
application problem domains, with high accuracy, robustness, and scalability. This paper highlights the
research methodologies and research activities that we propose to be conducted by the Big Data
researchers and practitioners in order to develop and support seamless automation and integration of
machine learning capabilities for Big Data analytics.
SEAMLESS AUTOMATION AND INTEGRATION OF MACHINE LEARNING CAPABILITIES FOR BIG ...ijdpsjournal
The paper aims at proposing a solution for designing and developing a seamless automation and integration of machine learning capabilities for Big Data with the following requirements: 1) the ability to seamlessly handle and scale very large amount of unstructured and structured data from diversified and heterogeneous sources; 2) the ability to systematically determine the steps and procedures needed for
analyzing Big Data datasets based on data characteristics, domain expert inputs, and data pre-processing component; 3) the ability to automatically select the most appropriate libraries and tools to compute and accelerate the machine learning computations; and 4) the ability to perform Big Data analytics with high learning performance, but with minimal human intervention and supervision. The whole focus is to provide
a seamless automated and integrated solution which can be effectively used to analyze Big Data with highfrequency
and high-dimensional features from different types of data characteristics and different application problem domains, with high accuracy, robustness, and scalability. This paper highlights the research methodologies and research activities that we propose to be conducted by the Big Data researchers and practitioners in order to develop and support seamless automation and integration of machine learning capabilities for Big Data analytics.
Decision Making Framework in e-Business Cloud Environment Using Software Metr...ijitjournal
Cloud computing technology is most important one in IT industry by enabling them to offer access to their
system and application services on payment type. As a result, more than a few enterprises with Facebook,
Microsoft, Google, and amazon have started offer to their clients. Quality software is most important one in
market competition in this paper presents a hybrid framework based on the goal/question/metric paradigm
to evaluate the quality and effectiveness of previous software goods in project, product and organizations
in a cloud computing environment. In our approach it support decision making in the area of project,
product and organization levels using Neural networks and three angular metrics i.e., project metrics,
product metrics, and organization metrics
Case Study—PART 1—Jurisdictional Declaration CriteriaLevels .docxketurahhazelhurst
Case Study—PART 1—Jurisdictional Declaration
Criteria
Levels of Achievement
Content
(70%)
Advanced
92-100%
Proficient
84-91%
Developing
1-83%
Not Present
Total
Economic Development Location
18.5 to 20 points:
Location is clearly delineated, fully meeting assignment standards.
16.5 to 18 points:
Meets most of the assignment standards.
1 to 16 points:
Location needs further specification before the student may proceed.
0 points
Not present
Action Research Statement of Work Understood and Signed
13.5 to 15 points:
Template completed.
12.5 to 13 points:
Template partially completed.
1 to 12 points:
Student Modified Template
0 points
Not present
Structure (30%)
Advanced
92-100%
Proficient
84-91%
Developing
1-83%
Not present
Total
Formatting, Spelling, and Grammar
13.5 to 15 points:
No spelling or grammar errors
12.5 to 13 points:
1-2 spelling and/or
grammar errors
1 to 12 points:
3-4 spelling and/or
grammar errors
0 points
Not present
Professor Comments:
Total:
/50
Running head: NETWORK DESCRIPTION 1
NETWORK DESCRIPTION 6
NETWORK DESCRIPTION
Institution Affiliation
Student Name
Date
HEALTH-COP COMPANY
Network and Workflow Description
Data mining is a complex process that involves several activities undertaken sequentially for the entire process to be successful. As such, there are specific protocols that must be followed in data mining. The desired goals and objectives are the guiding principles upon which the type of data to be analyzed is identified. The main goal for Health-Cop is to establish links between diet composition and health issues. More specifically, the company will focus on analysis of data from various health facilities, websites, databases and health journals. The analysis is intended to provide new forms of data that can be interpreted to give meaningful patterns. To facilitate the process of data mining, there are several aspects that must be considered such as: statistics, clustering of data, rules of association, data classification, visualization and the decision tree.
Network Description
Health Cop company will set up is network system using both the windows and Linux based operating system. The company will have 10 desktop computers and 5 portable computers. The 10 desktop computers will be connected together via a metered Wi-Fi service. The desktops will be the main engine of the company. All the desktops will be configured with an algorithm that constantly searches for specific keywords from various databases. The portables computers will be connected to the internet via modems. A modem is much safer since it limits the connectivity to only the device being used. Internet connectivity via modem is facilitated through local area networks (LAN), through to the service providers, (Cui, et.al., 2016). Multiple firewalls are set up within the company networks to sort out undesired data traffic from the local network on the computer devices.
The most suitable firewall for the network w ...
Shape based plagiarism detection for flowchart figures in textsijcsit
Plagiarism detection is well known phenomenon in the academic arena. Copying other people is
considered as serious offence that needs to be checked. There are many plagiarism detection systems such
as turn-it-in that has been developed to provide this checks. Most, if not all, discard the figures and charts
before checking for plagiarism. Discarding the figures and charts results in look holes that people can take
advantage. That means people can plagiarized figures and charts easily without the current plagiarism
systems detecting it. There are very few papers which talks about flowcharts plagiarism detection.
Therefore, there is a need to develop a system that will detect plagiarism in figures and charts. This paper
presents a method for detecting flow chart figure plagiarism based on shape-based image processing and
multimedia retrieval. The method managed to retrieve flowcharts with ranked similarity according to
different matching sets.
Similar to Visualization of Computer Forensics Analysis on Digital Evidence (20)
Show drafts
volume_up
Empowering the Data Analytics Ecosystem: A Laser Focus on Value
The data analytics ecosystem thrives when every component functions at its peak, unlocking the true potential of data. Here's a laser focus on key areas for an empowered ecosystem:
1. Democratize Access, Not Data:
Granular Access Controls: Provide users with self-service tools tailored to their specific needs, preventing data overload and misuse.
Data Catalogs: Implement robust data catalogs for easy discovery and understanding of available data sources.
2. Foster Collaboration with Clear Roles:
Data Mesh Architecture: Break down data silos by creating a distributed data ownership model with clear ownership and responsibilities.
Collaborative Workspaces: Utilize interactive platforms where data scientists, analysts, and domain experts can work seamlessly together.
3. Leverage Advanced Analytics Strategically:
AI-powered Automation: Automate repetitive tasks like data cleaning and feature engineering, freeing up data talent for higher-level analysis.
Right-Tool Selection: Strategically choose the most effective advanced analytics techniques (e.g., AI, ML) based on specific business problems.
4. Prioritize Data Quality with Automation:
Automated Data Validation: Implement automated data quality checks to identify and rectify errors at the source, minimizing downstream issues.
Data Lineage Tracking: Track the flow of data throughout the ecosystem, ensuring transparency and facilitating root cause analysis for errors.
5. Cultivate a Data-Driven Mindset:
Metrics-Driven Performance Management: Align KPIs and performance metrics with data-driven insights to ensure actionable decision making.
Data Storytelling Workshops: Equip stakeholders with the skills to translate complex data findings into compelling narratives that drive action.
Benefits of a Precise Ecosystem:
Sharpened Focus: Precise access and clear roles ensure everyone works with the most relevant data, maximizing efficiency.
Actionable Insights: Strategic analytics and automated quality checks lead to more reliable and actionable data insights.
Continuous Improvement: Data-driven performance management fosters a culture of learning and continuous improvement.
Sustainable Growth: Empowered by data, organizations can make informed decisions to drive sustainable growth and innovation.
By focusing on these precise actions, organizations can create an empowered data analytics ecosystem that delivers real value by driving data-driven decisions and maximizing the return on their data investment.
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
StarCompliance is a leading firm specializing in the recovery of stolen cryptocurrency. Our comprehensive services are designed to assist individuals and organizations in navigating the complex process of fraud reporting, investigation, and fund recovery. We combine cutting-edge technology with expert legal support to provide a robust solution for victims of crypto theft.
Our Services Include:
Reporting to Tracking Authorities:
We immediately notify all relevant centralized exchanges (CEX), decentralized exchanges (DEX), and wallet providers about the stolen cryptocurrency. This ensures that the stolen assets are flagged as scam transactions, making it impossible for the thief to use them.
Assistance with Filing Police Reports:
We guide you through the process of filing a valid police report. Our support team provides detailed instructions on which police department to contact and helps you complete the necessary paperwork within the critical 72-hour window.
Launching the Refund Process:
Our team of experienced lawyers can initiate lawsuits on your behalf and represent you in various jurisdictions around the world. They work diligently to recover your stolen funds and ensure that justice is served.
At StarCompliance, we understand the urgency and stress involved in dealing with cryptocurrency theft. Our dedicated team works quickly and efficiently to provide you with the support and expertise needed to recover your assets. Trust us to be your partner in navigating the complexities of the crypto world and safeguarding your investments.
As Europe's leading economic powerhouse and the fourth-largest hashtag#economy globally, Germany stands at the forefront of innovation and industrial might. Renowned for its precision engineering and high-tech sectors, Germany's economic structure is heavily supported by a robust service industry, accounting for approximately 68% of its GDP. This economic clout and strategic geopolitical stance position Germany as a focal point in the global cyber threat landscape.
In the face of escalating global tensions, particularly those emanating from geopolitical disputes with nations like hashtag#Russia and hashtag#China, hashtag#Germany has witnessed a significant uptick in targeted cyber operations. Our analysis indicates a marked increase in hashtag#cyberattack sophistication aimed at critical infrastructure and key industrial sectors. These attacks range from ransomware campaigns to hashtag#AdvancedPersistentThreats (hashtag#APTs), threatening national security and business integrity.
🔑 Key findings include:
🔍 Increased frequency and complexity of cyber threats.
🔍 Escalation of state-sponsored and criminally motivated cyber operations.
🔍 Active dark web exchanges of malicious tools and tactics.
Our comprehensive report delves into these challenges, using a blend of open-source and proprietary data collection techniques. By monitoring activity on critical networks and analyzing attack patterns, our team provides a detailed overview of the threats facing German entities.
This report aims to equip stakeholders across public and private sectors with the knowledge to enhance their defensive strategies, reduce exposure to cyber risks, and reinforce Germany's resilience against cyber threats.
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
Visualization of Computer Forensics Analysis on Digital Evidence
1. VISUALIZATION OF COMPUTER FORENSICS
ANALYSIS ON DIGITAL EVIDENCE
Muhd Mu’izuddin b. Hj.Muhsinon,
Nazri b. Ahmad Zamani
University Tenaga Nasional,
CyberSecurity Malaysia
muiz_din94@rocketmail.com
Abstract
The project is to explore the usage of data science methodology in further analyzing
computer forensics analysis results. In computer forensics the analysis in carried out via
forensics tools, for example EnCase, FTK, and XRY. These tools have powerful engine
to zooming in digital evidence and finding information pertinent to an investigation. What
lack in these tools are features for statistical, machine learning and visualization
function that may be crucial in looking into the evidence in its entirety. The project will
explore methods to profile and visualize these in computer forensics analysis findings
by using Python and Jupyter Notebook. The EnCase csv files of a real-life case analysis
will be loaded and will be analyzed by using Python’s SKLearn statistical and pattern
recognition engine. The result will be plotted by using Python’s visualization tools such
as Matplotlib, Seaborn, and Pandas.
I. Introduction
Computer technology is the major integral
part of everyday human life, and it is
growing rapidly, as are computer crimes
such as financial fraud, unauthorized
intrusion, identity theft and intellectual
theft. To counteract those computer-
related crimes, Computer Forensics plays
a very important role. Computer
Forensics involves obtaining and
analysing digital information for use as
evidence in civil, criminal or
administrative cases [1] .
A Computer Forensic Investigation
generally investigates the data which
could be taken from computer hard disks
or any other storage devices with
adherence to standard operating policies
and procedures to determine if those
devices have been compromised by
unauthorized access or not [2]. Computer
Forensics Investigators work as a team to
investigate the incident and conduct the
forensic analysis by using various
methodologies (e.g. Static and Dynamic)
and tools (e.g. EnCase csv files of a real-
life case).
To ensure the computer network system
is secure in an organization. A successful
Computer Forensic Investigator must be
familiar with various laws and regulations
related to computer crimes in their
country (e.g. Malaysian Computer Crimes
Act , CCA 1997) and various computer
operating systems (e.g. Windows, Linux)
and network operating systems (e.g. Win
NT). This report will be analyzed the
method and visualize these computer
forensics analysis results by using Python
and Jupyter Notebook. The result will be
2. plotted in visualization so that it more
easy to make reference or any
improvement [2].
Digital investigations are constantly
changing as new technologies are utilized
to create, store or transfer vital data [3].
Augmenting existing forensic platforms
with innovative methods of acquiring,
processing, reasoning about and
providing actionable evidence is vital.
Integrating open-source Python scripts
with leading-edge forensic platforms like
EnCase provides great versatility and can
speed new investigative methods and
processing algorithms to address these
emerging technologies.
In Malaysia, law enforcement agency is
now faced with the task of enforcing law
in cyberspace that transcends borders
and raises issues of jurisdiction.
Cybercrime has surpassed drug
trafficking as the most lucrative crime.
Almost anybody who is an active
computer/online user would have been a
cybercrime victim, and in most cases too
its perpetrators. Cybercriminals usually
use to cheat, harass, disseminate false
information for their own good. This
project basically want to improve the
results of the investigation have been
made to visualize these computer
forensics analysis results by using Python
and Jupyter Notebook. By not only have
raw data into something that is more
easily understood as a whole. So that,
people can also see the overview of the
results and it will be more accurate.
II. Problem Statement
In the analysis period of the computer
forensics crime scene investigation, the
analyst may confront numerous issues
on getting the exceptionally precise
result. They only get some kind of raw
information and less clear than regular
visualizations even more
understandable. One of the problems
is:
1. Computer forensics system lacks
statistical and visualization tools.
There are key points that need to be
considered in the investigation period
of the digital evidence:
1. Evidence profiling is crucial in
understanding relationships of
the digital evidence activities
timeline to the case investigation
timeline.
III. Workflow
Figure 3.: Flowchart
3. Figure 3.: Current Situation
Figure 3.: Overview
The security analysts are having
problems with lack of statistical and
visualizations tools in order to get
accurate results. They need to manually
compared all the raw information’s from
the digital evidence instead of visualize it.
The data from csv file may consist so
many data that came from various
sources. To avoid the situation where
analysts having issues with time
consuming and getting unclear
visualization of the csv data, the system
is needed. By using Jupyter Notebook
with Python, it may assist analyst to
gathering information and speed up prove
of evidence collection. Visualizaitons will
help to provide more understandable and
clear view of data from the csv file.
Due to the problems that have been
declared, the system provided the best
solution in order to get the visualizations.
The data .csv will be loaded into Jupyter
Notebook with Python 2.7, and then user
will choose type of analysis to be
included. In this part, it will decide on the
building blocks of the language such as
variables, datatypes, functions,
conditionals and loops. In addition,
question may be asked in this phase,
what type of analysis has been choosen.
In models & algorithms part, user may
choose what kind of models to be
produce based on the coding part. After
that, visualization will be visualized based
on request. In reporting part, user may
choose whether to export the data
science results and code base to PDF,
Microsoft Word and the web (html).
IV. Data Specimen
In computing, a comma-separated values
(CSV) document stores unthinkable
information (numbers and content) in
plain content. Every line of the document
is an information record. Every record
comprises of one or more fields, isolated
by commas. The utilization of the comma
as a field separator is the wellspring of
the name for this document group.
The CSV record organization is not
standardized. The essential thought of
isolating fields with a comma is clear, yet
that thought gets confused when the field
information may likewise contain commas
or even implanted line-breaks. CSV
usage may not handle such field
information, or they may utilize quotes to
encompass the field. CSV data contains
many datatypes and fields, it need to be
4. clean in order to get a better view of the
data. Jupyter Notebook with Python have
provided csvkit library in order to clean
the data. It can be set during the coding
part of the system.
Figure 1.: Data .csv
V. Methodology
Methodology that are used by this project
is Security Data Visualization Process.
Figure 5.1: Security Data Visualization
Process
1. Visualization Goals
On this step, it should get the overview of
current situation. Then, it follows with
gathering requirement from security
analyst where is the main user. The
requirement consists of determine the
visualization goals for the specific ease.
In fullfulling the requirement, the program
development of the system is produced in
order to achieve the visualization goals
that will be determined by the security
analyst. The visualization goals may
consists of what kind of information and
question required by the security analyst.
2. Data Preparation
It begins with seeking information and
setting up the information for analysis.
The following stride is to investigate the
information with the right inquiries, then
picture the information to create bits of
knowledge and follow up on it.
The most essential stride before
beginning representation is information
purifying or making the information
accessible in a usable configuration. For
example, encase data form csv file. It will
search for different learns of files found
inside an external hardrive and represent
it in visualization methods.
3. Explore
Asking the right question will prompt
further investigation and representation
utilizing factual/probabilistic
models/calculations and lead to helpful
bits of knowledge/choices. Statistical
methods that suitable to be used will be
decided in this step.
The investigate stage will take a gander
at some systematic exercises that will
empower security groups to ask the right
inquiries and take a gander at the
5. information to perceive how security
groups can accomplish their objectives.
4. Visualize
There are two angles to perception
hypothesis; one of it is the style. There is
writing around how to utilize shading,
tone, thickness and different perspectives
to make outwardly satisfying pictures to
target group. There is part of outline rules
in the book [4]. Graphics Press: There is
a committed section in the book [5].
These are sample of visualizations and
some explanation about it that could be
made.
5. Feedback
This step involves continuous
improvement with feedback from the
stakeholders and availability of new data.
In reporting part, data science results
could be represented in many ways.
VI. Results
For this visualization, the CyberSecurity
Malaysia has provided this data. It
provides metadata from Encase Result in
real forensic cases. The format for this
data is in .csv.
The metadata from the Encase Result
was a real data that given by Digital
Forensics Departments In CyberSecurity
Malaysia. The data was exhibit from
external hard drive. The first impression
by just looking the raw data, visualization
can make the data into something that is
more easily understood as a whole. So
that, people can also see the overview of
the results and it will be more accurate. In
this way, analysts are doing deduction of
material evidence so that they are easy to
identify the suspect
Figure 6.1: Overall Data Pie Chart
This pie chart shows the perentage of
each data type in the metadata file. From
the chart, .jpg data type is the highest
data that are produced/keeped by the
suspect. Followed by .xls, .pdf, .doc and
lastly .pptx.
Suspects showed a deep interest in data
type .jpg extent that more than 50% of the
data is based on data type .jpg. But, none
the less the number of data types .xls
where it represents 22% of the total data.
The suspect is likely an overpowering
interest in the collection, but the suspect
was also a diligent collecting data in the
calculation of whether skilled or analyze.
Figure .2: Data Compared by Month
From the graph, total number of metadata
shows that in April is the active period for
the suspect to produced/keeped the data.
So, it can be predict that April month for
each year are the most busy time for the
suspect to produced/keeped the data.
6. Followed by May, July and November
each of the above shows the number of
data rates are relatively high. The
probabilities that suspect are actively
doing the job in the middle of the year.
While in January for each year are the
lowest count that suspect frequently to
produced/keeped the data.
As seen in the graph, in January and
December rates meant their numbers are
very different compared to other months.
The assumption can be made that in two
months the suspect took time off and less
interested in generating any data.
Figure 6.3: Data Compared by Years
In this graph, it compare the data type of
the data for each year. It's proved that
.jpg file was the most file that being
produce/keep by the suspect. It can be
said that in 2006 & 2013 respectively was
the highest data being produce/keep
based on the visualization.
From that, we can aspect the suspect
behavior. .jpg format is for digital photos
and
other digital graphics. So, from that we
can concluded that suspect loves picture.
In beginning of years 1998 until 2011 it
keep going produce/keep those kind of
data.
Suspect likely to take great pictures. it
uses the advantages of and interest on
the image to get his wish. Conclusion that
can be made is the suspect is a
Photographer. In 2013, but no less
intense .xls format.It's a file extension for
a spreadsheet file format created by
Microsoft for use with Microsoft Excel.
Microsoft Excel is a well-organized
platform that give freedom to write data
on grids and worksheets, organized at
will, formatted as they prefer.It's also uses
in any quantity in business or finance.
Suspect maybe someone that loves to
write and doing doing something with its
own way.
Conclusion that can be made is the
suspect is an Analyst.
VII. Conclusion and Way
Forward
1. There are sugeestion that can be
making for the future works.
Visualization results improved with
the addition of information and the
right technique.
2. Numerical data can improve the
quality of the visualization. The
graphs are more attractive and easy
to understand.
3. If jupyter notebook can import more
data library, the more attractive form
of graphs can be
7. In conclusion, the phases that was
involved throughout the development of
this system starting from the idea,
requirement gathering, analysis, design,
coding, testing and finally presentation
was a very precious journey of learning,
failures, successes and persistence.
From this journey, this application has
opened my thoughts on how I used to
view on programming and it builds a
sense of interest in me towards
programming. Even though there are, still
much enhancement to be made in future,
the current developed system still
manages to fulfill the minimum
requirements and solves the problems
stated.
VIII. References
1. Nelson, B., et al., “Computer
Forensics Investigation”, 2008.
2. Case studies,
http://resources.infosecinstitute.com/,
2016.
3. Michael G. Noblett; Mark M. Pollitt;
Lawrence A. Presley, “Computer
Forensics”,
https://en.wikipedia.org/wiki/Compute
r_forensics October 2000.
4. Tufte, E., ”The visual display of
quantitative information, Cheshire,
Conn. (Box 430, Cheshire 06410)”,
1983.
5. Marty, R., “Applied security
visualization, Upper Saddle River, NJ:
Addison-Wesley", 2009.