Sensitive data may be stored in different forms. Not only legal owners but also malicious people are interesting of getting sensitive data. Exposing valuable data to others leads to severe Consequences. Customers, organizations, and /or companies lose their money and reputation due to data breaches. There are many reasons for data leakages. Internal threats such as human mistakes and external threats such as DDoS attacks are two main reasons for data loss. In general, data may be categorized based into three kinds: data in use, data at rest, and data in motion. Data Loss Prevention (DLP) are good tools to identify important data. DLP can do analysis for data content and send feedback to administrators to make decision such as filtering, deleting, or encryption. Data Loss Prevention (DLP) tools are not a final solution for data breaches, but they consider good security tools to eliminate malicious activities and protect sensitive information. There are many kinds of DLP techniques, and approximation matching is one of them. Mrsh-v2 is one type of approximation matching. It is implemented and evaluated by using TS dataset and confusion matrix. Finally, Mrsh-v2 has high score of true positive and sensitivity, and it has low score of false negative.
This document discusses achieving compliance with the Health Insurance Portability and Accountability Act (HIPAA). It notes that technology alone cannot ensure compliance, but tools like electronic medical record software can help practices achieve compliance by aiding the important factors of people, processes, and policies. The document provides a checklist of steps needed for compliance, including education, policies, technology standards, documentation, and periodic audits. It also discusses specific controls, documentation, data destruction procedures, and documents typically requested during investigations of HIPAA compliance.
This document provides information about a 2-day training course on "Risk Based IT Auditing for Non-IT Auditors". The training will cover topics such as understanding information systems risks and controls, auditing key controls, systems development lifecycles, corporate governance and compliance, and using computer-assisted audit techniques. The instructor, Thilakpathirage, has over 35 years of experience in banking, information security, and risk management. The course is intended for internal auditors and others who need a basic understanding of IT risk-based auditing practices.
Fusion of data from multiple sources is generating new information from existing data. Now users
can access any information from inside or outside of the organization very easily. It helps to increase
the user productivity and knowledge shared within the organization. But this leads to a new area of
network security threat, “Inside Threat”. Now users can share critical information of organization to
outside the organization if he/she has access to the information. The current network security tool
cannot prevent the new threat. In this paper, we address this issue by “Building real time anomaly
detection system based on users’ current behavior and previous behavior”
Protect customer's personal information eng 191018sang yoo
Let's take a look at the mcloudoc-based personal information protection function!
First of all, by unifying the personal information management points, all information managed sporadically on a personal PC is easily managed, reducing the management cost!
In addition, it is possible to control the personal information document because the authority to handle the document can be granted depending on the role of the employee who manages the personal information document.
Even personal information hidden in centralized documents can be detected, and the work history of users using personal information documents can also be tracked, which can also be used to leak malicious documents.
Now, how about realizing the protection of personal information documents with mcloudoc?
Start with mcloudoc!
Mindtree distributed agile journey and guiding principlesMindtree Ltd.
Agile is all about delivering business value in short iterations at a sustainable pace, adapting to changing business needs. Agile software development focuses on early delivery of working software and considers working software as the primary measure of progress. It creates an environment that responds to change by being flexible and nimble. It discourages creation of extensive documents that do not add any value.
This document provides an overview of Microsoft Office 365 Data Loss Prevention (DLP). It defines DLP as a practice to protect sensitive data from loss or sharing. It describes how Office 365 DLP works by detecting sensitive content, defining policies to monitor activities, and taking protective actions. It also outlines how to configure DLP policies by specifying monitoring targets, locations, conditions, and actions. Finally, it discusses best practices for planning, preparing, testing and deploying DLP policies within an organization.
This document discusses achieving compliance with the Health Insurance Portability and Accountability Act (HIPAA). It notes that technology alone cannot ensure compliance, but tools like electronic medical record software can help practices achieve compliance by aiding the important factors of people, processes, and policies. The document provides a checklist of steps needed for compliance, including education, policies, technology standards, documentation, and periodic audits. It also discusses specific controls, documentation, data destruction procedures, and documents typically requested during investigations of HIPAA compliance.
This document provides information about a 2-day training course on "Risk Based IT Auditing for Non-IT Auditors". The training will cover topics such as understanding information systems risks and controls, auditing key controls, systems development lifecycles, corporate governance and compliance, and using computer-assisted audit techniques. The instructor, Thilakpathirage, has over 35 years of experience in banking, information security, and risk management. The course is intended for internal auditors and others who need a basic understanding of IT risk-based auditing practices.
Fusion of data from multiple sources is generating new information from existing data. Now users
can access any information from inside or outside of the organization very easily. It helps to increase
the user productivity and knowledge shared within the organization. But this leads to a new area of
network security threat, “Inside Threat”. Now users can share critical information of organization to
outside the organization if he/she has access to the information. The current network security tool
cannot prevent the new threat. In this paper, we address this issue by “Building real time anomaly
detection system based on users’ current behavior and previous behavior”
Protect customer's personal information eng 191018sang yoo
Let's take a look at the mcloudoc-based personal information protection function!
First of all, by unifying the personal information management points, all information managed sporadically on a personal PC is easily managed, reducing the management cost!
In addition, it is possible to control the personal information document because the authority to handle the document can be granted depending on the role of the employee who manages the personal information document.
Even personal information hidden in centralized documents can be detected, and the work history of users using personal information documents can also be tracked, which can also be used to leak malicious documents.
Now, how about realizing the protection of personal information documents with mcloudoc?
Start with mcloudoc!
Mindtree distributed agile journey and guiding principlesMindtree Ltd.
Agile is all about delivering business value in short iterations at a sustainable pace, adapting to changing business needs. Agile software development focuses on early delivery of working software and considers working software as the primary measure of progress. It creates an environment that responds to change by being flexible and nimble. It discourages creation of extensive documents that do not add any value.
This document provides an overview of Microsoft Office 365 Data Loss Prevention (DLP). It defines DLP as a practice to protect sensitive data from loss or sharing. It describes how Office 365 DLP works by detecting sensitive content, defining policies to monitor activities, and taking protective actions. It also outlines how to configure DLP policies by specifying monitoring targets, locations, conditions, and actions. Finally, it discusses best practices for planning, preparing, testing and deploying DLP policies within an organization.
Let us understand some of the infrastructural and
security challenges that every organization faces today
before delving into the concept of securing the cloud
data lake platform. Though Data lakes provide scalability,
agility, and cost-effective features, it possesses a unique
infrastructure and security challenges.
An agile based software development approach offers many advantages of an iterative and fast-paced process. However, customers often find themselves at crossroads when it comes to choosing a specific adoption path. Organizational culture and mindset are critical to the success of distributed agile projects. Enterprises need the right partner who can address all of these and deliver projects efficiently.
This document discusses disaster recovery. It describes different types of disasters including natural disasters like tsunamis and earthquakes that are difficult to recover from, and man-made disasters like theft that are easier to recover from. It emphasizes the importance of data backup and restoration, and outlines disaster recovery measures organizations can take including identifying critical systems, backing up data daily including transaction logs, and having system redundancy. It also recommends technologies for disaster recovery planning like regular backups to off-site locations and data replication using SAN storage.
2017-10-05 Mitigating Cybersecurity and Cyber Fraud risk in Your OrganizationRaffa Learning Community
An examination of ever growing cyber threats which continue to develop and successfully execute cyber attacks and fraud scams, which cost businesses billions of dollars globally. This session will step through different current and emerging cyber attacks and cyber fraud scenarios, and then discuss how basic but effective security controls can help to significantly reduce the risks.
You are attending a workshop on security threats and how to address them, not a training. The presenters introduce themselves and their backgrounds. They discuss how security threats have evolved from viruses in the early internet era to today's more sophisticated targeted attacks. Microsoft's approach to security focuses on technology, processes, and people to manage complexity, protect information, and advance the business with IT solutions. Specific solutions discussed include Windows Firewall, BitLocker, and Network Access Protection.
The document discusses the major security concerns organizations have regarding cloud environments. The top concerns include: data loss/leakage due to the ease of sharing data in the cloud (69% of organizations), data privacy and confidentiality (66%), accidental exposure of cloud credentials (44%), difficulty performing effective incident response in the cloud (44%), and legal/regulatory compliance challenges (42%). Other concerns include data sovereignty/residence/control, as organizations may not know where their data is physically stored.
The document discusses the Digital Trust Framework (DTF), which will use the TMForum's Open Digital Architecture (ODA) as a foundation. The DTF is being developed for 4IR environments and will provide a blueprint for modular, cloud-based, open digital platforms that can be orchestrated using AI. It will integrate ODA with other frameworks to ensure an overall digital trust approach for continuously evolving systems.
This document describes Scalar's managed security services. It notes that cyber attacks are increasing in frequency and severity, posing a major challenge for organizations. While security has become a top priority, many companies lack the in-house expertise to effectively manage their security. Scalar's managed security services allow companies to leverage their specialized skills and expertise through three tiers - Insight, Monitoring, and Management - to address security issues for a predictable monthly cost. This reduces the need for companies to invest in recruiting and training their own security staff.
Cyber Security - Maintaining Operational Control of Critical ServicesDave Reeves
This document has been developed to assist organisations with some of the considerations when building and operating critical services from an ICS cyber security perspective. The next whitepaper in the series will focus on securing critical services and the inter dependencies between cyber and physical security.
Aujas Cyber Security is a global cyber security services company consistently recognized by NASSCOM, Deloitte and Gartner for its unique cyber security capabilities. With a growing workforce of 400+ security experts, Aujas Networks has served more than 1500 clients across the globe.
Technology Overview - Symantec Data Loss Prevention (DLP)Iftikhar Ali Iqbal
The presentation provides the following:
- Symantec Corporate Overview
- Solution Portfolio of Symantec
- Symantec Data Loss Prevention - Introduction
- Symantec Data Loss Prevention - Components
- Symantec Data Loss Prevention - Features & Use Cases
- Symantec Data Loss Prevention - System Requirements
- Symantec Data Loss Prevention - Appendix (extra information)
This provides a brief overview of Symantec Data Loss Prevention (DLP). Please note all the information is based prior to May 2016 and the full integration of Blue Coat Systems's set of solutions.
Forcepoint offers a Data Loss Prevention (DLP) solution that takes a human-centric approach to data security. It focuses on gaining visibility into user interactions with data across endpoints, cloud applications, and networks in order to apply appropriate controls based on the user's risk level and the sensitivity of the data. The solution aims to accelerate compliance with regulations, empower users to protect data, provide advanced detection of potential data loss through machine learning and fingerprinting techniques, and prioritize security incidents by risk level. It combines DLP capabilities across endpoints, cloud applications, and the network from a single point of control.
This document discusses data leakage prevention (DLP) and outlines best practices for implementing a DLP project. It defines DLP, explains how DLP technology works to monitor data in motion, at rest, and in use. The document recommends a multi-step DLP project that includes analyzing business environments and threats, classifying sensitive data, mapping data storage and business processes, assessing leakage channels, and selecting DLP tools. It also stresses the importance of organizational culture and policies to complement technical solutions and prevent data leakage.
There is an increasing trend witnessed in the cloud computing technology which has led to a lot of risks in preserving the Confidentiality, Integrity and Availability of data. The Cloud is now facing a lot of compliance requirements due to the sensitivity of the data that is being stored. View this presentation to understand the Cloud Compliance Requirements, Risks, Audit Processes and Methodologies involved in providing assurance.
This presentation was given by CA Anand Prakash Jangid at the Conference on Cloud Computing conducted by the Committee on Information Technology of the Institute of Chartered Accountants of India on 11th January 2014.
Cybersecurity frameworks globally and saudi arabiaFaysal Ghauri
My second paper on Cybersecurity frameworks and how Saudi Arabia is forming. This paper has been published by the International Journal of Computer Science and Information Security (IJCSIS) in April 2021, Vol. 19 No. 4 Publication.
The document discusses API security best practices. It describes how APIs can be secured at different layers including authentication, authorization, perimeter defense, and the service/API layer. It also discusses how a blended API gateway and data loss prevention deployment can help control access to APIs and sensitive data. The presentation included examples of securing mobile access to enterprise services and controlling use of cloud infrastructure through an API gateway.
Harald Leitenmüller | DSGVO - globaler, zeitgemäßer Datenschutzstandard für M...Microsoft Österreich
The document discusses Microsoft's approach to data protection and compliance with the GDPR. It provides an overview of Microsoft's security operations, including its cyber defense operations center and intelligence security graph. It also describes Microsoft's Next Generation Privacy framework for inventorying and standardizing how customer data is treated. Tools like SecureScore and the Compliance Manager are introduced for assessing compliance and managing tasks. Additional resources on GDPR, security reference architectures and blogs are listed in an appendix.
Cloud Computing
Categories of Cloud Computing
SaaS
PaaS
IaaS
Threads of Cloud Computing
Insurance Challenges
Cloud Solutions
Security of the Insurance Industry
Cloud Solutions
Insurance Security in the Insurance Industry with respect to Indian market
IT security teams often lack visibility into cloud file sharing services used by employees. Analysis of over 100 million files shared across many industries revealed several risks to enterprise data and compliance. The majority of broadly shared files and exposure risks were concentrated among a small number of users. While passwords and encryption aim to protect data, inadvertent or deliberate data exposure still commonly occurs. New technologies embedded in the cloud are needed to provide visibility and control over shadow data and file sharing activities.
Data Leak Protection Using Text Mining and Social Network AnalysisIJERD Editor
Data Leak prevention is a research field which deals with study of potential security threats to
organizational data and strategies to prevent such threats. Data leaks involve the release of sensitive information
to an untrusted third party, intentionally or otherwise while data loss on the other hand is disappearance or
damage of data, inwhich a correct data copy isno longer available to the organization.Thesecorrespond toa
compromise of data integrity oravailability. Data leak/loss has led to huge loss of revenue in the affected
organisation and a threat to their continued existence. All organisations using electronic data storage are
vulnerable to this attack. This research work is targeted at organisations with sensitive datasuch as Bank,
Manufacturing industries, GSM operators, research centres, Military, Higher Educational Institutions and so
on.The authorsanalyse the possible threats to organisational data and the parties that are involved in such threat,
the impact of successful attack on an organisation,and current approaches to DLP.The authorsalso design a DLP
modelusing “text mining” and “social network analysis”, and suggested further research into “text mining” and
“social network analysis”for effective future solution to DLP problems.In conclusion, implementation of this
design with adherence to good data security practices and proactive strategies suggested in thispaper will
significantly reduce the risk of such security threats.
Dealing with Data Breaches Amidst Changes In TechnologyCSCJournals
The document discusses data breaches and cybersecurity measures to prevent them. It begins by defining a data breach and describing major causes from cases at companies like Adobe, eBay, Facebook, and Myspace. It then discusses types of data breaches like ransomware, denial of service attacks, phishing, malware, insider threats, physical theft, and employee errors. Finally, it proposes cybersecurity measures organized into technical practices, organizational practices, and policies/standards to help prevent future breaches.
Let us understand some of the infrastructural and
security challenges that every organization faces today
before delving into the concept of securing the cloud
data lake platform. Though Data lakes provide scalability,
agility, and cost-effective features, it possesses a unique
infrastructure and security challenges.
An agile based software development approach offers many advantages of an iterative and fast-paced process. However, customers often find themselves at crossroads when it comes to choosing a specific adoption path. Organizational culture and mindset are critical to the success of distributed agile projects. Enterprises need the right partner who can address all of these and deliver projects efficiently.
This document discusses disaster recovery. It describes different types of disasters including natural disasters like tsunamis and earthquakes that are difficult to recover from, and man-made disasters like theft that are easier to recover from. It emphasizes the importance of data backup and restoration, and outlines disaster recovery measures organizations can take including identifying critical systems, backing up data daily including transaction logs, and having system redundancy. It also recommends technologies for disaster recovery planning like regular backups to off-site locations and data replication using SAN storage.
2017-10-05 Mitigating Cybersecurity and Cyber Fraud risk in Your OrganizationRaffa Learning Community
An examination of ever growing cyber threats which continue to develop and successfully execute cyber attacks and fraud scams, which cost businesses billions of dollars globally. This session will step through different current and emerging cyber attacks and cyber fraud scenarios, and then discuss how basic but effective security controls can help to significantly reduce the risks.
You are attending a workshop on security threats and how to address them, not a training. The presenters introduce themselves and their backgrounds. They discuss how security threats have evolved from viruses in the early internet era to today's more sophisticated targeted attacks. Microsoft's approach to security focuses on technology, processes, and people to manage complexity, protect information, and advance the business with IT solutions. Specific solutions discussed include Windows Firewall, BitLocker, and Network Access Protection.
The document discusses the major security concerns organizations have regarding cloud environments. The top concerns include: data loss/leakage due to the ease of sharing data in the cloud (69% of organizations), data privacy and confidentiality (66%), accidental exposure of cloud credentials (44%), difficulty performing effective incident response in the cloud (44%), and legal/regulatory compliance challenges (42%). Other concerns include data sovereignty/residence/control, as organizations may not know where their data is physically stored.
The document discusses the Digital Trust Framework (DTF), which will use the TMForum's Open Digital Architecture (ODA) as a foundation. The DTF is being developed for 4IR environments and will provide a blueprint for modular, cloud-based, open digital platforms that can be orchestrated using AI. It will integrate ODA with other frameworks to ensure an overall digital trust approach for continuously evolving systems.
This document describes Scalar's managed security services. It notes that cyber attacks are increasing in frequency and severity, posing a major challenge for organizations. While security has become a top priority, many companies lack the in-house expertise to effectively manage their security. Scalar's managed security services allow companies to leverage their specialized skills and expertise through three tiers - Insight, Monitoring, and Management - to address security issues for a predictable monthly cost. This reduces the need for companies to invest in recruiting and training their own security staff.
Cyber Security - Maintaining Operational Control of Critical ServicesDave Reeves
This document has been developed to assist organisations with some of the considerations when building and operating critical services from an ICS cyber security perspective. The next whitepaper in the series will focus on securing critical services and the inter dependencies between cyber and physical security.
Aujas Cyber Security is a global cyber security services company consistently recognized by NASSCOM, Deloitte and Gartner for its unique cyber security capabilities. With a growing workforce of 400+ security experts, Aujas Networks has served more than 1500 clients across the globe.
Technology Overview - Symantec Data Loss Prevention (DLP)Iftikhar Ali Iqbal
The presentation provides the following:
- Symantec Corporate Overview
- Solution Portfolio of Symantec
- Symantec Data Loss Prevention - Introduction
- Symantec Data Loss Prevention - Components
- Symantec Data Loss Prevention - Features & Use Cases
- Symantec Data Loss Prevention - System Requirements
- Symantec Data Loss Prevention - Appendix (extra information)
This provides a brief overview of Symantec Data Loss Prevention (DLP). Please note all the information is based prior to May 2016 and the full integration of Blue Coat Systems's set of solutions.
Forcepoint offers a Data Loss Prevention (DLP) solution that takes a human-centric approach to data security. It focuses on gaining visibility into user interactions with data across endpoints, cloud applications, and networks in order to apply appropriate controls based on the user's risk level and the sensitivity of the data. The solution aims to accelerate compliance with regulations, empower users to protect data, provide advanced detection of potential data loss through machine learning and fingerprinting techniques, and prioritize security incidents by risk level. It combines DLP capabilities across endpoints, cloud applications, and the network from a single point of control.
This document discusses data leakage prevention (DLP) and outlines best practices for implementing a DLP project. It defines DLP, explains how DLP technology works to monitor data in motion, at rest, and in use. The document recommends a multi-step DLP project that includes analyzing business environments and threats, classifying sensitive data, mapping data storage and business processes, assessing leakage channels, and selecting DLP tools. It also stresses the importance of organizational culture and policies to complement technical solutions and prevent data leakage.
There is an increasing trend witnessed in the cloud computing technology which has led to a lot of risks in preserving the Confidentiality, Integrity and Availability of data. The Cloud is now facing a lot of compliance requirements due to the sensitivity of the data that is being stored. View this presentation to understand the Cloud Compliance Requirements, Risks, Audit Processes and Methodologies involved in providing assurance.
This presentation was given by CA Anand Prakash Jangid at the Conference on Cloud Computing conducted by the Committee on Information Technology of the Institute of Chartered Accountants of India on 11th January 2014.
Cybersecurity frameworks globally and saudi arabiaFaysal Ghauri
My second paper on Cybersecurity frameworks and how Saudi Arabia is forming. This paper has been published by the International Journal of Computer Science and Information Security (IJCSIS) in April 2021, Vol. 19 No. 4 Publication.
The document discusses API security best practices. It describes how APIs can be secured at different layers including authentication, authorization, perimeter defense, and the service/API layer. It also discusses how a blended API gateway and data loss prevention deployment can help control access to APIs and sensitive data. The presentation included examples of securing mobile access to enterprise services and controlling use of cloud infrastructure through an API gateway.
Harald Leitenmüller | DSGVO - globaler, zeitgemäßer Datenschutzstandard für M...Microsoft Österreich
The document discusses Microsoft's approach to data protection and compliance with the GDPR. It provides an overview of Microsoft's security operations, including its cyber defense operations center and intelligence security graph. It also describes Microsoft's Next Generation Privacy framework for inventorying and standardizing how customer data is treated. Tools like SecureScore and the Compliance Manager are introduced for assessing compliance and managing tasks. Additional resources on GDPR, security reference architectures and blogs are listed in an appendix.
Cloud Computing
Categories of Cloud Computing
SaaS
PaaS
IaaS
Threads of Cloud Computing
Insurance Challenges
Cloud Solutions
Security of the Insurance Industry
Cloud Solutions
Insurance Security in the Insurance Industry with respect to Indian market
IT security teams often lack visibility into cloud file sharing services used by employees. Analysis of over 100 million files shared across many industries revealed several risks to enterprise data and compliance. The majority of broadly shared files and exposure risks were concentrated among a small number of users. While passwords and encryption aim to protect data, inadvertent or deliberate data exposure still commonly occurs. New technologies embedded in the cloud are needed to provide visibility and control over shadow data and file sharing activities.
Data Leak Protection Using Text Mining and Social Network AnalysisIJERD Editor
Data Leak prevention is a research field which deals with study of potential security threats to
organizational data and strategies to prevent such threats. Data leaks involve the release of sensitive information
to an untrusted third party, intentionally or otherwise while data loss on the other hand is disappearance or
damage of data, inwhich a correct data copy isno longer available to the organization.Thesecorrespond toa
compromise of data integrity oravailability. Data leak/loss has led to huge loss of revenue in the affected
organisation and a threat to their continued existence. All organisations using electronic data storage are
vulnerable to this attack. This research work is targeted at organisations with sensitive datasuch as Bank,
Manufacturing industries, GSM operators, research centres, Military, Higher Educational Institutions and so
on.The authorsanalyse the possible threats to organisational data and the parties that are involved in such threat,
the impact of successful attack on an organisation,and current approaches to DLP.The authorsalso design a DLP
modelusing “text mining” and “social network analysis”, and suggested further research into “text mining” and
“social network analysis”for effective future solution to DLP problems.In conclusion, implementation of this
design with adherence to good data security practices and proactive strategies suggested in thispaper will
significantly reduce the risk of such security threats.
Dealing with Data Breaches Amidst Changes In TechnologyCSCJournals
The document discusses data breaches and cybersecurity measures to prevent them. It begins by defining a data breach and describing major causes from cases at companies like Adobe, eBay, Facebook, and Myspace. It then discusses types of data breaches like ransomware, denial of service attacks, phishing, malware, insider threats, physical theft, and employee errors. Finally, it proposes cybersecurity measures organized into technical practices, organizational practices, and policies/standards to help prevent future breaches.
Running head ORGANIZATIONAL SECURITY1ORGANIZATIONAL SECURITY.docxtodd581
Running head: ORGANIZATIONAL SECURITY 1
ORGANIZATIONAL SECURITY 7
CDU International College
MQP 008
Report on Security Issues in the Fugle Company
Marufa Binte Muztaba
Date: 22th April 2020
Student ID:S33821
Length: 1500 words (+/-100)
Introduction
When we consider every modern business, we find that none lacks security issues. This means that we need to look into how to come up with secure systems. Information security stands for prevention or the practice of preventing access of data by unauthorized user. The information does not need to be electrical for it to be secured, even physical information is put into consideration. The purpose of writing this paper is to talk about Fugle Company by describing its information system, outlining the main risks that the system might be exposed to and the ethical issues that need to be considered in order to maintain the security of information in Fugle, (Trend Micro, 2015). For this company to succeed, information security has to be up tight. This technological company has developed an application that you can pay using your fingerprint. A lot of attention has been drawn to it which has risen questions of how secure the application is, (Dooley, 2017). With the scheduled time for launching the application, the company experiences a lot of pressure because they do not want to launch it before considering all the security issues with their budget, and at the same time they do not have a lot of time. The security issues addressed here apply to the HRM, product development, accounting, and marketing information systems.
Information Systems and their Assets
There are four main key information systems in Fugle. When dealing with an information system, we basically mean the software that a company used to analyze and organize its data. It is used to convert raw data into information that can be understood and be used for effective decision making. There are key assets that each one of the four keys have been assigned to protect. We can define an asset as something that is useful for the company that brings profit to it. It is very important to know how to handle threats that are imposed to these assets because they can have a major impact on the future of the company and its viability. In fugle, the main responsibility of the market information system is to make sure that information in the company concerning marketing is not breached. The company’s major assets are its customer Intel and information concerning the asset. This is seen by when Dave is called and is told that there was an attempt of people hacking the data concerning the clients of the company, ( Lowry, Dinev, and Willison, 2017). This would mean that there is a confidentiality breach and the clients would not trust the company again. Also when journalists come to take a look at the product and they are given a controlled presentation it is because the product is still considered vulnerable to attacks. Information about the .
Running head ORGANIZATIONAL SECURITY1ORGANIZATIONAL SECURITY.docxglendar3
Running head: ORGANIZATIONAL SECURITY 1
ORGANIZATIONAL SECURITY 7
CDU International College
MQP 008
Report on Security Issues in the Fugle Company
Marufa Binte Muztaba
Date: 22th April 2020
Student ID:S33821
Length: 1500 words (+/-100)
Introduction
When we consider every modern business, we find that none lacks security issues. This means that we need to look into how to come up with secure systems. Information security stands for prevention or the practice of preventing access of data by unauthorized user. The information does not need to be electrical for it to be secured, even physical information is put into consideration. The purpose of writing this paper is to talk about Fugle Company by describing its information system, outlining the main risks that the system might be exposed to and the ethical issues that need to be considered in order to maintain the security of information in Fugle, (Trend Micro, 2015). For this company to succeed, information security has to be up tight. This technological company has developed an application that you can pay using your fingerprint. A lot of attention has been drawn to it which has risen questions of how secure the application is, (Dooley, 2017). With the scheduled time for launching the application, the company experiences a lot of pressure because they do not want to launch it before considering all the security issues with their budget, and at the same time they do not have a lot of time. The security issues addressed here apply to the HRM, product development, accounting, and marketing information systems.
Information Systems and their Assets
There are four main key information systems in Fugle. When dealing with an information system, we basically mean the software that a company used to analyze and organize its data. It is used to convert raw data into information that can be understood and be used for effective decision making. There are key assets that each one of the four keys have been assigned to protect. We can define an asset as something that is useful for the company that brings profit to it. It is very important to know how to handle threats that are imposed to these assets because they can have a major impact on the future of the company and its viability. In fugle, the main responsibility of the market information system is to make sure that information in the company concerning marketing is not breached. The company’s major assets are its customer Intel and information concerning the asset. This is seen by when Dave is called and is told that there was an attempt of people hacking the data concerning the clients of the company, ( Lowry, Dinev, and Willison, 2017). This would mean that there is a confidentiality breach and the clients would not trust the company again. Also when journalists come to take a look at the product and they are given a controlled presentation it is because the product is still considered vulnerable to attacks. Information about the .
This document is a learner's module for an Information Assurance and Security course. It provides an overview of key topics that will be covered during the course, including information systems security, the internet of things, malicious attacks and vulnerabilities, risk management, and securing information systems. The module includes details on the course instructor, unit timelines, learning objectives, chapter summaries, and recent examples of major data breaches in the United States.
This document identifies and describes 14 major repositories of data breach information. It finds that while no single comprehensive source exists, the main repositories include Attrition.org, CERT, Databreaches.net, DataLossDB, Identity Theft Resource Centre, InfosecurityAnalysis.com, MyID.com, NAID, PHI Privacy, PogoWasRight.org, Privacy Rights Clearinghouse, US-CERT, US Securities and Exchange Commission, and Verizon. The document evaluates Privacy Rights Clearinghouse, Identity Theft Resource Centre, and DatalossDB.org as being relatively open, detailed repositories of breach data.
IRJET- An Approach Towards Data Security in Organizations by Avoiding Data Br...IRJET Journal
This document discusses data leakage prevention (DLP) systems and approaches to avoid data breaches in organizations. It begins with an abstract that outlines how sensitive data can be lost through unauthorized access or transfer. The introduction then discusses the need for DLP to control and monitor data access and usage. Key challenges for DLP implementations are also reviewed, such as protecting information, reducing unauthorized data transfers, and identifying internal and external threats. The document concludes with recommendations for future research on DLP, including using deep learning techniques to improve insider threat detection and monitoring encrypted communication channels.
The document discusses information security and analyzes its importance. It describes key aspects of information security like confidentiality, integrity and availability. It also outlines some common threats to information security such as computer viruses, theft, sabotage and vandalism. The document then analyzes some challenges to effective information security, including employees being fooled by scams, issues with authentication, and the growing threat of phishing. It emphasizes the importance of addressing security concerns to build trust with customers and gain a competitive advantage.
All product and company names mentioned herein are for identification and educational purposes only and are the property of, and may be trademarks of, their respective owners.
Article 1 currently, smartphone, web, and social networking technohoney690131
The document discusses several articles and papers related to ethics and privacy issues with new technologies. It covers concerns around obtaining private patient data online and vulnerabilities of electronic health records. It also discusses increased government surveillance impacting human rights and national security. Additional topics covered include privacy of patient information shared with healthcare providers and ethical challenges of data security, anonymity, and intellectual property with information technology.
Identity Theft ResponseYou have successfully presented an expaLizbethQuinonez813
The CEO has tasked you with developing an identity theft response plan for your financial organization. This plan will outline procedures for responding to potential cyberattacks involving theft or compromise of customers' personally identifiable information (PII). You will need to consider responses to both internal incidents, like a rogue employee accessing records, and external incidents, such as a hacker breaching systems. The plan will need to address regulatory compliance, communication with leadership and authorities, and recovery of operations should PII be stolen. It will also help the organization avoid damages to its reputation and legal liability in the event of an identity theft incident.
This document summarizes security breaches of personal health information that were reported to the U.S. Department of Health and Human Services in 2009 and 2010. It provides two tables listing over 50 security breaches in the United States during these years, with the state, approximate number of individuals affected, and date of each breach. The greatest losses from security breaches result from unauthorized access, modification or theft of confidential information, as well as lack of data availability. Protecting the confidentiality, integrity and availability of data is important for database security.
Protection and defense against sensitive data leakage problem within organiza...Alexander Decker
This document discusses sensitive data leakage problems within organizations and proposes a Data Leakage Prevention (DLP) solution. The DLP solution involves identifying, monitoring, and protecting an organization's sensitive data that is at rest, in use, and in motion. It proposes implementing procedures like monitoring user actions and properly protecting sensitive data to prevent intentional or accidental leaks. The DLP solution is considered one of the most critical security approaches for organizations to effectively protect sensitive data from being accessed or leaked to unauthorized individuals.
Protection and defense against sensitive data leakage problem within organiza...Alexander Decker
This document summarizes a paper that proposes a Data Leakage Prevention (DLP) solution to help organizations prevent intentional or accidental leakage of sensitive data. The proposed solution involves identifying, monitoring, and protecting three types of organizational data: data at rest (stored data), data in use (data currently being processed), and data in motion (data being transmitted). It describes sensitive data that organizations need to protect, such as personal information, financial records, and research data. The solution aims to classify data protection levels and help organizations enforce policies regarding appropriate data access and transmission to reduce risks from data leakage.
The amount of data in the world seems increasing and computers make it easy to save the data. Companies offer data storage by providing cloud services and the amount of data being stored in these servers is increasing rapidly. In data mining, the data is stored electronically and the search is automated or at least augmented by computer. As the volume of data increases, inexorably, the proportion of it that people understand decreases alarmingly. This paper presents the data leakage problem arises because the services like Facebook and Google store all your data unencrypted on their servers, making it easy for them, or governments and hackers, to monitor the data.
iStart feature: Protect and serve how safe is your personal data?Hayden McCall
The revelations of the Heartbleed vulnerability in April and the recent implementation of Australia’s new privacy regime in March have put data breaches firmly back in the limelight. Clare Coulson finds out more...
A research paper published by the International Bar Association draws attention to the fact that law firms have particular data security vulnerabilities and are attractive targets for hackers due to the sensitive information they possess on clients and third parties. The authors of the paper observe that lawyers need greater engagement with data security issues to protect client and firm interests and comply with ethical obligations, as law firms currently have weaker security than clients.
Fundamentals of information systems security ( pdf drive ) chapter 1newbie2019
This document discusses the growth of the internet and increased connectivity of devices beyond just computers. It notes that as internet usage has increased, issues of privacy, data security, and protecting sensitive information have become more important for both personal and business use. The document provides an overview of common security concepts and terms to help understand how to prevent cyberattacks and secure sensitive data. It also includes a table summarizing several high-profile data breaches between 2013-2015 at companies like Target, Anthem, and Sony Pictures that compromised personal and financial information for millions of customers.
Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...IJECEIAES
Medical image analysis has witnessed significant advancements with deep learning techniques. In the domain of brain tumor segmentation, the ability to
precisely delineate tumor boundaries from magnetic resonance imaging (MRI)
scans holds profound implications for diagnosis. This study presents an ensemble convolutional neural network (CNN) with transfer learning, integrating
the state-of-the-art Deeplabv3+ architecture with the ResNet18 backbone. The
model is rigorously trained and evaluated, exhibiting remarkable performance
metrics, including an impressive global accuracy of 99.286%, a high-class accuracy of 82.191%, a mean intersection over union (IoU) of 79.900%, a weighted
IoU of 98.620%, and a Boundary F1 (BF) score of 83.303%. Notably, a detailed comparative analysis with existing methods showcases the superiority of
our proposed model. These findings underscore the model’s competence in precise brain tumor localization, underscoring its potential to revolutionize medical
image analysis and enhance healthcare outcomes. This research paves the way
for future exploration and optimization of advanced CNN models in medical
imaging, emphasizing addressing false positives and resource efficiency.
Embedded machine learning-based road conditions and driving behavior monitoringIJECEIAES
Car accident rates have increased in recent years, resulting in losses in human lives, properties, and other financial costs. An embedded machine learning-based system is developed to address this critical issue. The system can monitor road conditions, detect driving patterns, and identify aggressive driving behaviors. The system is based on neural networks trained on a comprehensive dataset of driving events, driving styles, and road conditions. The system effectively detects potential risks and helps mitigate the frequency and impact of accidents. The primary goal is to ensure the safety of drivers and vehicles. Collecting data involved gathering information on three key road events: normal street and normal drive, speed bumps, circular yellow speed bumps, and three aggressive driving actions: sudden start, sudden stop, and sudden entry. The gathered data is processed and analyzed using a machine learning system designed for limited power and memory devices. The developed system resulted in 91.9% accuracy, 93.6% precision, and 92% recall. The achieved inference time on an Arduino Nano 33 BLE Sense with a 32-bit CPU running at 64 MHz is 34 ms and requires 2.6 kB peak RAM and 139.9 kB program flash memory, making it suitable for resource-constrained embedded systems.
Advanced control scheme of doubly fed induction generator for wind turbine us...IJECEIAES
This paper describes a speed control device for generating electrical energy on an electricity network based on the doubly fed induction generator (DFIG) used for wind power conversion systems. At first, a double-fed induction generator model was constructed. A control law is formulated to govern the flow of energy between the stator of a DFIG and the energy network using three types of controllers: proportional integral (PI), sliding mode controller (SMC) and second order sliding mode controller (SOSMC). Their different results in terms of power reference tracking, reaction to unexpected speed fluctuations, sensitivity to perturbations, and resilience against machine parameter alterations are compared. MATLAB/Simulink was used to conduct the simulations for the preceding study. Multiple simulations have shown very satisfying results, and the investigations demonstrate the efficacy and power-enhancing capabilities of the suggested control system.
Neural network optimizer of proportional-integral-differential controller par...IJECEIAES
Wide application of proportional-integral-differential (PID)-regulator in industry requires constant improvement of methods of its parameters adjustment. The paper deals with the issues of optimization of PID-regulator parameters with the use of neural network technology methods. A methodology for choosing the architecture (structure) of neural network optimizer is proposed, which consists in determining the number of layers, the number of neurons in each layer, as well as the form and type of activation function. Algorithms of neural network training based on the application of the method of minimizing the mismatch between the regulated value and the target value are developed. The method of back propagation of gradients is proposed to select the optimal training rate of neurons of the neural network. The neural network optimizer, which is a superstructure of the linear PID controller, allows increasing the regulation accuracy from 0.23 to 0.09, thus reducing the power consumption from 65% to 53%. The results of the conducted experiments allow us to conclude that the created neural superstructure may well become a prototype of an automatic voltage regulator (AVR)-type industrial controller for tuning the parameters of the PID controller.
An improved modulation technique suitable for a three level flying capacitor ...IJECEIAES
This research paper introduces an innovative modulation technique for controlling a 3-level flying capacitor multilevel inverter (FCMLI), aiming to streamline the modulation process in contrast to conventional methods. The proposed
simplified modulation technique paves the way for more straightforward and
efficient control of multilevel inverters, enabling their widespread adoption and
integration into modern power electronic systems. Through the amalgamation of
sinusoidal pulse width modulation (SPWM) with a high-frequency square wave
pulse, this controlling technique attains energy equilibrium across the coupling
capacitor. The modulation scheme incorporates a simplified switching pattern
and a decreased count of voltage references, thereby simplifying the control
algorithm.
A review on features and methods of potential fishing zoneIJECEIAES
This review focuses on the importance of identifying potential fishing zones in seawater for sustainable fishing practices. It explores features like sea surface temperature (SST) and sea surface height (SSH), along with classification methods such as classifiers. The features like SST, SSH, and different classifiers used to classify the data, have been figured out in this review study. This study underscores the importance of examining potential fishing zones using advanced analytical techniques. It thoroughly explores the methodologies employed by researchers, covering both past and current approaches. The examination centers on data characteristics and the application of classification algorithms for classification of potential fishing zones. Furthermore, the prediction of potential fishing zones relies significantly on the effectiveness of classification algorithms. Previous research has assessed the performance of models like support vector machines, naïve Bayes, and artificial neural networks (ANN). In the previous result, the results of support vector machine (SVM) were 97.6% more accurate than naive Bayes's 94.2% to classify test data for fisheries classification. By considering the recent works in this area, several recommendations for future works are presented to further improve the performance of the potential fishing zone models, which is important to the fisheries community.
Electrical signal interference minimization using appropriate core material f...IJECEIAES
As demand for smaller, quicker, and more powerful devices rises, Moore's law is strictly followed. The industry has worked hard to make little devices that boost productivity. The goal is to optimize device density. Scientists are reducing connection delays to improve circuit performance. This helped them understand three-dimensional integrated circuit (3D IC) concepts, which stack active devices and create vertical connections to diminish latency and lower interconnects. Electrical involvement is a big worry with 3D integrates circuits. Researchers have developed and tested through silicon via (TSV) and substrates to decrease electrical wave involvement. This study illustrates a novel noise coupling reduction method using several electrical involvement models. A 22% drop in electrical involvement from wave-carrying to victim TSVs introduces this new paradigm and improves system performance even at higher THz frequencies.
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...IJECEIAES
Climate change's impact on the planet forced the United Nations and governments to promote green energies and electric transportation. The deployments of photovoltaic (PV) and electric vehicle (EV) systems gained stronger momentum due to their numerous advantages over fossil fuel types. The advantages go beyond sustainability to reach financial support and stability. The work in this paper introduces the hybrid system between PV and EV to support industrial and commercial plants. This paper covers the theoretical framework of the proposed hybrid system including the required equation to complete the cost analysis when PV and EV are present. In addition, the proposed design diagram which sets the priorities and requirements of the system is presented. The proposed approach allows setup to advance their power stability, especially during power outages. The presented information supports researchers and plant owners to complete the necessary analysis while promoting the deployment of clean energy. The result of a case study that represents a dairy milk farmer supports the theoretical works and highlights its advanced benefits to existing plants. The short return on investment of the proposed approach supports the paper's novelty approach for the sustainable electrical system. In addition, the proposed system allows for an isolated power setup without the need for a transmission line which enhances the safety of the electrical network
Bibliometric analysis highlighting the role of women in addressing climate ch...IJECEIAES
Fossil fuel consumption increased quickly, contributing to climate change
that is evident in unusual flooding and draughts, and global warming. Over
the past ten years, women's involvement in society has grown dramatically,
and they succeeded in playing a noticeable role in reducing climate change.
A bibliometric analysis of data from the last ten years has been carried out to
examine the role of women in addressing the climate change. The analysis's
findings discussed the relevant to the sustainable development goals (SDGs),
particularly SDG 7 and SDG 13. The results considered contributions made
by women in the various sectors while taking geographic dispersion into
account. The bibliometric analysis delves into topics including women's
leadership in environmental groups, their involvement in policymaking, their
contributions to sustainable development projects, and the influence of
gender diversity on attempts to mitigate climate change. This study's results
highlight how women have influenced policies and actions related to climate
change, point out areas of research deficiency and recommendations on how
to increase role of the women in addressing the climate change and
achieving sustainability. To achieve more successful results, this initiative
aims to highlight the significance of gender equality and encourage
inclusivity in climate change decision-making processes.
Voltage and frequency control of microgrid in presence of micro-turbine inter...IJECEIAES
The active and reactive load changes have a significant impact on voltage
and frequency. In this paper, in order to stabilize the microgrid (MG) against
load variations in islanding mode, the active and reactive power of all
distributed generators (DGs), including energy storage (battery), diesel
generator, and micro-turbine, are controlled. The micro-turbine generator is
connected to MG through a three-phase to three-phase matrix converter, and
the droop control method is applied for controlling the voltage and
frequency of MG. In addition, a method is introduced for voltage and
frequency control of micro-turbines in the transition state from gridconnected mode to islanding mode. A novel switching strategy of the matrix
converter is used for converting the high-frequency output voltage of the
micro-turbine to the grid-side frequency of the utility system. Moreover,
using the switching strategy, the low-order harmonics in the output current
and voltage are not produced, and consequently, the size of the output filter
would be reduced. In fact, the suggested control strategy is load-independent
and has no frequency conversion restrictions. The proposed approach for
voltage and frequency regulation demonstrates exceptional performance and
favorable response across various load alteration scenarios. The suggested
strategy is examined in several scenarios in the MG test systems, and the
simulation results are addressed.
Enhancing battery system identification: nonlinear autoregressive modeling fo...IJECEIAES
Precisely characterizing Li-ion batteries is essential for optimizing their
performance, enhancing safety, and prolonging their lifespan across various
applications, such as electric vehicles and renewable energy systems. This
article introduces an innovative nonlinear methodology for system
identification of a Li-ion battery, employing a nonlinear autoregressive with
exogenous inputs (NARX) model. The proposed approach integrates the
benefits of nonlinear modeling with the adaptability of the NARX structure,
facilitating a more comprehensive representation of the intricate
electrochemical processes within the battery. Experimental data collected
from a Li-ion battery operating under diverse scenarios are employed to
validate the effectiveness of the proposed methodology. The identified
NARX model exhibits superior accuracy in predicting the battery's behavior
compared to traditional linear models. This study underscores the
importance of accounting for nonlinearities in battery modeling, providing
insights into the intricate relationships between state-of-charge, voltage, and
current under dynamic conditions.
Smart grid deployment: from a bibliometric analysis to a surveyIJECEIAES
Smart grids are one of the last decades' innovations in electrical energy.
They bring relevant advantages compared to the traditional grid and
significant interest from the research community. Assessing the field's
evolution is essential to propose guidelines for facing new and future smart
grid challenges. In addition, knowing the main technologies involved in the
deployment of smart grids (SGs) is important to highlight possible
shortcomings that can be mitigated by developing new tools. This paper
contributes to the research trends mentioned above by focusing on two
objectives. First, a bibliometric analysis is presented to give an overview of
the current research level about smart grid deployment. Second, a survey of
the main technological approaches used for smart grid implementation and
their contributions are highlighted. To that effect, we searched the Web of
Science (WoS), and the Scopus databases. We obtained 5,663 documents
from WoS and 7,215 from Scopus on smart grid implementation or
deployment. With the extraction limitation in the Scopus database, 5,872 of
the 7,215 documents were extracted using a multi-step process. These two
datasets have been analyzed using a bibliometric tool called bibliometrix.
The main outputs are presented with some recommendations for future
research.
Use of analytical hierarchy process for selecting and prioritizing islanding ...IJECEIAES
One of the problems that are associated to power systems is islanding
condition, which must be rapidly and properly detected to prevent any
negative consequences on the system's protection, stability, and security.
This paper offers a thorough overview of several islanding detection
strategies, which are divided into two categories: classic approaches,
including local and remote approaches, and modern techniques, including
techniques based on signal processing and computational intelligence.
Additionally, each approach is compared and assessed based on several
factors, including implementation costs, non-detected zones, declining
power quality, and response times using the analytical hierarchy process
(AHP). The multi-criteria decision-making analysis shows that the overall
weight of passive methods (24.7%), active methods (7.8%), hybrid methods
(5.6%), remote methods (14.5%), signal processing-based methods (26.6%),
and computational intelligent-based methods (20.8%) based on the
comparison of all criteria together. Thus, it can be seen from the total weight
that hybrid approaches are the least suitable to be chosen, while signal
processing-based methods are the most appropriate islanding detection
method to be selected and implemented in power system with respect to the
aforementioned factors. Using Expert Choice software, the proposed
hierarchy model is studied and examined.
Enhancing of single-stage grid-connected photovoltaic system using fuzzy logi...IJECEIAES
The power generated by photovoltaic (PV) systems is influenced by
environmental factors. This variability hampers the control and utilization of
solar cells' peak output. In this study, a single-stage grid-connected PV
system is designed to enhance power quality. Our approach employs fuzzy
logic in the direct power control (DPC) of a three-phase voltage source
inverter (VSI), enabling seamless integration of the PV connected to the
grid. Additionally, a fuzzy logic-based maximum power point tracking
(MPPT) controller is adopted, which outperforms traditional methods like
incremental conductance (INC) in enhancing solar cell efficiency and
minimizing the response time. Moreover, the inverter's real-time active and
reactive power is directly managed to achieve a unity power factor (UPF).
The system's performance is assessed through MATLAB/Simulink
implementation, showing marked improvement over conventional methods,
particularly in steady-state and varying weather conditions. For solar
irradiances of 500 and 1,000 W/m2
, the results show that the proposed
method reduces the total harmonic distortion (THD) of the injected current
to the grid by approximately 46% and 38% compared to conventional
methods, respectively. Furthermore, we compare the simulation results with
IEEE standards to evaluate the system's grid compatibility.
Enhancing photovoltaic system maximum power point tracking with fuzzy logic-b...IJECEIAES
Photovoltaic systems have emerged as a promising energy resource that
caters to the future needs of society, owing to their renewable, inexhaustible,
and cost-free nature. The power output of these systems relies on solar cell
radiation and temperature. In order to mitigate the dependence on
atmospheric conditions and enhance power tracking, a conventional
approach has been improved by integrating various methods. To optimize
the generation of electricity from solar systems, the maximum power point
tracking (MPPT) technique is employed. To overcome limitations such as
steady-state voltage oscillations and improve transient response, two
traditional MPPT methods, namely fuzzy logic controller (FLC) and perturb
and observe (P&O), have been modified. This research paper aims to
simulate and validate the step size of the proposed modified P&O and FLC
techniques within the MPPT algorithm using MATLAB/Simulink for
efficient power tracking in photovoltaic systems.
Adaptive synchronous sliding control for a robot manipulator based on neural ...IJECEIAES
Robot manipulators have become important equipment in production lines, medical fields, and transportation. Improving the quality of trajectory tracking for
robot hands is always an attractive topic in the research community. This is a
challenging problem because robot manipulators are complex nonlinear systems
and are often subject to fluctuations in loads and external disturbances. This
article proposes an adaptive synchronous sliding control scheme to improve trajectory tracking performance for a robot manipulator. The proposed controller
ensures that the positions of the joints track the desired trajectory, synchronize
the errors, and significantly reduces chattering. First, the synchronous tracking
errors and synchronous sliding surfaces are presented. Second, the synchronous
tracking error dynamics are determined. Third, a robust adaptive control law is
designed,the unknown components of the model are estimated online by the neural network, and the parameters of the switching elements are selected by fuzzy
logic. The built algorithm ensures that the tracking and approximation errors
are ultimately uniformly bounded (UUB). Finally, the effectiveness of the constructed algorithm is demonstrated through simulation and experimental results.
Simulation and experimental results show that the proposed controller is effective with small synchronous tracking errors, and the chattering phenomenon is
significantly reduced.
Remote field-programmable gate array laboratory for signal acquisition and de...IJECEIAES
A remote laboratory utilizing field-programmable gate array (FPGA) technologies enhances students’ learning experience anywhere and anytime in embedded system design. Existing remote laboratories prioritize hardware access and visual feedback for observing board behavior after programming, neglecting comprehensive debugging tools to resolve errors that require internal signal acquisition. This paper proposes a novel remote embeddedsystem design approach targeting FPGA technologies that are fully interactive via a web-based platform. Our solution provides FPGA board access and debugging capabilities beyond the visual feedback provided by existing remote laboratories. We implemented a lab module that allows users to seamlessly incorporate into their FPGA design. The module minimizes hardware resource utilization while enabling the acquisition of a large number of data samples from the signal during the experiments by adaptively compressing the signal prior to data transmission. The results demonstrate an average compression ratio of 2.90 across three benchmark signals, indicating efficient signal acquisition and effective debugging and analysis. This method allows users to acquire more data samples than conventional methods. The proposed lab allows students to remotely test and debug their designs, bridging the gap between theory and practice in embedded system design.
Detecting and resolving feature envy through automated machine learning and m...IJECEIAES
Efficiently identifying and resolving code smells enhances software project quality. This paper presents a novel solution, utilizing automated machine learning (AutoML) techniques, to detect code smells and apply move method refactoring. By evaluating code metrics before and after refactoring, we assessed its impact on coupling, complexity, and cohesion. Key contributions of this research include a unique dataset for code smell classification and the development of models using AutoGluon for optimal performance. Furthermore, the study identifies the top 20 influential features in classifying feature envy, a well-known code smell, stemming from excessive reliance on external classes. We also explored how move method refactoring addresses feature envy, revealing reduced coupling and complexity, and improved cohesion, ultimately enhancing code quality. In summary, this research offers an empirical, data-driven approach, integrating AutoML and move method refactoring to optimize software project quality. Insights gained shed light on the benefits of refactoring on code quality and the significance of specific features in detecting feature envy. Future research can expand to explore additional refactoring techniques and a broader range of code metrics, advancing software engineering practices and standards.
Smart monitoring technique for solar cell systems using internet of things ba...IJECEIAES
Rapidly and remotely monitoring and receiving the solar cell systems status parameters, solar irradiance, temperature, and humidity, are critical issues in enhancement their efficiency. Hence, in the present article an improved smart prototype of internet of things (IoT) technique based on embedded system through NodeMCU ESP8266 (ESP-12E) was carried out experimentally. Three different regions at Egypt; Luxor, Cairo, and El-Beheira cities were chosen to study their solar irradiance profile, temperature, and humidity by the proposed IoT system. The monitoring data of solar irradiance, temperature, and humidity were live visualized directly by Ubidots through hypertext transfer protocol (HTTP) protocol. The measured solar power radiation in Luxor, Cairo, and El-Beheira ranged between 216-1000, 245-958, and 187-692 W/m 2 respectively during the solar day. The accuracy and rapidity of obtaining monitoring results using the proposed IoT system made it a strong candidate for application in monitoring solar cell systems. On the other hand, the obtained solar power radiation results of the three considered regions strongly candidate Luxor and Cairo as suitable places to build up a solar cells system station rather than El-Beheira.
An efficient security framework for intrusion detection and prevention in int...IJECEIAES
Over the past few years, the internet of things (IoT) has advanced to connect billions of smart devices to improve quality of life. However, anomalies or malicious intrusions pose several security loopholes, leading to performance degradation and threat to data security in IoT operations. Thereby, IoT security systems must keep an eye on and restrict unwanted events from occurring in the IoT network. Recently, various technical solutions based on machine learning (ML) models have been derived towards identifying and restricting unwanted events in IoT. However, most ML-based approaches are prone to miss-classification due to inappropriate feature selection. Additionally, most ML approaches applied to intrusion detection and prevention consider supervised learning, which requires a large amount of labeled data to be trained. Consequently, such complex datasets are impossible to source in a large network like IoT. To address this problem, this proposed study introduces an efficient learning mechanism to strengthen the IoT security aspects. The proposed algorithm incorporates supervised and unsupervised approaches to improve the learning models for intrusion detection and mitigation. Compared with the related works, the experimental outcome shows that the model performs well in a benchmark dataset. It accomplishes an improved detection accuracy of approximately 99.21%.
Digital Twins Computer Networking Paper Presentation.pptxaryanpankaj78
A Digital Twin in computer networking is a virtual representation of a physical network, used to simulate, analyze, and optimize network performance and reliability. It leverages real-time data to enhance network management, predict issues, and improve decision-making processes.
Software Engineering and Project Management - Introduction, Modeling Concepts...Prakhyath Rai
Introduction, Modeling Concepts and Class Modeling: What is Object orientation? What is OO development? OO Themes; Evidence for usefulness of OO development; OO modeling history. Modeling
as Design technique: Modeling, abstraction, The Three models. Class Modeling: Object and Class Concept, Link and associations concepts, Generalization and Inheritance, A sample class model, Navigation of class models, and UML diagrams
Building the Analysis Models: Requirement Analysis, Analysis Model Approaches, Data modeling Concepts, Object Oriented Analysis, Scenario-Based Modeling, Flow-Oriented Modeling, class Based Modeling, Creating a Behavioral Model.
Use PyCharm for remote debugging of WSL on a Windo cf5c162d672e4e58b4dde5d797...shadow0702a
This document serves as a comprehensive step-by-step guide on how to effectively use PyCharm for remote debugging of the Windows Subsystem for Linux (WSL) on a local Windows machine. It meticulously outlines several critical steps in the process, starting with the crucial task of enabling permissions, followed by the installation and configuration of WSL.
The guide then proceeds to explain how to set up the SSH service within the WSL environment, an integral part of the process. Alongside this, it also provides detailed instructions on how to modify the inbound rules of the Windows firewall to facilitate the process, ensuring that there are no connectivity issues that could potentially hinder the debugging process.
The document further emphasizes on the importance of checking the connection between the Windows and WSL environments, providing instructions on how to ensure that the connection is optimal and ready for remote debugging.
It also offers an in-depth guide on how to configure the WSL interpreter and files within the PyCharm environment. This is essential for ensuring that the debugging process is set up correctly and that the program can be run effectively within the WSL terminal.
Additionally, the document provides guidance on how to set up breakpoints for debugging, a fundamental aspect of the debugging process which allows the developer to stop the execution of their code at certain points and inspect their program at those stages.
Finally, the document concludes by providing a link to a reference blog. This blog offers additional information and guidance on configuring the remote Python interpreter in PyCharm, providing the reader with a well-rounded understanding of the process.
Mechatronics is a multidisciplinary field that refers to the skill sets needed in the contemporary, advanced automated manufacturing industry. At the intersection of mechanics, electronics, and computing, mechatronics specialists create simpler, smarter systems. Mechatronics is an essential foundation for the expected growth in automation and manufacturing.
Mechatronics deals with robotics, control systems, and electro-mechanical systems.
Accident detection system project report.pdfKamal Acharya
The Rapid growth of technology and infrastructure has made our lives easier. The
advent of technology has also increased the traffic hazards and the road accidents take place
frequently which causes huge loss of life and property because of the poor emergency facilities.
Many lives could have been saved if emergency service could get accident information and
reach in time. Our project will provide an optimum solution to this draw back. A piezo electric
sensor can be used as a crash or rollover detector of the vehicle during and after a crash. With
signals from a piezo electric sensor, a severe accident can be recognized. According to this
project when a vehicle meets with an accident immediately piezo electric sensor will detect the
signal or if a car rolls over. Then with the help of GSM module and GPS module, the location
will be sent to the emergency contact. Then after conforming the location necessary action will
be taken. If the person meets with a small accident or if there is no serious threat to anyone’s
life, then the alert message can be terminated by the driver by a switch provided in order to
avoid wasting the valuable time of the medical rescue team.
Tools & Techniques for Commissioning and Maintaining PV Systems W-Animations ...Transcat
Join us for this solutions-based webinar on the tools and techniques for commissioning and maintaining PV Systems. In this session, we'll review the process of building and maintaining a solar array, starting with installation and commissioning, then reviewing operations and maintenance of the system. This course will review insulation resistance testing, I-V curve testing, earth-bond continuity, ground resistance testing, performance tests, visual inspections, ground and arc fault testing procedures, and power quality analysis.
Fluke Solar Application Specialist Will White is presenting on this engaging topic:
Will has worked in the renewable energy industry since 2005, first as an installer for a small east coast solar integrator before adding sales, design, and project management to his skillset. In 2022, Will joined Fluke as a solar application specialist, where he supports their renewable energy testing equipment like IV-curve tracers, electrical meters, and thermal imaging cameras. Experienced in wind power, solar thermal, energy storage, and all scales of PV, Will has primarily focused on residential and small commercial systems. He is passionate about implementing high-quality, code-compliant installation techniques.
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODELijaia
As digital technology becomes more deeply embedded in power systems, protecting the communication
networks of Smart Grids (SG) has emerged as a critical concern. Distributed Network Protocol 3 (DNP3)
represents a multi-tiered application layer protocol extensively utilized in Supervisory Control and Data
Acquisition (SCADA)-based smart grids to facilitate real-time data gathering and control functionalities.
Robust Intrusion Detection Systems (IDS) are necessary for early threat detection and mitigation because
of the interconnection of these networks, which makes them vulnerable to a variety of cyberattacks. To
solve this issue, this paper develops a hybrid Deep Learning (DL) model specifically designed for intrusion
detection in smart grids. The proposed approach is a combination of the Convolutional Neural Network
(CNN) and the Long-Short-Term Memory algorithms (LSTM). We employed a recent intrusion detection
dataset (DNP3), which focuses on unauthorized commands and Denial of Service (DoS) cyberattacks, to
train and test our model. The results of our experiments show that our CNN-LSTM method is much better
at finding smart grid intrusions than other deep learning algorithms used for classification. In addition,
our proposed approach improves accuracy, precision, recall, and F1 score, achieving a high detection
accuracy rate of 99.50%.
Generative AI Use cases applications solutions and implementation.pdfmahaffeycheryld
Generative AI solutions encompass a range of capabilities from content creation to complex problem-solving across industries. Implementing generative AI involves identifying specific business needs, developing tailored AI models using techniques like GANs and VAEs, and integrating these models into existing workflows. Data quality and continuous model refinement are crucial for effective implementation. Businesses must also consider ethical implications and ensure transparency in AI decision-making. Generative AI's implementation aims to enhance efficiency, creativity, and innovation by leveraging autonomous generation and sophisticated learning algorithms to meet diverse business challenges.
https://www.leewayhertz.com/generative-ai-use-cases-and-applications/
2. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 4, August 2020 : 3615 - 3622
3616
reported before 2013. Based on report of Open Security Foundation (OSF), more than one thousand incidents
of data loss happened during 2013 [1]. Moreover, more than two hundred million of login information were
compromised before 2012 based on Symantec company report [2]. In addition, more than 165 million of
documents were stolen during 2011 according to Verizon company [3]. Finally, more than 60 million of
sensitive files were breached from Sony company, and more than 5 million of login data were exposed for
LinkedIn users during 2012 [1]. Because of these dangerous, researchers have designed many useful tools
against data breaches. Data Loss Prevention (DLP) considers as one kind of data breach prevention. It is used
to identify and prevent important data to fall in the wrong hands [4]. This tool can be used to protect sensitive
data at rest which is data stored in end user or system such as hard disk or/and removable disk. It can also be
used to identify, monitor, or protect data in use which is data transmitted in the network. Protecting data in
motion is also one goal of DLP [5].
There are many kinds of security tools other than DLP that can be used in this regard such as
Intrusion Prevention Systems (IPS) and Intrusion Detection Systems (IDS). Although they consider as
security tools, they have main difference. DLP is responsible only for capturing and identifying sensitive
information. However, IPS and IDS is in charge of all kinds of dangerous or threat that may face data. DLP
has two main phases to identify important or sensitive information. The first one is generating fingerprints or
predefined patterns which are based on extracting certain features from known files. The second is comparing
the fingerprints of new files with the existed fingerprints that are derived from the first stage. Finally, if result
of comparison was positive, detected file may be encrypted, blocked, removed, or transferred to
safe place [6, 7].
Furthermore, DLP tools was first used in 2006. They are not a final solution for data breaches, but
they consider good security tool to eliminate malicious activities and protect sensitive information [8].
In the market, DLP may have other names such as information monitoring and prevention, information loss
protection, data analyzing and prevention, and/ or data leakage prevention [5]. There are many famous
companies have developed DLP tools. For example, Palo Alto Network company produced a security tool to
protect data in motion by analyzing, monitoring, and detecting sensitive information that are transmitted in
the wire or wireless. AmXecure Company released security design to identify data leakage that is called
PrivacyID [7]. RSA and McAfee have the best DLP security tool to identify information leakage as stated
in [9, 10]. Finally, Websense and McAfee organizations designed DLP tool that has three stages which are
data control, data endpoint, and data identification [11, 12]. Finally, the rest of the paper is organized as
the following: section II explained the popular reasons that lead to data leakage. Furthermore, section III
presented categories of where sensitive data can be stored. Moreover, solution methods are presented in
detail in section IV. The implementation and evaluation of mrsh-v2 in section V and VI respectively. Finally,
conclusion is presented in the in the last section.
2. POPULAR REASONS FOR DATA LOSS
Famous organizations and companies lose their reputational and billions of dollars due to sensitive
information breaches. There are two primary causes of data breach: external and internal threat.
2.1. Internal threats
First of all, internal threats consider as a primary cause of data leakage. Human mistakes are at
the top of this type of threat. There are many mistakes that can be done by people. For instance, negligence of
employees who leave their computers, mobiles, iPads, or other devices in public transportation or places such
as restaurant, markets, and/ or stores cost their companies a lot of money if these devices fall in the wrong
hands. Removable devices such as flash disk or disk driver that are left in internet café are consider from this
type of threat. More than forty percent was the rate of data loss due to stolen computers. As in [13], these
kinds of problems lead to almost more than forty five percent of data leakage in the healthcare sector [14].
According to the statistics, human errors was the main causes of information leakage during 2014 [15].
According to the last survey that was accomplished by the Group of Healthcare Organizations and
Ethics Association which represent a collection of expert researchers in this major, they stated that almost
forty percent of data breaches incidents happened due to misplaced documents such as important files.
They also showed that almost 30 percent of data leakages incidents caused by lost removable devices [16].
As stated also in [16], counsel public office for Massachusetts state obligated Goldthwait which is one of
the healthcare charging company to pay more than one hundred and thirty thousand bucks as a fine. This is
done because police officers found USB disk belong to one employee of their company that contains more
than 65000 sensitive records that are related to their patients in public trash.
In addition, texting messages and e-mails is another form of information leakage. Staff who work in
different kind of places such as hospitals, schools, universities, companies, or/ and organizations may send
3. Int J Elec & Comp Eng ISSN: 2088-8708
Data loss prevention by using MRSH-v2 algorithm (Basheer Husham Ali)
3617
important documents that contain sensitive data outside boundaries of their workplace by using texting
applications such as Gmail, Yahoo, and/or Hotmail. They can use other kinds of communication application
such as Viber, Facebook, WhatsApp, Telegram, or Instagram. They may send that to wrong destination and
these types of errors named as miscellaneous mistakes [14]. Finally, this can lead to data breaches.
Furthermore, staff of certain kind of organizations may leave certain job and start a new job with another.
They may take sensitive data of their previous job with them and expose that to others or their new job.
According to the statistics, almost fifty percent of staff take confidential data of their previous job when they
leave [5]. Atul Malhotra was in charge of giving important data from his previous job which is IBM to his
new one which is HP [5].
Staff of certain companies believe that their important documents can be demolished when they
deleted them. In other hands, these files can be recovered by using certain programs and can be used against
their companies unless provider remove them permanently. For instance, every device such as computers,
scanners, printers, or phones have memory inside of them. The data in memory can be recovered even if
users remove them. Finally, using weak algorithms help employees who have bad intention to take data.
Incorrect setting for organization system induces attackers to do their malicious activity such as unauthorized
access to the system according to Version report in 2008 [17, 18].
2.2. External threats
Not only internal threats but also external threats have high effect on data breach. Attackers would
like to have data to exploit their owner in order to induce them to pay money for them. Therefore, they have
developed sophisticated application to deceive users of large organizations such as healthcare companies,
financial, other business organizations. Point-of-Sale intrusions (POS) were one type of external threats.
POS is kind of malicious activities that can take and gather people visa, credit, or dept cards data at
checkout market. Sensitive data such cards numbers, expiration date, and passwords can be collected in one
file by attackers. According to the statistics that have been done, almost more than 10 gigabytes had been
breached at checkout of Target markets which are series of largest stores in USA. Almost forty million of
credit, dept, visa cards data were stolen at checkout when customers swipe their cards at checkout. Seventy
people names, date of birth, and address were also stolen due to that malicious acts [14, 19]. Finally, not only
Target stores but also Neiman Marcus markets was targeted by POS attacks. Two thousand credit card data
were breached by this attack [19].
Moreover, another kind of external dangerous is crimeware. Attackers may deceive users to install
malicious applications in their electronic devices without their knowledge in order to get their data. Attackers
can do that by sending a poison e-mail to victims that contain malicious applications. Victims may also visit
compromised websites by mistake. As soon as victims download and install the malicious application in their
devices, attackers can spy, monitor, or phishing data [20]. Finally, there many other external threats lead to
data breaches. One of them is Distributed Denial of Services (DDoS) attacks. Statistics showed that DDoS
has increased in the last decade [14]. DDoS attacks overload servers with very large number of packets to
shutdown services to legitimate users. Based on report of Worldwide Infrastructure Association, seventy five
percent of bank data leakage were caused by DDoS attacks [21].
3. PROTECTION OF IMPORTANT INFO
Data is very important not only for legitimate owners but also for attackers. Many DLP vendors
presented threats that face data wherever stored. Vonto is one of them. It stated that almost two out of eight
hundred texts that are transmitted in the air may comprise sensitive data. Two out of hundred network
messages imply private data. Vonto mentioned also in their report that eight out of ten companies lost their
data stored on their laptops. In addition, two out of four companies lost their information stored on removable
drivers [8]. Important data can be stored in different kind of categories such as data in motion, data at rest,
and data in use. Finally, in the next three subsections, these categories are going to be presented in detail.
3.1. Data in motion
Data in motion is information that are transmitted in the network whether through wireless or wire.
This data may be electronic books, word documents, excel documents, pictures, videos, voices, or power
point slides. Attackers may intercept these communications between two legal users in order to steal people
data. They can do that by developing malicious applications. They also may do that by exploiting
vulnerabilities in network algorithms, network application or protocols. DLP solutions are very important to
monitor network packets that are transferring in the channel. DLP provides feedback or report to the admin of
network if there are any malicious activity. Therefore, administrators can take action such as blocking,
filtering, encryption to the data.
4. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 4, August 2020 : 3615 - 3622
3618
3.1.1. Network monitor
DLP can do many tasks as mentioned earlier. One of them is network monitor. Passive monitor and
active monitor are two kinds of network monitor in general. In specific, monitor of DLP considers as
a passive one. Active monitor can deploy in both client and server sides. Server side can get full report about
transmission medium such as number of packets, loss of packets, throughput, bandwidth, protocol type, delay
time, source IP address, destination IP address, destination port address, and source port address. On other
hands, passive monitor may be as a device that can be deploy only in one side (either on client side or on
server side) [22]. It is used as packet sniffer to gather information in order to analyses or examine packets or
flows to identify malicious activities. This can increase level of confidentiality, performance, and efficiency
of the network. DLP likes Intrusion Detection System (IDS) in identifying malicious act and notifying person
who in charge of network. On other hands, IDS is not designed to avoid data leakage [23]. Finally, network
monitor of DLP can be deployed near router or switch that connected all devices together to control all
incoming and outcoming packets [24].
3.1.2. FTP protocol and E-mail
E-mail is the most common way to send data through the internet among people around the world.
Users can transfer different kinds of files, and it is on top priority of data leakage. Users can receive poison
links, photos, or other documents. They also may get executable program such as botnet. As soon as users
download these files in their devices, their important data may be in danger. Aa a result, DLP solution is
an important way to protect e-mail contents by few methods.
One way is that some DLP tools obligate e-mail applications to attach small files size instead of
large files size. This would guarantee that employee can not sent large size of important data to external
parties. Furthermore, DLP tools can also notify persons who in charge of network when attacks happen.
Finally, data can be encrypted or even blocked if there are any suspicion by using DLP tools.
Important documents can be sent by using another method which is file transfer protocol (FTP).
This protocol face security challenges. Data may be changed or breached on server side when malicious
persons invade this protocol. For instance, important data related to American army that are available in Iraq
were exposed by Associated Press because of lake of security for FTP protocol [25]. Another consequence of
FTP vulnerability happened when almost eight thousands of records that belong to SAIC were exposed as
stated in [25]. DLP tools face FTP problems, but it is not enough to eliminate this problem. Therefore,
Managed File Transfer (MFT), which can be used to transfer documents safely, may work with DLP to
decrease dangerous of FTP completely [26].
3.1.3. Filtering, bridge, and blocking solutions
Another action that can be taken after detection of data leakage by using DLP is blocking data from
being breached. Packets that carry data which are not identified as sensitive can be passing through the DLP.
However, those packets that has important information can be blocked from passing through. Blocking,
filtering, installing bridge are all ways that can be used in this regard.
First of all, bridge is a network device that can be used to connect computer devices to form
a network. It can be used also to connect internal with outside networks. Bridge device can do deep content
analysis for the packets that passing through it and stop transferring packets in case of finding important
data [5]. This device collects incoming packets to form flow table. Each flow is a group of packets that
has same characteristics such as IP source address, IP destination address, Mac source address, and Mac
destination address. Bridge depends mainly on recording and storing incoming destination Mac addresses and
source Mac addresses in its table. After a while, this device can get enough information to decide which
traffic may be blocked based on Mac addresses. This leads to increase security level [18].
Proxy server is device that has high characteristic level such as high microprocessor speed, high
random memory access (RAM), and large amount of storage space. This device may deploy between internal
network and external network to do deep inspection. DLP can get packets that passing through proxy server
for analysis. Internet Content Adaptation Protocol (ICAP) that run in the proxy can send a copy of packets
flow to identify sensitive information. Finally, DLP can do full inspection and analysis for incoming packets
as mentioned earlier. It can take action when sensitive data detected such as blocking, filtering, and
encryption. This tool may break the communication between two sides. For example, this can be done by
sending packet that called TCP Rest (RST) to break the connection [5, 27].
3.2. Data at rest
Data at Rest is kind of inactive data that may be not be used at the time in the system. It may be
stored in different kind of forms such as word document, spreadsheet excel, electronic books (pdf), power
point slides (ppt), videos, voices, images, or other kinds of file types. These data can be stored in different
kinds of devices such as laptops, computers, database of servers, workstations, internal or external hard
drives, tape drivers, cloud system, phones, iPads, or iPods [4].
5. Int J Elec & Comp Eng ISSN: 2088-8708
Data loss prevention by using MRSH-v2 algorithm (Basheer Husham Ali)
3619
3.3. Data in use
The best description for data in use is that any kind of data that may be used when system is active.
In other words, all data that users can deal with when they use their devices. For instance, data that are
available in Random Access Memory (RAM) may consider as data in use. This is because RAM is empty
when system is off. However, as soon as users start up system and run programs, data can be uploaded to
RAM. Another example, Central Processing Unit (CPU) has memory that contains a few kinds of small
registers. Each register has special tasks such as storing temporary results, pointers, memory location
addresses, number of increments, and number of decrements. These data consider as data in use which are
active during system work. Data stored in off-line memory that is used when system is active such as DVD,
CD, and Blue ray consider as data in use. Data stored in floppy disk, internal hard drive, external hard drive,
removable disks are all kinds of data in use if they used in time of system execution. data stored in office
application such as word, excel, power points, or outlook are all type of data in use. Data that are written in
terminal execution of programs such as Java, C, C++, Python, or MATLAB may be also from this type [4, 5].
DLP is a good solution for this kind of data. For example, applying constraint on machines to prevent data
loss is one solution of DLP. Putting limitation on using programs that let users transmitting important
information outside their devices. Finally, DLP tools obligate employers to have limited access to data
content in order to eliminate data leakage [5].
4. TECHNIQUES FOR DLP
DLP tools may use different and various methods to do data content analysis. According to SANS
Institute in [28], seven types of methods can be used to implement DLP tools. First of all, the most popular
type that can be used to implement DLP tools called regular expression or rule based. This method is based
on looking for specific sensitive information such as social security numbers, users’ names, e-mail addresses,
dept or visa card numbers, or phone numbers. This technique is suitable for identifying patients’ records,
employees’ records, electricity bills, phones’ bills, hospital bills, or bank statements. However, it is not
appropriate for detecting images, videos, or voices. False positive rate is going to be high. In other word,
amount or rate of data that are detected by using this method and do not match with original sensitive data
is high.
In addition, another method for identifying important data is database fingerprinting. This method is
based on looking for a collection of important information with the data that are available in the database.
This group might be any set of data such as dept card numbers and full names, full names and phone
numbers, or e-mail addresses and card numbers. This method is fitting on identifying a selection of sensitive
data. Amount or rate of data that are detected by using this method and do not match with original sensitive
data is high. In other words, this method produces low false positive rate. However, this approach has
problem that is similar to the previous one which is not suitable for identifying unstructured data such as
video, image, or voices.
Moreover, another approach that used for DLP is exact file matching. This method is based on
finding fingerprint signatures to document that has important data. the next step is comparing these
fingerprint signatures with new files to find matching. It can be used with all document’s kinds. It also has
low false negative rate. Mrsh-v1, ssdeep, and sdhash algorithms can be used to implement this method.
Furthermore, partial document matching is another technique. This approach is searching for incomplete or
complete matching with sensitive data. Rolling hash method is an example about this approach. It is effective
in detecting text data. However, it is not suitable for videos, photos, or voices.
In addition, another technique is statistical method. This method is based on mathematical equations
and statistics. Bayesian approach can be used to implement this method. This method is suitable for very big
dataset. However, it generates high false positive and high false negative. It also required massive dataset to
produce accurate results.
Conceptual/ lexicon method is another technique that can be used to implement DLP. This approach
is a collection of rules and dictionaries that can be used to find suspicious behavior and detecting important
data. This method is appropriate for detecting sexual harassment, private trading by using work account, and
illegal practice of stock exchanges. However, it generates high false positive. Finally, the last method is
categories. This technique is a collection of previous methods that can be used to implement a model for
DLP. For example, looking for specific e-mail address, specific username, and one full name. Finally, exact
file matching is the best approach according to reports [28]. Thus, mrsh-vs is an algorithm that implemented
in the next section to show the capabilities of this method.
6. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 4, August 2020 : 3615 - 3622
3620
5. IMPLEMANTATION
As mentioned earlier that approximation files matching is one technique that can be used to identify
data breaches. It works on identifying leakage files such as pdfs or words. This technique has two phases.
The first one is generating fingerprints or signatures which are based on extracting certain features from
known files. The second is comparing the fingerprints of new files with the existed fingerprints that are
derived from the first stage. The results of comparison fall within the range of 0 and 100. In other words,
when the probabilistic results of comparing two files are high, the files are similar to each other. However,
when these ratios are low, files are not similar to each other. This paper implemented and evaluated
the Mrsh-v2 (Multi-resolution similarity hashing) which is one algorithm is based on approximation
matching technique [6, 29].
In this paragraph, first stage of this algorithm is explained in detail. The input that contains sequence
of bytes (b, b2,.., bn) can be divided and grouped into several window. Each window has 7 bytes in length.
Then, these windows (w1,w2,…,wm) are grouped by using idea of rolling hash algorithm. This idea is
simply based on removing the first element from old window and inserting a new element to form a new
chunk. The size of elements in chunk (end of chunk) is determined by calculating pseudo random function
PRF (for each chunk) and chunk size (c). if PRF (for certain element) == -1 mod c, this means stop adding
new elements to the chunk. Otherwise, adding new element is keep going to form the chunk. Each chunk
then is hashed by using FIN algorithm. The main goal is to have 160 bytes for the chunk size and 0.5 for
the fingerprint size [6]. The idea of bloom filter was used to implement the second phase. Bloom filter is
the method for answering set membership. It depends on set of values (input) and independent hash
functions. Let us imagine that there are set of elements denoted by E, and all elements (n) are set initially to
be false. In addition, independent hash function (Hf) that may contain a set of hash functions
(Hf1,Hf1,…,Hfn) gives us random numbers between 0 and n-1. For each element e in the set of E, result of
Hf(e) is going to be the index number (position) of E, and we are going to set to true the element
of that position.
The second stage is comparing the fingerprints of new files with the existed fingerprints.
To implement this stage, two bloom filters are used E1 and E2. e1 and e2 is going to be number of bits that
are set to be true in E1 and E2 respectively. The number of bits that set to be true in common is (k=e1∩e2).
To compare E1 and E2, k is going to be compared with certain kind of score (S). In such a way, if k is larger
than S, then the similarity score is high. Otherwise it is going to be 0. S can be computed based on
the maximum (Max) and minimum (Min) number of bits that interfere by chance between E1 and E2.
Therefore,
𝑆 = 𝛽 ∗ ( 𝑀𝑎𝑥 − 𝑀𝑖𝑛) + 𝑀𝑖𝑛 (1)
where β =0.3 based on better experiment. Where Max can be defined as in (2):
𝑀𝑎𝑥 = 𝑚𝑖𝑛 (𝑒1, 𝑒2) (2)
However, Min can be calculated as in (3).
𝑀𝑖𝑛 = 𝑛 ∗ (1 − 𝑞(𝐻𝑓 ∗ 𝑒1$)
− 𝑞(𝐻𝑓 ∗ 𝑒2$)
+ 𝑞 𝐻𝑓 ( 𝑒1$+ 𝑒2$)
) (3)
Where n is the number of bits of bloom filter and Hf is the size of hash functions as mentioned earlier in
the description of bloom filter. e1$ is amount of element for Bloom filter E1, and e2$ is the number of
elements for E2. Finally, q is the rate that some kind of bit still false or zero in the Bloom filter when we add
a new element and it can be defined as in (4).
𝑞 = (1 − 1/𝑛) (4)
Therefore, we can use (5) to find the similarity ratio (Sim) between two Bloom filters:
𝑆𝑖𝑚 = {
0, 𝑘 ≤ 𝑆
100(𝑘 − 𝑆)
𝑀𝑎𝑥 − 𝑆
, 𝑂𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
(5)
6. EVALUATION
Mrsh-v2 algorithm was evaluated by using confusion matrix [29, 30]. Confusion matrix has many
kinds of metrics that can be used in the evaluation. True positive (TP), false positive (FP), false negative
7. Int J Elec & Comp Eng ISSN: 2088-8708
Data loss prevention by using MRSH-v2 algorithm (Basheer Husham Ali)
3621
(FN), true positive rate (TPR) or sensitivity or recall, false positive rate (FPR) or fall-out or probability of
false alarm, and false negative rate (FNR) or miss rate are some of these metrices that were used. TP means
the file identify correctly by the algorithms. In other words, the identified file is similar to the one that is
stored in database. Sensitivity or recall is the amount or rate of files that have similarity with files that are
existed in the database. FP means that mrsh-v2 detects files, but they are not existed in the database. Fall-out
or probability of false alarm is the amount or rate of files that are detected by the algorithm and do not have
a corresponding one in the database. FN means that mrsh-v2 identifies files that are not available in
the database. Miss rate is the number of files that are detected by this algorithm, but they are not available in
the database.
Mrsh-v2 was evaluated in the network environment and by using TS dataset. TS dataset which is
publicly available online. This dataset contains several kinds of files such as pdf, exe, doc, gif, xls, ppt, and
txt. Approximately three thousand files from TS dataset were transferred in network by generating almost
two hundred and ninety thousand packets. 249670 out of 290314 packets that carried different kinds of files
which are available in the dataset were detected correctly by mrsh-v2 algorithm as shown in Table 1.
However, 40643 packets that transferred different kinds of files were identified by mrsh-v2, but they are not
available in the dataset. In other hand, 29031 packets that have files were not discovered by mrsh-v2
although those files are available in the dataset. Finally, the reset of Table 1 shows the amount of packets for
all kinds of files in details based on the term FP,TP, and FN.
Finally, 0.85 out of 1 was the value of TPR or sensitivity as shown in Figure 1. This means that
there is a high amount of different kinds of files was detected correctly by mrsh-v2. However, 0.14 and 0.1
out of 1 was the amount of FPR and FNR respectively. This means mrsh-v2 produces low amount of errors in
general for different types of files. In specific, Mrsh-v2 algorithm detects jpg, gif, and pdf files correctly
because TPR is high, and FNR and FPR are low as shown in Figure 1.
Table 1. Values of FP, TP, and FN
File Type FP TP FN
jpg 31934 278701 8709
gif 29031 281604 5806
doc 40643 214832 72578
xls 52256 235154 52256
ppt 40643 267088 20321
pdf 34837 278701 11612
txt 40643 252573 34837
exe 52256 243863 43547
total 40643 249670 29031
Figure 1. FPR, TPR, and FNR values by using Mrsh-v2
7. CONCLUSION
It is very significant to protect data whatever they store. Data can be stored in three main different
kinds: data in motion, data at rest, and data in use. Data is important not only for legitimate owners but also
for attackers. Data Loss Prevention (DLP) are good tools to identify sensitive data. DLP can do analysis for
8. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 4, August 2020 : 3615 - 3622
3622
data content and send feedback to administrators to make decision such as filtering, deleting, or encryption.
There are many kinds of DLP techniques. The best one is approximation files matching. Mrsh-v2 algorithm
considers as an example about this approach. This algorithm was implemented and evaluated by using
publicly available TS dataset. Confusion matrix results showed that this algorithm has high TP, and TPR.
In other hands, mrsh-v2 has low FP, FN, FPR, and FNR.
REFERENCES
[1] "2019 Data Breach Investigations Report," verizon, 2019. [Online]. Available:
https://enterprise.verizon.com/resources/reports/2019-data-breach-investigations-report.pdf.
[2] D. Antoniades, et al., "Accurate Traffic Categorization," Proceedings of IST Broadband Europe, pp. 1-6, 2006.
[3] W. Ashford, "DDoS is the Most Common Method of Cyber-Attack on Financial Institutions," 2016. [Online],
Available: https://Computer Weekly.com,
[4] F. Breitinger, and I. Baggili, "File Detection on Network Traffic Using Approximation Matching," Journal of
Digital Forensics, Security & Law, vol. 9, no. 2, pp. 23-35, 2014.
[5] J. Beeskow, "Reducing Security Risk using Data Loss Prevention Technology," Journal of The Healthcare
Financial Management Association, pp. 108-112, 2015.
[6] F. Breitinger and H. Baier, "Similarity Preserving Hashing: Eligible Properties and a New Algorithm MRSH-v2,"
International ICST Conference on Digital Forensics and Cyber Crime, pp. 167-182, 2012.
[7] B. Blevins, "Best of Data Loss Prevention," Information Security, pp. 13-15, 2014.
[8] A. Burroughs, "Data Breaches Cause Worry," Smart Business Orange County, 2015.
[9] A. Cecil, "A Summary of Network Traffic Monitoring and Analysis Techniques," Computer Systems Analysis,
pp. 4-7, 2006.
[10] S. Gumaste, et al., "Proxy Server Experiment and the Behavior of the Web," International Journal of Advanced
Research in Computer Science, vol. 4, no. 1, pp. 84-87, 2013.
[11] "Data Loss Prevention Keeping your Sensitive out of the Public Domain," EY, 2011. [Online]. Available:
http://www.ey.com/Publication/vwLUAssets/EY_Data_Loss_Prevention/$FILE/EY_Data_Loss_Prevention.pdf
[12] L. Grandia, "Nine Key Cyber Threats Identified in Verizon Data Breach Report," Health Management Technology,
vol. 35, no. 6, 2014.
[13] "The Practical Executive’s Guide to Data Loss Prevention," Whitepaper, pp. 2-17, 2019. [Online]. Available:
https://cdw-prod.adobecqms.net/content/dam/cdw/on-domaincdw/brands/forcepoint/whitepaper-practical-
executives-guide-data-loss-prevention-en.pdf,
[14] "Internet security threat report," 2019trends, Symantec, Inc., vol. 24, 2019. [Online]. Available:
https://www.symantec.com/content/dam/symantec/docs/reports/istr-24-2019-en.pdf.
[15] J. Jaeger, "Human Error, Not Hackers, Cause Most Data Breaches," Compliance Week, vol. 10, no. 110,
pp. 56-57, 2013.
[16] R. Hiesh, "Improving HIPAA Enforcement and Protecting Patient Privacy in a Digital Healthcare Environment,"
Loyola University Chicago Law Journal, vol. 46, no. 1, pp. 175-223, 2014.
[17] "McAfee Total Protection for Data Loss Prevention," Solution brief, McAfee, 2019. [Online]. Available:
https://www.mcafee.com/enterprise/en-us/assets/solution-briefs/sb-total-protection-for-dlp.pdf.
[18] H. Tuttle, "Hacking Away at the Bottom Line," Risk Management, 2014.
[19] L. Musthaler, "The True Cause of Data Breaches," NetworkWorld Asia, pp. 6-6. 2008.
[20] S. Naidu, "Data in Motion-Securing Businesses on the Go," SDA Asia Magazine, pp. 46-48, 2009.
[21] N. Wynne and B. Reed, "Magic Quadrant for Content-Aware Data Loss Prevention," Gartner Research, 2013.
[22] A. Papadogiannakis, et al., "Improving the Performance of Passive Network Monitoring Applications using
Locality Buffering," IEEE Xplore, pp. 151-157, 2007.
[23] K. Košt’ál, et al., "Management and Monitoring of IoT Devices Using Blockchain," The Association of Digital
Forensics, Security and Law (ADFSL), pp. 1-12, 2019.
[24] L. Strauss, "Data breach study: Criminal attacks now leading cause," J. of health care compliance, pp. 61-63, 2015.
[25] "Survey Cites Human Error as Biggest Cause of Data Breaches," Magazine Article-Information Management,
2015, [online]. Available: https://www.questia.com/magazine/1G1-436228015/survey-cites-human-error-as-
biggest-cause-of-data.
[26] "The GoAnywhere book of secure file transfer project examples," GoAnywhere manag. file transf., pp. 1-33, 2019,
[online]. Available: https://www.infosecurityeurope.com/__novadocuments/585230?v=636906728355170000.
[27] J. Wu, et al., "Keystroke and Mouse Movement Profiling for Data Loss Prevention," Journal of Information
Science and Engineering, pp. 23-42, 2015.
[28] R. Mogull, and LLC. Securosing, "Understanding and Selecting a Data Loss Prevention Solution," Technicalreport,
SANS Institute, 2007.
[29] V. Gupta, "File detection in network traffic using approximate matching," MS Thesis. Institutt for Telematikk, 2013.
[30] K. Ting, "Confussion Matrix," Encyclopedia of Machine Learning and Data Mining. Springer, Boston, 2017.