How To Tackle Big Data From A
Security Point Of View
Securing Your Big Data
A growing number of companies are using the technology to
store and analyze petabytes of data including web logs, click
stream data and social media content to gain better insights
about their customers and their business.
As a result, information classification becomes even more
critical; and information ownership must be addressed to
facilitate any reasonable classification.
Deploying Big Data for Security
The deployment of Big Data for fraud detection, and in place
of security incident and event management (SIEM) systems, is
attractive to many organisations. The overheads of managing
the output of traditional SIEM and logging systems are
proving too much for most IT departments and Big Data is
seen as a potential saviour. There are commercial
replacements available for existing log management systems,
or the technology can be deployed to provide a single data
store for security event management and enrichment.
Big Data Technologies and Risks
If you research the term Big Data, you will invariably
encounter Hadoop. Traditional data warehouses and relational
databases process structured data and can store massive
amounts of data, but the requirement for structure restricts the
type of data that can be processed. Hadoop is designed to
process large amounts of data, regardless of its structure.
The core of Hadoop is the MapReduce framework, which was
created at Google in response to the problem of creating web
search indexes. MapReduce distributes a computation over
multiple nodes, thus solving the problem of data that is too
large to fit onto a single machine.
Specialist Skills
In reality, Big Data is more about the processing techniques
and outputs than the size of the data set itself, so specific skills
are required to use Big Data effectively. There is a general
shortage of specialist skills for Big Data analysis, in particular
when it comes to using some of the less mature technologies.
The growing use of Hadoop and related technologies is
driving demand for staff with very specific skills. People with
backgrounds in multivariate statistical analysis, data mining,
predictive modelling, natural language processing, content
analysis, text analysis and social network analysis are all in
demand. These analysts and scientists work with structured
and unstructured data to deliver new insights and intelligence
to the business.
Stay Tuned With Us for More
Information
https://www.linkedin.com/company/tyronesystems
https://twitter.com/tyronesystems
https://www.facebook.com/tyronesystems
Source:
http://www.computerweekly.com/feature/How-to-tackle-big-data-from-a-
security-point-of-view

How to tackle big data from a security

  • 1.
    How To TackleBig Data From A Security Point Of View
  • 2.
  • 3.
    A growing numberof companies are using the technology to store and analyze petabytes of data including web logs, click stream data and social media content to gain better insights about their customers and their business. As a result, information classification becomes even more critical; and information ownership must be addressed to facilitate any reasonable classification.
  • 4.
    Deploying Big Datafor Security
  • 5.
    The deployment ofBig Data for fraud detection, and in place of security incident and event management (SIEM) systems, is attractive to many organisations. The overheads of managing the output of traditional SIEM and logging systems are proving too much for most IT departments and Big Data is seen as a potential saviour. There are commercial replacements available for existing log management systems, or the technology can be deployed to provide a single data store for security event management and enrichment.
  • 6.
  • 7.
    If you researchthe term Big Data, you will invariably encounter Hadoop. Traditional data warehouses and relational databases process structured data and can store massive amounts of data, but the requirement for structure restricts the type of data that can be processed. Hadoop is designed to process large amounts of data, regardless of its structure. The core of Hadoop is the MapReduce framework, which was created at Google in response to the problem of creating web search indexes. MapReduce distributes a computation over multiple nodes, thus solving the problem of data that is too large to fit onto a single machine.
  • 8.
  • 9.
    In reality, BigData is more about the processing techniques and outputs than the size of the data set itself, so specific skills are required to use Big Data effectively. There is a general shortage of specialist skills for Big Data analysis, in particular when it comes to using some of the less mature technologies. The growing use of Hadoop and related technologies is driving demand for staff with very specific skills. People with backgrounds in multivariate statistical analysis, data mining, predictive modelling, natural language processing, content analysis, text analysis and social network analysis are all in demand. These analysts and scientists work with structured and unstructured data to deliver new insights and intelligence to the business.
  • 10.
    Stay Tuned WithUs for More Information https://www.linkedin.com/company/tyronesystems https://twitter.com/tyronesystems https://www.facebook.com/tyronesystems Source: http://www.computerweekly.com/feature/How-to-tackle-big-data-from-a- security-point-of-view