2. WHAT IS BIG DATA?
ď‚„ Big Data is a phrase used to mean a massive volume of both structured and
unstructured data that is so large it is difficult to process using traditional database and
software techniques. In most enterprise scenarios the volume of data is too big or it
moves too fast or it exceeds current processing capacity.
ď‚„ Big Data has the potential to help companies improve operations and make faster,
more intelligent decisions. The data is collected from a number of sources including
emails, mobile devices, applications, databases, servers and other means. This data,
when captured, formatted, manipulated, stored and then analyzed, can help a company
to gain useful insight to increase revenues, get or retain customers and improve
operations.
3. ď‚„ Is Big Data a Volume or a Technology?
ď‚„ While the term may seem to reference the volume of data, that isn't always the case. The term Big
Data, especially when used by vendors, may refer to the technology (which includes tools and
processes) that an organization requires to handle the large amounts of data and storage facilities.
The term is believed to have originated with web search companies who needed to query very
distributed aggregations of loosely-structured data.
ď‚„ An Example of Big Data
ď‚„ An example of Big Data might be petabytes (1,024 terabytes) or exabytes (1,024 petabytes) of data
consisting of billions to trillions of records of millions of people—all from different sources (e.g.
Web, sales, customer contact center, social media, mobile data and so on). The data is typically
loosely structured data that is often incomplete and inaccessible.
4. ď‚„ Business Datasets
ď‚„ When dealing with larger datasets, organizations face difficulties in being able to
manipulate, and manage big data. Big Data is particularly a problem in business
analytics because standard tools and procedures are not designed to search and
massive datasets.
ď‚„ As research of QuinStreet demonstrates, big data initiatives are poised for explosive
growth. QuinStreet surveyed 540 enterprise decision-makers involved in big data and
found the datasets of interest to many businesses today include traditional structured
databases of inventories, orders, and customer information, as well as unstructured
from the Web, social networking sites, and intelligent devices.
ď‚„ Big Data may also be called enterprise Big Data or big data.
5. BOG EXAMPLES OF “BIG-DATA”
ď‚„The New York Stock
Exchange generates
about one terabyte of new
trade data per day.
6. ď‚„Social Media Impact
ď‚„Statistic shows that 500+terabytes of
new data gets ingested into the
databases of social media site Facebook,
every day. This data is mainly generated
in terms of photo and video uploads,
message exchanges, putting comments
etc.
7. ď‚„Single Jet engine can
generate 10+terabytes of data in 30
minutes of a flight time. With many
thousand flights per day, generation of
data reaches up to many Petabytes.
8. TOOLS FOR “BIG-DATA”
ď‚„ 1. Cassandra
ď‚„ This tool is widely used today because it
provides an effective management of
large amounts of data. It is a database
that offers high availability and scalability
without compromising the performance
of commodity hardware and cloud
infrastructure. Among the main
advantages of Cassandra highlighted by
the development are fault tolerance,
performance, decentralization,
professional support, durability, elasticity,
and scalability. Indeed, such users of
Cassandra as eBay and Netflix may prove
them.
2. Hadoop
Another great product from Apache that has be
enused by many large corporations. Among the
most important features of this advanced software
library is superior processing of voluminous data
sets in clusters
of computers using effective programming
models.
Corporations choose Hadoop because of its great
processing capabilities plus developer provides
regular
updates and improvements to the product.
9. ď‚„ 3. Plotly
ď‚„ Successful big data analytics use Plotly to create great dynamic visualization even in case if
the company does not have sufficient time or skills for meeting big data needs. It makes the
process of creating stunning and informative graphics very easy using the online tools. Also,
the platform enables sharing the findings by transporting the results into different
convenient formats.
ď‚„ 4. Bokeh
ď‚„ Similarly to Plotly, this tool is also great for creating easy and informative visualizations. It is
used for big data analytics experts to easily and quickly create interactive data applications,
dashboards, and plots. Check out the gallery of the example works that were done with
Bokeh using the big data. Many experts also say Bokeh is the most advanced visual data
representation tool.
10. ď‚„ 5. Neo4j
 The official website of the tool claims that it is the world’s leading graph database.
Indeed, it is, because it takes the big data business to the next level: it helps to work
with the connections between them. The connections between the data drive modern
intelligent applications, and Neo4j is the tool that transforms these connections to
gain competitive advantage. If you are looking for additional information about how
you can gain a competitive advantage of utilizing a graph database
ď‚„ 6. Cloudera
ď‚„ Businesses today use this tool for creating a data repository that can be accessed by all
corporate users that need the data for different purposes. It was developed in 2008
and still is the most popular provider and supporter of Apache Hadoop. This
combination is known to transform businesses and reducing business risks in order to
give them a competitive advantage.
11. ď‚„ 7. OpenRefine
ď‚„ Need to explore voluminous data sets with ease? This tool allows the businesses to
prepare everything for the data analysis. Simply saying, OpenRefine will help to organize
the data in the database that was nothing but a mess. As the result, the users can begin
to process the data with the computer.
ď‚„ 8. Storm
ď‚„ This tool makes the list because of its superior streaming data processing capabilities in
real time. It also integrates with many other tools such as Apache Slider to manage and
secure the data. The use cases of Storm include data monetization, real time customer
management, cyber security analytics, operational dashboards, and threat detection.
These functions provide awesome business opportunities.
12. ď‚„ 9. Wolfram Alpha
ď‚„ Want to calculate or know something new about things?
ď‚„ Wolfram Alpha is an awesome tool to look for information about just about everything.
Doug Smith from Proessaywriting says that his company uses this platform for advanced
research of financial, historical, social, and other professional areas. For example, if you
type “Microsoft,” you receive input interpretation, fundamentals and financials, latest
trade, price history, performance comparisons, data return analysis, correlation matrix,
and many other information.
ď‚„ 10. Rapidminer
ď‚„ A big data specialist needs this open source data science platform, which functions
through visual programming. It allows to manipulate, analyze, model, create models, and
integrate the data into business processes.
13. DETAILED TOOL WITH PRICING.
ď‚„ Sisense
ď‚„ Sisense is the only business intelligence software that makes it easy for users to prepare, analyze
and visualize complex data. Sisense provides an end-to-end solution for tackling growing data
sets from multiple sources, that comes out-of-the-box with the ability to crunch terabytes of data
and support thousands of users--all on a single commodity server. Sisense has already won over
the hearts of some of the worlds leading, most data-intensive companies, including eBay, Henry
Schein, NASA
ď‚„ They provide trial version and demo version, if you interested so they will sell that software on
your needs.
ď‚„ For examples they will ask what is your project type, number of users, data volumes project
timelines and etc,, depends on this information they will give estimated price.
14. SPSS
ď‚„ Your organization has more data than ever, but spreadsheets and basic statistical analysis tools
limit its usefulness. IBM SPSS Statistics software can help you find new relationships in the data
and predict what will likely happen next. Watch IBM's free statistics video demo to learn how to
easily access, manage and analyze data sets-without previous statistics experience; virtually
eliminate time-consuming data prep; and quickly create, manipulate and distribute insights for
decision making.
ď‚„ It will provide trial version also
ď‚„ Paid information and plans :
ď‚„ 1. IBM SPSS Statistics Standard
ď‚„ 2. IBM SPSS Statistics Premium
ď‚„ 3. IBM SPSS Statistics Professional
ď‚„ All plans starts with 99.00USD but it will increased by your needs.
15. ENVISION
ď‚„ Envision is a cloud data analytics platform for embedded, Big Data, and
IoT applications. It provides secure and rapid free-form data
visualization and exploration in real time, scalable throughout the
enterprise.
ď‚„ Starting price - $10.00/month/user
ď‚„ And its also provide free trial.
16. DATAPLAY
ď‚„ By using DataPlay you can significantly cut your time spent on analysis and presentation of
data. DataPlay Suite automates MS PowerPoint presentation generation through a set of data
analysis, data visualization and data storage solutions. DataPlay applications boost the
productivity of researchers by empowering them with intuitive tools for automated SPSS data
visualization. All DataPlay applications are securely stored in DataPlay Cloud, which enables
users to securely share the information.
ď‚„ Starting price - $140.00/month/user
ď‚„ Its also have free trial and free version.
17. IMAGES & VIDEOS: WITH BIG DATA
ď‚„ The human brain simultaneously processes millions of images, movement, sound and
other esoteric information from multiple sources. The brain is exceptionally efficient and
effective in its capacity to prescribe and direct a course of action and eclipses any
computing power available today. Smartphones now record and share images, audios
and videos at an incredibly increasing rate, forcing our brains to process more.
 Technology is catching up to the brain. Google’s image recognition in “Self-taught
Software” is working to replicate the brain’s capacity to learn through experience. In
parallel, prescriptive analytics is becoming far more intelligent and capable than
predictive analytics. Like the brain, prescriptive analytics learns and adapts as it processes
images, videos, audios, text and numbers to prescribe a course of action.
18. IMAGE ANALYTICS: TECHNOLOGY PROCESS
ď‚„ Image analytics is the automatic algorithmic extraction and logical analysis of information found in
image data using digital image processing techniques. The use of bar codes and QR codes are
simple examples, but interesting examples are as complex as facial recognition and position and
movement analysis.
ď‚„ Today, images and image sequences (videos) make up about 80 percent of all corporate and public
unstructured big data. As growth of unstructured data increases, analytical systems must assimilate
and interpret images and videos as well as they interpret structured data such as text and numbers.
19. ď‚„ An image is a set of signals sensed by the human eye and
processed by the visual cortex in the brain creating a vivid
experience of a scene that is instantly associated with
concepts and objects previously perceived and recorded in
one’s memory. To a computer, images are either a raster
image or a vector image. Simply put, raster images are a
sequence of pixels with discreet numerical values for color;
vector images are a set of color-annotated polygons. To
perform analytics on images or videos, the geometric
encoding must be transformed into constructs depicting
physical features, objects and movement represented by the
image or video. These constructs can then be logically
analyzed by a computer.
21. PENTAHO
ď‚„ Within a single platform, our solution provides big data tools to extract,
prepare and blend your data, plus the visualizations and analytics that
will change the way you run your business. From Hadoop and Spark to
NoSQL, Pentaho allows you to turn big data into big insights.
ď‚„ It is freeware but you should mail them for demo and they will provide
you link.
22. AZURE
ď‚„ Azure HDInsight offers fully managed and supported 100% Apache
Hadoop®, Spark, HBase and Storm clusters. You can get up and
running quickly on any of these workloads with a few clicks and within
a few minutes without buying hardware or hiring specialised operations
teams typically associated with big data infrastructure.
ď‚„ Starting price â‚ą9,899rs to â‚ą1,32,380.75/month
23. IMPORT.IO
ď‚„ Import.io is the number one tool for data extraction. Import.io enables
users to convert websites into structured, machine readable data with
no coding required. Using a simple point and click UI, we take a
webpage and transform it into an easy to use spreadsheet that you can
then analyze, visualize, and use to make data-driven decisions. Features
include Authenticated Extractions behind a login, flexible scheduling,
and fully documented public APIs. Customers use the data for machine
learning, market and academic research, lead generation, app
development, and price monitoring.
ď‚„ Essential-$299 Queries expire after 1 month
ď‚„ Professional- $1,999
ď‚„ Enterprise-$4,999
24. THANK YOU.
ď‚„ For more information visit www.mithileshjoshi.blogspot.com