Coartha Technosolutions
Presented by
Coartha Team
Rebot Project
Contents:-
1. Bigdata (Horton Works HDP-V2.0)
2. Machine Learning (Python V2.7.6)
3. Cloud ( Amazon Free tie...
Brief Explanation
 Bigdata (Horton Works HDP-2.0)
Big Data is nothing but collection of data, data sets which
is of unst...
Continue……
 HDFS:
 Hadoop Distributed file system.(HDFS).
 Handles large data with streaming data Access.
 Runs on top...
Machine Learning Language
 Python V2.7.6 Modules on Dev,staging and
Production.
PIP-1.5.4
NLTK-2.0.4
Setup tools-3.3
...
Cloud
 Cloud Components for Staging & Production.
Centos V6.4 Instance .
Bucket for Storage of files.
WordPress Blog w...
Dev,Staging and Production.
Maintain same version on all the three stages of
the Project.
MongoDB(Database)
 Description of Database:
Handles structured, unstructured and polymorphic.
NOSQL.....
Scale up with...
!!!!!!Thank You !!!!!!
Upcoming SlideShare
Loading in …5
×

Rebot Project Contents and Description

491 views

Published on

Rebot Project Details

Published in: Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
491
On SlideShare
0
From Embeds
0
Number of Embeds
250
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Rebot Project Contents and Description

  1. 1. Coartha Technosolutions Presented by Coartha Team
  2. 2. Rebot Project Contents:- 1. Bigdata (Horton Works HDP-V2.0) 2. Machine Learning (Python V2.7.6) 3. Cloud ( Amazon Free tier) 4. Dev,staging and Production. 5. Database (MongoDB) 6. Linux Systems(Centos V6.4)
  3. 3. Brief Explanation  Bigdata (Horton Works HDP-2.0) Big Data is nothing but collection of data, data sets which is of unstructured data . Big Data is useful under the large growing data where it is unable to manage. Hadoop V2.0 is for running mapreduce job of the particular task. Hadoop V2.0 consists of Mapreduce ,Yarn and HDFS . Above mentioned three components are important in Hadoop.
  4. 4. Continue……  HDFS:  Hadoop Distributed file system.(HDFS).  Handles large data with streaming data Access.  Runs on top of all file system.  Uses Blocks to store files.  Mapreduce:  Frame work for performing calculations on data in HDFS.  Map&Reduce Function.  YARN:  Distributed Data Processing.  Resource and Scheduler Manager.
  5. 5. Machine Learning Language  Python V2.7.6 Modules on Dev,staging and Production. PIP-1.5.4 NLTK-2.0.4 Setup tools-3.3 Easy install-2.7 Numpy-1.8.1 pyYaml-3.11 Mrjob-0.4.2
  6. 6. Cloud  Cloud Components for Staging & Production. Centos V6.4 Instance . Bucket for Storage of files. WordPress Blog with Version 3.8.1. JQuery on WordPress. Visualization on instance. .pem file for connecting Cloud from Local machine. .ppk file for moving data from Local system to cloud through FileZilla. Public Ip (Elastic Ip for the Instance).
  7. 7. Dev,Staging and Production. Maintain same version on all the three stages of the Project.
  8. 8. MongoDB(Database)  Description of Database: Handles structured, unstructured and polymorphic. NOSQL..... Scale up with Bigdata. MongoHQ for MongoDB server. Backup & Restore Data from DB.
  9. 9. !!!!!!Thank You !!!!!!

×