Big Query - Women Techmarkers (Ukraine - March 2014)


Published on

BigQuery Overview
Typical Uses
Project Hierarchy
Access Control and Security
Datasets and Tables

Published in: Software, Technology
  • Be the first to comment

  • Be the first to like this

Big Query - Women Techmarkers (Ukraine - March 2014)

  1. 1. BigQuery Basics
  2. 2. Who? Why? BigQuery Basics Ido Green Developer Advocate
  3. 3. Topics we cover ● BigQuery Overview ● Typical Uses ● Project Hierarchy ○ Access Control and Security ○ Datasets and Tables ● Tools ● Demos BigQuery Basics
  4. 4. ● MapReduce based analysis can be slow for ad-hoc queries ● Managing data centers and tuning software takes time & money ● Analytics tools should be services How does BigQuery fit in the analytics landscape? BigQuery Basics
  5. 5. Why BigQuery? ● Generate big data reports require expensive servers and skilled database administrators ● Interacting with big data has been expensive, slow and inefficient ● BigQuery changes all that ○ Reducing time and expense to query data BigQuery Basics
  6. 6. What's BigQuery? ● Service for interactive analysis of massive datasets (TBs) ○ Query billions of rows: seconds to write, seconds to return ○ Uses a SQL-style query syntax ○ It's a service, accessed by a RESTful API ● Reliable and secure ○ Replicated across multiple sites ○ Secured through Access Control Lists ● Scalable ○ Store hundreds of terabytes ○ Pay only for what you use ● Fast (really) ○ Run ad hoc queries on multi-terabyte data sets in seconds BigQuery Basics
  7. 7. Analyzing Large Amount of Data high speed BigQuery Basics
  8. 8. Uses
  9. 9. Typical Uses Analyzing query results using a visualization library such as Google Charts Tools API BigQuery Basics
  10. 10. Typical Uses Another way to analyze query results with Google Spreadsheets ○ ○ BigQuery Basics
  11. 11. BigQuery Use Cases ● Log Analysis - Making sense of computer generated records ● Retailer - Using data to forecast product sales ● Ads Targeting - Targeting proper customer sections ● Sensor Data - Collect and visualize ambient data ● Data Mashup - Query terabytes of heterogeneous data BigQuery Basics
  12. 12. Some Customer Case Studies Uses BigQuery to hone ad targeting and gain insights into their business Dashboards using BigQuery to analyze booking and inventory data Use BigQuery to provide their customers ways to expand game engagement and find new channels for monetization Used BigQuery, App Engine and the Visualizaton API to build a business intelligence solution BigQuery Basics
  13. 13. BigQuery Basic Technical Details
  14. 14. Project Hierarchy ● Project ○ All data in BigQuery belongs inside a project ○ Set of users, APIs, authentication, billing information and ACL ● Dataset ○ Holds one or more tables ○ Lowest access control unit (to which ACLs are applied) ● Table ○ Row-column structure that contains actual data ● Job ○ Used to start potentially long running queries BigQuery Basics
  15. 15. Datasets and Tables Table name is represented as follows: ● Current Project <dataset>.<table name> ● Different Project <project>:<dataset>.<table> e.g. publicdata:samples.wikipedia BigQuery Basics
  16. 16. Schema Example ● Demographics about names occurrence table schema name:string,gender:string,count:integer BigQuery Basics
  17. 17. Data Types ● String ○ UTF-8 encoded, <64kB ● Integer ○ 64 bit signed ● Float ● Boolean ○ "true" or "false", case insensitive ● Timestamp ○ String format ■ YYYY-MM-DD HH:MM:SS[.sssss] [+/-][HH:MM] ○ Numeric format (seconds from UNIX epoch) ■ 1234567890, 1.234567890123456E9 (*) Max row size: 64kB Date type is supported as timestamp BigQuery Basics
  18. 18. Data Format BigQuery supports the following format for loading data: 1. Comma Separated Values (CSV) 2. JSON a. BigQuery can load data faster, if your data con embedded newlines. b. Supports nested/repeated data fields BigQuery Basics
  19. 19. Loading data with repeated and nested fields is supported by JSON data format only Repeated and Nested Fields BigQuery Basics [ { "fields": [ { "mode": "nullable", "name": "country", "type": "string" }, { "mode": "nullable", "name": "city", "type": "string" } ], "mode": "repeated", "name": "location", "type": "record" }, ........... [ { "fields": [ { "mode": "nullable", "name": "country", "type": "string" }, { "mode": "nullable", "name": "city", "type": "string" } ], "mode": "repeated", "name": "location", "type": "record" }, ........... Schema example
  20. 20. Accessing BigQuery ● BigQuery Web browser ○ Imports/exports data, runs queries ● bq command line tool ○ Performs operations from the command line ● Service API ○ RESTful API to access BigQuery programmatically ○ Requires authorization by OAuth2 ○ Google client libraries for Python, Java, JavaScript, PHP, ... ○ BigQuery Basics
  21. 21. Third-party Tools BigQuery Basics Visualization and Business Intelligence ETL tools for loading data into BigQuery
  22. 22. Example of Visualization Tools Using commercial visualization tools to graph the query results BigQuery Basics
  23. 23. Loading Data Using the Web Browser ● Upload from local disk or from Cloud Storage ● Start the Web browser ● Select Dataset ● Create table and follow the wizard steps BigQuery Basics
  24. 24. "bq load" command Syntax ● If not specified, the default file format is CSV (comma separated values) ● The files can also use newline delimited JSON format ● Schema ○ Either a filename or a comma-separated list of column_name:datatype pairs that describe the file format. ● Data source may be on local machine or on Cloud Storage Loading Data Using bq Tool BigQuery Basics bq load [--source_format=NEWLINE_DELIMITED_JSON|CSV] destination_table data_source_uri table_schema
  25. 25. ● 1,000 import jobs per table per day ● 10,000 import jobs per project per day ● File size (for both CSV and JSON) ○ 1GB for compressed file ○ 1TB for uncompressed ■ 4GB for uncompressed CSV with newlines in strings ● 10,000 files per import job ● 1TB per import job Load Limitations BigQuery Basics
  26. 26. CSV/JSON must be split into chunks less than 1TB ● "split" command with --line-bytes option ● Split to smaller files ○ Easier error recovery ○ To smaller data unit (day, month instead of year) ● Uploading to Cloud Storage is recommended Best Practices Cloud Storage BigQuery BigQuery Basics
  27. 27. ● Split Tables by Dates ○ Minimize cost of data scanned ○ Minimize query time ● Upload Multiple Files to Cloud Storage ○ Allows parallel upload into BigQuery ● Denormalize your data Best Practices BigQuery Basics
  28. 28. Google I/O Data Sensing ● Start the BigQuery Web browser ● Click on Display Project in the project chooser dialog window ● Enter data-sensing-lab when prompted ● In the dataset data-sensing-lab:io_sensor_data, select the table moscone_io13 ● In the New Query box, enter the following query: SELECT * FROM [data-sensing-lab:io_sensor_data.moscone_io13] LIMIT 10 ● Click Run Query button ● Scroll to see relevant results BigQuery Basics
  29. 29. Data Structure ● Define table schema when creating table ● Data is stored in per-column structure ● Each column is handled separately and only combined when necessary Advantage of this data structure: ● No need to set index in advance ● Load only the relevant Columns BigQuery Basics
  30. 30. Questions? BigQuery Basics Thank you!