2. • Experience
Vpon Data Engineer
TWM, Keywear, Nielsen
• Bryan’s notes for data analysis
http://bryannotes.blogspot.tw
• Spark.TW
• Linikedin
https://tw.linkedin.com/pub/bryan-yang/7b/763/a79
ABOUT ME
13. spark-shell
• 除了sc之外,還會起SQL Context
• Spark context available as sc.
• 15/03/22 02:09:11 INFO SparkILoop: Created sql context
(with Hive support)..
• SQL context available as sqlContext.
14. DF from RDD
• 先轉成RDD
scala> val data = sc.textFile("hdfs://localhost:54310/user/hadoop/ml-
100k/u.data")
• 建立case class
case class Rattings(userId: Int, itemID: Int, rating: Int, timestmap:String)
• 轉成Data Frame
scala> val ratting = data.map(_.split("t")).map(p => Rattings(p(0).trim.toInt,
p(1).trim.toInt, p(2).trim.toInt, p(3))).toDF()
ratting: org.apache.spark.sql.DataFrame = [userId: int, itemID: int, rating: int,
timestmap: string]
15. DF from json
• 格式
{"movieID":242,"name":"test1"}
{"movieID":307,"name":"test2"}
• 可以直接呼叫
scala> val movie =
sqlContext.jsonFile("hdfs://localhost:54310/user/ha
doop/ml-100k/movies.json")
36. User Defined Function
• from pyspark.sql.functions import udf
• from pyspark.sql.types import *
• sqlContext.registerFunction("hash", lambda x:
hash(x), LongType())
• sqlContext.sql(“select hash(item) from ratings”)
37. DataType
Numeric types
String type
Binary type
Boolean type
Datetime type
TimestampType: Represents values comprising values of fields year, month,
day, hour, minute, and second.
DateType: Represents values comprising values of fields year, month, day.
Complex types