SlideShare a Scribd company logo
Introducing zio-dynamodb
a new Scala library for DynamoDB
1 / 42
contributors
- Avinder Bahra
used Scala in production for last 5 years
used ZIO in production for the last 2 years
been programming professionally for last 30 years
- Adam Johnson
2 / 42
Some context
Relational Databases
pros
transactional (ACID), rich query language, tooling
3 / 42
Some context
Relational Databases
pros
transactional (ACID), rich query language, tooling
cons
not scalable
4 / 42
Some context
DynamoDB
"fully managed, serverless, key-value NoSQL database designed to
run high-performance applications at any scale"
5 / 42
Some context
DynamoDB
"fully managed, serverless, key-value NoSQL database designed to
run high-performance applications at any scale"
great for scalability, throughput with features like auto-scaling
and global table replication
6 / 42
Some context
DynamoDB
fully managed, serverless, key-value NoSQL database designed to run
high-performance applications at any scale
great for scalability, throughput with features like auto-scaling
and global table replication
downside of NoSQL DBs
they do less work than relational databases for tasks such as
querying and maintaining consistency/integrity and offer less
tooling
hence the code burden for the developer is significantly
increased
7 / 42
problem: using DDB Java SDK is painful
we are going to walk you through the process of creating a production
grade Scala application using the Java JDK.
Lets see how many LOC it takes to perform simple write and read
operations.
Here are the steps we are going to follow:
create a typesafe model (case class)
serialialse/deserialse the Scala model to the APIs model
create API requests for wring and reading the model to the database
execute the requests and process the results
improve performance by using batching of requests
as this is production grade we are going to use ZIO to manage effects,
concurrency, Java interop ...
8 / 42
problem: using DDB Java SDK is painful
final case class Student(email: String, // partition key
subject: String, // sort key
enrollmentDate: Option[Instant], payment: Payment)
sealed trait Payment
object Payment {
final case object DebitCard extends Payment
final case object CreditCard extends Payment
final case object PayPal extends Payment
}
9 / 42
problem: using DDB Java SDK is painful
we have to build a PutItemRequest
import scala.jdk.CollectionConverters._
final case class Student(email: String, subject: String,
enrollmentDate: Option[Instant], payment: Payment)
sealed trait Payment
object Payment {
final case object DebitCard extends Payment
final case object CreditCard extends Payment
final case object PayPal extends Payment
}
def putItemRequest(student: Student): PutItemRequest =
PutItemRequest.builder
.tableName("student")
.item(toAttributeValueMap(student).asJava) //Item: Map[String, AttributeValue]
.build
10 / 42
problem: using DDB Java SDK is painful
we have to serialise the Student case class to AttributeValue's
final case class Student(email: String, subject: String,
enrollmentDate: Option[Instant], payment: Payment)
sealed trait Payment
object Payment {
final case object DebitCard extends Payment
final case object CreditCard extends Payment
final case object PayPal extends Payment
}
def putItemRequest(student: Student): PutItemRequest =
PutItemRequest.builder
.tableName("student")
.item(toAttributeValueMap(student).asJava)
.build
def toAttributeValueMap(student: Student): Map[String, AttributeValue] = {
Map(
"email" -> AttributeValue.builder.s(student.email).build,
"subject" -> AttributeValue.builder.s(student.subject).build
)
}
11 / 42
problem: using DDB Java SDK is painful
we have to deal with optional and sum types
def toAttributeValueMap(student: Student): Map[String, AttributeValue] = {
val mandatoryFields = Map(
"email" -> AttributeValue.builder.s(student.email).build,
"subject" -> AttributeValue.builder.s(student.subject).build,
"payment" -> AttributeValue.builder.s { // serialise sum type
student.payment match {
case DebitCard => "DebitCard"
case CreditCard => "CreditCard"
case PayPal => "PayPal"
}
}.build
)
val nonEmptyOptionalFields: Map[String, AttributeValue] = Map(
"enrollmentDate" -> student.enrollmentDate.map(instant =>
AttributeValue.builder.s(instant.toString).build)
).filter(_._2.nonEmpty).view.mapValues(_.get).toMap
mandatoryFields ++ nonEmptyOptionalFields
}
12 / 42
problem: using DDB Java SDK is painful
Now that we have saved a Student as an Item in the DynamoDb database we
next need to create a GetItemRequest to retrieve it.
GetItemRequest.builder
.tableName("student")
.key(
Map(
"email" -> AttributeValue.builder.s("avi.gmail.com").build,
"subject" -> AttributeValue.builder.s("maths").build
).asJava
)
.build()
13 / 42
problem: using DDB Java SDK is painful
Next have to de-serialise a GetItemResponse - a Map[String, AttributeValue]
and deal with concerns such as:
mandatory fields - if not present we want an error
optional fields
conversion to standard primitive type eg Instant
Helper functions that return an Either[String, _] for managing errors
def getString(map: Map[String, AttributeValue],
name: String): Either[String, String] =
map.get(name).toRight(s"mandatory field $name not found").map(_.s)
def getStringOpt(map: Map[String, AttributeValue],
name: String): Either[Nothing, Option[String]] =
Right(map.get(name).map(_.s))
def parseInstant(s: String): Either[String, Instant] =
Try(Instant.parse(s)).toEither.left.map(_.getMessage)
14 / 42
problem: using DDB Java SDK is painful
we create a deserialise function that uses the previous helpers
note we have to de-serialise the Payment sum type
def deserialise(item: Map[String, AttributeValue]): Either[String, Student] =
for {
email <- getString(item, "email")
subject <- getString(item, "subject")
maybeEnrollmentDateAV <- getStringOpt(item, "enrollmentDate")
maybeEnrollmentDate <-
maybeEnrollmentDateAV.fold[Either[String, Option[Instant]]](Right(None))(s =>
parseInstant(s).map(i => Some(i))
)
payment <- getString(item, "payment")
paymentType = payment match {
case "DebitCard" => DebitCard
case "CreditCard" => CreditCard
case "PayPal" => PayPal
}
} yield Student(email, subject, maybeEnrollmentDate, paymentType)
15 / 42
problem: using DDB Java SDK is painful
stitching together all the functions using ZIO
val program =
for {
client <- ZIO.service[DynamoDbAsyncClient]
student = Student("avi@gmail.com", "maths", Instant.now,
Payment.DebitCard)
putRequest = putItemRequest(expectedStudent)
_ <- ZIO.fromCompletionStage(client.putItem(putRequest))
getItemRequest = getItemRequest(student)
getItemResponse <- ZIO.fromCompletionStage(client.getItem(getItemRequest))
studentItem = getItemResponse.item.asScala.toMap
foundStudent = deserialise(studentItemstudentItem)
} yield foundStudent
16 / 42
problem: using DDB Java SDK is painful
But that's not fast enough - we want to use DDB batching for our Puts and Gets
So we have to first create BatchWriteItemRequest
def batchWriteItemRequest(students: List[Student]): BatchWriteItemRequest = {
val putRequests = students.map { student =>
val request = PutRequest
.builder()
.item(toAttributeValueMap(student).asJava)
.build()
WriteRequest.builder().putRequest(request).build()
}
BatchWriteItemRequest
.builder()
.requestItems(Map("student" -> putRequests.asJava).asJava)
.build()
}
input is a List of Student which we map to WriteRequests
for each student se use our toAttributeValueMap function to serilaise
to an Item
finally we create a BatchWriteItemRequest
17 / 42
problem: using DDB Java SDK is painful
We then have to execute the batch and process the response
def batchWriteAndRetryUnprocessed(
batchRequest: BatchWriteItemRequest
): ZIO[Has[DynamoDbAsyncClient], Throwable, BatchWriteItemResponse] = {
val result = for {
client <- ZIO.service[DynamoDbAsyncClient]
response <- ZIO.fromCompletionStage(client.batchWriteItem(batchRequest))
} yield response
result.flatMap {
case response if response.unprocessedItems().isEmpty => ZIO.succeed(response)
case response =>
// very simple recursive retry of unprocessed requests
// in Production we would have exponential back offs and a timeout
batchWriteAndRetryUnprocessed(batchRequest =
BatchWriteItemRequest
.builder()
.requestItems(response.unprocessedItems())
.build()
)
}
}
18 / 42
problem: using DDB Java SDK is painful
We then have to create a BatchGetItemRequest
def batchGetItemReq(studentPks: Seq[(String, String)]): BatchGetItemRequest = {
val keysAndAttributes = KeysAndAttributes.builder
.keys(
studentPks.map { // for all students we extract the partition and sort keys
case (email, subject) =>
Map(
"email" -> AttributeValue.builder().s(email).build(),
"subject" -> AttributeValue.builder().s(subject).build()
).asJava
}.asJava
)
.build()
BatchGetItemRequest.builder
.requestItems(Map("student" -> keysAndAttributes).asJava)
.build()
}
19 / 42
problem: using DDB Java SDK is painful
We then have to execute the batch and process the response
def batchGetItemAndRetryUnprocessed(batchRequest: BatchGetItemRequest)
: ZIO[Has[DynamoDbAsyncClient], Throwable, BatchGetItemResponse] = {
val result = for {
client <- ZIO.service[DynamoDbAsyncClient]
response <- ZIO.fromCompletionStage(client.batchGetItem(batchRequest))
} yield response
result.flatMap {
case response if response.unprocessedKeys.isEmpty => ZIO.succeed(response)
case response =>
// very simple recursive retry of failed requests
// in Production we would have exponential back offs and a timeout
batchGetItemAndRetryUnprocessed(batchRequest =
BatchGetItemRequest.builder
.requestItems(response.unprocessedKeys)
.build
)
}
}
20 / 42
problem: using DDB Java SDK is painful
putting it all together - full batching program
val program =
for {
client <- ZIO.service[DynamoDbAsyncClient]
avi = Student("avi@gmail.com", ...)
adam = Student("adam@gmail.com", ...)
students = List(avi, adam)
batchPutRequest = batchWriteItemRequest(students)
_ <- batchWriteAndRetryUnprocessed(batchPutRequest)
batchGetItemResponse <-
batchGetItemAndRetryUnprocessed(batchGetItemRequest(
students.map(st => (st.email, st.subject))))
responseMap = batchGetItemResponse.responses.asScala
listOfErrorOrStudent =
responseMap.get("student")
.fold[List[Either[String, Student]]](List.empty) {
javaList =>
val listOfErrorOrStudent: List[Either[String, Student]] =
javaList.asScala.map(m =>
attributeValueMapToStudent(m.asScala.toMap)).toList
listOfErrorOrStudent
} // traverse!
errorOrStudents = foreach(listOfErrorOrStudent)(identity)
} yield errorOrStudents
21 / 42
problem: using DDB Java SDK is painful
That's not all, we want to speed up our updates
however there is no batching support in DDB for updates
So we want to parallelise our updates
def updateItemRequest(student: Student): UpdateItemRequest = {
val values: Map[String, AttributeValue] =
Map(":paymentType" ->
AttributeValue.builder.s(student.payment.toString).build)
UpdateItemRequest.builder
.tableName("student")
.key(
Map(
"email" -> AttributeValue.builder.s(student.email).build,
"subject" -> AttributeValue.builder.s(student.subject).build
).asJava
)
.updateExpression("set payment = :paymentType")
.expressionAttributeValues(values.asJava)
.build
}
...
// we execute two queries in parallel using a ZIO zipPar
ZIO.fromCompletionStage(client.updateItem(updateItemRequest(updatedAvi))) zipPar
ZIO.fromCompletionStage(client.updateItem(updateItemRequest(updatedAdam))
)
22 / 42
problem: using DDB Java SDK is painful
Thats a lot of boilerplate!
...and this is just the tip of the iceberg - we have not covered
// TODO: should I create more examples for any of these?
// TODO: show how many lines of code
// TODO: move though slides quickly @ explain at a higher level
scanning and queries with key condition and filter expressions
pagination of scan and query results
complex projection expressions
error handling
TODO: solution is
23 / 42
solution: zio-dynamodb
Simple, type-safe, and efficient access to DynamoDB
We are now going to write the equivelent application using zio-dynamodb and
show you how much less boilerplate code there is.
24 / 42
solution: zio-dynamodb
val program = (for {
avi = Student("avi@gmail.com", "maths", ...)
adam = Student("adam@gmail.com", "english", ...)
_ <- (DynamoDBQuery.put("student", avi) zip
DynamoDBQuery.put("student", adam)).execute
listErrorOrStudent <- DynamoDBQuery
.forEach(List(avi, adam)) { st =>
DynamoDBQuery.get[Student](
"student",
PrimaryKey("email" -> st.email, "subject" -> st.subject)
)
}
.execute
} yield EitherUtil.collectAll(listErrorOrStudent))
.provideCustomLayer(DynamoDBExecutor.live)
25 / 42
solution: zio-dynamodb
offers a type safe API with auto serialisation
auto batching and parallelisation queries
testable using a fake in memory DB
26 / 42
zio-dynamodb API 101
DynamoDBQuery
its type and it's combinators
auto batching and parallelisation
how to create and execute a query
Serialisation
low level API
built in type classes
Expressions
usages
mutations operations
query operations
Serialisation
high level type safe API
note there is a 1:1 correspondence to the AWS API to aid discoverability
27 / 42
DynamoDBQuery
sealed trait DynamoDBQuery[+A] {
def zip[B](that: DynamoDBQuery[B]): DynamoDBQuery[(A, B)] = ??
def zipLeft[B](that: DynamoDBQuery[B]): DynamoDBQuery[A] = ???
def zipRight[B](that: DynamoDBQuery[B]): DynamoDBQuery[B] = ???
def forEach[A, B](values: Iterable[A])(body: A => DynamoDBQuery[B])
: DynamoDBQuery[List[B]]
def execute: ZIO[Has[DynamoDBExecutor], Exception, A] = ???
}
object DynamoDBQuery {
// whole bunch of query constructors
}
Next lets create and execute some basic queries
28 / 42
Simple Put for multiple tables
val program = (for {
_ <- (DynamoDBQuery.put("student", avi) zip
DynamoDBQuery.put("student", adam) zip
DynamoDBQuery.put("course", french) zip
DynamoDBQuery.put("course", art) zip
).execute
} yield ())
.provideCustomLayer(DynamoDBExecutor.live)
puts for multiple tables are batch together, grouped by table in the request
29 / 42
Serialisation - Low level API
AttributeValue
sealed trait AttributeValue
final case class Binary(value: Iterable[Byte]) extends AttributeValue
final case class BinarySet(value: Iterable[Iterable[Byte]]) extends AttributeValue
final case class Bool(value: Boolean) extends AttributeValue
// ... etc etc
final case class String(value: ScalaString) extends AttributeValue
final case class Number(value: BigDecimal) extends AttributeValue
This corresponds 1:1 with the AWS API AttributeValue
30 / 42
Serialisation - Low level API
AttrMap
Top level container type for an Item with type aliases
final case class AttrMap(map: Map[String, AttributeValue])
type Item = AttrMap
type PrimaryKey = AttrMap
Internal type classes take care of AttributeValue conversions
val aviItem = Item("email" -> "avi@gmail.com", "age" -> 21)
is equivalent to
val aviPrimaryKey = AttrMap(Map("email" -> AttributeValue.String("email"),
"age" -> AttributeValue.Number(BigDecimal(21)))
31 / 42
Projection Expression Parser
$("cost") // simple
$("address.line1") // map
$("adresses[1]") // list
$("adresses[work].line1") // list with map
The $ projection expression parser function takes a string field expression and
turns it into the internal representation
32 / 42
updates
updateItem("course", PrimaryKey("name" -> "art"))(
// UpdateExpression
$("cost").set(500.0) + $("code").set("123")
)
we can specify ProjectionExpression's
$("cost") ... $("code")
a ProjectionExpression has may update actions which can be combined using
+
$("count").set(1)
$("field1").set($("field2")) // replaces filed1 with field2
$("count").setIfNotExists($("two"), 42)
$("numberList").appendList(List("1"))
$("numberList").prependList(List("1"))
$("count").add(1) // updating Numbers and Sets
$("count").remove // Removes this field from an item
$("numberSet").deleteFromSet(1)
$("person.address").set(Item("line1" -> "1 high street"))
33 / 42
delete
deleteItem("course", PrimaryKey("name" -> "art"))
Both Update and Delete can have ConditionExpressions that must be met for
the operation to succeed. For this we use the where method.
where $("code") > 1 && $("code") < 5
applied to the queries
deleteItem("course", PrimaryKey("name" -> "art")) where
$("code") > 1 && $("code") < 5
updateItem("course", PrimaryKey("name" -> "art")) {
// UpdateExpression
$("cost").set(500.0) + $("code").set("123")
} where $("code") > 1 && $("code") < 5
34 / 42
queryAll
val zio: ZIO[Has[DynamoDBExecutor], Exception, Stream[Exception, Person]] =
queryAll("person", $("name"), $("address[1].line1"))
.whereKey(
PartitionKey("partitionKey1") === "x" && SortKey("sortKey1") > 10
)
.execute
Note that we use a list of ProjectionExpression's again
$("name"), $("address[1].line1")
The whereKey method specifies a KeyConditionExpression
.whereKey(
PartitionKey("partitionKey1") === "x" && SortKey("sortKey1") > 10
)
... and we get back a ZStream that the library lazily paginates for us
val queryAll: ZIO[Has[DynamoDBExecutor], Exception, Stream[Exception, Person]]
35 / 42
querySome
val zio =
querySomeItem("person", limit = 5, $("name"), $("address[1].line1"))
.whereKey(PartitionKey("partitionKey1") === "x" && SortKey("sortKey1") > 10)
.execute
... and we get back a ZIO of (Chunk[Person], LastEvaluatedKey)
type LastEvaluatedKey = Option[PrimaryKey]
val q: ZIO[Has[DynamoDBExecutor], Exception, (Chunk[Item], LastEvaluatedKey)]
and we use the startKey method to feed back the LastEvaluatedKey to get the
next page of data
val zio =
querySomeItem("person", limit = 5, last$("name"), $("address[1].line1"))
.whereKey(PartitionKey("partitionKey1") === "x" && SortKey("sortKey1") > 10)
.startKey(startKey)
.execute
36 / 42
scanAll, scanSome
scanAll
These are similar to the previous queryAll/querySome queries
val zio: ZIO[Has[DynamoDBExecutor], Exception, Stream[Exception, Person]] =
scanAll("person", $("name"), $("address[1].line1")).execute
scanSome
val zio =
scanSome("person", limit = 5, $("name"), $("address[1].line1")).execute
...
val zio =
scanSome("person", limit = 5, $("name"), $("address[1].line1"))
.startKey(startKey)
.execute
37 / 42
Condition Expressions
$("field1").exists
$("field1").notExists
$("field1").beginsWith("1")
$("field1").contains("1")
$("field1").size > 1
$("field1").isNumber
condition1 && condition2
condition1 || condition2
!conition1
$("field1") > 1.0
$("field1") > $("col2")
$("field1") === $("col2")
$("field1") === "2"
$("field1").between("1", "2")
$("field1").in(Set("1", "2"))
$("field1").in("1", "2")
38 / 42
Condition Expressions cont.
Apply to
updateItem
delete
queryXXXX/scanXXXX
39 / 42
ConditionExpressions for query and scan
40 / 42
Serialisation - High level API
High level API uses zio-schema to create codecs
41 / 42
summary (3 slide)
wrap-up (1 slide)
Java SDK is painful
ZIO DynamoDB is a joy
Learning more (1 slide)
Thank you (1 slide)
42 / 42

More Related Content

What's hot

програм хангамжийн чанарын инженерчлэл
програм хангамжийн чанарын инженерчлэл програм хангамжийн чанарын инженерчлэл
програм хангамжийн чанарын инженерчлэл Энхтамир Ш
 
Bai 6 uoc luong tham so
Bai 6   uoc luong tham soBai 6   uoc luong tham so
Bai 6 uoc luong tham so
batbai
 
Rdbms 300 test
Rdbms 300 testRdbms 300 test
Rdbms 300 test
Usukhuu Galaa
 
Зенитийн артиллерийн зэвсэглэл. ттө
Зенитийн артиллерийн зэвсэглэл. ттөЗенитийн артиллерийн зэвсэглэл. ттө
Зенитийн артиллерийн зэвсэглэл. ттө
umurbek kalin
 
Pl lecture6
Pl lecture6Pl lecture6
Pl lecture6
ganzorigb
 
Sw203 Lecture6 Inheritance
Sw203 Lecture6 InheritanceSw203 Lecture6 Inheritance
Sw203 Lecture6 Inheritance
Jargalsaikhan Alyeksandr
 
Sw203 Lecture5 Class Acess Modifiers
Sw203 Lecture5 Class Acess ModifiersSw203 Lecture5 Class Acess Modifiers
Sw203 Lecture5 Class Acess Modifiers
Jargalsaikhan Alyeksandr
 
Bài tập chương Hồi quy tuyến tính 2 biến
Bài tập chương Hồi quy tuyến tính 2 biếnBài tập chương Hồi quy tuyến tính 2 biến
Bài tập chương Hồi quy tuyến tính 2 biến
caoxuanthang
 
1
11
Xulgana arslan bichil ex
Xulgana arslan bichil exXulgana arslan bichil ex
Xulgana arslan bichil exDensee_od
 
Informatyka 8-klas-morze-2021
Informatyka 8-klas-morze-2021Informatyka 8-klas-morze-2021
Informatyka 8-klas-morze-2021
kreidaros1
 
Презентація:Додавання, редагування та форматування таблиць
Презентація:Додавання, редагування та форматування таблицьПрезентація:Додавання, редагування та форматування таблиць
Презентація:Додавання, редагування та форматування таблиць
sveta7940
 
Chương 5 & 6 Tương Quan Và Hồi Quy
Chương 5 & 6 Tương Quan Và Hồi QuyChương 5 & 6 Tương Quan Và Hồi Quy
Chương 5 & 6 Tương Quan Và Hồi Quy
Le Nguyen Truong Giang
 
практичні роботи № 1 10 ms excel 2003
практичні роботи № 1 10 ms excel 2003практичні роботи № 1 10 ms excel 2003
практичні роботи № 1 10 ms excel 2003
slavinskiy
 
Hop ngu MIP
Hop ngu MIPHop ngu MIP
Hop ngu MIP
satlove02
 

What's hot (20)

програм хангамжийн чанарын инженерчлэл
програм хангамжийн чанарын инженерчлэл програм хангамжийн чанарын инженерчлэл
програм хангамжийн чанарын инженерчлэл
 
Bai 6 uoc luong tham so
Bai 6   uoc luong tham soBai 6   uoc luong tham so
Bai 6 uoc luong tham so
 
Rdbms 300 test
Rdbms 300 testRdbms 300 test
Rdbms 300 test
 
Excel function
Excel functionExcel function
Excel function
 
Зенитийн артиллерийн зэвсэглэл. ттө
Зенитийн артиллерийн зэвсэглэл. ттөЗенитийн артиллерийн зэвсэглэл. ттө
Зенитийн артиллерийн зэвсэглэл. ттө
 
Pl lecture6
Pl lecture6Pl lecture6
Pl lecture6
 
Sw203 Lecture6 Inheritance
Sw203 Lecture6 InheritanceSw203 Lecture6 Inheritance
Sw203 Lecture6 Inheritance
 
Sw203 Lecture5 Class Acess Modifiers
Sw203 Lecture5 Class Acess ModifiersSw203 Lecture5 Class Acess Modifiers
Sw203 Lecture5 Class Acess Modifiers
 
Лекц №6
Лекц №6Лекц №6
Лекц №6
 
Lects 12
Lects 12Lects 12
Lects 12
 
Bài tập chương Hồi quy tuyến tính 2 biến
Bài tập chương Hồi quy tuyến tính 2 biếnBài tập chương Hồi quy tuyến tính 2 biến
Bài tập chương Hồi quy tuyến tính 2 biến
 
Програмчлалын хэл
Програмчлалын хэлПрограмчлалын хэл
Програмчлалын хэл
 
1
11
1
 
Xulgana arslan bichil ex
Xulgana arslan bichil exXulgana arslan bichil ex
Xulgana arslan bichil ex
 
Informatyka 8-klas-morze-2021
Informatyka 8-klas-morze-2021Informatyka 8-klas-morze-2021
Informatyka 8-klas-morze-2021
 
Презентація:Додавання, редагування та форматування таблиць
Презентація:Додавання, редагування та форматування таблицьПрезентація:Додавання, редагування та форматування таблиць
Презентація:Додавання, редагування та форматування таблиць
 
Chương 5 & 6 Tương Quan Và Hồi Quy
Chương 5 & 6 Tương Quan Và Hồi QuyChương 5 & 6 Tương Quan Và Hồi Quy
Chương 5 & 6 Tương Quan Và Hồi Quy
 
практичні роботи № 1 10 ms excel 2003
практичні роботи № 1 10 ms excel 2003практичні роботи № 1 10 ms excel 2003
практичні роботи № 1 10 ms excel 2003
 
Лекц №8
Лекц №8Лекц №8
Лекц №8
 
Hop ngu MIP
Hop ngu MIPHop ngu MIP
Hop ngu MIP
 

Similar to Introducing Zio DynamoDB - a new Scala library for Simple, type-safe, and efficient access to DynamoDB

Introducing the ZIO DynamoDB Type Safe API
Introducing the ZIO DynamoDB Type Safe APIIntroducing the ZIO DynamoDB Type Safe API
Introducing the ZIO DynamoDB Type Safe API
AvinderBahra
 
Scala based Lift Framework
Scala based Lift FrameworkScala based Lift Framework
Scala based Lift Framework
vhazrati
 
Overview Of Lift Framework
Overview Of Lift FrameworkOverview Of Lift Framework
Overview Of Lift Framework
Xebia IT Architects
 
Overview of The Scala Based Lift Web Framework
Overview of The Scala Based Lift Web FrameworkOverview of The Scala Based Lift Web Framework
Overview of The Scala Based Lift Web Framework
IndicThreads
 
React js
React jsReact js
DAGScheduler - The Internals of Apache Spark.pdf
DAGScheduler - The Internals of Apache Spark.pdfDAGScheduler - The Internals of Apache Spark.pdf
DAGScheduler - The Internals of Apache Spark.pdf
JoeKibangu
 
Graph computation
Graph computationGraph computation
Graph computation
Sigmoid
 
An introduction to React.js
An introduction to React.jsAn introduction to React.js
An introduction to React.js
Emanuele DelBono
 
Using Spark to Load Oracle Data into Cassandra (Jim Hatcher, IHS Markit) | C*...
Using Spark to Load Oracle Data into Cassandra (Jim Hatcher, IHS Markit) | C*...Using Spark to Load Oracle Data into Cassandra (Jim Hatcher, IHS Markit) | C*...
Using Spark to Load Oracle Data into Cassandra (Jim Hatcher, IHS Markit) | C*...
DataStax
 
Using Spark to Load Oracle Data into Cassandra
Using Spark to Load Oracle Data into CassandraUsing Spark to Load Oracle Data into Cassandra
Using Spark to Load Oracle Data into Cassandra
Jim Hatcher
 
Database Access With JDBC
Database Access With JDBCDatabase Access With JDBC
Database Access With JDBC
Dharani Kumar Madduri
 
ReactJS.pdf
ReactJS.pdfReactJS.pdf
ReactJS.pdf
IbrahimRashidBayoh
 
Dacj 4 1-b
Dacj 4 1-bDacj 4 1-b
Dacj 4 1-b
Niit Care
 
Improving your Gradle builds
Improving your Gradle buildsImproving your Gradle builds
Improving your Gradle builds
Peter Ledbrook
 
11. Java Objects and classes
11. Java  Objects and classes11. Java  Objects and classes
11. Java Objects and classes
Intro C# Book
 
Slickdemo
SlickdemoSlickdemo
Slickdemo
Knoldus Inc.
 
R2DBC JEEConf 2019 by Igor Lozynskyi
R2DBC JEEConf 2019 by Igor LozynskyiR2DBC JEEConf 2019 by Igor Lozynskyi
R2DBC JEEConf 2019 by Igor Lozynskyi
Igor Lozynskyi
 
JavaCro'14 - Scala and Java EE 7 Development Experiences – Peter Pilgrim
JavaCro'14 - Scala and Java EE 7 Development Experiences – Peter PilgrimJavaCro'14 - Scala and Java EE 7 Development Experiences – Peter Pilgrim
JavaCro'14 - Scala and Java EE 7 Development Experiences – Peter Pilgrim
HUJAK - Hrvatska udruga Java korisnika / Croatian Java User Association
 
JavaCro 2014 Scala and Java EE 7 Development Experiences
JavaCro 2014 Scala and Java EE 7 Development ExperiencesJavaCro 2014 Scala and Java EE 7 Development Experiences
JavaCro 2014 Scala and Java EE 7 Development Experiences
Peter Pilgrim
 
PyCon KR 2018 Effective Tips for Django ORM in Practice
PyCon KR 2018 Effective Tips for Django ORM in PracticePyCon KR 2018 Effective Tips for Django ORM in Practice
PyCon KR 2018 Effective Tips for Django ORM in Practice
Seomgi Han
 

Similar to Introducing Zio DynamoDB - a new Scala library for Simple, type-safe, and efficient access to DynamoDB (20)

Introducing the ZIO DynamoDB Type Safe API
Introducing the ZIO DynamoDB Type Safe APIIntroducing the ZIO DynamoDB Type Safe API
Introducing the ZIO DynamoDB Type Safe API
 
Scala based Lift Framework
Scala based Lift FrameworkScala based Lift Framework
Scala based Lift Framework
 
Overview Of Lift Framework
Overview Of Lift FrameworkOverview Of Lift Framework
Overview Of Lift Framework
 
Overview of The Scala Based Lift Web Framework
Overview of The Scala Based Lift Web FrameworkOverview of The Scala Based Lift Web Framework
Overview of The Scala Based Lift Web Framework
 
React js
React jsReact js
React js
 
DAGScheduler - The Internals of Apache Spark.pdf
DAGScheduler - The Internals of Apache Spark.pdfDAGScheduler - The Internals of Apache Spark.pdf
DAGScheduler - The Internals of Apache Spark.pdf
 
Graph computation
Graph computationGraph computation
Graph computation
 
An introduction to React.js
An introduction to React.jsAn introduction to React.js
An introduction to React.js
 
Using Spark to Load Oracle Data into Cassandra (Jim Hatcher, IHS Markit) | C*...
Using Spark to Load Oracle Data into Cassandra (Jim Hatcher, IHS Markit) | C*...Using Spark to Load Oracle Data into Cassandra (Jim Hatcher, IHS Markit) | C*...
Using Spark to Load Oracle Data into Cassandra (Jim Hatcher, IHS Markit) | C*...
 
Using Spark to Load Oracle Data into Cassandra
Using Spark to Load Oracle Data into CassandraUsing Spark to Load Oracle Data into Cassandra
Using Spark to Load Oracle Data into Cassandra
 
Database Access With JDBC
Database Access With JDBCDatabase Access With JDBC
Database Access With JDBC
 
ReactJS.pdf
ReactJS.pdfReactJS.pdf
ReactJS.pdf
 
Dacj 4 1-b
Dacj 4 1-bDacj 4 1-b
Dacj 4 1-b
 
Improving your Gradle builds
Improving your Gradle buildsImproving your Gradle builds
Improving your Gradle builds
 
11. Java Objects and classes
11. Java  Objects and classes11. Java  Objects and classes
11. Java Objects and classes
 
Slickdemo
SlickdemoSlickdemo
Slickdemo
 
R2DBC JEEConf 2019 by Igor Lozynskyi
R2DBC JEEConf 2019 by Igor LozynskyiR2DBC JEEConf 2019 by Igor Lozynskyi
R2DBC JEEConf 2019 by Igor Lozynskyi
 
JavaCro'14 - Scala and Java EE 7 Development Experiences – Peter Pilgrim
JavaCro'14 - Scala and Java EE 7 Development Experiences – Peter PilgrimJavaCro'14 - Scala and Java EE 7 Development Experiences – Peter Pilgrim
JavaCro'14 - Scala and Java EE 7 Development Experiences – Peter Pilgrim
 
JavaCro 2014 Scala and Java EE 7 Development Experiences
JavaCro 2014 Scala and Java EE 7 Development ExperiencesJavaCro 2014 Scala and Java EE 7 Development Experiences
JavaCro 2014 Scala and Java EE 7 Development Experiences
 
PyCon KR 2018 Effective Tips for Django ORM in Practice
PyCon KR 2018 Effective Tips for Django ORM in PracticePyCon KR 2018 Effective Tips for Django ORM in Practice
PyCon KR 2018 Effective Tips for Django ORM in Practice
 

Recently uploaded

"Choosing proper type of scaling", Olena Syrota
"Choosing proper type of scaling", Olena Syrota"Choosing proper type of scaling", Olena Syrota
"Choosing proper type of scaling", Olena Syrota
Fwdays
 
Northern Engraving | Nameplate Manufacturing Process - 2024
Northern Engraving | Nameplate Manufacturing Process - 2024Northern Engraving | Nameplate Manufacturing Process - 2024
Northern Engraving | Nameplate Manufacturing Process - 2024
Northern Engraving
 
9 CEO's who hit $100m ARR Share Their Top Growth Tactics Nathan Latka, Founde...
9 CEO's who hit $100m ARR Share Their Top Growth Tactics Nathan Latka, Founde...9 CEO's who hit $100m ARR Share Their Top Growth Tactics Nathan Latka, Founde...
9 CEO's who hit $100m ARR Share Their Top Growth Tactics Nathan Latka, Founde...
saastr
 
Energy Efficient Video Encoding for Cloud and Edge Computing Instances
Energy Efficient Video Encoding for Cloud and Edge Computing InstancesEnergy Efficient Video Encoding for Cloud and Edge Computing Instances
Energy Efficient Video Encoding for Cloud and Edge Computing Instances
Alpen-Adria-Universität
 
"Frontline Battles with DDoS: Best practices and Lessons Learned", Igor Ivaniuk
"Frontline Battles with DDoS: Best practices and Lessons Learned",  Igor Ivaniuk"Frontline Battles with DDoS: Best practices and Lessons Learned",  Igor Ivaniuk
"Frontline Battles with DDoS: Best practices and Lessons Learned", Igor Ivaniuk
Fwdays
 
The Microsoft 365 Migration Tutorial For Beginner.pptx
The Microsoft 365 Migration Tutorial For Beginner.pptxThe Microsoft 365 Migration Tutorial For Beginner.pptx
The Microsoft 365 Migration Tutorial For Beginner.pptx
operationspcvita
 
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
saastr
 
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfHow to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
Chart Kalyan
 
Principle of conventional tomography-Bibash Shahi ppt..pptx
Principle of conventional tomography-Bibash Shahi ppt..pptxPrinciple of conventional tomography-Bibash Shahi ppt..pptx
Principle of conventional tomography-Bibash Shahi ppt..pptx
BibashShahi
 
Programming Foundation Models with DSPy - Meetup Slides
Programming Foundation Models with DSPy - Meetup SlidesProgramming Foundation Models with DSPy - Meetup Slides
Programming Foundation Models with DSPy - Meetup Slides
Zilliz
 
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...
Jason Yip
 
Generating privacy-protected synthetic data using Secludy and Milvus
Generating privacy-protected synthetic data using Secludy and MilvusGenerating privacy-protected synthetic data using Secludy and Milvus
Generating privacy-protected synthetic data using Secludy and Milvus
Zilliz
 
Astute Business Solutions | Oracle Cloud Partner |
Astute Business Solutions | Oracle Cloud Partner |Astute Business Solutions | Oracle Cloud Partner |
Astute Business Solutions | Oracle Cloud Partner |
AstuteBusiness
 
JavaLand 2024: Application Development Green Masterplan
JavaLand 2024: Application Development Green MasterplanJavaLand 2024: Application Development Green Masterplan
JavaLand 2024: Application Development Green Masterplan
Miro Wengner
 
Deep Dive: AI-Powered Marketing to Get More Leads and Customers with HyperGro...
Deep Dive: AI-Powered Marketing to Get More Leads and Customers with HyperGro...Deep Dive: AI-Powered Marketing to Get More Leads and Customers with HyperGro...
Deep Dive: AI-Powered Marketing to Get More Leads and Customers with HyperGro...
saastr
 
AppSec PNW: Android and iOS Application Security with MobSF
AppSec PNW: Android and iOS Application Security with MobSFAppSec PNW: Android and iOS Application Security with MobSF
AppSec PNW: Android and iOS Application Security with MobSF
Ajin Abraham
 
Taking AI to the Next Level in Manufacturing.pdf
Taking AI to the Next Level in Manufacturing.pdfTaking AI to the Next Level in Manufacturing.pdf
Taking AI to the Next Level in Manufacturing.pdf
ssuserfac0301
 
GNSS spoofing via SDR (Criptored Talks 2024)
GNSS spoofing via SDR (Criptored Talks 2024)GNSS spoofing via SDR (Criptored Talks 2024)
GNSS spoofing via SDR (Criptored Talks 2024)
Javier Junquera
 
Harnessing the Power of NLP and Knowledge Graphs for Opioid Research
Harnessing the Power of NLP and Knowledge Graphs for Opioid ResearchHarnessing the Power of NLP and Knowledge Graphs for Opioid Research
Harnessing the Power of NLP and Knowledge Graphs for Opioid Research
Neo4j
 
Mutation Testing for Task-Oriented Chatbots
Mutation Testing for Task-Oriented ChatbotsMutation Testing for Task-Oriented Chatbots
Mutation Testing for Task-Oriented Chatbots
Pablo Gómez Abajo
 

Recently uploaded (20)

"Choosing proper type of scaling", Olena Syrota
"Choosing proper type of scaling", Olena Syrota"Choosing proper type of scaling", Olena Syrota
"Choosing proper type of scaling", Olena Syrota
 
Northern Engraving | Nameplate Manufacturing Process - 2024
Northern Engraving | Nameplate Manufacturing Process - 2024Northern Engraving | Nameplate Manufacturing Process - 2024
Northern Engraving | Nameplate Manufacturing Process - 2024
 
9 CEO's who hit $100m ARR Share Their Top Growth Tactics Nathan Latka, Founde...
9 CEO's who hit $100m ARR Share Their Top Growth Tactics Nathan Latka, Founde...9 CEO's who hit $100m ARR Share Their Top Growth Tactics Nathan Latka, Founde...
9 CEO's who hit $100m ARR Share Their Top Growth Tactics Nathan Latka, Founde...
 
Energy Efficient Video Encoding for Cloud and Edge Computing Instances
Energy Efficient Video Encoding for Cloud and Edge Computing InstancesEnergy Efficient Video Encoding for Cloud and Edge Computing Instances
Energy Efficient Video Encoding for Cloud and Edge Computing Instances
 
"Frontline Battles with DDoS: Best practices and Lessons Learned", Igor Ivaniuk
"Frontline Battles with DDoS: Best practices and Lessons Learned",  Igor Ivaniuk"Frontline Battles with DDoS: Best practices and Lessons Learned",  Igor Ivaniuk
"Frontline Battles with DDoS: Best practices and Lessons Learned", Igor Ivaniuk
 
The Microsoft 365 Migration Tutorial For Beginner.pptx
The Microsoft 365 Migration Tutorial For Beginner.pptxThe Microsoft 365 Migration Tutorial For Beginner.pptx
The Microsoft 365 Migration Tutorial For Beginner.pptx
 
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
 
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfHow to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
 
Principle of conventional tomography-Bibash Shahi ppt..pptx
Principle of conventional tomography-Bibash Shahi ppt..pptxPrinciple of conventional tomography-Bibash Shahi ppt..pptx
Principle of conventional tomography-Bibash Shahi ppt..pptx
 
Programming Foundation Models with DSPy - Meetup Slides
Programming Foundation Models with DSPy - Meetup SlidesProgramming Foundation Models with DSPy - Meetup Slides
Programming Foundation Models with DSPy - Meetup Slides
 
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...
 
Generating privacy-protected synthetic data using Secludy and Milvus
Generating privacy-protected synthetic data using Secludy and MilvusGenerating privacy-protected synthetic data using Secludy and Milvus
Generating privacy-protected synthetic data using Secludy and Milvus
 
Astute Business Solutions | Oracle Cloud Partner |
Astute Business Solutions | Oracle Cloud Partner |Astute Business Solutions | Oracle Cloud Partner |
Astute Business Solutions | Oracle Cloud Partner |
 
JavaLand 2024: Application Development Green Masterplan
JavaLand 2024: Application Development Green MasterplanJavaLand 2024: Application Development Green Masterplan
JavaLand 2024: Application Development Green Masterplan
 
Deep Dive: AI-Powered Marketing to Get More Leads and Customers with HyperGro...
Deep Dive: AI-Powered Marketing to Get More Leads and Customers with HyperGro...Deep Dive: AI-Powered Marketing to Get More Leads and Customers with HyperGro...
Deep Dive: AI-Powered Marketing to Get More Leads and Customers with HyperGro...
 
AppSec PNW: Android and iOS Application Security with MobSF
AppSec PNW: Android and iOS Application Security with MobSFAppSec PNW: Android and iOS Application Security with MobSF
AppSec PNW: Android and iOS Application Security with MobSF
 
Taking AI to the Next Level in Manufacturing.pdf
Taking AI to the Next Level in Manufacturing.pdfTaking AI to the Next Level in Manufacturing.pdf
Taking AI to the Next Level in Manufacturing.pdf
 
GNSS spoofing via SDR (Criptored Talks 2024)
GNSS spoofing via SDR (Criptored Talks 2024)GNSS spoofing via SDR (Criptored Talks 2024)
GNSS spoofing via SDR (Criptored Talks 2024)
 
Harnessing the Power of NLP and Knowledge Graphs for Opioid Research
Harnessing the Power of NLP and Knowledge Graphs for Opioid ResearchHarnessing the Power of NLP and Knowledge Graphs for Opioid Research
Harnessing the Power of NLP and Knowledge Graphs for Opioid Research
 
Mutation Testing for Task-Oriented Chatbots
Mutation Testing for Task-Oriented ChatbotsMutation Testing for Task-Oriented Chatbots
Mutation Testing for Task-Oriented Chatbots
 

Introducing Zio DynamoDB - a new Scala library for Simple, type-safe, and efficient access to DynamoDB

  • 1. Introducing zio-dynamodb a new Scala library for DynamoDB 1 / 42
  • 2. contributors - Avinder Bahra used Scala in production for last 5 years used ZIO in production for the last 2 years been programming professionally for last 30 years - Adam Johnson 2 / 42
  • 3. Some context Relational Databases pros transactional (ACID), rich query language, tooling 3 / 42
  • 4. Some context Relational Databases pros transactional (ACID), rich query language, tooling cons not scalable 4 / 42
  • 5. Some context DynamoDB "fully managed, serverless, key-value NoSQL database designed to run high-performance applications at any scale" 5 / 42
  • 6. Some context DynamoDB "fully managed, serverless, key-value NoSQL database designed to run high-performance applications at any scale" great for scalability, throughput with features like auto-scaling and global table replication 6 / 42
  • 7. Some context DynamoDB fully managed, serverless, key-value NoSQL database designed to run high-performance applications at any scale great for scalability, throughput with features like auto-scaling and global table replication downside of NoSQL DBs they do less work than relational databases for tasks such as querying and maintaining consistency/integrity and offer less tooling hence the code burden for the developer is significantly increased 7 / 42
  • 8. problem: using DDB Java SDK is painful we are going to walk you through the process of creating a production grade Scala application using the Java JDK. Lets see how many LOC it takes to perform simple write and read operations. Here are the steps we are going to follow: create a typesafe model (case class) serialialse/deserialse the Scala model to the APIs model create API requests for wring and reading the model to the database execute the requests and process the results improve performance by using batching of requests as this is production grade we are going to use ZIO to manage effects, concurrency, Java interop ... 8 / 42
  • 9. problem: using DDB Java SDK is painful final case class Student(email: String, // partition key subject: String, // sort key enrollmentDate: Option[Instant], payment: Payment) sealed trait Payment object Payment { final case object DebitCard extends Payment final case object CreditCard extends Payment final case object PayPal extends Payment } 9 / 42
  • 10. problem: using DDB Java SDK is painful we have to build a PutItemRequest import scala.jdk.CollectionConverters._ final case class Student(email: String, subject: String, enrollmentDate: Option[Instant], payment: Payment) sealed trait Payment object Payment { final case object DebitCard extends Payment final case object CreditCard extends Payment final case object PayPal extends Payment } def putItemRequest(student: Student): PutItemRequest = PutItemRequest.builder .tableName("student") .item(toAttributeValueMap(student).asJava) //Item: Map[String, AttributeValue] .build 10 / 42
  • 11. problem: using DDB Java SDK is painful we have to serialise the Student case class to AttributeValue's final case class Student(email: String, subject: String, enrollmentDate: Option[Instant], payment: Payment) sealed trait Payment object Payment { final case object DebitCard extends Payment final case object CreditCard extends Payment final case object PayPal extends Payment } def putItemRequest(student: Student): PutItemRequest = PutItemRequest.builder .tableName("student") .item(toAttributeValueMap(student).asJava) .build def toAttributeValueMap(student: Student): Map[String, AttributeValue] = { Map( "email" -> AttributeValue.builder.s(student.email).build, "subject" -> AttributeValue.builder.s(student.subject).build ) } 11 / 42
  • 12. problem: using DDB Java SDK is painful we have to deal with optional and sum types def toAttributeValueMap(student: Student): Map[String, AttributeValue] = { val mandatoryFields = Map( "email" -> AttributeValue.builder.s(student.email).build, "subject" -> AttributeValue.builder.s(student.subject).build, "payment" -> AttributeValue.builder.s { // serialise sum type student.payment match { case DebitCard => "DebitCard" case CreditCard => "CreditCard" case PayPal => "PayPal" } }.build ) val nonEmptyOptionalFields: Map[String, AttributeValue] = Map( "enrollmentDate" -> student.enrollmentDate.map(instant => AttributeValue.builder.s(instant.toString).build) ).filter(_._2.nonEmpty).view.mapValues(_.get).toMap mandatoryFields ++ nonEmptyOptionalFields } 12 / 42
  • 13. problem: using DDB Java SDK is painful Now that we have saved a Student as an Item in the DynamoDb database we next need to create a GetItemRequest to retrieve it. GetItemRequest.builder .tableName("student") .key( Map( "email" -> AttributeValue.builder.s("avi.gmail.com").build, "subject" -> AttributeValue.builder.s("maths").build ).asJava ) .build() 13 / 42
  • 14. problem: using DDB Java SDK is painful Next have to de-serialise a GetItemResponse - a Map[String, AttributeValue] and deal with concerns such as: mandatory fields - if not present we want an error optional fields conversion to standard primitive type eg Instant Helper functions that return an Either[String, _] for managing errors def getString(map: Map[String, AttributeValue], name: String): Either[String, String] = map.get(name).toRight(s"mandatory field $name not found").map(_.s) def getStringOpt(map: Map[String, AttributeValue], name: String): Either[Nothing, Option[String]] = Right(map.get(name).map(_.s)) def parseInstant(s: String): Either[String, Instant] = Try(Instant.parse(s)).toEither.left.map(_.getMessage) 14 / 42
  • 15. problem: using DDB Java SDK is painful we create a deserialise function that uses the previous helpers note we have to de-serialise the Payment sum type def deserialise(item: Map[String, AttributeValue]): Either[String, Student] = for { email <- getString(item, "email") subject <- getString(item, "subject") maybeEnrollmentDateAV <- getStringOpt(item, "enrollmentDate") maybeEnrollmentDate <- maybeEnrollmentDateAV.fold[Either[String, Option[Instant]]](Right(None))(s => parseInstant(s).map(i => Some(i)) ) payment <- getString(item, "payment") paymentType = payment match { case "DebitCard" => DebitCard case "CreditCard" => CreditCard case "PayPal" => PayPal } } yield Student(email, subject, maybeEnrollmentDate, paymentType) 15 / 42
  • 16. problem: using DDB Java SDK is painful stitching together all the functions using ZIO val program = for { client <- ZIO.service[DynamoDbAsyncClient] student = Student("avi@gmail.com", "maths", Instant.now, Payment.DebitCard) putRequest = putItemRequest(expectedStudent) _ <- ZIO.fromCompletionStage(client.putItem(putRequest)) getItemRequest = getItemRequest(student) getItemResponse <- ZIO.fromCompletionStage(client.getItem(getItemRequest)) studentItem = getItemResponse.item.asScala.toMap foundStudent = deserialise(studentItemstudentItem) } yield foundStudent 16 / 42
  • 17. problem: using DDB Java SDK is painful But that's not fast enough - we want to use DDB batching for our Puts and Gets So we have to first create BatchWriteItemRequest def batchWriteItemRequest(students: List[Student]): BatchWriteItemRequest = { val putRequests = students.map { student => val request = PutRequest .builder() .item(toAttributeValueMap(student).asJava) .build() WriteRequest.builder().putRequest(request).build() } BatchWriteItemRequest .builder() .requestItems(Map("student" -> putRequests.asJava).asJava) .build() } input is a List of Student which we map to WriteRequests for each student se use our toAttributeValueMap function to serilaise to an Item finally we create a BatchWriteItemRequest 17 / 42
  • 18. problem: using DDB Java SDK is painful We then have to execute the batch and process the response def batchWriteAndRetryUnprocessed( batchRequest: BatchWriteItemRequest ): ZIO[Has[DynamoDbAsyncClient], Throwable, BatchWriteItemResponse] = { val result = for { client <- ZIO.service[DynamoDbAsyncClient] response <- ZIO.fromCompletionStage(client.batchWriteItem(batchRequest)) } yield response result.flatMap { case response if response.unprocessedItems().isEmpty => ZIO.succeed(response) case response => // very simple recursive retry of unprocessed requests // in Production we would have exponential back offs and a timeout batchWriteAndRetryUnprocessed(batchRequest = BatchWriteItemRequest .builder() .requestItems(response.unprocessedItems()) .build() ) } } 18 / 42
  • 19. problem: using DDB Java SDK is painful We then have to create a BatchGetItemRequest def batchGetItemReq(studentPks: Seq[(String, String)]): BatchGetItemRequest = { val keysAndAttributes = KeysAndAttributes.builder .keys( studentPks.map { // for all students we extract the partition and sort keys case (email, subject) => Map( "email" -> AttributeValue.builder().s(email).build(), "subject" -> AttributeValue.builder().s(subject).build() ).asJava }.asJava ) .build() BatchGetItemRequest.builder .requestItems(Map("student" -> keysAndAttributes).asJava) .build() } 19 / 42
  • 20. problem: using DDB Java SDK is painful We then have to execute the batch and process the response def batchGetItemAndRetryUnprocessed(batchRequest: BatchGetItemRequest) : ZIO[Has[DynamoDbAsyncClient], Throwable, BatchGetItemResponse] = { val result = for { client <- ZIO.service[DynamoDbAsyncClient] response <- ZIO.fromCompletionStage(client.batchGetItem(batchRequest)) } yield response result.flatMap { case response if response.unprocessedKeys.isEmpty => ZIO.succeed(response) case response => // very simple recursive retry of failed requests // in Production we would have exponential back offs and a timeout batchGetItemAndRetryUnprocessed(batchRequest = BatchGetItemRequest.builder .requestItems(response.unprocessedKeys) .build ) } } 20 / 42
  • 21. problem: using DDB Java SDK is painful putting it all together - full batching program val program = for { client <- ZIO.service[DynamoDbAsyncClient] avi = Student("avi@gmail.com", ...) adam = Student("adam@gmail.com", ...) students = List(avi, adam) batchPutRequest = batchWriteItemRequest(students) _ <- batchWriteAndRetryUnprocessed(batchPutRequest) batchGetItemResponse <- batchGetItemAndRetryUnprocessed(batchGetItemRequest( students.map(st => (st.email, st.subject)))) responseMap = batchGetItemResponse.responses.asScala listOfErrorOrStudent = responseMap.get("student") .fold[List[Either[String, Student]]](List.empty) { javaList => val listOfErrorOrStudent: List[Either[String, Student]] = javaList.asScala.map(m => attributeValueMapToStudent(m.asScala.toMap)).toList listOfErrorOrStudent } // traverse! errorOrStudents = foreach(listOfErrorOrStudent)(identity) } yield errorOrStudents 21 / 42
  • 22. problem: using DDB Java SDK is painful That's not all, we want to speed up our updates however there is no batching support in DDB for updates So we want to parallelise our updates def updateItemRequest(student: Student): UpdateItemRequest = { val values: Map[String, AttributeValue] = Map(":paymentType" -> AttributeValue.builder.s(student.payment.toString).build) UpdateItemRequest.builder .tableName("student") .key( Map( "email" -> AttributeValue.builder.s(student.email).build, "subject" -> AttributeValue.builder.s(student.subject).build ).asJava ) .updateExpression("set payment = :paymentType") .expressionAttributeValues(values.asJava) .build } ... // we execute two queries in parallel using a ZIO zipPar ZIO.fromCompletionStage(client.updateItem(updateItemRequest(updatedAvi))) zipPar ZIO.fromCompletionStage(client.updateItem(updateItemRequest(updatedAdam)) ) 22 / 42
  • 23. problem: using DDB Java SDK is painful Thats a lot of boilerplate! ...and this is just the tip of the iceberg - we have not covered // TODO: should I create more examples for any of these? // TODO: show how many lines of code // TODO: move though slides quickly @ explain at a higher level scanning and queries with key condition and filter expressions pagination of scan and query results complex projection expressions error handling TODO: solution is 23 / 42
  • 24. solution: zio-dynamodb Simple, type-safe, and efficient access to DynamoDB We are now going to write the equivelent application using zio-dynamodb and show you how much less boilerplate code there is. 24 / 42
  • 25. solution: zio-dynamodb val program = (for { avi = Student("avi@gmail.com", "maths", ...) adam = Student("adam@gmail.com", "english", ...) _ <- (DynamoDBQuery.put("student", avi) zip DynamoDBQuery.put("student", adam)).execute listErrorOrStudent <- DynamoDBQuery .forEach(List(avi, adam)) { st => DynamoDBQuery.get[Student]( "student", PrimaryKey("email" -> st.email, "subject" -> st.subject) ) } .execute } yield EitherUtil.collectAll(listErrorOrStudent)) .provideCustomLayer(DynamoDBExecutor.live) 25 / 42
  • 26. solution: zio-dynamodb offers a type safe API with auto serialisation auto batching and parallelisation queries testable using a fake in memory DB 26 / 42
  • 27. zio-dynamodb API 101 DynamoDBQuery its type and it's combinators auto batching and parallelisation how to create and execute a query Serialisation low level API built in type classes Expressions usages mutations operations query operations Serialisation high level type safe API note there is a 1:1 correspondence to the AWS API to aid discoverability 27 / 42
  • 28. DynamoDBQuery sealed trait DynamoDBQuery[+A] { def zip[B](that: DynamoDBQuery[B]): DynamoDBQuery[(A, B)] = ?? def zipLeft[B](that: DynamoDBQuery[B]): DynamoDBQuery[A] = ??? def zipRight[B](that: DynamoDBQuery[B]): DynamoDBQuery[B] = ??? def forEach[A, B](values: Iterable[A])(body: A => DynamoDBQuery[B]) : DynamoDBQuery[List[B]] def execute: ZIO[Has[DynamoDBExecutor], Exception, A] = ??? } object DynamoDBQuery { // whole bunch of query constructors } Next lets create and execute some basic queries 28 / 42
  • 29. Simple Put for multiple tables val program = (for { _ <- (DynamoDBQuery.put("student", avi) zip DynamoDBQuery.put("student", adam) zip DynamoDBQuery.put("course", french) zip DynamoDBQuery.put("course", art) zip ).execute } yield ()) .provideCustomLayer(DynamoDBExecutor.live) puts for multiple tables are batch together, grouped by table in the request 29 / 42
  • 30. Serialisation - Low level API AttributeValue sealed trait AttributeValue final case class Binary(value: Iterable[Byte]) extends AttributeValue final case class BinarySet(value: Iterable[Iterable[Byte]]) extends AttributeValue final case class Bool(value: Boolean) extends AttributeValue // ... etc etc final case class String(value: ScalaString) extends AttributeValue final case class Number(value: BigDecimal) extends AttributeValue This corresponds 1:1 with the AWS API AttributeValue 30 / 42
  • 31. Serialisation - Low level API AttrMap Top level container type for an Item with type aliases final case class AttrMap(map: Map[String, AttributeValue]) type Item = AttrMap type PrimaryKey = AttrMap Internal type classes take care of AttributeValue conversions val aviItem = Item("email" -> "avi@gmail.com", "age" -> 21) is equivalent to val aviPrimaryKey = AttrMap(Map("email" -> AttributeValue.String("email"), "age" -> AttributeValue.Number(BigDecimal(21))) 31 / 42
  • 32. Projection Expression Parser $("cost") // simple $("address.line1") // map $("adresses[1]") // list $("adresses[work].line1") // list with map The $ projection expression parser function takes a string field expression and turns it into the internal representation 32 / 42
  • 33. updates updateItem("course", PrimaryKey("name" -> "art"))( // UpdateExpression $("cost").set(500.0) + $("code").set("123") ) we can specify ProjectionExpression's $("cost") ... $("code") a ProjectionExpression has may update actions which can be combined using + $("count").set(1) $("field1").set($("field2")) // replaces filed1 with field2 $("count").setIfNotExists($("two"), 42) $("numberList").appendList(List("1")) $("numberList").prependList(List("1")) $("count").add(1) // updating Numbers and Sets $("count").remove // Removes this field from an item $("numberSet").deleteFromSet(1) $("person.address").set(Item("line1" -> "1 high street")) 33 / 42
  • 34. delete deleteItem("course", PrimaryKey("name" -> "art")) Both Update and Delete can have ConditionExpressions that must be met for the operation to succeed. For this we use the where method. where $("code") > 1 && $("code") < 5 applied to the queries deleteItem("course", PrimaryKey("name" -> "art")) where $("code") > 1 && $("code") < 5 updateItem("course", PrimaryKey("name" -> "art")) { // UpdateExpression $("cost").set(500.0) + $("code").set("123") } where $("code") > 1 && $("code") < 5 34 / 42
  • 35. queryAll val zio: ZIO[Has[DynamoDBExecutor], Exception, Stream[Exception, Person]] = queryAll("person", $("name"), $("address[1].line1")) .whereKey( PartitionKey("partitionKey1") === "x" && SortKey("sortKey1") > 10 ) .execute Note that we use a list of ProjectionExpression's again $("name"), $("address[1].line1") The whereKey method specifies a KeyConditionExpression .whereKey( PartitionKey("partitionKey1") === "x" && SortKey("sortKey1") > 10 ) ... and we get back a ZStream that the library lazily paginates for us val queryAll: ZIO[Has[DynamoDBExecutor], Exception, Stream[Exception, Person]] 35 / 42
  • 36. querySome val zio = querySomeItem("person", limit = 5, $("name"), $("address[1].line1")) .whereKey(PartitionKey("partitionKey1") === "x" && SortKey("sortKey1") > 10) .execute ... and we get back a ZIO of (Chunk[Person], LastEvaluatedKey) type LastEvaluatedKey = Option[PrimaryKey] val q: ZIO[Has[DynamoDBExecutor], Exception, (Chunk[Item], LastEvaluatedKey)] and we use the startKey method to feed back the LastEvaluatedKey to get the next page of data val zio = querySomeItem("person", limit = 5, last$("name"), $("address[1].line1")) .whereKey(PartitionKey("partitionKey1") === "x" && SortKey("sortKey1") > 10) .startKey(startKey) .execute 36 / 42
  • 37. scanAll, scanSome scanAll These are similar to the previous queryAll/querySome queries val zio: ZIO[Has[DynamoDBExecutor], Exception, Stream[Exception, Person]] = scanAll("person", $("name"), $("address[1].line1")).execute scanSome val zio = scanSome("person", limit = 5, $("name"), $("address[1].line1")).execute ... val zio = scanSome("person", limit = 5, $("name"), $("address[1].line1")) .startKey(startKey) .execute 37 / 42
  • 38. Condition Expressions $("field1").exists $("field1").notExists $("field1").beginsWith("1") $("field1").contains("1") $("field1").size > 1 $("field1").isNumber condition1 && condition2 condition1 || condition2 !conition1 $("field1") > 1.0 $("field1") > $("col2") $("field1") === $("col2") $("field1") === "2" $("field1").between("1", "2") $("field1").in(Set("1", "2")) $("field1").in("1", "2") 38 / 42
  • 39. Condition Expressions cont. Apply to updateItem delete queryXXXX/scanXXXX 39 / 42
  • 40. ConditionExpressions for query and scan 40 / 42
  • 41. Serialisation - High level API High level API uses zio-schema to create codecs 41 / 42
  • 42. summary (3 slide) wrap-up (1 slide) Java SDK is painful ZIO DynamoDB is a joy Learning more (1 slide) Thank you (1 slide) 42 / 42