Application optimization is one of the key things that we need to handle to ensure that we have success and pleased stakeholders. Should we use memory caching, shared resources, or an external cache? Join this session in a journey when we discover how we can interlace Azure Functions and Azure Redis Cache in such a way to optimize our applications with a minimum impact on development effort and technical depth.
3. In an O'Reilly survey of 1500 IT professionals in 2019, 40% of
respondents worked at organizations that have adopted
serverless architecture
A 2020 DataDog survey indicated that over 50% of AWS users
are now using the serverless AWS Lambda Function as a
Service (FaaS)
https://www.infoq.com/news/2020/07/future -serverless-architecture/
6. C#, Java, JavaScript, PowerShell, or Python
Custom Handler (e.g. Go, Rust) Docker
Consumption Plan, Premium
and App Services
Kubernetes - KEDA
Automate deployment
7. Timeout 5’-30’
Max timeout 10’- oo’*
*guaranteed up to 60’
Request size 100MB
Query length 4096
Max memory 1.5-14GB
(shared at Function App level) GB-s =(number of executions) x (execution duration in
seconds) x (amount of RAM used in GB)
12. Stateful functions
in serverless environment
Behind the scene
an extension that manage state,
checkpoints and restarts
(1) Orchestrator functions
(2) Stateful entities
Entities functions
Durable
16. In memory caching using
static dictionary Simple to use for static content
Error prone for
refreshed content
Impact
Consumption Plan: value refreshed because of running
time
Elastic Premium Plan: pre-warmed with old value, new
scaled instances with refreshed value
App Service Plan: Similar with Elastic, but with more
control
Cache
18. Memory Cache Capability to specify expiration time
Refreshed content time
can be managed
Risk to get the old value still exist but it
is smaller
Impact
Consumption Plan: value refreshed because of running
time
Elastic Premium Plan: pre-warmed with old value, new
scaled instances with refreshed value
App Service Plan: Similar with Elastic, but with more
control
Cache
20. Azure Redis Cache Full control on content
Ability to manage expiration and
consistency
Cache
Easy to scale and manage
21. Azure Redis Cache During high load you can get
Time-out exceptions
Causes
Plan exceeded
Object size vs single thread
Synchronously access of large objects
Noisy neighbors (C0 plan)
Cache Error
Explanation
The Lazy instance is not thread safe; if the instance is accessed from multiple
threads, its behavior is undefined. Use this mode only when high performance is
crucial and the Lazy instance is guaranteed never to be initialized from more than
one thread. If you use a Lazy constructor that specifies an initialization method
(valueFactory parameter), and if that initialization method throws an exception (or
fails to handle an exception) the first time you call the Value property, then the
exception is cached and thrown again on subsequent calls to the Value property
28. Configuration Data
Azure Table
Azure Function In-memory cache
Changeevent
Event Grid
LazyLoad
Azure Function
Clearcache
Eventhandler
EventPublisher
EventSubscriptions
29. Expensive Queries / API Calls
100B cache operations a day
Query / API
Azure Function Azure Redis Cache
Azure Function
Azure Function
30. Expensive Queries / API Calls
100B cache operations a day
Query / API
Azure Function Azure Redis Cache
Azure Function
Azure Function
Smalllatency
(externalcall)