Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Layered caching in OpenResty (OpenResty Con 2018)

73 views

Published on

A 40 minutes version of my previous talk on layered caching in OpenResty, given at OpenResty Con 2018 in Hangzhou, China.

In this talk, we will dive into the new "mlcache" library, which aims at providing a simple abstraction for layered caching in OpenResty. We will explore several practical use-cases for it that will help you achieve high-performance goals for your applications.

Published in: Software
  • Be the first to comment

Layered caching in OpenResty (OpenResty Con 2018)

  1. 1. 1/43 Layered Caching in OpenResty Thibault Charbonnier November 18th, 2018 OpenResty Con 2018
  2. 2. 2/43 Thibault Charbonnier Principal Engineer @ Kong Inc. OpenResty contributor GitHub: thibaultcha Twitter: @thibaultcha
  3. 3. 3/43 ”Could OpenResty get a better caching abstraction?”
  4. 4. 4/43 What type of caching? Any Lua state fetched from I/O Dynamic configuration Dynamic logic (injected code) Client sessions ... Eventually, all OpenResty applications have needs for Lua-land caching.
  5. 5. 5/43 The challenges of Lua-land caching LuaJIT VM limitations lua shared dict limitations Split nginx workers
  6. 6. 6/43
  7. 7. 7/43 Example: Caching in Kong
  8. 8. 8/43 Example: Caching in Kong -- Retrieve a value from the cache , or fetch it -- if it’s a miss function _M.cache_get_and_set (key , cb) local val = _M.cache_get(key) if not val then val = cb() if val then local succ , err = _M.cache_set(key , val) if not succ and ngx then ngx.log(ngx.ERR , err) end end end return val end Caching in Kong in 2015. Not ideal. Let’s switch to a library?
  9. 9. 9/43 Existing caching libraries mtourne/ngx.shcache lloydzhou/lua-resty-cache hamishforbes/lua-resty-tlc Unpractical APIs, lack of flexibility, or sometimes too opinionated... We need a new solution. Let’s make our own!
  10. 10. 10/43 Caching primitives in OpenResty OpenResty offers a few off-the-shelf options for caching data in the Lua-land: The Lua VM itself lua-resty-lrucache lua shared dict
  11. 11. 10/43 Caching primitives in OpenResty OpenResty offers a few off-the-shelf options for caching data in the Lua-land: The Lua VM itself lua-resty-lrucache lua shared dict
  12. 12. 10/43 Caching primitives in OpenResty OpenResty offers a few off-the-shelf options for caching data in the Lua-land: The Lua VM itself lua-resty-lrucache lua shared dict
  13. 13. 11/43 lua-resty-mlcache
  14. 14. 12/43 lua-resty-mlcache The Swiss Army Knife of Open- Resty caching
  15. 15. 13/43 Methods mlcache:get() mlcache:set() mlcache:delete () mlcache:peek () mlcache:purge () mlcache:update ()
  16. 16. 14/43 Usage http { lua_shared_dict cache_shm 128m; server { ... } }
  17. 17. 15/43 Usage local mlcache = require "resty.mlcache" local cache , err = mlcache.new("cache_name", "cache_shm", { lru_size = 1000, -- 1000 items in L1 cache (VM) ttl = 3600, -- cache for 1h neg_ttl = 60 -- cache nil values for 60s })
  18. 18. 16/43 Layered Caching Architecture
  19. 19. 17/43 lua shared dict serialization
  20. 20. 18/43 Layered caching architecture Built-in cache-stampede prevention Efficient design (JIT) ...
  21. 21. 19/43 Practical Examples
  22. 22. 20/43 Example: Database caching local function fetch_user(id) return db:query_user(id) -- row or nil end local id = 123 local user , err = cache:get(id , nil , fetch_user , id) if err then ngx.log(ngx.ERR , "failed to fetch user: ", err) return end if user then print(user.id) -- 123 else -- miss is cached end
  23. 23. 21/43 Layered caching architecture Built-in cache-stampede prevention Efficient design (JIT) Caching of Lua tables + negative hits ...
  24. 24. 22/43 Example: DNS records caching local resolver = require " resty.dns.resolver " local r = resolver.new ({ nameservers = { "1.1.1.1" }}) local function resolve(name) local answers = r:query(name) local ip = answers [1] local ttl = answers [1] .ttl return ip , nil , ttl -- provide TTL from result end local host = "openresty.org" local answers , err = cache:get(host , nil , resolve , host) -- callback arg
  25. 25. 23/43 Example: Injected logic local function compile_code(row) row.f = loadstring(row.code) -- load only once return row end local user , err = cache:get(user_id , { l1_serializer = compile_code }, fetch_code) -- fetch Lua from DB user.f () -- now a valid function
  26. 26. 24/43 l1 serializer contributed by Cloudflare
  27. 27. 25/43 Layered caching architecture Built-in cache-stampede prevention Efficient design (JIT) Caching of Lua tables + negative hits Custom serializers ...
  28. 28. 26/43 Bad example: cache churning scenario Never trust user input. local key = ngx.var.http_some_header -- req header local cache_key = "items:" .. key local value , err = cache:get(cache_key , nil , db.select , key) A malicious client could send high-cardinality header values (aaaa, aaab, aaac, aaad, ...)
  29. 29. 27/43 Solution: Split hit/miss caches
  30. 30. 28/43 lua shared dict multi-tenancy
  31. 31. 29/43 Split hit/miss + multi-tenancy
  32. 32. 30/43 Layered caching architecture Built-in cache-stampede prevention Efficient design (JIT) Caching of Lua tables + negative hits Custom serializers Flexible deployment capabilities ...
  33. 33. 31/43 Cache Invalidation worker 0: invalidate item worker 1: poll + re-fetch item -- t0 cache:delete("users :123") -- t1 cache:update () cache:get("users :123", nil , fetch_user)
  34. 34. 32/43 Layered caching architecture Built-in cache-stampede prevention Efficient design (JIT) Caching of Lua tables + negative hits Custom serializers Flexible deployment capabilities Cache invalidation With built-in IPC, or custom ones: lua-resty-worker-events slact/ngx lua ipc ...
  35. 35. 33/43 Resiliency: Serving Stale Data local opts = { resurrect_ttl = 30 } local value , err = cache:get(key , opts , db.fetch) If the L3 lookup fails (e.g. timeout), value may be served as stale data, ensuring resiliency.
  36. 36. 34/43 Layered caching architecture Built-in cache-stampede prevention Efficient design (JIT) Caching of Lua tables + negative hits Custom serializers Flexible deployment capabilities Cache invalidation With built-in IPC, or custom ones: lua-resty-worker-events slact/ngx lua ipc Stale data serving ...
  37. 37. 35/43 Observability Ok, caching is great. But how do you ensure it is properly configured? lua-resty-mlcache takes the next step as a caching library by providing caching metrics.
  38. 38. 36/43 Observability local value , err , hit_level = cache:get(key , nil , callback , ...) print(hit_level) -- 1, 2, 3, 4 By keeping track of the hit lvl, we can compute hit/miss ratios for each layer, as well as tracking stale data servings.
  39. 39. 37/43 Observability local ttl , err , value = cache:peek(key)
  40. 40. 38/43 Observability Plans for providing metrics do not stop here. In the future, we will: Provide more info than hit lvl on lookups Use the new shm:ttl and shm:free space APIs
  41. 41. 39/43 A complete solution Layered caching architecture Built-in cache-stampede prevention Efficient design (JIT) Caching of Lua tables + negative hits Custom serializers Flexible deployment capabilities Cache invalidation With built-in IPC, or custom ones: lua-resty-worker-events slact/ngx lua ipc Stale data serving Observability
  42. 42. 40/43 In the wild Used in Kong for over a year Contributions from Cloudflare and Kong Inc. Extensive test suite OpenResty 1.11.2.2 to 1.13.6.2 (current)
  43. 43. 41/43 OpenResty contributions & Improvements User flags support - openresty/lua-resty-lrucache#35 lua shared dict R/W lock - openresty/lua-nginx-module#1287 TODO: custom l2 serializer TODO: Lua-land R/W lock TODO: A native IPC solution for OpenResty
  44. 44. 42/43 Q & A
  45. 45. 43/43 Can you replace your caching logic with mlcache? https://github.com/thibaultcha/lua-resty-mlcache Contributions welcome!

×