Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

How Ask.fm was built

720 views

Published on

Story from Ask.fm team

Published in: Technology
  • Be the first to comment

  • Be the first to like this

How Ask.fm was built

  1. 1. How ask.fm was built Valery Vishnyakov ask.fm/balepc
  2. 2. Covered topics • Scaling Database • Scaling Application • Scaling Infrastructure
  3. 3. Started as a side project
  4. 4. • 0 users • 1 server • 3 tech guys 2010
  5. 5. Is ask.fm big? Users Servers Tech guys Ask.fm 135m 800 15 Twitter* 554m ~3500 1800 StackOverflow** 4m 25 24 * https://about.twitter.com/company ** http://bit.ly/1rEl8sX
  6. 6. No white spots
  7. 7. Initial stack
  8. 8. Scaling UP application
  9. 9. Not a sponsor slide
  10. 10. Scaling OUT application
  11. 11. User uploaded content
  12. 12. • 97k users • 9 servers • 3 tech guys 2011
  13. 13. Scaling database
  14. 14. Caching
  15. 15. Redis
  16. 16. Read scalability
  17. 17. Decision time Middleware Logical partitioning
  18. 18. App-level logical partitioning
  19. 19. Preparation • Choose Key + IDs • Avoid JOINS • Remove DB Constraints
  20. 20. Sharding schema • node0 • shard0 • questions_by_user • shard1 • questions_by_user • node1 • shard2 • questions_by_user • shard3 • questions_by_user • …
  21. 21. Maneuver
  22. 22. Adding more servers
  23. 23. DB Scaling takeaways • Divide and conquer • MySQL is good (schema change could be painful) • Cache on all layers • Scale out, not just up • Denormalize
  24. 24. When 256 is not enough? user_id % 256
  25. 25. 2012 • 2M users • 3 tech guys • 46 servers
  26. 26. More on scaling application
  27. 27. Middleware
  28. 28. End of AWS era
  29. 29. New data center
  30. 30. Bare metal
  31. 31. Infrastructure evolution Public Cloud => Bare metal => Private Cloud
  32. 32. SPAM
  33. 33. SPAM fighting evolution ‣ IP addresses ‣ Captcha ‣ Links ‣ Patterns ‣ Users ‣ Beneficiary ‣ …
  34. 34. Moderation How to moderate 4M photos every day?
  35. 35. Philippines
  36. 36. 2013 • 31M users • 440 servers • 6 tech guys
  37. 37. Exponential growth
  38. 38. Functionality switches • Media content • Likes • Wall
  39. 39. TDD Trend Driven Development
  40. 40. Lessons learned • Monitoring • Incremental rollout • Use battle-proven techniques
  41. 41. Android & iOS
  42. 42. 2014 • 105M users • 12 tech guys • ~700 servers
  43. 43. Nowadays challenges • DC High-Availability • Fault-tolerance • Scaling teams
  44. 44. Global takeaways • Do not over engineer • Stay small • Avoid SPOF • Performance as a feature • Plan for scaling out
  45. 45. What could we do differently? • More SOA • Plan hiring ahead • Don’t f**k up tests
  46. 46. Questions?

×