Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

How we deal with legacy code in TrekkSoft

How we deal with legacy code in TrekkSoft

  • Be the first to comment

  • Be the first to like this

How we deal with legacy code in TrekkSoft

  1. 1. TrekkSoft AG info@trekksoft.com | www.trekksoft.com How we deal with legacy code in TrekkSoft Anton Serdyuk, 17th February 2017 PHP User Group Minsk
  2. 2. ● 6 y/o Zend Framework 1 codebase ● People: ○ 13 devs (12 in Minsk, 1 in Switzerland) ○ 5 Product Managers (in Switzerland) ○ 0 QA Introduction: About TrekkSoft
  3. 3. ● Processes ○ CI/CD (or Error log Driven Development) ○ BDD ● Technical stuff ○ TDD/Refactoring ○ Static analysis (or Scrutinizer Driven Development) ○ Monitoring Introduction: About This Talk
  4. 4. ● We try to avoid long-running feature branches ○ Soft-releases for complicated stuff ○ Enable features for internal accounts, then beta-testers, then all other clients CI/CD (or Error log Driven Development)
  5. 5. ● Release often ○ It is uncommon for us and we consider dangerous to release 2 days of work ○ Usually we have 1 or 2 releases per day ○ Release process includes error logs and main application metrics checking CI/CD (or Error log Driven Development)
  6. 6. ● Error logs cleanup ○ We constantly fix errors from error logs, so new errors are visible CI/CD (or Error log Driven Development)
  7. 7. ● Domain knowledge is super important for developers ○ We often refuse to add to sprint feature we do not understand how real customer would use BDD
  8. 8. ● BDD process is emerging right now ○ Developer + Product Manager + Technical Support talk about feature and usage scenarios and document them in google docs in real time with shared screen ○ Developer converts those usage scenarios to behat scenarios and everyone reviews them ○ Only then they are implemented BDD
  9. 9. BDD
  10. 10. BDD
  11. 11. BDD
  12. 12. BDD ● Behat 3 usage ○ 1 suit, 1 context per feature/set of features ○ Reuse application connection code, not behat steps/contexts ○ Do not write them though UI: we connect to the layer behind UI. Usually services, if we are lucky. Sometimes controllers :(
  13. 13. BDD: 1 suit, 1 context per feature/set of features ● behat.yml.dist
  14. 14. BDD: Reuse application connection code, not behat steps/contexts
  15. 15. BDD: Do not write them though UI Context Feature Feature Feature Context Feature Feature Feature Context Feature Feature Feature Connection API Connection API View Controller Controller Service Service Service Entity Entity Not lucky Lucky
  16. 16. TDD/Refactoring ● We do not require TDD process and unittests now for everything, our main focus to Behat right now ● We decouple and do TDD where it is possible ● We try to avoid: move some functionality to some class, mock huge amount of dependencies and write “unittests” ● We try to do: decouple a part of functionality completely by Dependency Inversion and mock/stub a limited number of interfaces
  17. 17. Component TDD/Refactoring ● We try to avoid: move some functionality to some class, mock huge amount of dependencies and write “unit tests” Dependency Dependency Dependency Dependency Class “Unit test” Uses Mocks Class “Unit test”
  18. 18. Component TDD/Refactoring ● We try to do: decouple a part of functionality completely by Dependency Inversion and mock/stub a limited number of interfaces Dependency Class Unit test Class Unit test <I>Interface Uses Mocks
  19. 19. Component TDD/Refactoring ● We try to avoid: Mock every dependency if they are from the same component Class Class unittest Uses M ocks
  20. 20. Component TDD/Refactoring ● We try to do: It is OK to use dependencies in unit tests if they belong to the same component Class Class unittest Uses
  21. 21. Static analysis (or Scrutinizer Driven Development) ● Scrutinizer code metrics work surprisingly good for us and lead our refactoring efforts
  22. 22. Static analysis (or Scrutinizer Driven Development)
  23. 23. Static analysis (or Scrutinizer Driven Development)
  24. 24. Logs/Monitoring: Kibana ● Access logs ● Application logs ● Filter by IP, filter by request ID ● Links from slack channel directly to filtered results
  25. 25. Logs/Monitoring: Kibana ● Links from slack channel directly to filtered results
  26. 26. Logs/Monitoring: New Relic ● Servers (obviously) ● APM, traces ● Insights
  27. 27. Logs/Monitoring: New Relic APM
  28. 28. Logs/Monitoring: New Relic APM
  29. 29. Logs/Monitoring: New Relic Insights
  30. 30. Literature ● Kent Beck TDD ● Robert Martin Clean Code series ● Gojko Adzic Bridging the Communication Gap ● Gojko Adzic Specification by Example
  31. 31. Thanks

×