One reason I really like the serverless architecture approach is being pretty selfish: one has to care only about what matters - the code. Well, I know code is not everything, but as a developer I'm just having more fun coding than scripting infrastructure in YAML or similar. For people like me is the serverless model a dream come true. But how to do serverless without turning the dream into a nightmare?
Distribution of a task among several threads means horizontal scaling - the more computing resources (processors) the less time to work the task out. Sharing data among threads brings the need to synchronize which kills the scaling capability of the computing. How to proceed when shared data are needed?
Binary data shouldn't be a part of the codebase. This is pretty well-known practice. But how to proceed when we do need binaries in our codebase, for instance as test data?
Write your tests once and run them twice - as both unit and integration tests - sounds like a good deal, let's take a look at this practice.
Black-box testing is testing of a component via its API without any knowledge of its implementation details. As the opposite there is the white-box testing. And it about testing implementation, right? Well, no...
Nothing new, but I keep seeing this bad practice around again and again... Let's explain why this is incorrect:
Software architecting is about tradeoffs. Even when the theory is good the implementation details can break it. In this article I try to find the best from two architectural approaches: Package by component and Clean architecture (a variety of Ports and adapters).
After some time of developing serverless systems (especially on AWS) I take a look back and try to summarize what I have learned so far.
Why shouldn't we test the implemetation? How to decouple our tests from the code? What is the reason to add a new test? Why is mocking a code smell? In this article I will try to find answers to those questions.