Binary data shouldn't be a part of the codebase. This is pretty well-known practice. But how to proceed when we do need binaries in our codebase, for instance as test data?
Write your tests once and run them twice - as both unit and integration tests - sounds like a good deal, let's take a look at this practice.
After some time of developing serverless systems (especially on AWS) I take a look back and try to summarize what I have learned so far.
Black-box testing is testing of a component via its API without any knowledge of its implementation details. As the opposite there is the white-box testing. And it about testing implementation, right? Well, no...
Continuous delivery (CD) brings a lot of ideas essential for a modern software product deployment. In this article we discuss how to follow CD principles by building CD pipelines with an example in AWS.
Nothing new, but I keep seeing this bad practice around again and again... Let's explain why this is incorrect:
Automated infrastructure (Infrastructure as Code) is essential to succeed (not only) in the cloud. AWS provides its own service for managing resource stacks: AWS CloudFormation. What are the options to manage dependencies between stacks, how to use them and which pros & cons they have?
I already blogged about the serverless blue-green development some time ago. I used it in practice a lot with very promising results. But there are challenges as well.
Transducers are composable reducers. A transducer takes a reducer and returns another reducer. Transducers compose via simple function composition. But, there is a tiny difference between function and transducer composition: functions compose bottom-to-top while transducers top-to-bottom.
Software architecting is about tradeoffs. Even when the theory is good the implementation details can break it. In this article I try to find the best from two architectural approaches: Package by component and Clean architecture (a variety of Ports and adapters).