We have a serverless framework project with different microservices in a monorepo. The structure of the project looks like:
project-root/
|-services/
|-service1/
|-handler.py
|-serverless.yml
|-service2/
...
|-serviceN/
...
|-bitbucket-pipelines.yml
As you can see, each service has its own deployment file (serverless.yml).
We want to deploy only the services that have been modified in a push/merge and, as we are using bitbucket-pipelines, we prefer to use its features to get this goal.
After doing a little research, we've found the changesets
property that can condition a step to the changed files inside a directory. So, for each service we could add in out bitbucket-pipelines.yml something like:
- step:
name: step1
script:
- echo "Deploy service 1"
- ./deploy service1
condition:
changesets:
includePaths:
# only files directly under service1 directory
- "service1/**"
This seems like a perfect match, but the problem with this approach is that we must write a step for every service we have in the repo, and, if we add more services we will have to add more steps, and this affects maintainability and readability.
Is there any way to make a for
loop with a parametrized step where the input parameter is the name of the service?
On the other hand, we could make a custom script that handles the condition and the deployment detecting the changes itself. Even if we prefer the bitbucket-native approach, we are open to other options; what's the best way to do this in a script?