SECRET OF CSS

Trigger Bitbucket Pipeline Upon S3 Changes | by nhammad | Jun, 2022


A tiny guide to help you finish faster

1*ZcF35i9mpZmBkI382hftDw
Trigger Bitbucket Pipelines Using AWS

Bitbucket Pipelines have made creating CI/CD pipelines a quick, easy job. Using Bitbucket, you can either schedule your pipelines or else you can select specific branches that should trigger the pipelines as soon as any code changes are detected.

For a recent project, I needed to trigger a bitbucket pipeline every time there was a new file in a specific folder of my AWS S3 bucket. This was easier than I thought!

Using an AWS Python 3.9 Lambda Function, we can easily send a POST request to Bitbucket’s API. For storing the user credentials, it is recommended to use AWS Secrets Manager. The code can look like this:

Here, the function run_pipeline() sends aPOST request to the Bitbucket API. Every time this Lambda function runs, it will send a POSTrequest, which automatically runs the relevant pipeline. If your bitbucket pipeline ran successfully, you will see a 200 status code in the Cloudwatch logs.

Meanwhile, the get_secret() function retrieves the user credentials from the AWS Secrets Manager. If you are not concerned with security-related matters at the moment, you may also hard-code the credentials while testing.

Once you have a Lambda function set up, you need to set up an event notification in your S3 bucket. This will ensure that your Lambda function runs every time there’s a change (for example a new file) in your bucket.

If you head to the Properties tab of your S3 bucket, you can set up an Event Notification for all object “create” events (or also “delete” events, depending on your requirements). For the destination, you can select the Lambda function you created in the first step.

If you only want specific file types to be considered when triggering the Lambda function, you can specify the related file types under Filters. You can also add a prefix to your event notification settings, for instance, if you only want to run your lambda if files are uploaded to a specific folder within that S3 bucket.

In order for your Lambda function to work smoothly, you will need to give the used AWS role certain permissions. For example, the role in use would need policies to access S3, CloudWatch, and the Secrets Manager.

I will soon be writing another article describing how to create the same architecture using Terraform. Stay tuned for that!



News Credit

%d bloggers like this: