23

I want to integrate my code from Bitbucket into AWS Code Pipeline. I unable to find proper examples on the same. My source code is in .Net. Can someone please guide me. Thanks.

Nigel Fds
  • 669
  • 1
  • 10
  • 22
  • 2
    the only source options i see available so far are github, codecommit and s3. it might be possible to use a Bitbucket pipeline to push code to S3 or upstream to CodeCommit and then use that source. – Dave Maple Jan 16 '17 at 23:43
  • Ya thats what i've also come across. But at this stage I can't change my code source to CodeCommit. – Nigel Fds Jan 17 '17 at 01:52
  • Everybody here is assuming you're talking about Cloud Bitbucket. Can you update the question to clarify it is not the Self Hosted version you're talking about? – eco Sep 10 '19 at 01:34
  • @einarc I just checked there is a different tag for bitbucket server – Nigel Fds Sep 10 '19 at 23:48
  • @NigelFds I can't see it: https://github.com/aws-quickstart/quickstart-git2s3/tree/master – eco Sep 11 '19 at 00:36
  • @einarc I'm pretty sure the self-hosted version of Bitbucket - previously Atlassian Stash was rebranded Bitbucket server well after OP posted this question. And NigelFds is referring to stackoverflow tags on this question, not in the AWS docs. – OllyTheNinja Sep 11 '19 at 05:01
  • @OllyTheNinja Got it, thanks!! Then the correct answer is. Bitbucket Cloud is now supported by CodePipeline and CodeBuild. – eco Sep 12 '19 at 01:29

8 Answers8

15

You can integrate Bitbucket with AWS CodePipeline by using webhooks that call to an AWS API Gateway, which invokes a Lambda function (which calls into CodePipeline). There is an AWS blog that walks you thru this: Integrating Git with AWS CodePipeline

Kirkaiya
  • 885
  • 8
  • 12
  • 3
    I've followed this guide to setup Bitbucket integration via S3 into CodePipeline. Works well. The only issue I have is that I can't setup separate branches to run through the pipe; 'develop' to a test environment, 'master' to production environment. – fakataha Apr 04 '17 at 21:51
  • ok I'm having a couple of issue with Code Pipeline since I'm using .Net C# and have a lot of configurations. – Nigel Fds Apr 19 '17 at 05:04
  • That tutorial's code has been updated to process Bitbucket webhook events, but only for the cloud version. The OP did not specify if he's in a private repo or using Bitbucket Cloud. – eco Sep 10 '19 at 01:18
  • 1
    We tried this on Bitbucket server. You lose a lot of the advantages of git. If you're on server might as well use Bamboo to generate and submit to ECR and then use ECR as a source. That is of course, only if you're using micro services, which is what we're looking at. – eco Dec 20 '19 at 21:45
11

BitBucket has a service called PipeLines which can deploy code to AWS services. Use Pipelines to package and push updates from your master branch to an S3 bucket which is hooked up to CodePipeline

Note:

  • You must enable PipeLines in your repository

  • PipeLines expects a file named bitbucket-pipelines.yml which must be placed inside your project

  • Ensure you set your accounts AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY in the BitBucket Pipelines UI. This comes with an option to encrypt so all is safe and secure

Here is an example bitbucket-pipelines.yml which copies the contents of a directory named DynamoDb to an S3 bucket

pipelines:
  branches:
    master:
      - step:
          script:
            - apt-get update # required to install zip
            - apt-get install -y zip # required if you want to zip repository objects
            - zip -r DynamoDb.zip .
            - apt-get install -y python-pip
            - pip install boto3==1.3.0 # required for s3_upload.py
            # the first argument is the name of the existing S3 bucket to upload the artefact to
            # the second argument is the artefact to be uploaded
            # the third argument is the the bucket key
            - python s3_upload.py LandingBucketName DynamoDb.zip DynamoDb.zip # run the deployment script

Here is a working example of a Python upload script which should be deployed alongside the bitbucket-pipelines.yml file in your project. Above I have named my Python script s3_upload.py:

from __future__ import print_function
import os
import sys
import argparse
import boto3
from botocore.exceptions import ClientError

def upload_to_s3(bucket, artefact, bucket_key):
    """
    Uploads an artefact to Amazon S3
    """
    try:
        client = boto3.client('s3')
    except ClientError as err:
        print("Failed to create boto3 client.\n" + str(err))
        return False
    try:
        client.put_object(
            Body=open(artefact, 'rb'),
            Bucket=bucket,
            Key=bucket_key
        )
    except ClientError as err:
        print("Failed to upload artefact to S3.\n" + str(err))
        return False
    except IOError as err:
        print("Failed to access artefact in this directory.\n" + str(err))
        return False
    return True


def main():

    parser = argparse.ArgumentParser()
    parser.add_argument("bucket", help="Name of the existing S3 bucket")
    parser.add_argument("artefact", help="Name of the artefact to be uploaded to S3")
    parser.add_argument("bucket_key", help="Name of the S3 Bucket key")
    args = parser.parse_args()

    if not upload_to_s3(args.bucket, args.artefact, args.bucket_key):
        sys.exit(1)

if __name__ == "__main__":
    main()

Here is an example CodePipeline with only one Source stage (you may want to add more):

Pipeline:
  Type: "AWS::CodePipeline::Pipeline"
  Properties:
    ArtifactStore:
      # Where codepipeline copies and unpacks the uploaded artifact
      # Must be versioned
      Location: !Ref "StagingBucket"
      Type: "S3"
    DisableInboundStageTransitions: []
    RoleArn:
      !GetAtt "CodePipelineRole.Arn"
    Stages:
      - Name: "Source"
        Actions:
          - Name: "SourceTemplate"
            ActionTypeId:
              Category: "Source"
              Owner: "AWS"
              Provider: "S3"
              Version: "1"
            Configuration:
              # Where PipeLines uploads the artifact
              # Must be versioned
              S3Bucket: !Ref "LandingBucket"
              S3ObjectKey: "DynamoDb.zip" # Zip file that is uploaded
            OutputArtifacts:
              - Name: "DynamoDbArtifactSource"
            RunOrder: "1"

LandingBucket:
  Type: "AWS::S3::Bucket"
  Properties:
    AccessControl: "Private"
    VersioningConfiguration:
      Status: "Enabled"
StagingBucket:
  Type: "AWS::S3::Bucket"
  Properties:
    AccessControl: "Private"
    VersioningConfiguration:
      Status: "Enabled"

Reference to this Python code along with other examples can be found here: https://bitbucket.org/account/user/awslabs/projects/BP

ohsufresh
  • 933
  • 15
  • 21
  • Thank you for the contribution. Why not use aws cli and install it with `pip install awscli` which gives you `aws s3 sync` among other tools which intergrate well with AWS? – Felipe Alvarez Dec 19 '18 at 02:56
  • 3
    Note - Bitbucket Pipelines is only supported by Bitbucket Cloud, not Bitbucket Server (self-hosted). – SeanFromIT Mar 02 '19 at 00:23
  • BitBucket integration is now fully supported by CodePipeline: https://aws.amazon.com/about-aws/whats-new/2019/12/aws-codepipeline-now-supports-atlassian-bitbucket-cloud/ – ohsufresh Dec 19 '19 at 00:46
10

Follow up for anyone finding this now:

AWS CodeBuild now supports Atlassian Bitbucket Cloud as a Source Type, making it the fourth alongside the existing supported sources: AWS CodeCommit, Amazon S3, and GitHub.

This means you no longer need to implement a lambda function as suggested in @Kirkaiya's link to integrate with Bitbucket - it is still a valid solution depending on your use case or if you're integrating with the non-cloud version of Bitbucket.

Posted on the AWS blog Aug 10, 2017 - https://aws.amazon.com/about-aws/whats-new/2017/08/aws-codebuild-now-supports-atlassian-bitbucket-cloud-as-a-source-type/

And to clarify for the commenters, this link talks about integrating with CodeBuild not CodePipeline: you still need to find a way to trigger the pipeline, but when it is triggered CodeBuild will pull the code from BitBucket rather than having to copy the code to S3 or AWS CodeCommit before triggering the pipeline.

OllyTheNinja
  • 497
  • 3
  • 12
  • A link to a solution is welcome, but please ensure your answer is useful without it: [add context around the link](//meta.stackexchange.com/a/8259) so your fellow users will have some idea what it is and why it’s there, then quote the most relevant part of the page you're linking to in case the target page is unavailable. [Answers that are little more than a link may be deleted.](//stackoverflow.com/help/deleted-answers) – g00glen00b Dec 13 '17 at 07:44
  • 17
    But the question is about CodePipeline, not CodeBuild – CodyBugstein Jan 02 '18 at 03:29
  • As long as I understand AWS CodePipeline will use the preconfigured AWS CodeBuild as a source. – Loki Jan 20 '18 at 17:37
  • 6
    I've tried the proposed solution. It's true you can configure AWS CodeBuild to use BitBucket as source. But it won't allow to use this configured AWS CodeBuild step as part of the CodePipeline. The message is: AWS CodePipeline does not support Bitbucket AWS CodePipeline cannot build source code stored in Bitbucket. If you want AWS CodePipeline to use this build project, choose a different source provider. – Loki Jan 20 '18 at 17:59
  • You don't know if the OP or who's reading this is using Cloud Bitbucket or their own Bitbucket server. While the first case is supported, the second isn't. – eco Sep 10 '19 at 01:32
  • @einarc I'm pretty sure the self-hosted version of Bitbucket - previously Atlassian Stash was rebranded Bitbucket Server well after OP posted this question. Pretty sure OP would have referred to it as "Atlassian Stash" if that's what they were using at the time. – OllyTheNinja Sep 11 '19 at 05:05
  • @OllyTheNinja You're correct. The web hook response from Bitbucket Server is similar, so you can use a Lambda to parse it and put on S3 and then use that as a Codepipeline source, albeit somewhat complex and losing most of the advantages of git. So you're better off using Bamboo to submit to ECR and using ECR as the source for Codepipeline, Bitbucket Server integration with Bamboo suffices for the Build part of the process, it's the deploy part of CI/CD that Bamboo lacks(chicken and egg problem). – eco Dec 20 '19 at 21:44
  • @einarc why are you still talking about Bitbucket Server? We've established OP was referring to what is now Bitbucket Cloud. – OllyTheNinja Dec 22 '19 at 06:17
7

If you are looking for a way to automate your build deploy process using AWS CodePipeline with source as bitbucket without using lambdas do the following steps.

  1. Create CodeBuild which supports BitBucket as of now. https://docs.aws.amazon.com/codebuild/latest/userguide/sample-bitbucket-pull-request.html Also create a web-hook which rebuilds every time a code is pushed to repository. You cannot use a web-hook if you use a public Bitbucket repository.
  2. Code Build will trigger automatically on commit,and will create a zip file and store it in s3 bucket.
  3. Create Code Pipeline with source as S3,and deploy it using codeDeploy. As S3 is a valid source.

Note -1. In order to create a webhook , you need to have bitbucket admin access So the process from commit to deployment is totally automated. 2. As of now(April'19) CodeBuild does not support webhook on Pull request merge.If you want you can create trigger which will trigger code build say every day.

You can also create triggers to build code periodically https://docs.aws.amazon.com/codebuild/latest/userguide/trigger-create.html

Update - (June'19) - Pull Request builds for PR_Merge is supported in CodeBuild now. Reference: https://docs.aws.amazon.com/codebuild/latest/userguide/sample-bitbucket-pull-request.html#sample-bitbucket-pull-request-filter-webhook-events.

Shubham Chopra
  • 1,508
  • 10
  • 26
5

an alternative to the answer of @binary, and clarification to @OllyTheNinja's answer:

in short: let CodeBuild listen to Bitbucket's Webhook and write to an S3 object. in the pipeline listen to the update event of the latter.

In AWS codesuite

  1. define a CodeBuild project, with

    • Source: Bitbucket that uses its WebHook to listen to git-push events.
    • Buildspec: build the project according to buildspec.yml
    • Artifact store output of the build directly to an S3 container.
  2. define the pipeline:

    • Source: listen to updates to the previously defined S3 object
    • remove Build step
    • add other steps, configure deploy step
Michael Kraxner
  • 121
  • 3
  • 5
4

AWS CodeBuild Now Supports Building Bitbucket Pull Requests, and we can make use of this for a better solution without using webhooks/API Gateway/Lambda

You can use a CodeBuild to zip your code to s3 and use that as a source in your CodePipeline

https://lgallardo.com/2018/09/07/codepipeline-bitbucket

binary
  • 1,224
  • 1
  • 13
  • 20
  • 2
    You can have a BitBucket PR trigger a CodeBuild now but, at this time, I think you can only do this from the UI not from CloudFormation. – Russell Keane Dec 17 '18 at 16:50
2

For me, the best way to integrate Bitbucket with any AWS Service, is to use Pipelines to mirror any commit into a (mirror) AWS CodeCommit repo. From there, you have prime integration into any service on AWS. You can find an excellent how-to: here :

proeben
  • 41
  • 3
0

In 12/2019 AWS launched a support for Atlassian Bitbucket Cloud in beta mode.

So now you can natively integrate your AWS CodePipeline with Bitbucket Cloud

Orest
  • 5,679
  • 6
  • 40
  • 72