This documentation provides the steps to integrate the AWS CodePipeline, Azure DevOps, SonarQube and JFrog services.
Following are the pre-requisites setup must be completed before the setting up the AWS CodePipeline.
- Azure DevOps Repository
- Sonarqube Project
- Sonarqube Configuration
- Azure DevOps WebHooks with AWS Services
- Configure Azure DevOps Repo WebHook Trigger
In this section, we will setup the Azure DevOps Repository by cloning it from an existing GitHub spring boot project.
- Log into the Azure Devops Portal https://dev.azure.com/
- Click
New Organizationlink to create an organization - Enter the oranization name and select
Central USfrom the list underWe'll host your projects in, and clickNext - In the
Create a project to get started', enter the project name and click+ Create Project`button. - Import the existing GitHub project into the new repo following the steps in the link
- Click the
Filesunder the repo and you should see project files listed
After the repository is created, you must create a personal access token for the AWS CodePipeline to communicate with the Azure DevOps repository to get the latest code, when there is a code change happens and triggers the pipeline. Follow the below section to generate the token.
- Click on the
User Settingsicon and selectPersonal Access Tokento create the token for AWS CodePipeline to download the repo codebase as zip file - Click
+ New Tokenbutton and provide a user-friendly name (for ex, aws-codepipeline-access-token) - Under the Scopes, select
Readunder Code section to provide read access to the token consumer. - Click
Createbutton to complete the setup.
Now you have completed the AzureDevOps Repository configuration part. Let's move to the next section to setup the SonarQube project for the same repository.
In this section, we will use the sonarcloud online version of the SonarQube to create an account and setup the project to capture the repository code quality details.
- Go to https://sonarcloud.io
- Login using any of the GitHub, BitBucket etc credentials
After the account setup is completed, let's create the sonarqube project by following the below steps:
- Click the
+icon in top right corner - Select
Analyse New Projectoption - Select the repo from the list and Click
Setupbutton - Under
Configuretab, select theWith Other CI Toolsoption from the Choose another analysis method options. - Select appropriate codebase
languageunder the 'What option best describes your build?' - Select the
Operation Systemname from the list under the 'What is your OS?' - Click icon next to
+and select organization name underMy Organizations - Click
Administrationand selectOrganization settings - Right corner, copy the value of the label
Key:and store it somewhere.
After the project setup is completed, we need to generate the token for the pipeline to publish the code analysis result files for the code quality analysis.
- Click on the
Profileicon and select theaccount name - Click on the
Securitytab - Enter a friend name for the repo access token in the
Generate Tokenfield and clickGenerate. - Copy the token string by clicking the
copybutton and save it somwhere. You cannot retrieve this again.
Let's store the sonarcloud endpoint configuration details in the AWS Secret Manager. It is always a best practice to store the sensitive details in the secret manager to prevent hard-coding and leaking the credentails.
In this section, we will create a new secret to store the sonarcloud endpoint details like url, token and the organization details.
- Log into AWS Management Console and select
Secrets Manager - Click on the
Store a new secretbutton - Select
Other types of secretsin the Select secret type - Under the secret key/value, add the following key/value pairs
a. key: token, value: <paste the sonarqube token value from the section
SonarQube Token Setupstep 4. b. key: host, value: https://sonarcloud.io c. key: organization, value: (obtained from the section Sonarqube Project Setup step 9) - Click
Nextbutton - Enter the secret name value
dev/sonarcloudand clickNextbutton - Click
Nextbutton - Click
Storebutton to complete the setup
- Log into AWS Management Console
- Click on this link
- Click
Nextbutton - Enter the Output S3 Bucket Name as `azure-repo-codebase'
- In the Allowed IPs, enter the
Azure DevOps Services IPs for the Regional Identity Service - Central United Statesvalue 13.89.236.72,52.165.41.252,52.173.25.16,13.86.38.60,20.45.1.175,13.86.36.181,52.158.209.56 (Refer the link for different region) - In the Git Personal Access Token, paste the AzureDevOps Personal Access Token value created in the sub-section
Generate Personal Access Tokenstep 7. - In the Quick Start S3 Bucket Name, enter the value
Azure-DevOps-WebHooks - In the Quick Start S3 Key Prefix, enter the value
Assets/ - Click
Next - Click
Next - In the Review screen, under Capabilities section select the checkbox for
I acknowledge that AWS CloudFormation might create IAM resources. - Click
Create stackto complete the setup - After the stack creation is completed, go to the
Outputstab and copy the value of key nameZipDownloadWebHookApi - Go to the section
Configure Azure DevOps Repo WebHook Triggerand follow the steps 1-12. - Go to
Lambdaservice and select theAzureRepo-to-Amazon-S3-ZipDlLambdalambda function to edit. - In the code editor, replace the following existing code
if 'X-Hub-Signature' in event['params']['header'].keys():
hostflavour = 'githubent'
elif 'X-Gitlab-Event' in event['params']['header'].keys():
hostflavour = 'gitlab'
elif 'User-Agent' in event['params']['header'].keys():
if event['params']['header']['User-Agent'].startswith('Bitbucket-Webhooks'):
hostflavour = 'bitbucket'
elif event['params']['header']['User-Agent'].startswith('GitHub-Hookshot'):
hostflavour = 'github'
elif 'Bitbucket-' in event['params']['header']['User-Agent']:
hostflavour = 'bitbucket-server'
elif event['body-json']['publisherId'] == 'tfs':
hostflavour='tfs'with this new code snippet
if event['body-json']['publisherId'] == 'tfs':
hostflavour='tfs'
elif 'X-Hub-Signature' in event['params']['header'].keys():
hostflavour = 'githubent'
elif 'X-Gitlab-Event' in event['params']['header'].keys():
hostflavour = 'gitlab'
elif 'User-Agent' in event['params']['header'].keys():
if event['params']['header']['User-Agent'].startswith('Bitbucket-Webhooks'):
hostflavour = 'bitbucket'
elif event['params']['header']['User-Agent'].startswith('GitHub-Hookshot'):
hostflavour = 'github'
elif 'Bitbucket-' in event['params']['header']['User-Agent']:
hostflavour = 'bitbucket-server'- Similarly, replace the following line
archive_url = event['body-json']['resourceContainers']['account']['baseUrl'] + 'DefaultCollection/' + event['body-json']['resourceContainers']['project']['id'] + '/_apis/git/repositories/' + event['body-json']['resource']['repository']['id'] + '/items'with this code
archive_url = event['body-json']['resourceContainers']['account']['baseUrl'] + event['body-json']['resourceContainers']['project']['id'] + '/_apis/git/repositories/' + event['body-json']['resource']['repository']['id'] + '/items'- Click the
Savebutton to complete the code change
- Log into the Azure DevOps portal and select the Repo
- Click the
Project Settingsbottom of the left navigatiojn - Select
service hooks - Click
+to add a new webhook - Select
Web Hooksfrom the list of Service and clickNext - Select
Code Pushedfrom the list underTrigger on this type of event - Under the
Repository, select the repo name from the list - Select
Masterfrom the list under the branch - Leave the default value
[Any]for the Pushed by member of group and clickNextbutton - In the
Actionscreen, pase the value obtained from the section Azure DevOps WebHooks with AWS Services step 14. - Click
Testbutton for test - Click
Finishbutton to complete the setup
- Log into the AWS Account and select
CodePipelineservice - Click
Create Pipelinebutton - Under the
Pipeline Settings, enter the pipeline name. - Expand the
Advanced settings - Make sure the
Default Locationoption is selected underArtifact StoreandDefault AWS Managed Keyoption under theEncryption keysection
- Under the
Source, select Amazon S3 and enter the bucket nameazure-repo-codebase(specified in the step 4 under the sectionAzure DevOps WebHooks with AWS Services) and paste the S3 object key as<Azure Repo Organization Name>/<repo name>/master/<repo name>.zip
- Under the
Build, selectAWS CodeBuildoption. - Click
create projectbutton (follow the steps under the sectionCodeBuild Project Setup for Unit Test) - Click the
Nextbutton - Click
skip deploy stagebutton to skip the deployment step - Review the pipeline details and click
Create Pipelinebutton to complete the initial pipeline step - Goto IAM service and search for the service role associated with Unit Test build project.
- Click + Add inline policy to add inline policy to give access to read the secret manager key
- Click JSON tab and paste the following json snippet:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "secretsmanager:GetSecretValue",
"Resource": "<secret ARN>"
}
]
}- To verify the Unit Test setup, goto azure repo, edit and update readme.md file and click commit.
- Check the pipeline must be triggered per above code change.
- Select the pipeline name link from the list
- In the pipeline screen, select
Editto add additional stages - Below Unit Test stage, click the
+ Add Stagebutton to addQuality-Gatestage - In the Add Stage, enter the stage name
Quality-Gate. - Click
+ Add action groupbutton to add the steps - Enter
code-qualityin theAction Name - Select
AWS CodeBuildin theAction provider - Select the region where the pipline s3 bucket is located
- Select
OutputArtifactfromt the list underInput Artifacts - Click
Create projectto createCodeBuild Project Setup for Quality Gate
- Enter the build project name (preferrably, prefix
-unit-testwith the pipeline name) - Under
Environmentsection, selectManaged Image - Select
Ubuntufor the operation system - Select
Standardfor the runtime - Select
aws/codebuild/standard:3.0for the image. (Refer the link for which OS and image should be selected based on the lanaguge version.) - Under buildsepc, select
insert build commandsoption and clickswitch to editorlink - In the
build commandstext editor, update with the following code
version: 0.2
env:
variables:
Project: "<Sonarqube project name goes here>"
secrets-manager:
LOGIN: dev/sonarcloud:token
HOST: dev/sonarcloud:host
Organization: dev/sonarcloud:organization
phases:
install:
#If you use the Ubuntu standard image 2.0 or later, you must specify runtime-versions.
#If you specify runtime-versions and use an image other than Ubuntu standard image 2.0, the build fails.
runtime-versions:
java: openjdk8
pre_build:
commands:
- apt-get update
- wget https://binaries.sonarsource.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-4.3.0.2102-linux.zip
- unzip ./sonar-scanner-cli-4.3.0.2102-linux.zip
- export PATH=$PATH:/sonar-scanner-cli-4.3.0.2102-linux/bin/
build:
commands:
- mvn clean install
- mvn sonar:sonar -Dsonar.login=$LOGIN -Dsonar.host.url=$HOST -Dsonar.projectKey=$Project -Dsonar.organization=$Organization -Dsonar.jacoco.reportPath=target/coverage-reports/jacoco-unit.exec
artifacts:
files:
- '**/*'
base-directory: 'target'- Click
continue to pipelinebutton. It will take you back to the sectionCodePipeline Setupstep 21.
- Enter the build project name (pipeline name prefix with -quality-gate)
- Under
Environmentsection, selectManaged Image - Select
Ubuntufor the operation system - Select
Standardfor the runtime - Select
aws/codebuild/standard:4.0for the image. (Refer the link for which OS and image should be selected based on the lanaguge version.) - Under buildsepc, select
insert build commandsoption and clickswitch to editorlink - In the
build commandstext editor, update with the following code and update the Project variable with the sonarqube project name
version: 0.2
env:
variables:
Project: "<Sonarqube project name goes here>"
phases:
install:
#If you use the Ubuntu standard image 2.0 or later, you must specify runtime-versions.
#If you specify runtime-versions and use an image other than Ubuntu standard image 2.0, the build fails.
runtime-versions:
java: corretto8
build:
commands:
- curl https://sonarcloud.io/api/qualitygates/project_status?projectKey=$Project >result.json
- if [ $(jq -r '.projectStatus.status' result.json) = ERROR ] ; then $CODEBUILD_BUILD_SUCCEEDING -eq 0 ;fi
- Click
continue to pipelinebutton. It will take you back to the sectionCodePipeline Setupstep 8.
- Log into the Azure DevOps portal and select the Repo
- Select any file (for ex, readme.md) and click
Editbutton - Make some changes (doesn't matter what change it is)
- Click
Savebutton - Click
Commitbutton in the commit dialog screen to trigger the pipeline - Goto AWS CodePipeline and select the pipeline name
- Check for the build is triggered
In this section, follow the steps to configure JFrog instance using the Open Source JFrog Artifactory from the AWS Marketplace and start the artifactory instance.
- Log into the AWS Account and select the Region where the AWS CodePipeline project will be created.
- Goto AWS Marketplace and click Discover products
- Search for
JFrog Open Sourceand select the one from the publisherMiri Infotechfrom the list - Click
continue to subscribebutton - Click
continue to configurationbutton - Leave the default selection under
Delivery MethodandSoftware Version - Select the desired
regionfrom the list and clickContinue to Launchbutton. Wait for the instance to status change to READY state. - Select the ec2 instance and copy the private ip address and host name.
- click Connect button and follow the chmod and ssh commands (use ubuntu instead of root) to remote log into the ec2 instance.
- Type the command
sudo vi /etc/hoststo open the hosts file - After the below first line, type the private ip and host name separated by tab space to the hosts file, save and exit from the vi.
- Start the artifactory by entering the following commands
sudo su
cd /home/ubuntu/artifactory-oss-6.8.2/bin
./artifactory.sh
- After the
Artifactory successfully startedmessage, open the browser and type the URL http://ec2-instance-public-hostname:8081/artifactory. (Replace the ec-instance-public-hostname with the actual value) - Login with the admin credentials, admin/password.
- Open the ec2 instance security group and check the inbound rules for ports 22 and 8081 must be opened. Port 22 must be allowed from specific ip and 8081 must be opened for all (0.0.0.0).
After the JFrog Artifactory is setup and running, a Maven repostiory needs to setup to resolve and deploy artifacts and plugins.
- Login with the admin credentials, admin/password.
- Click the link
Welcome, adminand selectQuick Setupto create maven repository - Select
Mavenfrom the repositories and clickcreatebutton. - Under
Set Me Upsection, you should see the following repository keys for snapshot and release- lib-snapshot
- lib-snapshot-local
- lib-release
- lib-release-local
- Now the Maven repository setup is completed.
Maven project configurations are stored in pom.xml file. In this section, we will configure the maven-artifactory sections to integrate with JFrog Artifactory.
First, we will configure the plugin and dependency for the artifactory-maven.
- Go to the Azure DevOps Repo, select the pom.xml file and click Edit button.
- Add the following maven configuration inside the tag to specify the JDK 1.8 version for build and specify the code coverage plugin jacoco configuration
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<jacoco.version>0.8.3</jacoco.version>
<sonar.java.coveragePlugin>jacoco</sonar.java.coveragePlugin>
<sonar.dynamicAnalysis>reuseReports</sonar.dynamicAnalysis>
<sonar.jacoco.reportPath>${project.basedir}/../target/jacoco.exec</sonar.jacoco.reportPath>
<sonar.language>java</sonar.language>- Add the below plugin configurations for the maven plugins jacoco-maven-plugin and maven-compiler-plugin under the section.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>${jacoco.version}</version>
<configuration>
<skip>${maven.test.skip}</skip>
<destFile>${basedir}/target/coverage-reports/jacoco-unit.exec</destFile>
<dataFile>${basedir}/target/coverage-reports/jacoco-unit.exec</dataFile>
<output>file</output>
<append>true</append>
<excludes>
<exclude>*MethodAccess</exclude>
</excludes>
</configuration>
<executions>
<execution>
<id>jacoco-initialize</id>
<goals>
<goal>prepare-agent</goal>
</goals>
<phase>test-compile</phase>
</execution>
<execution>
<id>jacoco-site</id>
<phase>verify</phase>
<goals>
<goal>report</goal>
</goals>
</execution>
</executions>
</plugin>- Add the following XML configuration below the
</properties>line to configure build artifact publishing artifactory details for SNAPSHOT and RELEASE stages
<distributionManagement>
<snapshotRepository>
<id>snapshots</id>
<name>ip-172-31-42-114-snapshots</name>
<url>http://${internal.repo.server.url}/artifactory/libs-snapshot-local</url>
</snapshotRepository>
<repository>
<id>central</id>
<name>ip-172-31-42-114-releases</name>
<url>http://${internal.repo.server.url}/artifactory/libs-release-local</url>
</repository>
</distributionManagement>- Final step, create a new empty file names
settings.xmlin the root directory of the repository and add the following xml configuration to specify the server and repository configuration
<settings>
<servers>
<server>
<username>${internal.repo.username}</username>
<password>${internal.repo.password}</password>
<id>central</id>
</server>
<server>
<username>${internal.repo.username}</username>
<password>${internal.repo.password}</password>
<id>snapshots</id>
</server>
</servers>
<profiles>
<profile>
<repositories>
<repository>
<snapshots>
<enabled>false</enabled>
</snapshots>
<id>central</id>
<name>libs-release</name>
<url>http://${internal.repo.server.url}/artifactory/libs-release</url>
</repository>
<repository>
<snapshots />
<id>snapshots</id>
<name>libs-snapshot</name>
<url>http://${internal.repo.server.url}/artifactory/libs-snapshot</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<snapshots>
<enabled>false</enabled>
</snapshots>
<id>central</id>
<name>libs-release</name>
<url>http://${internal.repo.server.url}/artifactory/libs-release</url>
</pluginRepository>
<pluginRepository>
<snapshots />
<id>snapshots</id>
<name>libs-snapshot</name>
<url>http://${internal.repo.server.url}/artifactory/libs-snapshot</url>
</pluginRepository>
</pluginRepositories>
<id>artifactory</id>
</profile>
</profiles>
<activeProfiles>
<activeProfile>artifactory</activeProfile>
</activeProfiles>
</settings>In this section, new stage for building and publishing artifacts to JFrog Artifactory will be added in the existing pipeline.
- Log into the AWS Management Console and select Code Pipeline service,
- Select the pipeline name link from the pipelines list and click
Editbutton. - Goto the Code-Quality step and click the below button
+ Add Stageto add new build step. - Enter the action name
Publish-Artifacts' and clickAdd Stage` button - Click
+ Add Action Groupbutton. - Enter the Action name
publish-artifacts, selectAWS CodeBuildunder Action Provider, selectSource Artifactunder Input Artifacts and clickCreate Projectbutton to createCodeBuild Project Setup for Publish Artifacts
- Enter the build project name (pipeline name prefix with -publish-artifacts)
- Under
Environmentsection, selectManaged Image - Select
Ubuntufor the operation system - Select
Standardfor the runtime - Select
aws/codebuild/standard:3.0for the image. (Refer the link for which OS and image should be selected based on the lanaguge version.) - Under buildsepc, select
insert build commandsoption and clickswitch to editorlink - In the
build commandstext editor, update with the following code and update the Project variable with the sonarqube project name
version: 0.2
env:
secrets-manager:
USER: dev/artifactory:user
PASSWORD: dev/artifactory:password
URL: dev/artifactory:url
phases:
install:
#If you use the Ubuntu standard image 2.0 or later, you must specify runtime-versions.
#If you specify runtime-versions and use an image other than Ubuntu standard image 2.0, the build fails.
runtime-versions:
java: openjdk8
pre_build:
commands:
- cp settings.xml ~/.m2
build:
commands:
- mvn -Dinternal.repo.username=$USER -Dinternal.repo.password=$PASSWORD -Dinternal.repo.server.url=$URL clean deploy
artifacts:
files:
- '**/*'
base-directory: 'target'In this section, we create secret manager configuration to store the JFrog Artifactory configurtion.
In this section, follow the steps to get the admin user encrypted password to configure in the secret manager.
- Log into the Artifactory using admin credentials.
- Click on the
Welcome,adminlink right corner and selectEdit Profile - Enter the admin password in the
Current Passwordfield and clickUnlockbutton - Under the
Authentication Settings, copy the text in the Encrypted Password text field and save it somewhere to use it during the secret manager configuration
- Log into AWS Management Console and select
Secrets Manager - Click on the
Store a new secretbutton - Select
Other types of secretsin the Select secret type - Under the secret key/value, add the following key/value pairs
- key: url, value: <paste the JFrog Artifactory ec2 Public DNS (IPv4) name and port 8081>. For ex, ec2-54-146-7-13.compute-1.amazonaws.com:8081
- key: user, value: admin
- key: password, value: (Note: Paste the admin Encrypted Password field value)
- Click
Nextbutton - Enter the secret name value
dev/artifactoryand clickNextbutton - Click
Nextbutton - Click
Storebutton to complete the setup
- Goto IAM service and search for the service role associated with
Publish Artifactsbuild project. - Click + Add inline policy to add inline policy to give access to read the secret manager key
- Click JSON tab and paste the following json snippet:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "secretsmanager:GetSecretValue",
"Resource": "<dev/artifactory secret ARN>"
}
]
}