Pipeline artifacts
Introduction
Artifacts are files that are produced by a step. Once you've defined them in your pipeline configuration, you can share them with a following step or export them to keep the artifacts after a step completes. For example, you might want to use reports or JAR files generated by a build step in a later deployment step. Or you might like to download an artifact generated by a step, or upload it to external storage.
There are some things to remember:
Files that are in the
BITBUCKET_CLONE_DIR
at the end of a step can be configured as artifacts. TheBITBUCKET_CLONE_DIR
is the directory in which the repository was initially cloned.You can use glob patterns to define artifacts. Glob patterns that start with a
*
will need to be put in quotes. Note: As these are glob patterns, path segments “.” and “..” won’t work. Use paths relative to the build directory.Artifact paths are relative to the
BITBUCKET_CLONE_DIR
.Artifacts that are created in a step are available to all the following steps.
Artifacts created in parallel steps may not be accessible to other steps within the same group of parallel steps. If another step in the parallel group requests the artifact, it may or may not exist when it's requested.
Artifacts will be deleted 14 days after they are generated.
By default, all available artifacts are downloaded at the start of a step. You can control this behavior using the
download
field:Set
download: true
to download all artifacts from previous steps.Set
download: false
to skip downloading any artifacts.Set download to a list of specific artifact names to download only those artifacts. Be sure to specify the exact artifact names. For example, to download artifacts with names
Artifact 1
andArtifact 2
, use:download: - "Artifact 1" - "Artifact 2"
Artifact types
Artifact types identify artifacts with distinct behaviour.
The following artifact types are available:
Shared artifacts: Accessible across multiple steps. Use shared artifacts for workflows that require sharing data between steps.
Scoped artifacts: Scoped to each step and can’t be downloaded across steps. Scoped artifacts are ideal for files like log-files, test reports, screenshots, or videos generated during testing.
Using artifact types is optional, but specifying them can help you save build minutes and improve artifact organization.
Artifact fields for shared and scoped artifacts
When uploading shared or scoped artifacts in Pipelines, provide details using the following fields:
Name (required): The name of the artifact displayed in the user interface.
Type: The artifact type—either
shared
orscoped
. The default isshared
.Paths (required): A list of glob patterns to include as artifacts.
Ignore-paths: A list of glob patterns to exclude from artifacts.
Depth: The maximum depth to search for artifacts in the file hierarchy. By default, the search includes the entire hierarchy.
Capture-on: The step condition for uploading the artifact. Possible values are:
success
: Upload the artifact if the step passes.failed
: Upload the artifact if the step fails.always
: Upload the artifact regardless of the step outcome (default).
Define artifact types under the upload
section within the artifacts
section of your YAML file.
Use artifacts
In the example bitbucket-pipelines.yml
file that follows, we show how to configure artifacts to share them between steps.
When the script for 'Build and test' completes, all files under the
dist
folder and thetxt
files in the report folder (both found under theBITBUCKET_CLONE_DIR
) are kept as artifacts, with the same path.'Integration test' and 'Deploy to beanstalk' can access files in
dist
andreports
, created by the first step.Any changes to
dist
orreports
by 'Integration test' will not be available in later steps because they have not been specified as artifacts in 'Integration test'. If you wanted to keep the changes, you would need to define them as artifacts in this step, too.Artifacts will not be downloaded and thus not available during ‘Display success message’. This step will still produce the artifact
success.txt
, making it available for download in later steps.When the ‘Run post deployment tests’ step completes:
all files in the
test-reports
folder—except for HTML files—are saved as a shared artifact and made available to subsequent stepsfiles in the
logs
folder are saved as a scoped artifact if the step fails; these scoped artifacts are not passed to later steps
In the final step, ‘Run coverage report’, only the
reports/*.txt
andpdv-test-reports
artifacts from previous steps are downloaded. This targeted download helps reduce build minutes by avoiding unnecessary artifact downloads.Artifacts that are downloaded in a given step have the default file permissions set as 644 (-rw-r--r--). These permissions may need to be changed depending on the commands being run, such as pipes.
Artifacts have a 1 GB size limit.
Example bitbucket-pipelines.yml
pipelines:
default:
- step:
name: Build and test
image: node:10.15.0
caches:
- node
script:
- npm install
- npm test
- npm run build
artifacts: # defining the artifacts to be passed to each future step.
- dist/**
- reports/*.txt
- step:
name: Integration test
image: node:10.15.0
caches:
- node
services:
- postgres
script: # using one of the artifacts from the previous step
- cat reports/tests.txt
- npm run integration-test
- step:
name: Deploy to beanstalk
image: python:3.5.1
script:
- python deploy-to-beanstalk.py
- step:
name: Display success message
artifacts:
download: false # do not download artifacts in this step
paths: # defining artifacts to be passed to each future step
- success.txt
script:
- echo "Deployment for $BITBUCKET_COMMIT successful!" > success.txt
- step:
name: Run post deployment tests
caches:
- node
script:
- npm run pdv-tests
artifacts:
upload:
- name: pdv-test-reports #shared artifact
type: shared
paths:
- test-reports/**
ignore-paths:
- test-reports/*.html
- name: pdv logs #scoped artifact
type: scoped
paths:
- "logs/**"
capture-on: failed
- step:
name: Run coverage report
caches:
- node
artifacts:
download: #download only specific artifacts defined in previous steps
- reports/*.txt
- pdv-test-reports
script:
- npm run coverage-report
definitions:
services:
postgres:
image: postgres:9.6.4
Manual steps
Manual steps will have build artifacts produced by any previous steps copied into their working directory, similar to automatic steps.
Artifact downloads and expiry
You can download artifacts generated by a step:
Select the Artifact tab of the pipeline result view
Click the download icon
Artifacts are stored for 14 days following the execution of the step that produced them. After this time, the artifacts are expired and any manual steps later in the pipeline can no longer be executed.
If you need artifact storage for longer than 14 days (or more than 1 GB), we recommend using your own storage solution, like Amazon S3 or a hosted artifact repository like JFrog Artifactory. Setting a reasonable time limit for build artifacts allows us to manage our costs so we don't have to charge for storage and transfer costs of build artifacts in Pipelines.
See these pages for more information:
Was this helpful?