Artifacts is a list of files and directories which are attached to a job
after it completes successfully. This feature is enabled by default in all
GitLab installations. Keep reading if you want to know how to disable it.

If you don't want to use the local disk where GitLab is installed to store the
artifacts, you can use an object storage like AWS S3 instead.
This configuration relies on valid AWS credentials to be configured already.
Use an object storage option like AWS S3 to store job artifacts.

Object Storage Settings

For source installations the following settings are nested under artifacts: and then object_store:. On omnibus installs they are prefixed by artifacts_object_store_.

Setting

Description

Default

enabled

Enable/disable object storage

false

remote_directory

The bucket name where Artifacts will be stored

direct_upload

Set to true to enable direct upload of Artifacts without the need of local shared storage. Option may be removed once we decide to support only single storage for all files.

false

background_upload

Set to false to disable automatic upload. Option may be removed once upload is direct to S3

true

proxy_download

Set to true to enable proxying all files served. Option allows to reduce egress traffic as this allows clients to download directly from remote storage instead of proxying all data

false

connection

Various connection options described below

S3 compatible connection settings

The connection settings match those provided by Fog, and are as follows:

Setting

Description

Default

provider

Always AWS for compatible hosts

AWS

aws_access_key_id

AWS credentials, or compatible

aws_secret_access_key

AWS credentials, or compatible

aws_signature_version

AWS signature version to use. 2 or 4 are valid options. Digital Ocean Spaces and other providers may need 2.

4

region

AWS region

us-east-1

host

S3 compatible host for when not using AWS, e.g. localhost or storage.example.com

s3.amazonaws.com

endpoint

Can be used when configuring an S3 compatible service such as Minio, by entering a URL such as http://127.0.0.1:9000

(optional)

path_style

Set to true to use host/bucket_name/object style paths instead of bucket_name.host/object. Leave as false for AWS S3

false

use_iam_profile

Set to true to use IAM profile instead of access keys

false

In Omnibus installations:

The artifacts are stored by default in
/var/opt/gitlab/gitlab-rails/shared/artifacts.

Edit /etc/gitlab/gitlab.rb and add the following lines by replacing with
the values you want:

The artifacts are stored by default in
/home/git/gitlab/shared/artifacts.

Edit /home/git/gitlab/config/gitlab.yml and add or amend the following
lines:

artifacts:enabled:trueobject_store:enabled:trueremote_directory:"artifacts"# The bucket nameconnection:provider:AWS# Only AWS supported at the momentaws_access_key_id:AWS_ACCESS_KEY_IDaws_secret_access_key:AWS_SECRET_ACCESS_KEYregion:eu-central-1

Expiring artifacts

If an expiry date is used for the artifacts, they are marked for deletion
right after that date passes. Artifacts are cleaned up by the
expire_build_artifacts_worker cron job which is run by Sidekiq every hour at
50 minutes (50 * * * *).

To change the default schedule on which the artifacts are expired, follow the
steps below.

Set the maximum file size of the artifacts

Provided the artifacts are enabled, you can change the maximum file size of the
artifacts through the Admin area settings.

Storage statistics

You can see the total storage used for job artifacts on groups and projects
in the administration area, as well as through the groups
and projects APIs.

Implementation details

When GitLab receives an artifacts archive, an archive metadata file is also
generated by GitLab Workhorse. This metadata file describes all the entries
that are located in the artifacts archive itself.
The metadata file is in a binary format, with additional GZIP compression.

GitLab does not extract the artifacts archive in order to save space, memory
and disk I/O. It instead inspects the metadata file which contains all the
relevant information. This is especially important when there is a lot of
artifacts, or an archive is a very large file.

When clicking on a specific file, GitLab Workhorse extracts it
from the archive and the download begins. This implementation saves space,
memory and disk I/O.