DevOpsGitHub

Using Artifacts: Passing Data Between GitHub Actions Jobs

TT
TopicTrick Team
Using Artifacts: Passing Data Between GitHub Actions Jobs

Using Artifacts: Passing Data Between GitHub Actions Jobs

Because every Job in a GitHub Actions workflow executes on a completely fresh, isolated virtual machine, they cannot see each other's files. If your "Build" job creates a compiled binary, your "Test" job cannot access it by default. Artifacts provide the secure bridge: you use upload-artifact to save files to GitHub's storage, and download-artifact to pull them onto a different runner, effectively passing data across isolation boundaries.


Table of Contents


The Isolation Boundary Problem

Imagine a simple CI workflow:

  • Job A (Build): Compiles your TypeScript code into a /dist folder.
  • Job B (Lint): Checks your source code for errors.
  • Job C (Deploy): Takes the /dist folder and ships it to a server.

If these jobs run in parallel (the default), Job C will fail because it is on an entirely separate machine that doesn't have the /dist folder. To solve this, Job A must "hand off" the compiled results to GitHub's cloud storage so Job C can download them later.


Artifacts vs. Caching

This sounds similar to caching, but they have fundamentally different use cases:

  • Caching: Used to speed up a single job by reusing dependencies (like node_modules) from a previous run. It persists across different workflow executions.
  • Artifacts: Used to pass files between different jobs in the same workflow run. Artifacts are typically unique to that specific commit (e.g., a compiled .apk file or a test coverage report).

Uploading Your First Artifact

To save a file as an artifact, you use the actions/upload-artifact action.

yaml

As soon as this step finishes, GitHub zips up the dist/ folder and saves it. You will even see a link to download it at the bottom of the Summary page for that workflow run.


Downloading Artifacts in Subsequent Jobs

Now that the file is in GitHub's cloud storage, your "Deploy" job can retrieve it using actions/download-artifact.

yaml

[!IMPORTANT] Always use the needs: keyword when downloading artifacts. If Job B starts before Job A has finished uploading the file, the download will fail with a "File not found" error.


Managing Artifact Retention

By default, GitHub keeps artifacts for 90 days. After that, they are permanently deleted to save space.

If you are a high-velocity team pushing code 50 times a day, your artifacts can consume a massive amount of storage quickly. You can (and should) lower the retention period in your YAML to save on your storage quota:

yaml

Frequently Asked Questions

Can I upload multiple files to one artifact? Yes. You can provide glob patterns in the path field, such as path: bin/*.exe or path: release/*. GitHub will bundle all matching files into a single zip file.

Can I access artifacts from a different workflow run? Not easily with the standard actions. download-artifact is designed for the current run only. To fetch artifacts from a "latest successful run" of a different workflow, you typically need to use the GitHub API or specialized community actions.


Key Takeaway

Artifacts are the essential communication protocol for multi-job workflows. By using upload-artifact to save build results and download-artifact to retrieve them on separate runners, you enable your CI/CD pipeline to be truly modular—decoupling your expensive build logic from your specialized deployment logic.

Read next: Matrix Strategies: Running Tests on Every OS Simultaneously →