Code Coverage for Rust Projects with GitHub Actions
I particularly don’t like using the code coverage as a metric to drive the tests I write. Nevertheless, there is a lot of value in knowing what parts of your system are covered by tests or not.
Rust projects have great test tools that are out of the box, like `cargo test, ‘ but do not include a code coverage tool. In this post, I will show how I introduced it to my project NunDb.
The tarpaulin tool
Tarpaulin is the tool I chose; tarpaulin is not in a stable version yet, so make sure to make a version that works for your case. Read more about it in the tarpaulin repo.
Install tarpaulin
cargo install cargo-tarpaulin
In my case I had to install the nightly version of the tool, so I had to run the following:
cargo +nightly install cargo-tarpaulin
By the time I am writing this post, the latest version of tarpaulin is v0.30.0
.
Running it locally
Starting from the simplest case, to run all the tests, run the next command in the base of your project:
cargo tarpaulin
That may work for your case, but it did not work in my case, I have many tests that are not well suited to run in parallel because they write to disk (NunDb is a database and all tests run in the default config saving to the same place), so I had to run the tests with only one thread, like this:
cargo tarpaulin -- --test-threads=1
Ignoring performance tests
It still did not run all tests successfully. As tarpaulin has to calculate the coverage, which slows down the tests quite a bit. Therefore, I had to skip some performance tests by adding #[cfg(not(tarpaulin))]
in the tests I wanted to skip, like the following:
#[test] #[cfg(not(tarpaulin))] fn restore_should_be_fast() { // ... }
Checking the e2e tests
That was progress, but my E2E tests were still not working because in order to run all tests for NunDb, I had to run multiple instances of the database and kill the process at the end of each test. To support that, You have to enable the ` –engine llvm` flag in the tarpaulin command since the default engine “relies on the sigtrap signal to catch when the instrumentation points are hit” read more here.
Additionally, to report the code coverage from the running process called in my E2E tests, I had to add the --follow-exec
flag.
Here is the final command I used to run the tests:
cargo tarpaulin --engine llvm --follow-exec -- --test-threads=1
All tests were running locally as expected now \o/.
By default, tarpaulin will force you to clean the build, which is most likely what you want for CI, but it slows down quite a bit for testing locally, so you may want to add the flag --skip-clean
to speed it up when running locally.
It would look like the following:
cargo tarpaulin --engine llvm --follow-exec --skip-clean -- --test-threads=1
Running it on GitHub Actions
At this point, I was happy locally. Let’s start the work to get it to run on GitHub Actions.
The first step is to create a .github/workflows/coverage.yml.
That file will be used to run the tests and push the result to Codecov.
For that, you will need three steps:
1. Checkout the repository
This is the most straightforward step. You need to add the following lines to your workflow file:
- name: Checkout repository uses: actions/checkout@v2
2. Run the tests with tarpaulin (where you can pass the flags you want)
This one is a little bit more complex, first I set the environment variables I need to run the tests, TIME_TO_START
and NUN_ELECTION_TIMEOUT
and I also use the command we came up with before:
- name: Generate code coverage env: TIME_TO_START: 4 NUN_ELECTION_TIMEOUT: 3000 run: | mkdir dbs&&cargo +nightly tarpaulin --follow-exec --engine llvm --verbose --all-features --workspace --timeout 120 --out xml -- --test-threads=1
3. Upload the results to Codecov (Needs setup in the Codecov website prior)
Now we are close, it is time to upload the result to Codecov. For that you will need to add the following lines to your workflow file:
- name: Upload to codecov.io uses: codecov/codecov-action@v2 with: token: $ # not required for public repos fail_ci_if_error: true
To get the CODECOV_TOKEN,
go to the Codecov website, authenticate with your GitHub account, find your repo in the list, and click on Configure; on the next page, you will see the token to be copy pasted into the GitHub Secrets.
To set up the secret in GitHub, go to your repository, click on Settings -> Secrets and variables -> Actions, and add a new secret.
The final file should look like this:
name: Code Coverage on: push: branches: - master pull_request: jobs: coverage: name: coverage runs-on: ubuntu-latest container: image: xd009642/tarpaulin:develop-nightly options: --security-opt seccomp=unconfined steps: - name: Checkout repository uses: actions/checkout@v2 - name: Generate code coverage env: TIME_TO_START: 4 NUN_ELECTION_TIMEOUT: 3000 run: | mkdir dbs&&cargo +nightly tarpaulin --follow-exec --engine llvm --verbose --all-features --workspace --timeout 120 --out xml -- --test-threads=1 - name: Upload to codecov.io uses: codecov/codecov-action@v2 with: token: $ # not required for public repos fail_ci_if_error: true
Now, when you push code, you should see the tests running and the coverage being uploaded to Codecov.
Adding the badge to the README
In your Codecov project page, you will find a link to Settings and then Badge; there, you will find the markdown to add to your README file.
Something like this:
[![codecov](https://codecov.io/YOUR_REPO/branch/master/graph/badge.svg?token=YOUR_TOKEN)](https://codecov.io/gh/YOUR_REPO)
Conclusion
Adding cover coverage to your project can help you monitor the modules your automated tests are neglecting. Keep in mind that a single number does not define the quality of your project. Enjoy the coding and use the tools for what they are good for.
- Code Coverage for Rust Projects with GitHub Actions
- NunDb is now referenced in the Database of Databases
- Real-time Medical Image Collaboration POC Made Easy with OHIF and Nun-db
- How to create users with different permission levels in Nun-db
- Match vs Hashmap! Which one is faster in rust?
- Towards a More Secure Nun-db: Our Latest Security Enhancements
- Building a Trello-like React/Redux App with NunDB with offline and conflict resolution features
- Introduction to managing conflicts in NunDB
- Keepin up with Nun-db 2023
- The new storage engine of Nun-db
- Stop procrastinating and just fix your flaky tests, it may be catching nasty bugs
- An approach to hunt and fix non-reproducible bugs - Case study - Fixing a race conditions in Nun-db replication algorithm in rust
- NunDB the debug command
- Keeping up with Nun-db 2021
- Writing a prometheus exporter in rust from idea to grafana chart
- Integration tests in rust a multi-process test example
- Leader election in rust the journey towards implementing nun-db leader election
- How to make redux TodoMVC example a real-time multiuser app with nun-db in 10 steps
- A fast-to-sync/search and space-optimized replication algorithm written in rust, The Nun-db data replication model
- NunDb How to backup one or all databases
- How to create your simple version of google analytics real-time using Nun-db
- Migrating a chat bot feature from Firebase to Nun-db
- Keepin' up with NunDB
- Going live with NunDB