SECRET OF CSS

Three Ways To Publish Sphinx Documentation | by ukyen | Sep, 2022


Integrate with GitHub Actions and publish automatically

1*99JYFpzMW6MkNojelwzYTg
Publish Sphinx documentation to three different platforms.

In the previous article, we discussed how to generate documentation with Sphinx CLI tools quickly. In this article, I will demonstrate three different ways to publish your Sphinx documentation. Hence, anyone with the URL can view your documentation.

I will also integrate the publishing processes into GitHub Actions, so the publishing processes will be fully automated.

GitHub Pages is a good tool for hosting a static website, e.g., Sphinx documentation. I use this GitHub Action, actions-gh-pages, to deploy my documentation to GitHub Pages.

It’s really simple to use. You just need to provide the path to your documentation, e.g. docs/build, then this action will help you handle the rest steps!

- name: Publish to GitHub Pages
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: docs/build

When it is finished, you can view the published documentation from https://<username>.github.io/<repository_name>/.

Quite easy, right? Let’s move to the next platform.

I recommend using upload-cloud-storage GitHub action to upload your documentation to GCS. There are three steps to use this action. First, set up the permission to your job so the caller has permission to access GitHub secrets. Since I also use git-auto-commit-action for commit changes, the contents should also be set as write.

permissions:
contents: ‘write’
id-token: ‘write’

Second, authenticate action with google-github-actions/auth, so we can upload files to GCS. Last, specify the directory of your documentation and the target bucket, e.g. my-sample-bucket-123.

- name: Gcloud Auth
uses: ‘google-github-actions/auth@v0’
with:
credentials_json: ${{ secrets.GCP_CREDENTIALS }}
- name: Upload documentation to GCS
uses: ‘google-github-actions/upload-cloud-storage@v0’
with:
path: ‘docs/build’
destination: ‘my-sample-bucket-123’

Next step, we need to make all objects under my-sample-bucket-123 public. Hence, people can read the documentation on the internet. You can find the steps here. Now, you can view your documentation through https://storage.googleapis.com/<bucket-name>/build/index.html.

However, GCS does not support custom domains. You can set up a Load Balancer in front of GCS. But this is out of the scope of this article, so I won’t discuss it here.

Here’s the complete workflow. Whenever documentation is generated, this workflow automatically publishes our documentation to GitHub Pages and GCS.

Unlike the other two ways, after you follow the instructions to import a project to readthedocs, it will automatically build the documentation and trigger building whenever you change the repository.

After you import a project, you need to go to Admin > Advanced Settings, under Default settings, specify the path to the requirements file and the path to conf.py, so it can build the documentation properly.

1*sflP0hq2As2NS0bs7PfUeA

When you see Passed under the Builds tab, it means the documentation has been built successfully! Click the View Docs to view the documentation. The public URL would be https://<project-name>.readthedocs.io.

1*wbhB5P16pgJ1rp77vPVdUA

In the last article, we built a script to generate nice-looking documentation and integrate it with GitHub Actions. In this article, we demonstrated three ways to publish your documentation automatically. Therefore, now we have a complete pipeline for generating and hosting our documentation! I hope you find these two articles helpful for building your documentation pipeline.

You can find the complete examples in this repository.



News Credit

%d bloggers like this: