build pipeline with repository: is it advisable to build both on repo and end server
I am doing a classic build with install (pip/python), lint,test,format on my github repository with github actions, then deploying with ssh (copying the repo on server and deploying with docker/docker-compose).
Im wondering if it is adviseable to re-run the lint,test,format, install on the server?
Note the install is not the app install which occurs in docker but the install of yapf, pytest, pylint
Alberto last edited by
Usually, your pipeline would produce a docker image as an artifact. For a feature branch, it might be tagged with the branch name.
For changes to the default branch (main/master), you should tag the image with some kinda version string. Typically a https://semver.org/ string.
You should not be copying the repo or git pulling the repo from the server. This is because the docker build is not guaranteed to be reproducible. Meaning that tests that passed in the pipeline might all of a sudden fail on the server.
Instead, after you have tested the docker image you should push the docker image to a repository and then pull that image to the application server when you want to deploy it.
You shouldn't install or run any development tooling on your production server.