Private PyPi Server Setup
For one of our bigger clients we’re developing a suite of web applications that all share an authentication system. As such, it is a perfect candidate to be made into a Python package shared between them.
However, distribution of packages you want to keep off PyPi gets tricky. There are paid solutions that we wanted to avoid due to lack of flexibility (low initial number of packages and increases in cost when scaling beyond that).
Initially, we just included zipped packages in the apps repository, which works, but has a lot of drawbacks, such as:
- Being unable to easily check the version of the package included
- Git doesn’t play well with binary files when merging branches
- “Release process” becomes rather tedious (commit to GitHub, download the zip, save it to the target repository)
After being bored to death by the process and then some, I decided to research possible solutions for deploying a private PyPi-compatible server. Stumbling upon pypiserver and PyPICloud, I went for the latter, since there’s an Ansible playbook available for deployment, and we can use S3 as the storage backend, which fits nicely with the rest of the stack sitting in AWS.
After adjusting said Ansible playbook a little bit to include Lets-Encrypt-powered SSL (since basic http auth is used), I had to make some configuration choices. Initially adding PostgreSQL caused some server errors that turned out to be EOF when trying to read from a closed DB connection. Instead of wasting any more time on debugging this, I just went back to using SQLite, which should be perfectly fine for a low-traffic server.
I created separate users with read/write access for each package we planned to host, then moved on to setting up the process of uploading packages. This involves creating a .pypirc file containing the server url, along with the credentials needed, such as:
[distutils] index-servers = internal [internal] repository: https://pypi.myserver.com/ username: username password: password
Then uploading the package using a command like this:
python setup.py sdist bdist_dumb upload -r internal
Being lazy, I used Fabric to simplify that into a
fab publish command that pushes a new version of the package to the server.
Installing the packages
PyPICloud doesn’t provide an API for pip to interact with it using index-url, but it will work as an html index for it, which can be utilized using a find-links option. For each package in
pip.conf we need an entry (note that URLs include http basic auth):
[global] find-links = https://read-only-user:firstname.lastname@example.org/simple/package1 https://read-only-user:email@example.com/simple/package2
After that, installing packages is as simple as specifying the package name and version in the requirements file.