This post is focused on packaging distribution modules using "setuptools" and publish them onto PyPI. To better understand these concepts we will clarify the concept of module and, since most of the people use the term "package" in place of either "import package" and "distribution package", we will also clarify the  term "package" too to avoid confusion. In addition to that, we will highlight the differences, pros and cons of source, binary and wheel distribution packages. All of this taking care of "styling" things so that they can easily be used within a Continuous Integration environment.

The operating environment used in this post is Red Hat Enterprise Linux 8 or CentOS 8 - using a different environment may lead to having to adapt or even change things.

This is the second of a set of three posts aimed at showing how to create a full featured Python3 project: so the requisite to go on reading it is having read the Python Full Featured Project post first.

The third and last post Packaging a Python Wheel as RPM is about how to pack this project into two RPM packages:

  • the first installs the modules provided by the Python package
  • the second installs the example script that implements a sample application that imports these modules. This last package also performs post installation tasks so as to reconfigure rsyslog

In addition to that, in the third and last post we  also see how to digitally sign these RPM packages, and in general how to set up an RPM development environment, with a good overview on how to exploit the RPM building tools.

If you have not read it yet, please start from that post, since you need to have the project with all of the code that is necessary to start with.

Modules

The very first thing is having a clear understanding of terms such as "module", "package", import and so on: it is very easy to mix-up things, since very often people uses the term package and module in an interchangeable way, and the term package is used both as the name of a container of modules as well as the cabinet used to package it.

Python Modules

Modules provide a handy way to split the code into more files, logically grouping related objects and functions within a namespace: the namespace has the same name of the file of the module itself - you can easily get the name of a module by accessing its __name__attribute.

This means that Python modules are nothing but  files containing Python code the main program can import if needed: this of course promotes maintainability and code re-usability.

Honestly, even the main file is a special module called "__main__": this means that it's code is within the "__main__" namespace.

We can easily see this by launching Python interactively:

python3

and print the value of the "__name__" attribute:

>>> print(__name__)
__main__

As an example let's create "foomodule.py" file with the following contents:

def cheer(name):
    print(f"Hello {name}")

def _say_goodbye(name):
    print(f"Goodbye {name}")

we can easily test it launching Python interactively:

python3

load the module and print its name:

>>> import foomodule
>>> print(foomodule.__name__)
foomodule

as expected the name is "foomodule"; let's list all of the objects and attributes defined within it by using the "dir()" function, (it is a function that enumerates the contents of the namespace - so the name of the module - provided as argument):

>>> dir(foomodule)
['__builtins__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', '__spec__', '_say_goodbye', 'cheer']

as you see besides several builtin objects, the module has also the "_say_goodbye(name)" and "cheer(name)" functions.

Here is an example application that loads the module and runs the cheer() function:

#!/usr/bin/env python3
import foomodule

foomodule.cheer("Marco Carcano")

if used within the main program, the above statement imports all of the objects defined within the "foomodule.py" file within "foomodule" namespace: we need to prepend "foomodule." so to be able to execute the "cheer(name)" function indeed.

We can even import all or a subset of the objects from the module directly within the "__main__" namespace:

#!/usr/bin/env python3
from foomodule import *

cheer("Marco")

as you see this time we can directly access the cheer(name) function without prepending "foomodule." namespace: let's have a look to the list of objects of the __main__ module: launch Python

python3

then load the module and list its contents:

>>> from foomodule import *
>>> dir()
['__annotations__', '__builtins__', '__doc__', '__loader__', '__name__', '__package__', '__spec__', 'cheer']

cheer is actually listed, but this time _say_goodbye() function is not among the list.

Python default behavior is not to import objects with a name starting by underscore (_). You can alter this behavior by explicitly listing the objects you want to be imported.

this means that so to be able to use the "_say_goodbye(name)" function, we must explicitly import it as follows:

#!/usr/bin/env python3
from foomodule import *
from foomodule import _say_goodbye

cheer("Marco Carcano")
_say_goodbye("Marco Carcano")

Modules Search Path

When an import statement is found, the interpreter first searches for a built-in module with that name of the module to import: if this is not found, the interpreter searches for a ".py" file with the same name of the module into a list of paths specified by sys.path variable.

sys.path defaults to these locations:

  • the directory beneath the input script is stored into (or the current directory when no file is specified)
  • PYTHONPATH (a list of directory names, with the same syntax as the shell variable PATH)
  • the installation-dependent default (includes a site-packages directory, handled by the site module)
When dealing with sym-links, the directory containing the input script is calculated after following the sym-link. This means that the directory containing the sym-link is not added to the module search path. .

Note that the directory containing the script being run takes precedence over the standard library path: this is a handy way to perform overrides.

Extension Modules

Extension Modules are modules written in the low-level language of the Python implementation, so C/C++ for Python, Java for Jython. Typically they are contained into a single dynamically loadable pre-compiled object:

  • shared object (".so") on Unix
  • DLL on Windows - note that these DLL have the ".pyd" extension
  • Java class file for Jython extensions.

Python Packages

Import Packages

Import packages are a way of structuring Python’s module namespace by using "dotted module names": this is a convenient way that avoids collisions of names caused by modules with the same name. Modules are grouped within a parent object - the package - the outcome is a "dotted" namespace with the package name as the first token and the module name as the second one. For example, the module name "foo.bar" designates a module named "bar" in a package named "foo" - the outcome is that the namespace of the objects provided by the packages is "foo.bar".

Packages are a handy way to group related modules all-together; the only straightforward limit is that problems arise when more entities - for example different vendors - develop a package with the same name: neglecting to handle this causes naming collisions, meaning that users can install only one of the packages within the same Python instance. If you do not want to have more Python instances, you can overcome this by scoping packages using a corporate namespace.

Python considers a directory as a package if it does contain an  "__init__.py" file: this file often contains the initialization code of the package, but it can also just be an empty file.

It can be used to set the "__all__" variable: as we previously saw,  when using "from package import *" Python does not import objects with a name starting with underscores. It is however possible to provide Python the list of objects that we want to be loaded when using such a statement: it is as simple as listing the objects into the __all__ list, for example:

__all__ = ["FoolistItem", "Foolist","_an_hidden_object"]
This feature can be exploited both to have Python to load objects with a name starting by underscore as well as avoiding some objects to be loaded by default. Note that __all__ alter the behavior only of the "from package import *" statement: "import package" statement import everything defined within the module, even objects with a name starting by underscore

Distribution Packages

Distribution Packages are versioned archive files that contain import packages, modules, and other resource files that are used to distribute a Release.

Installing Distribution Packages using PIP

PIP is the official Python tool to download and install Python distribution packages. It downloads them from the online repository PyPI - we'll see more on this topic later on.

Installing a distribution package with pip is as easy as typing "pip install" followed by the name of the package you want to install.

For example:

pip install foopkg

when you have to deal with a lot of packages, you can list them into a requirements file like the following one:

###### Requirements without Version Specifiers ######
PyYAML

###### Requirements with Version Specifiers ######
jupyter == 1.0.0
nose >= 1.0
ansible >=2.8, <2.9

and install everything as a whole as follows:

sudo python3 -m pip install -r requirements.txt
Note how this time we invoked pip as a Python module ("-m pip")

as you see the syntax of the requirements file is really easy and recall the Python's syntax :

  • # marks the beginning of a comment
  • conditionals:
    • == equal
    • != different
    • < lower
    • > greater
    • <= lower or equal
    • >= greater or equal
    • ~= compatible release - for example ~= 1.1 is the same as >=1.1 or == 1.*
  • two conditions can be specified by separating them with a comma (,)

Creating Distribution Packages using Setuptools

First and foremost let's get back to our Python3 project we created in the previous post - change directory to the root of the Carcano's foolist project:

cd ~/fooproject

Distribution packages are created using "setuptools", so the first thing to do is ensuring to have it installed an as up to date as possible:

sudo python3 -m pip install -U setuptools
Note how this time we invoked pip as a Python module ("-m pip")

Setuptools relies on "setup.py" settings file: we already hit this file on the first post of this trilogy, where we used it only to have setuptools instantiate unit tests.

Now, a clean and tidy approach is creating a Makefile so that we can handle the whole project using make - yes, there's still ancient people like me using them out there.

Create the "Makefile" file into "~/fooproject" directory with the following contents:

all: unittests

clean:
	$(info -> Makefile: cleanup previous builds ... )
	@(rm -rf src/test/__pycache__ src/carcano/foolist/__pycache__ src/bin/__pycache__)
	@(rm -rf src/carcano_foolist.egg-info src/.eggs)
	@(rm -rf src/dist src/build)

unittests:
	$(info -> Makefile: Launching unit tests ...)
	@(cd src; python3 setup.py nosetests >/dev/null)

Let's have a go - just issue:

make

since "unittests" is listed in the "all" target, it runs all of the integration tests we defined in the previous post:

-> Makefile: Launching unit tests ...
testFoolistAppend (test_foolist.Foolist) ... ok
testFoolistIsIterable (test_foolist.Foolist) ... ok
testFoolistRemove (test_foolist.Foolist) ... ok

----------------------------------------------------------------------
Ran 3 tests in 0.005s

OK

Now we explore it a little bit, adding the minimal metadata necessary to create a package: our new "src/setup.py" must look like as follows:

Are you enjoying these high quality free contents on a blog without annoying banners? I like doing this for free, but I also have costs so, if you like these contents and you want to help keeping this website free as it is now, please put your tip in the cup below:

Even a small contribution is always welcome!

from setuptools import setup, find_namespace_packages

setup(
    name='carcano_foolist',
    version='0.0.1',
    author='Marco Antonio Carcano',
    author_email='myemailaddress@mydomain.tld',
    url= 'https://github.com/mac-grimoire/python-spells.git',
    packages=find_namespace_packages(include=['carcano.*']),
    setup_requires=['nose>=1.0'],
    test_suite="nose.collector",
    tests_require=["nose"],
)

here:

  • we import find_namespace_packages function, since we use it to find the packages to we want to include in our distribution package (note that it does also exist find_packages if you are working without a namespace)
  • add the minimum required metadata - I do not explain them since they are self-explanatory:
    • version
    • author
    • author_email
    • url

We are now ready to explore how to create distribution packages: the very first thing is to add the logic necessary to handle the version of the releases to the Makefile - it must look like as follows:

all: unittests

clean:
	$(info -> Makefile: cleanup previous builds ... )
	@(rm -rf src/test/__pycache__ src/carcano/foolist/__pycache__ src/bin/__pycache__)
	@(rm -rf src/carcano_foolist.egg-info src/.eggs)
	@(rm -rf src/dist src/build)

release:
ifndef RELEASE 
	$(error Makefile: RELEASE is not set - please set it with a value with the following format: major.minor[.patch][sub] - for example 0.0.1 or 1.0.1-a2)
endif
	$(info -> Makefile: validating RELEASE=${RELEASE} format)
	@(echo ${RELEASE} |grep -qE "[0-9]+\.[0-9]+\.*[0-9]*[a-zA-Z]*[0-9]*") || (echo " ${RELEASE} is not in compliance with our version format"; exit 1)	
	@(cd src; sed -i -E "s/version='[0-9]+.[0-9]+\.*[0-9]+[a-zA-Z]*[0-9]*'/version='${RELEASE}'/g" setup.py)

unittests:
	$(info -> Makefile: Launching unit tests ...)
	@(cd src; python3 setup.py nosetests >/dev/null)

we added:

  • a conditional to check that the "RELEASE" environment variable has been set: this variable is used to set the version of the package we are about to build
  • the "release" target: it's purpose is to validate the format of the value of the "RELEASE" environment variable
Python packages usually adhere to the version format "major.minor[.patch][sub]". The major number is 0 for initial releases of software (the experimental ones), and it is incremented for releases that are major milestones for the package. The minor number instead is incremented when important new features are added to the package. The patch number is incremented when bug-fix releases have to be made. Sub-releases are additional trailing version information that can be added to clarify things: for example "a1,a2,...,aN" for alpha releases (when functionality or API has changed), "b1,b2,...,bN" for beta releases (when some bug fixing has been made), "pr1,pr2,...,pr" for pre-release testing.
Source Distributions (sdist)

As the name itself suggests, a source distribution contains source code: this is not only Python code, but also the source code of any extension modules (usually in C or C++) bundled with the package.

This kind of package requires the user to compile extension modules on his side if they are necessary. This of course implies requirements, such as having installed development tools, that is not always a viable choice, especially when dealing with servers.

Source distributions also contain a bundle of metadata files sitting in a directory called "<package-name>.egg-info". These metadata help with building and installing the package, but user’s don’t really need to do anything with it.

From the developer’s perspective, a source distribution is what gets created when you run the following command:

python3 setup.py sdist

We can of course add a new target to the Makefile:

sdist: release clean
	$(info -> Makefile: building the sdist distribution package ...)
	@(cd src; python3 setup.py sdist)

let's make the "sdist" target:

export RELEASE="0.0.1"
make sdist

the output is as follows - I cut it to keep it short:

-> Makefile: validating RELEASE=0.0.1 format
-> Makefile: cleanup previous builds ... 
-> Makefile: building the sdist distribution package ...
running sdist
running egg_info
...
creating dist
Creating tar archive
removing 'carcano_foolist-0.0.1' (and everything under it)

so now we have our brand new "src/dist/carcano_foolist-0.0.1.tar.gz" sdist file.

Built Distributions (Bdist)

It is the "classic" distribution format: despite it’s not necessarily binary, though, because it might contain only Python source code and/or byte-code. This kind of distribution relies on the "bdist" family of command options - for example:

python3 setup.py bdist

bdist has several options that let you create platform specific packages, such as gztar on Linux, executable zip on Windows and so on. It even lets you generate an RPM package on Linux.

Be wary that, because of its very few options, it is suitable only to build very basic RPM packages: if you have to deal with a big and structured project, that requires to generate a main RPM package with sub-packages, you must rely on the rpmbuild command line utility and write a SPEC file as usual - this by the way will be the topic of the next and last post of this trilogy dedicated to Python.

We can of course add a new target to the Makefile:

bdist: release clean
	$(info -> Makefile: building the bdist distribution package ...)
	@(cd src; python3 setup.py bdist)

let's make the "bdist" target:

export RELEASE="0.0.1"
make bdist

the output is as follows - I cut it to keep it short:

-> Makefile: validating RELEASE=0.0.1 format
-> Makefile: cleanup previous builds ...
-> Makefile: building the bdist distribution package ...
running bdist
running bdist_dumb
...
creating /home/grimoire/fooproject/src/dist
Creating tar archive
removing 'build/bdist.linux-x86_64/dumb' (and everything under it)

so now we have our brand new "src/dist/carcano_foolist-0.0.1.linux-x86_64.tar.gz" bdist file.

Wheel Distributions (Bdist Wheel)

Wheel is a built distribution format that lets you skip the build stage required with source distributions. The straightforward benefits are:

  • increased delivery and installation speed
  • avoidance of the requisite of having development tools and libraries installed on the system

For these reasons, unless there's a strict constraint to distribute packages as sources, it is always preferable to build wheel packages.

There's only one thing you should really be aware of: since wheels can incorporate shared libraries too, there may be security concerns since these libraries never get updated when you update the system libraries.

In order to build wheels it is needed to install the "wheel" package: since it's always good having an updated "setuptools" version, we install wheel and update "setuptools" within the same command:

sudo python3 -m pip install -U wheel setuptools

Building a wheel requires to use the "bdist_wheel" command option of setup.py:

python3 setup.py bdist_wheel

We can of course add a new target to the Makefile:

wheel: release clean
	$(info -> Makefile: building the wheel distribution package ...)
	@(cd src; python3 setup.py bdist_wheel)

let's build the wheel:

export RELEASE="0.0.1"
make wheel

the output is as follows - I cut it to keep it short:

-> Makefile: validating RELEASE=0.0.1 format
-> Makefile: cleanup previous builds ...
-> Makefile: building the wheel distribution package ...l
running bdist_wheel
running build
...
creating 'dist/carcano_foolist-0.0.1-py3-none-any.whl' and adding 'build/bdist.linux-x86_64/wheel' to it
adding 'carcano_foolist-0.0.1.dist-info/METADATA'
adding 'carcano_foolist-0.0.1.dist-info/WHEEL'
adding 'carcano_foolist-0.0.1.dist-info/top_level.txt'
adding 'carcano_foolist-0.0.1.dist-info/RECORD'
removing build/bdist.linux-x86_64/wheel

so now we have our brand new "src/dist/carcano_foolist-0.0.1-py3-none-any.whl" bdist file

This is a ZIP package with a name ending by ".whl": this name is specially crafted to tell installers what Python versions and platforms the wheel is geared to.

We can of course have a look at its contents:

unzip -l src/dist/carcano_foolist-0.0.1-py3-none-any.whl
You may have to install unzip RPM package ("sudo dnf install -y unzip") before issuing the above command

the output looks like as follows:

Archive:  src/dist/carcano_foolist-0.0.1-py3-none-any.whl
  Length      Date    Time    Name
---------  ---------- -----   ----
      141  09-17-2021 17:44   carcano/foolist/__init__.py
     2857  09-17-2021 17:43   carcano/foolist/foolist.py
     2195  09-17-2021 17:43   carcano/foolist/foolistitem.py
      992  09-17-2021 17:53   carcano_foolist-0.0.1.data/data/bin/logging.conf
       76  09-17-2021 17:54   carcano_foolist-0.0.1.data/data/share/doc/fooapp/rsyslog/fooapp.conf
     1187  10-06-2021 20:52   carcano_foolist-0.0.1.data/scripts/fooapp.py
      674  10-06-2021 20:52   carcano_foolist-0.0.1.dist-info/METADATA
       92  10-06-2021 20:52   carcano_foolist-0.0.1.dist-info/WHEEL
        8  10-06-2021 20:52   carcano_foolist-0.0.1.dist-info/top_level.txt
      909  10-06-2021 20:52   carcano_foolist-0.0.1.dist-info/RECORD
---------                     -------
     9131                     10 files

thoroughly talking about wheels would require too much space, but I can at least tell you the "must know" concepts: the filename is broken down into parts separated by hyphens as follows:

{dist}-{version}(-{build})?-{python}-{abi}-{platform}.whl

For example the package name cryptography-2.9.2-cp35-abi3-macosx_10_9_x86_64.whl is made of the following tokens:

  • cryptography: the package name
  • 2.9.2: the package version (it must be a PEP 440-compliant string)
  • p35: the Python tag - it claims the Python implementation and version required by this wheel (in this case CPython 3.5) - this means for example that this wheel won't work with Jython
  • abi3:  the Application Binary Interface (ABI) tag - it claims the level of binary compatibility of the Python C API.
  • macosx_10_9_x86_64 is the platform tag: it can be broken down in the following tokens:
    • macosx: the macOS operating system
    • 10_9: is the macOS developer tools SDK version used to compile the Python that in turn built this wheel
    • x86_64:  reference to x86-64 instruction set architecture.
For more information on wheel naming convention, see PEP425 , PEP513, PEP571 and PEP599 for the manylinux platform tag

As of the manylinux platform tag, be wary that it poses requisites on the target host (the host that installs the wheel built by you). This means that very often it is necessary to update the PIP version so to support it:

  • manylinux1 requires pip 8.1.0 or later
  • manylinux2010 requires pip 19.0 or later
  • manylinux2014 requires pip 19.3 or later
Concerning the python tag, please consider also that since Alpine Linux uses "musl" instead of the standard glibc, PyPI wheels don’t work on Alpine Linux (or BusyBox).

More on Packaging with setuptools

it has eventually come the time to explore our "setup.py": here it is enriched with additional contents, by the way

from setuptools import setup, find_namespace_packages

setup(
    name='carcano_foolist',
    version='0.0.1',
    author='Marco Antonio Carcano',
    author_email='myemailaddress@mydomain.tld',
    description='An example list object that exploit Pyhon iteratable facilities',
    classifiers=[
        'License :: OSI Approved :: GNU Lesser General Public License v3 or later (LGPLv3+)',
        'Development Status :: 3 - Alpha',
        'Intended Audience :: Developers',
        'Programming Language :: Python :: 3'
    ],
    url= 'https://github.com/mac-grimoire/python-spells.git',
    license= 'GNU Lesser General Public License v3 or later',
    keywords = 'iteratable list',
    project_urls={
        'Bug Tracker': 'https://github.com/mac-grimoire/python-spells.git',
        'Documentation': 'https://github.com/mac-grimoire/python-spells.git',
        'Source Code': 'https://github.com/mac-grimoire/python-spells.git',
    },    
    packages=find_namespace_packages(include=['carcano.*']),
    scripts=["bin/fooapp.py"],
    data_files=[
        ("bin", ["bin/logging.conf"]),
        ("share/doc/fooapp/rsyslog", ["share/doc/fooapp/rsyslog/fooapp.conf"])
    ],
    setup_requires=['nose>=1.0'],
    test_suite="nose.collector",
    tests_require=["nose"]
)

the fields name, version, author, author_email, description and url are self-explanatory.

As for the others:

  • classifiers: a list of classifiers that apply to the distribution package -  the full list of the available classifiers can be found here
  • license: a custom string you can use to describe the license
  • keywords: a string with a space separated list of keywords
  • project_urls: a dictionary of additional relevant URLs
  • packages: the list of packages we want to pack together: despite you can manually list them, it is far better having them automatically discovered by find_namespace_packages or find_packages.
  • scripts: a list of scripts that have to be included with the package
  • data_files: a list of dictionaries with the destination path where to copy files as key, and a list of files to be copied as the second element
  • setup_requires: used to import a version higher or equal to 1.0 of the nose test suite
  • test_suite and tests_require are used to state that unit tests are performed using nose
Using find_namespace_packages() with no arguments can potentially result in unwanted packages to be included. This can happen, for example, if  an "__init__.py" in test/ has been included. Although it is possible to explicitly prevent the inclusion of tests in the package by the use of the exclude argument, this is less robust.

For the sake of completeness, note that the outcome distribution package from building the  "setup.py" file can be easily build and installed as a whole simply using pip as follows:

cd src;
sudo pip install -e .
Although you can install it by using Python alone "python setup.py install", using pip provides several benefits such as automatic dependencies resolutions as well as being able to install, update or remove the package. We'll  see in the last post of this trilogy that we will use python setup.py install when building the RPM package

Be wary however that this last method has more to deal to experimenting rather than a clean and tidy way of working method.

Listing requirements

if the scripts or packages have requisites, we can list them into the install_requires list:

    install_requires=[
        'PyYAML',
        'jupyter'
    ]

sometimes some of the requisites are optional: in this case use the extra_require dictionary.

For example, if "dump" is an optional feature provided by package "carcano.bar" that requires "PyYAML" >= 2.2.0 and  jupyter:

    extras_require={
        'dump': ['PyYAML>=5.3.1', 'jupyter']     }

create the package as previously shown: when it comes to install the "dump" option, all of its requirements will be automatically installed as well. The installation command to be issued is as follows:

python3 -m pip install carcano.bar[dump]
Automatically Generating SHELL Scripts

If needed, we can even have shell scripts automatically generated to wrap our Python scripts: just specify them using entry_point dictionary as follows:

    entry_points={
        'console_scripts': ['my-command=<code class="language-bash">carcano.bar:main'] } 
Additional Files

a list of additional files can be specified using:

  • data_files: files related to the python package. eg. documentation, static image files, configurations
  • package_data: files that will be installed system-wise, not in site-package directory. eg. desktop icons, fonts

For example:

 data_files= [
        ("/etc/rsyslog.d", ["<code class="language-python">hare/doc/fooapp/rsyslog/fooapp.conf"]), ]
Note that you can specify the files to be included also using wildcards. For example, to include all files with a trailing ".conf" you can specify "*.conf"

Publishing Distribution packages onto a repository

Once the package, we must publish it into a repository, so that other people can easily know that it does exist and download it.

Pypi

Python's online package repository is called PyPI (Python Package Index): when it comes to install or update files, pip looks for them onto PyPI and grabs them from it.

There's also an instance of PyPI dedicated to testing: we can register an account for test purposes at the following page.

Twine

Twine is the tool that has to be used to publish packages on PyPI.

You can install it via "dnf" or "yum" from the "EPEL" repository:

sudo dnf install -y epel-release
sudo dnf install -y twine

by default it pushes to PyPI, but we can force it to publish onto the testing PyPI repository (TestPyPI) by specifying "--repository-url" parameter as follows: 

twine upload --repository-url https://test.pypi.org/legacy/ dist/*

as we should expect, it prompts us for our username and password.

As you can easily guess, the statement to push to PyPI is:

twine upload dist/*
Please mind that when logging in to TestPyPI and PyPI you use different accounts and so you may have to use different credentials.

Let's go on and complete our Makefile - we must add two new targets:

  • push-test manages the push to TestPyPI
  • push manages the push to PyPI
push-test:
ifndef TWINE_USERNAME
	$(error Makefile: TWINE_USERNAME is not set - please set it or you won't be able to  authenticate to the PyPI server)
endif
ifndef TWINE_PASSWORD
	$(error Makefile: TWINE_PASSWORD is not set - please set it or you won't be able to  authenticate to the PyPI server)
endif
	$(info -> Makefile: pushing to Test PyPI...)
	@(cd src; twine upload --repository-url https://test.pypi.org/legacy/ dist/*)

push:
ifndef TWINE_USERNAME
	$(error Makefile: TWINE_USERNAME is not set - please set it or you won't be able to  authenticate to the PyPI server)
endif
ifndef TWINE_PASSWORD
	$(error Makefile: TWINE_PASSWORD is not set - please set it or you won't be able to  authenticate to the PyPI server)
endif
	$(info -> Makefile: pushing to PyPI ...)
	@(cd src; twine upload dist/*)

since "twine" can read credentials from TWINE_USERNAME and TWINE_PASSWORD environment variables, avoiding to prompt for them, we added two additional conditionals to check them.

For your convenience, here is the full Makefile:

all: unittests

clean:
	$(info -> Makefile: cleanup previous builds ... )
	@(rm -rf src/test/__pycache__ src/carcano/foolist/__pycache__ src/bin/__pycache__)
	@(rm -rf src/carcano_foolist.egg-info src/.eggs)
	@(rm -rf src/dist src/build)

release:
ifndef RELEASE 
	$(error Makefile: RELEASE is not set - please set it with a value with the following format: major.minor[.patch][sub] - for example 0.0.1 or 1.0.1-a2)
endif
	$(info -> Makefile: validating RELEASE=${RELEASE} format)
	@(echo ${RELEASE} |grep -qE "[0-9]+\.[0-9]+\.*[0-9]*[a-zA-Z]*[0-9]*") || (echo " ${RELEASE} is not in compliance with our version format"; exit 1)	
	@(cd src; sed -i -E "s/version='[0-9]+.[0-9]+\.*[0-9]+[a-zA-Z]*[0-9]*'/version='${RELEASE}'/g" setup.py)

unittests:
	$(info -> Makefile: Launching unit tests ...)
	@(cd src; python3 setup.py nosetests >/dev/null)

sdist: release clean
	$(info -> Makefile: building the sdist distribution package ...)
	@(cd src; python3 setup.py sdist)

bdist: release clean
	$(info -> Makefile: building the bdist distribution package ...)
	@(cd src; python3 setup.py bdist)

wheel: release clean
	$(info -> Makefile: building the wheel distribution package ...)
	@(cd src; python3 setup.py bdist_wheel)

push-test:
ifndef TWINE_USERNAME
	$(error Makefile: TWINE_USERNAME is not set - please set it or you won't be able to  authenticate to the PyPI server)
endif
ifndef TWINE_PASSWORD
	$(error Makefile: TWINE_PASSWORD is not set - please set it or you won't be able to  authenticate to the PyPI server)
endif
	$(info -> Makefile: pushing to Test PyPI...)
	@(cd src; twine upload --repository-url https://test.pypi.org/legacy/ dist/*)

push:
ifndef TWINE_USERNAME
	$(error Makefile: TWINE_USERNAME is not set - please set it or you won't be able to  authenticate to the PyPI server)
endif
ifndef TWINE_PASSWORD
	$(error Makefile: TWINE_PASSWORD is not set - please set it or you won't be able to  authenticate to the PyPI server)
endif
	$(info -> Makefile: pushing to PyPI ...)
	@(cd src; twine upload dist/*)

we can finally have a go and push the wheel to TestPyPI:

first we have to export TWINE_USERNAME and TWINE_PASSWORD environment variables, or the Makefile will fail:

export TWINE_USERNAME='myusername'
export TWINE_PASSWORD='mypassword'

we can eventually push

make push-test

the output is as follows:

-> Makefile: pushing to Test PyPI...
Uploading distributions to https://test.pypi.org/legacy/
Uploading carcano_foolist-0.0.1-py3-none-any.whl
100%|█████████████████████████████████████████████████████████████████████████| 8.35k/8.35k [00:02<00:00, 4.22kB/s]

View at:
https://test.pypi.org/project/carcano-foolist/0.0.1/
If you are already used to Continuous Integration tools such as Jenkins, you can easily figure out how all of this can easily be put inside the context of a Continuous Integration workflow.

Now that our package has been published on the test PyPI, anybody can easily install it using pip by supplying the "--index-url" command option. For example:

sudo python3 -m pip install -i https://test.pypi.org/simple/ carcano-foolist

the only drawback is that if a package has dependencies that cannot be resolved within TestPyPI, we must explicitly tell pip to fall-back to the regular PyPI repository: in this case we must also provide "--extra-index-url" parameter.

For example:

sudo python3 -m pip install -i https://test.pypi.org/simple/ --extra-index-url https://pypi.org/simple/ carcano-foolist

We can then go-on with the regular development cycle, fixing errors and adding features when needed.

Footnotes

Here it ends the second of this trilogy of posts dedicated to Python: I hope you enjoyed it. You now know how to deal with modules, import packages, distribution packages and their format. You now have the necessary skills to create distribution packages leveraging on "setuptools", to create unit tests and to put everything into a project managed by the "make" utility, so to easily include everything in a Continuous Delivery toolchain.

In the next post we see how to package this project into two RPM packages: the first installs the modules provided by the Python package, whereas the second installs the example script that implements a sample application that imports these modules. This last package also performs post installation tasks so as to reconfigure rsyslog.
In addition to that, we also see how to digitally sign these RPM packages, and in general how to set up an RPM development environment, with a good overview on how to exploit the RPM building tools. Don't miss it.

I hate blogs with pop-ups, ads and all the (even worse) other stuff that distracts from the topics you're reading and violates your privacy. I want to offer my readers the best experience possible for free, ... but please be wary that for me it's not really free: on top of the raw costs of running the blog, I usually spend on average 50-60 hours writing each post. I offer all this for free because I think it's nice to help people, but if you think something in this blog has helped you professionally and you want to give concrete support, your contribution is very much appreciated: you can just use the above button.

7 thoughts on “Python Setup Tools

  1. It’s a pity you don’t have a donate button! I’d without a doubt donate to this outstanding blog!
    I guess for now i’ll settle for bookmarking and adding your RSS feed to my Google account.
    I look forward to new updates and will talk about this website with my Facebook group.

    Chat soon!

    • Marco Antonio Carcano says:

      I’m glad to know that you appreciate the quality grade of my contents. I’m thinking maybe sooner or later I could collect the post by topic and publish them as monographs: I think this is better than donations, since like so who is interested may keep a printable PDF at hand or a printed copy in your drawer. And by the way, since they would be not more than 80-100 pages, I think that also the price would be very affordable.

  2. whoah this blog is excellent i love studying your posts.
    Stay up the good work! You know, a lot of individuals are searching round for
    this information, you can help them greatly.

  3. Its such as you read my thoughts! You seem to know so much approximately this, like you wrote the e-book in it or something.
    I think that you can do with some p.c. to drive the message house a
    bit, but instead of that, that is magnificent blog. An excellent read.
    I’ll definitely be back.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes:

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>